WTF ... IS WTF!?
We are a collective of people who believe in freedom of speech, the rights of individuals, and free pancakes! We share our lives, struggles, frustrations, successes, joys, and prescribe to our own special brand of humor and insanity. If you are looking for a great place to hang out, make new friends, find new nemeses, and just be yourself, WTF.com is your new home.

ATi has put the smackdown on nVidia - Crossfire > SLi

Descent

Hella Constipated
7,686
109
157
#1
http://ati.com/products/crossfire/index.html

I'm going to let it speak for itself. I can't elaborate that much - let's just say it's like comparing Doom to Pong.

If you own a PCI-Express X800 or X850 line card, you're good to go. Pick up a Crossfire ready motherboard, and a Crossfire Edition card, which doesn't even have to be the same make, model, or from the same company, just on the same product line - X800, or X850.

Each card can run at varying clock speeds overclocked, also.

And if you really want to cream your pants, check out the white papers.

I am really saving up every penny I can for a CrossFire rig, because it's futureproofing to the extreme. Spend $2000, never upgrade again. It's not even the gaming aspect - it's the ability to spend all this cash, and n ever have to spend a penny more.

And this is PRECISELY why I've been advising people to not upgrade yet, and why I haven't upgraded yet.

I always said "Wait up for CrossFire - let's see who will come out on top."

But it's not even a matter of who may come out on top - it's a matter of who is. And ATi doesn't break any promises anymore - I'm 100% confident that CrossFire will kick ass. Their 9600XT won me over, and the 9700 Pro made me fall in love even more. I was still an nVidia fan, but if I really like a platform, be it anything from a computer company, hardware manufacturer, gaming company (*cough* Sega *cough*), or GPU supplier, you gotta really impress me, because I test everything first and know what I like.

And Junglizm - if ya can, return the shit for that 939 upgrade.
 

Jung

???
Premium
13,993
1,401
487
#2
It's not even out yet and requires a special motherboard, of which there are currently very few. It is better than SLI, but dual video cards is a very niche market. Nvidia still has the upper hand with the 7800GTX right now. (at least for a while)
 

Broken

Member smoked too much weed!
3,891
0
0
#3
Dunno~ Certain makers have know about it for awhile and have been shipping boards for months now, well ok a month! ROFL
 

Descent

Hella Constipated
7,686
109
157
#4
junglizm said:
It's not even out yet and requires a special motherboard, of which there are currently very few. It is better than SLI, but dual video cards is a very niche market. Nvidia still has the upper hand with the 7800GTX right now. (at least for a while)
Yes, but if you factor in the fact that it will be cheaper, and more powerful in the long run, I think nVidia will be one-upped before it's release.

ATi hasn't produced anything of bad quality since the 9700 Pro. Software wise, that had one bad driver build after that - the Catalyst 5.4 release.

I think ATi will win out. It's a better technology. ATi's cards have substantially more bang for the buck. They have (arguably) better drivers.

And the fact that you can pop in your existing model of card, and get a whole different card as the master, or even slave card, makes things much easier.

And Junglizm - did you read everything they have there? Because it took me around an hour to soak it all in and think about it.
 

Jung

???
Premium
13,993
1,401
487
#5
And Junglizm - if ya can, return the shit for that 939 upgrade.
I haven't ordered everything yet, some things were out of stock. (video card, motherboard.) I have no intention of running a dual card system though, be it SLI or Crossfire. These won't be as future proof as you think, especially since ATI cards don't even support PS 3.0.

I'll probably just get an X850 XT or 7800 though. I don't want dual cards.
 
2,489
332
327
#6
yeah i don't think this will catch on till everyone decides to upgrade. technology world moves too fast, i bet many people just bought the 7800 anyway.
 

Jung

???
Premium
13,993
1,401
487
#7
Descent said:
Yes, but if you factor in the fact that it will be cheaper, and more powerful in the long run, I think nVidia will be one-upped before it's release.
I don't see how two $400 cards is cheaper than a single $500-$600 card, but ok.
I think ATi will win out. It's a better technology. ATi's cards have substantially more bang for the buck. They have (arguably) better drivers.
It's dual card technology IS better, I said that, but ATI's single card technology is still behind the 7800GTX at this point.
And Junglizm - did you read everything they have there? Because it took me around an hour to soak it all in and think about it.
Nope, but I've been reading up on it, and I read the [T]ardOCP review on it about a month ago.
 

Descent

Hella Constipated
7,686
109
157
#8
junglizm said:
I haven't ordered everything yet, some things were out of stock. (video card, motherboard.) I have no intention of running a dual card system though, be it SLI or Crossfire. These won't be as future proof as you think, especially since ATI cards don't even support PS 3.0.

I'll probably just get an X850 XT or 7800 though. I don't want dual cards.
I personally feel that SuperAA will kill any kind of eye candy that PS 3.0 can offer, but at a possibly lower resolution.

I'd personally take 14xAA over PS3.0 lighting any day. Let alone the fact that 14xAA is a fusion of both FSAA and MSAA, so you have the performance advantage of full-scene anti aliasing, and the eye candy advtange of Multi-Scene anti aliasing.
 

Descent

Hella Constipated
7,686
109
157
#9
junglizm said:
I don't see how two $400 cards is cheaper than a single $500-$600 card, but ok.
It's the fact that you could pair up two $300 X800XL cards, and kick the shit out of a single card setup. The X800XL Crossfire Edition card may cost around $50 more, but that's my guess because of the image processing chip.

If you want to get deeper into it, yes, the graphical solution is cheaper, and the motherboards are more expensive, but the TCO is less to spend more and future proof it.
 

Jung

???
Premium
13,993
1,401
487
#10
Descent said:
I personally feel that SuperAA will kill any kind of eye candy that PS 3.0 can offer, but at a possibly lower resolution.
Have you read up on Transparency FSAA yet? Furthermore, when more games start using PS 3.0(which should be in the very near future), those ATI cards that don't support it will be missing out. I understand your opinion is that it won't matter, but it WILL be lacking in new technology.
Descent said:
It's the fact that you could pair up two $300 X800XL cards, and kick the shit out of a single card setup.
I'm still doubtful that those two cards will out perform a single 7800GTX. The price of cards will be about the same, but the Crossfire capable motherboard adds a nice premium. Not to mention that it's doubtful you'll see any Nforce4 motherboards with Crossfire technology.

Dual video card technology is a silly fad, and will soon die out. I'm not going to waste my time, money or patience on a trivial technology.
but the TCO is less to spend more and future proof it.
I'm sorry, but you're not going to sell me on that. Whether you buy a single 6800Ultra, a single x850 XT, a single 7800GTX, an SLI combination or a crossfire combination, you WILL be future proof for quite some time to come. Crossfire is not some magical technology that will make you video card last forever. I'd be willing to bet that by the time the next gen cards come out, the dual card fad will be over or nearing its end.

Anyway, that's my take on it. Imo, it's not about ATI vs. Nvidia, it's about valid technology and horrible fads that die out before the technology even takes off.
 

Broken

Member smoked too much weed!
3,891
0
0
#11
Video Card fight!!! I bet My video card beats your video card.
 

Descent

Hella Constipated
7,686
109
157
#13
Junglizm said:
Dual video card technology is a silly fad, and will soon die out. I'm not going to waste my time, money or patience on a trivial technology.
What, like the Voodoo2? Or the two card on one board Voodoo5? Or how about the Voodoo Rush, which had both a 2D chip from Alliance Semiconductor and a 3D Voodoo Graphics family chip on it? Or the ATi Rage Fury Maxx dual Rage-128 single board solution?

It's been around for ages. The only thing keeping us from doing it was the AGP spec, which didn't support two slots. Once we needed the extra bandwith of AGP we couldn't go back to PCI.

PCI-E is changing that. It was never a "fad," it was a proven technology that died out because of another technology that it could not be utilized with.

Why am I even arguing this? It's cheaper than insane-ass transistor counts, it's more flexible for the end user, and it's been proven in the late 90's. Back then the ideal configuration was a 16MB nVidia Riva TNT card for 2D and some 3D work, and a dual 12MB Voodoo2 SLI setup. If you could not afford that the next best thing was an ATi Rage Pro 8MB for 2D work, along with the Voodoo2 for 3D work. The ATi Rage Pro had stellar 2D performance because it was based on ATi's Mach64-VT chip, which was the fastest 2D chip of it's time.

Junglizm said:
Have you read up on Transparency FSAA yet? Furthermore, when more games start using PS 3.0(which should be in the very near future), those ATI cards that don't support it will be missing out. I understand your opinion is that it won't matter, but it WILL be lacking in new technology.
Who gives a shit? No, honestly, who in their right mind would give a shit about PS3.0 enhanced lighting and performance when they could have a much more immersive experience with less jaggies? I think the amount of immersion the human mind would percieve with a more lifelike scene would heavily outweigh an improved lighting renderer for an already well-lit scene.


Junglizm said:
I'm sorry, but you're not going to sell me on that. Whether you buy a single 6800Ultra, a single x850 XT, a single 7800GTX, an SLI combination or a crossfire combination, you WILL be future proof for quite some time to come. Crossfire is not some magical technology that will make you video card last forever. I'd be willing to bet that by the time the next gen cards come out, the dual card fad will be over or nearing its end.

Anyway, that's my take on it. Imo, it's not about ATI vs. Nvidia, it's about valid technology and horrible fads that die out before the technology even takes off.
I already discussed this earlier. Everybody prefers more advanced technology. Unless you have the magical ability to bullshit developers and end users with false, unattainable specs in real world performance, you can't win. That's what Sony did with the PlayStation 2, and was successful in kicking the Dreamcast out of the market, even though the Dreamcast had a faster GPU for T&L, had double the framebuffer memory (8MB), superior FSAA quality and performance, and a much easier codebase to work with (DirectX/OpenGL), developers bailed.

But nVidia could not do that because not only has their product launched, but the PC gaming market and the console gaming market have two different mentalities. PC consumers are more tech-savvy, wheras console consumers are Average Joes. So just wait for the "Next-Gen" launch and you'll see my point as to upgrading your PC now - the new PS3 and Xbox 360 are crippled before launch. The XBox 360's CPU is only 2x faster than the XBox, as well as the PS3's. So while the developers will bail on the console industry, we will be functional.
 

Jung

???
Premium
13,993
1,401
487
#14
Descent said:
What, like the Voodoo2? Or the two card on one board Voodoo5? Or how about the Voodoo Rush, which had both a 2D chip from Alliance Semiconductor and a 3D Voodoo Graphics family chip on it? Or the ATi Rage Fury Maxx dual Rage-128 single board solution?

It's been around for ages. The only thing keeping us from doing it was the AGP spec, which didn't support two slots. Once we needed the extra bandwith of AGP we couldn't go back to PCI.
Translation: I hit reply without reading the linked article.

Dual video cards were asinine, overpriced and niche back then, they're the same way now and (imo) will continue to be in the future. There is no point in running dual cards and most people realize this. Only the most [T]ardcore fanboys care about getting 60FPS over what their eyes can even handle. Sure it looks impressive (sic) to put it in your sig on [T]ardOCP, but it's rather pointless to most people who are completely happy with the performance a $200 6600GT/x700 will give them. Furthermore, the people who are willing to spend $1000+ on an extra 10-20FPS (that adds nothing to performance) are going to upgrade when the new OMG FPS++ video cards come out anyway. Neither SLI, nor Crossfire will change that.
PCI-E is changing that. It was never a "fad," it was a proven technology that died out because of another technology that it could not be utilized with.
It was a fad because it never really caught on. Sure there were technical limitations, but that's not why it failed; it failed because at that point there was no reason to run dual cards. Even if PCI-E was around back then it wouldn't have mattered; none of the games needed that performance boost. Most people didn't care about, nor did they want to pay that much for a few extra (negligable) FPS.

Furthermore, curent video cards can't even saturate the AGP bus, much less PCI-E. There is a lot that can (and is) being done in the single card market, without the need of running two video cards.
Why am I even arguing this?
I don't know either, it's just difference of opinions. Your mind is stuck on FPS++ to the point of excess, I realize it doesn't really matter that much after a point.
it's more flexible for the end user, and it's been proven in the late 90's.
I disagree; it's much easier to upgrade a single card that to dick with two. It wasn't "proven in the 90s" because 90% of people didn't care about dual video cards. It was a fad. Granted, it was a fad that was ahead of it's time.
No, honestly, who in their right mind would give a shit about PS3.0 enhanced lighting and performance when they could have a much more immersive experience with less jaggies? I think the amount of immersion the human mind would percieve with a more lifelike scene would heavily outweigh an improved lighting renderer for an already well-lit scene.
You obviously didn't read the article above, but that's a different story...

Pixel Shader 3 will be useful in the future. At 16, in all of your wisdom, you can call me out of my mind if you want. That doesn't change the fact that current cards that lack PS3 support will be antiquated when games start to use that technology more and more.

You're thinking too small; the ideal cards would support both. Advanced lighting will effect dynamics and immersive environments. Current cards can also play games just fine with FFSA and less jaggies, so you're argueing a moot issue. Lack of PS3 support will hinder future proofing.
But nVidia could not do that because not only has their product launched
The first card in their 7xxx line has launched. I'm fairly certain that the GTX will be a low end GPU compared to what both ATI and Nvidia have on the horizon. But I'm not trying to turn this into Nvidia vs ATI, because I really don't care.
PC consumers are more tech-savvy, wheras console consumers are Average Joes.
On a whole, most PC gamers are clueless and buy whatever is more expensive/has larger numbers.
So just wait for the "Next-Gen" launch and you'll see my point as to upgrading your PC now - the new PS3 and Xbox 360 are crippled before launch. The XBox 360's CPU is only 2x faster than the XBox, as well as the PS3's.
You're looking at it from a very closed and uninformed point of view. A console will show drastic improvement with only double the performance of the previous gen. You can't compare their performance to that of PCs, because console processors have a lot less to deal with; a very small OS, memory management, and the currently playing game - that's pretty much it. The Xbox still has amazing graphics (for a console) today, with double the processing power and a beefier GPU it'll fair just fine. The same goes for the PS3.

I agree that both have been over hyped, but I'm sure they will impress us when they're released. At any rate, you don't have the information needed to prove your points, because there is little to no real world information right now. You're just formulating opinions based on ifs and buts.
So while the developers will bail on the console industry, we will be functional.
If you're honestly trying to compare the console gaming market with PC gaming, then I'm truly astonished. PC gaming always has been, and still is today, a niche market. Console gaming is for the masses. Most people can't and won't spend upwards of two grand just to play the newest games, especially when most big title games get released on both PC and console.

Your assumption that devs will bail is ill formed and ignorant. (no offense) There just isn't enough information to tell right now. One thing is for certain though; those consoles will sell like hot cakes, regardless of how small an improvement the technology is, and developers will jump on that. They will sell just because they're new and supposedly better. They will sell because the new hot games will be created for them. They will sell because, as you said earlier, console gamers are average (unknowldegable) joes. Such is the consumer electronics industry.

I guess most of this is just something we'll have to agree to disagree on though.
 

Descent

Hella Constipated
7,686
109
157
#15
Junglizm said:
Translation: I hit reply without reading the linked article.

Dual video cards were asinine, overpriced and niche back then, they're the same way now and (imo) will continue to be in the future. There is no point in running dual cards and most people realize this. Only the most [T]ardcore fanboys care about getting 60FPS over what their eyes can even handle. Sure it looks impressive (sic) to put it in your sig on [T]ardOCP, but it's rather pointless to most people who are completely happy with the performance a $200 6600GT/x700 will give them. Furthermore, the people who are willing to spend $1000+ on an extra 10-20FPS (that adds nothing to performance) are going to upgrade when the new OMG FPS++ video cards come out anyway. Neither SLI, nor Crossfire will change that.
Translation - not only do you preach from the RCP Bible and committed a cardinal sin, but you also fucked yourself historically there.

The hype around the Voodoo2 was serious at the time, and it's still fondly remembered today. As was nVidia's SLi at launch, and it's a little known fact that they ACQUIRED ALL OF 3DFX'S ENGINEERS AFTER A BUYOUT.

And once again, it's not an FPS issue - it's futureproofing. I would love to be able to buy the hardware I'll need to run UT2007 now, which is already running on the 7800GTX line, because not only am I financially solid by 16-year old standards, but I might not be then.

Catch my drift? Because if you look at the statistics, the only card that is high-end that has dropped significantly in rpice when it was "the cream of the crop" was the 9700 Pro. Launching at $400, it's now $130 for a new in box card. I bought mine open box off Newegg for that much when they were still easily garnering $160-200 last October.

And it's capability is fading by the minute now. What was among the best middle-high end cared when I bought it has now become a middle-end card that does not stack up to the 6600GT at all.

Junglizm said:
You're looking at it from a very closed and uninformed point of view. A console will show drastic improvement with only double the performance of the previous gen. You can't compare their performance to that of PCs, because console processors have a lot less to deal with; a very small OS, memory management, and the currently playing game - that's pretty much it. The Xbox still has amazing graphics (for a console) today, with double the processing power and a beefier GPU it'll fair just fine. The same goes for the PS3.

I agree that both have been over hyped, but I'm sure they will impress us when they're released. At any rate, you don't have the information needed to prove your points, because there is little to no real world information right now. You're just formulating opinions based on ifs and buts.
Once again, like I have said before, I know that fully well, and 100%. Lemme put it in perspective - The XBox 360 has three times the horse power of the Dreamcast. Wow, look at how far we've come in seven years! We've gone from a console with the CPU performance of a P3-550 to a P4-1400-1500. Granted, that equals more, but personally I'm having my doubts about a 2.0-2.5GHz PC equivalent console pushing a 24-pipe R520 GPU core - because a 2.5GHz PC cannot do a 32-pipe SLi setup, let alone a 16-pipe 6800U with performance bottlenecks.

It's just not right. We have always had consoles leapfrog consumer PC's - starting with the Nintendo Famicom in 1983. Then, the Sega Master System (Mark III) in Japan. Then the Sega Genesis, the SNES, Sega CD, NeoGeo AVS, Sega 32X/Saturn, PS-X, N64, Dreamcast, etc.

Why are they doing this? It's ass backwards! For the first time since the Famicom, the next-gen consoles with have less than half the power of a PC at launch. By August the Crossfire rollout will be over. The AMD Athlon 64 4800+ is a performance record setter.

It's fucking ridiculous. If the Nintendo Revolution has the same shitty SMP CPU design and architecture as the PS3, and XBox 360, it's all over. FYI - the XBox 360 has three VERY weak CPU cores - which you can never exploit fully anyway. Real-world performance ain't in the TFLOP rating they are claiming.

See where I'm going now? You could go further but I have the specs of every Nintendo console since the Famicom, as well as every Sega console since the Master System memorized, and you'd probobly lose because my point would be proven.


Junglizm said:
Furthermore, curent video cards can't even saturate the AGP bus, much less PCI-E. There is a lot that can (and is) being done in the single card market, without the need of running two video cards.
The issue was dual AGP slots. Everyone on the face of the Earth knows that there are no cards around that can saturate that bus.

Junglizm said:
I disagree; it's much easier to upgrade a single card that to dick with two. It wasn't "proven in the 90s" because 90% of people didn't care about dual video cards. It was a fad. Granted, it was a fad that was ahead of it's time.
I'm sure everybody also ran S3 VirgeMX and Savage cards too, and didn't care about Unreal Tournament. Shit, I bet everyone WAS BONKERS OVER NEOMAGIC AS WELL! Hell, I know I still am seven years later! I love broken D3D HAL support, and never getting working 3D drivers. Why bother when there's software rendering and my laptop has a Voodoo2 equivalent chip just waiting to be unearthed? Everyone knows 3D acceleration is for pussies. So is mipmapping!

You obviously didn't read the article above, but that's a different story...

Junlgizm said:
Pixel Shader 3 will be useful in the future. At 16, in all of your wisdom, you can call me out of my mind if you want. That doesn't change the fact that current cards that lack PS3 support will be antiquated when games start to use that technology more and more.

You're thinking too small; the ideal cards would support both. Advanced lighting will effect dynamics and immersive environments. Current cards can also play games just fine with FFSA and less jaggies, so you're argueing a moot issue. Lack of PS3 support will hinder future proofing.
So was Glide 2.0, S3 MeTaL, ATi Rave, etc. You already admitted you don't keep up with video cards, just give up.

Junglizm said:
The first card in their 7xxx line has launched. I'm fairly certain that the GTX will be a low end GPU compared to what both ATI and Nvidia have on the horizon. But I'm not trying to turn this into Nvidia vs ATI, because I really don't care.
Neither do I. I still miss 3Dfx but you never hear me bitch. Well, except to BRiT.

Bottom line: Whoever kicks the most ass I'll root for. Throw in some bubble gum and I'm good to go.

Junglizm said:
On a whole, most PC gamers are clueless and buy whatever is more expensive/has larger numbers.
If by PC gamers you mean people who enjoy a Radeon 9200, that's fine. Because everyonbe knows I'm the guy who sweet talked people into getting a 8500/9100 because it was still good and they thanked me for it. Nobody else really played the same shit besides people who claimed Radeon 9800 superiority, who I rarely came across. $50 more for three frames per second?! I'm sold!

Ironically, most people who I knew who had a 9800 Pro they self installed had a 256MB model. Which makes sense, because the 9700 Pro was 128MB only. But they weren't the guys who insisted on the 128MB 9800 Pro, who are the people I'm poking fun at here. People who I knew like that usually had spyware on their machines, and their machines were made by HP. 'Nuff said.

Junglizm said:
If you're honestly trying to compare the console gaming market with PC gaming, then I'm truly astonished. PC gaming always has been, and still is today, a niche market. Console gaming is for the masses. Most people can't and won't spend upwards of two grand just to play the newest games, especially when most big title games get released on both PC and console.

Your assumption that devs will bail is ill formed and ignorant. (no offense) There just isn't enough information to tell right now. One thing is for certain though; those consoles will sell like hot cakes, regardless of how small an improvement the technology is, and developers will jump on that. They will sell just because they're new and supposedly better. They will sell because the new hot games will be created for them. They will sell because, as you said earlier, console gamers are average (unknowldegable) joes. Such is the consumer electronics industry.

I guess most of this is just something we'll have to agree to disagree on though.
After the video game crash of 1983 PC's were used, but by 1984 that market softened too.

As for consoles - nobody will buy a console that does not have good games for it. And if they own one already, and nothing good is coming out, they won't buy software. Case in point: the last year the Atari 2600 was on the market (1983), and the first few years the Genesis was on the market up until the release of Sonic The Hedgehog (1989-Q1 1991). Both of those libraries were pretty abysmal. And we all know what happened to Nintendo's market share after Sonic, and pretty soon it was down from a solid 85% to 40%. Sega controlled the other 59%. Nintendo's image problems killed the SNES for the first few years, and the GameCube as well. They really came this close to losing me after I bought a GameCube because the library was NOTHING like the N64's, which totally won me over. I bought that over the Saturn. And if that new Zelda game doesn't out on the GameCube, I'd personally sell it for a Saturn and pick up some Japanese themed games as opposed to kiddie themed games and give those a try.

I'm guessing the new console will take off because Nintendo consoles have always been developer centric - it was a key part of their business strategy since the days of the Famicom. The N64 was a little bit tougher than it should have been but relatively good.

Developers want to work without limits - and the new console specs are posing some tough burdens.
 

Jung

???
Premium
13,993
1,401
487
#16
Out of all that :blahblah: this is the only thing that stood out.
You already admitted you don't keep up with video cards, just give up.
If you're going to quote/paraphrase me, then get it right. I said "Video cards are something I don't really keep up with until it's time to upgrade." It's that time and I've been reading up. It's not like someone has to know everything about every chip set, ram core or GPU heat sink created in the past 10 years (like you seem to :rolleyes: )to read the info and apply common sense.

Like I said, we just see things differently. Hardly anything you've said is rooted in fact though, and I'm sorry but fanboy opinion doesn't sway me.

Your analysis of the next gen consoles is plausible, but so are most of your opinions. I'll just leave it at that.
 

Broken

Member smoked too much weed!
3,891
0
0
#17
What is this, Computer Chamber/B&T? Who gives a flying fuck? Descent, support it, compile a driver and an install GUI for the device. Then you might have a clue of what we do. Until then read the white papers and buy into the bullshit. Shut UP!
 

Descent

Hella Constipated
7,686
109
157
#18
Junglizm said:
If you're going to quote/paraphrase me, then get it right. I said "Video cards are something I don't really keep up with until it's time to upgrade." It's that time and I've been reading up. It's not like someone has to know everything about every chip set, ram core or GPU heat sink created in the past 10 years (like you seem to
)to read the info and apply common sense.

Like I said, we just see things differently. Hardly anything you've said is rooted in fact though, and I'm sorry but fanboy opinion doesn't sway me.

Your analysis of the next gen consoles is plausible, but so are most of your opinions. I'll just leave it at that.
Then why fight with me over it? I have a photographic memory. If I am interested in something it stays with me forever. I don't even I have to think, I remember it for life after I read it once!
Broken said:
What is this, Computer Chamber/B&T? Who gives a flying fuck? Descent, support it, compile a driver and an install GUI for the device. Then you might have a clue of what we do. Until then read the white papers and buy into the bullshit. Shut UP!
Teehee...You got me. I'm teaching myself programming this summer, but when it comes to Windows video card drivers I know A LOT more than an average dude. Not in programming, but as to how to fuck with shit like DirectX function forcing, etc. Why? Neomagic. Satan in corporate form. "Hey! Let's sell notebook graphics chips as fast as Voodoo2's and never code proper drivers, LOL! Fuck them - they can WUV their software rendering."

I have to say that, from all that Neomagic hacking, I probobly got it working better than anyone else did. My 256AV, with it's puny 2.5MB frame buffer, ran both UT99 and Descent: FreeSpace - The Great War at 60FPS! UT99 ran in 512x384, and FS1 only runs at 640x480. But it would run better in intensive parts if it had more VRAM :D.
 

Fire_ze_Missles

Martha Fuckin' Stewart
1,622
5
38
#19
I can't quote specs, and I can't tell you which nVidia drivers are running on my system. But, what's the point of two cards if one cannot even fill the bandwidth for one spot.

We should work on filling the bandwidth available for one PCIe slot, before charging ahead with dual card technology, that's not even needed.
 

Jung

???
Premium
13,993
1,401
487
#20
Descent said:
Then why fight with me over it?
I'm not. I know I said we could "agree to disagree" and "these are just opinions" a couple times. I don't expect you to always agree with me, but I will post my comments if I have any.
I'm teaching myself programming this summer
What language(s) are you interested in?