WTF ... IS WTF!?
We are a collective of people who believe in freedom of speech, the rights of individuals, and free pancakes! We share our lives, struggles, frustrations, successes, joys, and prescribe to our own special brand of humor and insanity. If you are looking for a great place to hang out, make new friends, find new nemeses, and just be yourself, WTF.com is your new home.

NVidia busted for over optimising?

UberSkippy

a.k.a. FuckTheBullShit
7,529
28
142
#1
I was reading up on stuff at work and came accross this

In short, it goes into detail about how NVidia is undersampling for their textures with both the 6800GT and the newer 7800 line of cards. Apparently they get such great speed by "over optimising" their cards and using low quality texture filter algorithms.

Basically their Anisotropic filtering set to 16x is as good as ATI's filtering set to 2x. According to these guys.
 

jamesp

In Memory...
1,714
1
0
#2
Wow, I suspected some foul play because of the leaps and bounds NVIDIA is making over ATI, but damn. That is just downright wrong. They should be slapped with some sort of lawsuit.....:rolleyes:
 

Descent

Hella Constipated
7,686
109
157
#3
jamesp said:
Wow, I suspected some foul play because of the leaps and bounds NVIDIA is making over ATI, but damn. That is just downright wrong. They should be slapped with some sort of lawsuit.....:rolleyes:
Remember the shit they pulled with the static clip planes in 3DMark'03?

nVidia would have me as a customer if their textures looked better. When I upgraded from a Ti4200 to a 9600XT, which had the "Trilinear Optimizations" on by default, even when it was only sampling 6 pixels instead of 8 I nearly creamed my pants when I was playing Day of Defeat.

Yes. It was that noticeable in a game based on 5 year old technology at the time, which was in turn based on 7 year old technology.

And my current card, a 9700 Pro, still blows my brother's 5900XT outta the water both performance and image quality wise. The catch? I have a 128MB framebuffer. He has 256. And with texture detail set to high the filtering still blown goats.

Why aren't you winning me over nVidia? ATi's card run cooler, are cheaper, and have better image quality. The only card of yours I use is the 6600GT. It's the only good thing from ya guys with CrossFire's impending launch. It's a much more future-proof tevhnology than SLi. Every machine I build is futureproof. Does it cost more? Yes. Is the EU more satisfied? Yes.

See the pattern nVidia?
 

CopyLifted

Funnier than a 5th grader
4,790
78
0
#4
If I'm not mistaken, this isn't the first time they have done this. I remember about 2-3 years ago they where doing the same thing when ATI released their 9800 pro. Atleast if I remember correctly. :happysad:

Maybe that is what Descent is touching on above. I don't know. I try not to read his posts. :happysad:
 

UberSkippy

a.k.a. FuckTheBullShit
7,529
28
142
#5
You do remember correctly mr CL. You get a star!

They basically did the exact same thing once before. I don't know what card that was on but you'd think they'd learn to hide their shinanigans better.
 

bait

Hoodrat
32
0
0
#6
sorry i dont have the time but I dont think that website is very credible, just by looking at its design, and the fact that it does not state any credibale research.

maybe thats cause i have a overclocked BFG 6800 GT :p and yes its one of 4 top performing cards CURRENTLY on the general market.

from what I remember reading on toms hardware guide ( a credible source) most of NVidias kick comes from its drivers and their advances in technology when it comes to utilizing directX ( almost every game or benchmark on windows uses directX, if not then it MUST use opengl ). Opengl games are rare, I remember only 2 in the past 10 years ( since they are less efficient to make ). Anyhow you dont have to believe me yet but I will look for the article that should show you another more viable reason why Nvidia is on top.

just think of it this way.

nvidia = finesse

ati = brute dumb power

(Ive owned many ati cards before)
 

Jung

???
Premium
13,970
1,391
487
#7
bait said:
from what I remember reading on toms hardware guide ( a credible source)
Hahaha, when did Tom's Hardware become a credible source?

Tom’s is consistently biased against Intel and ATI. You’d think as an AMD/Nvidia guy I’d like that, but I can’t stand the fudged benchmark results. It’s kind of funny that their results differ so greatly from reputable sites like Anandtech, Xbit Labs and Tech Report. What’s even worse, is unknowledgeable consumers read that site and think they’re getting accurate information, and they never bother to cross reference anything. They spawn misinformation.

Don’t get me wrong though, Tom’s has some good information at times, but their bias and zealotry have no place in benchmarks or reviews.
NVidias kick comes from its drivers and their advances in technology when it comes to utilizing directX
Nvidia cards excel in OpenGL games, ATIs excel in DirectX. You seem confused.

Notice here that Nvidia cards do better in GL based games (Doom 3), whereas ATIs do better in DX based games. (HL2)
( since they are less efficient to make )
What? Do you understand anything about software development or graphics APIs? OpenGL is arguably a better technology, albeit kind of old hat; people use DirectX because it's easier to write in, not because GL is less efficient performance wise.

http://www.gamedev.net/reference/articles/article1775.asp
I've carefully avoided mentioning performance for most of this article. Why? There is a huge debate on the problem. There are many people who claim that OGL is faster than D3D, and of course, the reverse as well. I've been studying performance results: I've considered my own, my friends', and the results of major companies. I have come to the following conclusion: Performance is no longer an issue! The speed for both APIs has come out exactly even for well written programs. The performance can only be gauged per machine, and that by testing. There is no way to predict which will run faster. Like I mentioned earlier though, game programmers are going to have to write for both. It sucks, but that's the way it has to be, for now at least. And if you aren't going into games, well, there was really no debate to what API to learn anyway (OpenGL!). So as far as performance goes, I cannot really advise you. It would be best if you evaluate the two on your own.
(Btw, I am an Nvidia fan, and I don’t think that site is very credible either (I've seen no other report of this on any credible site.), but your baseless and incorrect comments don't help anything.)
 

BRiT

CRaZY
Founder
11,617
2,389
487
#8
junglizm said:
(Btw, I am an Nvidia fan, and I don’t think that site is very credible either (I've seen no other report of this on any credible site.), but your baseless and incorrect comments don't help anything.)
Just to weigh in on this...

As Jung said, Toms Hardware is NOT a credible source by any stretch of the definition.

As to the credibleness of the 3DCenter site, there have been discussions on Beyond3D, THE PREMIUM GRAPHICS SITE, forums about this exact issue. Beyond3D has the actual hardware developers and game developers posting in their forums. It is the pinacle of 3D-related sites on the net. As for textuure shimmering... Some discussions have even been around last year with the NV40 cards with certain drivers. Here is the Beyond3D discussions on texture shimmering on 6800 GT even w/ optimizations disabled from September 14, 2004. It's an issue that appears and is then fixed again periodicly with Nvidia drivers. Then after a while it re-appears. Rinse. Lather. Repeat. It's quite old-hat now.

The Beyond3D discussion related to the G70/7800 series has been around since August 12th, 2005. It can be seen in the thread When enough is enough (AF quality on g70). I can also say that Damien Triolet (Tridam on Beyond3D, member since 2003) is very credible and unbiased source. He even calls for LARGER sites to do their own discovery. He was the one who provided the original videos. Due to bandwidth concerns, he made them available to 3DCenter instead of publishing them on his site, Hardware.fr.

From the postings on Beyond3D the Nvidia Driver 78.03 mostly fixes the issues, but there's still some extreme cases where shimmering is still present. However, it looks like that might be caused by the specific game title and how it generates it's textures using pixel shaders. That is why some of those extreme cases also show up on ATI X800 series of cards.

And Bait, there are so many things wrong with the last section of your post, I don't even know where to begin to correct your ignorance. You could learn a lot from reading the Beyond3D forums.
 

Descent

Hella Constipated
7,686
109
157
#9
CopyLifted said:
If I'm not mistaken, this isn't the first time they have done this. I remember about 2-3 years ago they where doing the same thing when ATI released their 9800 pro. Atleast if I remember correctly. :happysad:

Maybe that is what Descent is touching on above. I don't know. I try not to read his posts. :happysad:
They pulled the exact same shit I was talking about.

However, they failed to realize developer versions of 3DMark03 exist where you can fly around the environments. Whoops. Remember when you would play Doom like a pussy with co clipping on as a kid? Looked like that.

There was also Z-Buffer hacking involved that kept it from clearing itself.

Keep in mind Tom's Hardware reported NONE of this, then wrote an article called "ATi's Questionable Texture Filtering Called Into Question." The said in the article "ATi cards have been known for their incredible image quality, up until the 9600."

This statement is complete horseshit, as anyone who has gamed with a 9700 Pro next to their buddy running a 9600XT, namely me, cannot see the difference at all. I purchased BOTH of those cards, and know them both like the palm of my hand. The exact same VisionTek 9600XT card he was running, I ran in my machine for a good half year before upgrading to my Sapphire Atlantis 9700 Pro.


Tom's Hardware munges benchmarks, cripples reviews, and uses these among other tactics to convert people to their own personal biases. They are a sham of a website, and only their "VGA Charts" articles seem to be credible when it comes to performance graphs.

However, a 9700 Pro scoring .3fps higher than a 9800 Pro in UT2004 is cause for skepticism, in my opinion. It could just be Windows. But I personally wonder if someone made a typo in Excel, because
it doesn't seem right that a card with a 320MHz core and 320MHz memory can outperform a card with an optimized version of the same architecture with a 380MHz core and 370MHz memory.
 

jamesp

In Memory...
1,714
1
0
#10
Keep in mind that a lot of the performance and efficiency has to do with good drivers.
 

bait

Hoodrat
32
0
0
#11
junglizm said:
( since they are less efficient to make )
What? Do you understand anything about software development or graphics APIs? OpenGL is arguably a better technology, albeit kind of old hat; people use DirectX because it's easier to write in, not because GL is less efficient performance wise.
jung you have seemed to have taken a bias against everyting I have said, but I would like to start my bias against you by correcting you on this point. My brother has taken several game development courses and has come across many possibilites of making games in either DirectX or Opengl. He chooses OpenGL because he is hardcore open source, but DirectX is a hella easier for companies to use to make games for the general gaming audience in windows becuase it comes bundled with all the API's for joysticks and shit like that that OpenGL doesnt. Anyhow I will tear your post apart latter but I got to go to math class now.
 

Jung

???
Premium
13,970
1,391
487
#12
bait said:
jung you have seemed to have taken a bias against everyting I have said
That’s only because everything you said WAS bias, or just an opinion.
My brother has taken several game development courses and has come across many possibilites of making games in either DirectX or Opengl. He chooses OpenGL because he is hardcore open source, but DirectX is a hella easier for companies to use to make games for the general gaming audience in windows becuase it comes bundled with all the API's for joysticks and shit like that that OpenGL doesnt.
First, way to just repeat what I said earlier, in so many words.

Second, I’m sure your brother is a super awesome game developer, but I’ll take the opinions of well known developers, who are actually in the industry, over some hear-say about someone’s brother. Of course, no offense to you or your brother.
 
322
0
0
#13
my media center has some kind of fancy nvidia graphics card, and all i can say is that it is a piece of FUCKING SHIT.

EA Games says that if you're going to buy a computer, you have to get ATI. NVidia definately works, but is shit compared to ATI. i end up buying this NVidia graphics card from dell, and the ATI was only $129, as compared to Nvid's $249. My brother's computer has an ATI all-in-wonder something, and (while playing the sims 2, my favorite game for PC), the sims appears choppier and less detailed on my NVid than on his ATI.

it totally pisses me off that NVid will put some kind of bullshit feature, one that probably does nothing to accelerate graphics, into one of their "new" cards and then jack the prices way up.

is there a way to switch out graphics cards?
 

UberSkippy

a.k.a. FuckTheBullShit
7,529
28
142
#14
slemaire195 said:
my media center has some kind of fancy nvidia graphics card, and all i can say is that it is a piece of FUCKING SHIT.

EA Games says that if you're going to buy a computer, you have to get ATI. NVidia definately works, but is shit compared to ATI. i end up buying this NVidia graphics card from dell, and the ATI was only $129, as compared to Nvid's $249. My brother's computer has an ATI all-in-wonder something, and (while playing the sims 2, my favorite game for PC), the sims appears choppier and less detailed on my NVid than on his ATI.

it totally pisses me off that NVid will put some kind of bullshit feature, one that probably does nothing to accelerate graphics, into one of their "new" cards and then jack the prices way up.

is there a way to switch out graphics cards?
Despite the fact that I started this thread I like NVidia cards. They work just as well as ATI. Some people will tell you ATI is the best, other will say NVidia all the way. In the end it's like Ford v Chevy. Regardless of what you hear, or even think, they're pretty much equal.

What you might try and do is download the newest drivers for your video card. They can work wonders on a card that's a little under performing. Another thing to do would be to tell us WHAT Nvidia card you have.

Lastly, yes, you can change out a video card. I'd suggest you have someone help you with it however so that you don't screw it up. No offence but from your post I gather than you may not know enough to swap a card safely. It's not hard at all.

Edit: One more thing, if you're running media center and he's running XP home you might see some differences there as well. I'm not sure why but I know media center does sometimes use more resources.

Video cards aren't always the culprit for choppiness. If you've got a TON of other programs running in the background you'll overload the system and get choppy play. Look at your system tray (lower right corner with the clock) if it's got a bunch of icons that's a good indicator you've got too much running.
 

Jung

???
Premium
13,970
1,391
487
#15
slemaire195 said:
my media center has some kind of fancy nvidia graphics card, and all i can say is that it is a piece of FUCKING SHIT.
What model is it? There are good ATI cards and there are good Nvidia cards. It’s possible you just have a lower model or slower card, but that’s not to sy that all Nvidia cards are bad.

It’s ridiculous to assume that all Nvidia cards are bad just because yours is slow.
EA Games says that if you're going to buy a computer, you have to get ATI.
Wow, the amount of misinformation in this thread is astounding.

http://www.bluesnews.com/cgi-bin/articles.pl?show=583

SANTA CLARA, CA-APRIL 10, 2003-NVIDIA Corporation (Nasdaq: NVDA), the worldwide leader in visual processing solutions, today announced that it has formed a strategic relationship with Electronic Arts (Nasdaq: ERTS), the leading worldwide publisher of PC and console games. The two companies will work together to produce and market next-generation 3D content under the EA GAMES and EA SPORTS brands, which include such top-selling games and franchises as Madden Football, Command & Conquer, James Bond 007, Tiger Woods PGA TOUR Golf and Battlefield 1942. By supporting NVIDIA(r) graphics processing units (GPUs) for desktop and notebook PCs, including the new NVIDIA(r) GeForce(tm) FX family of GPUs, EA is able to deliver innovative and rock-solid gaming experiences to PC consumers worldwide.
http://www.eagames.com/official/battlefield/battlefield2/us/editorial.jsp?src=bf2_ships_062105
Additionally, EA worked with marketing and development partners nVidia, Creative Labs, Ideazon and Alienware to create the optimal gaming experience for the Battlefield community.
You were saying? So apparently EA is partnered with Nvidia but recommends ATI? Yeah, ok... :rolleyes:
NVidia definately works, but is shit compared to ATI.
At this point, after demonstrating that you don’t know what you’re talking about, I’m sure we’re just supposed to take your word for that, right? How about no?

Look at benchmarks, ATI and Nvidia benchmark pretty close these days, depending on what the game was written it. Like I said previously, ATI does better at DirectX games while Nvidia does better at OpenGL games, but the differences are pretty small. Both companies are putting out great cards right now.
i end up buying this NVidia graphics card from dell, and the ATI was only $129, as compared to Nvid's $249.
Do you understand that there are different classes of video cards? There are low, middle and high end cards on the market at any given time, and quite obviously, they’re priced differently. If you’re comparing a $250 card with a $130 card you’re obviously mismatching classes. That's like comparing a $50 WalMart TV to an LCD TV; highly laughable.
My brother's computer has an ATI all-in-wonder something, and (while playing the sims 2, my favorite game for PC), the sims appears choppier and less detailed on my NVid than on his ATI.
Like I said earlier, there are different classes of cards. Your brother probably has a newer or better card than you do.

it totally pisses me off that NVid will put some kind of bullshit feature, one that probably does nothing to accelerate graphics, into one of their "new" cards and then jack the prices way up.
I’ve yet to see anything that actually points to them doing this on purpose or to hype their cards, it sounds like it was a driver problem that got fixed. I guess this is all relative though, and depends on if you’re an ATI fanboy or not…

is there a way to switch out graphics cards?
Yes. If you have an onboard card, you can disable it and install a new card. If you have a PCI, AGP or PCI-Express card, you can just remove the old drivers, switch the cards out and install the new drivers.
 

jamesp

In Memory...
1,714
1
0
#16
To lemaire:

Media Center was made to be a media center, not a gaming OS. YOu can safely (probably) assume that, like Windows Server, it does not provide peak performance for games. It might not be the video card at all, but your OS.
 

Descent

Hella Constipated
7,686
109
157
#17
jamesp said:
To lemaire:

Media Center was made to be a media center, not a gaming OS. YOu can safely (probably) assume that, like Windows Server, it does not provide peak performance for games. It might not be the video card at all, but your OS.
Windows Server 2003 works just as good for gaming as XP does...Likewise with Media Center if you exit the Media Center GUI.
 
322
0
0
#18
ive always thought that Media Center was like XP pro, just with the MC program add-on.

and my graphics card is an NVidia geforce 6800 with 256 MB, up-to-date with graphics and shit.

my specs:

3.02 Ghz P4
1 GB ram
180 GB HD
Dual-TV Tuner :)
MCE05
geforce 6800 256mb
blabla... but the sims specs only call for 1 ghz, 256 mb ram, 4 gb HD, and a minimum of 32mb T&L graphics card. i surpass those minimum requirements by far. i make sure that there are no programs running so nothing else takes up all the ram. even when i set all the enhancements to 0 (shadowing, mirroring, etc.) it still runs like shit.

and i am pretty good with computers, i'm just not the guy to run to if you want one built from scratch.

hey is there still a way to reset the windows admin password by opening up the comp. and pulling a plug and restarting? or was that only for Windows ME.,
 

Jung

???
Premium
13,970
1,391
487
#19
slemaire195 said:
ive always thought that Media Center was like XP pro, just with the MC program add-on.
It is.
and my graphics card is an NVidia geforce 6800 with 256 MB, up-to-date with graphics and shit.
Vanilla 6800s (meaning non GT, non Ultra 6800s) really aren’t all that great. In fact, the 6600GTs usually beat them out in benchmarks. They lack pipelines and shaders that some old games use, that might be your problem.

That said, a 6800 should run Sims just fine. But, of course, there could be a number of things that are causing your problems that aren’t related to graphics. My suggestion would be to post your problem on the Nvidia forums.
hey is there still a way to reset the windows admin password by opening up the comp. and pulling a plug and restarting? or was that only for Windows ME.,
That was NEVER the windows password, it was the BIOS password. ;)
 

Descent

Hella Constipated
7,686
109
157
#20
Junglizm said:
Vanilla 6800s (meaning non GT, non Ultra 6800s) really aren’t all that great. In fact, the 6600GTs usually beat them out in benchmarks. They lack pipelines and shaders that some old games use, that might be your problem.

That said, a 6800 should run Sims just fine. But, of course, there could be a number of things that are causing your problems that aren’t related to graphics. My suggestion would be to post your problem on the Nvidia forums.
What the fuck, dude? The 6800 is a great card, and it KILLS my 9700 Pro in benchmarks. My 9700 Pro handles everything I throw at it at 1024x768 (well, besides F.E.A.R. which runs at 800x600), but Christ, this guy has got driver problems up the putz.