A new way of analyzing GPU performance?

thatbloke

Junior Administrator
So I saw this linked on the Multiplay forums and have just read through it all:

http://techreport.com/articles.x/21516/1

It makes for some VERY interesting reading. The stutter they find, particularly on the multi-GPU setups are very interesting for someone like me with more money than sense who likes to buy new (and expensive) shinies for his PC every 6 months...

Thoughts?
 

Ki!ler-Mk1

Active Member
What's going on? Let's slide on those magnification glasses again and take a closer look.


Thanks for that, thoroughly interesting.

If i read it right, near the end he is saying that (SLI) the frame time (and rate?) reports from fraps is not concurrent with the final output from the card to the display.
 

Silk

Well-Known Member
What I don't get is hearing over and over that we can't see above 30fps. I absolutely guarantee you that I can tell when a game is "only" running at 30fps.

One example: Alice, Madness returns. It seemed "jerky" to me. I looked into it, and the game by default was capped at 30fps. I uncapped it and it was smooth as silk (no pun intended).

I will admit I don't really notice the difference above 50fps though. I just wondered where this 30fps nonsense came from - 30fps is jittery to me.
 

Traxata

Junior Administrator
Because you don't have Interlacing on a PC. TV broadcasts are at 24/25 FPS - PAL - (in the US they can be 30 FPS NTSC).

I don't know if you download TV series online and say watch them in VLC? But sometimes you may notice that the edges of people / cars / other objects in motion have fucked up edges? That's where the two images are laced on top of each other. 3D does this to some degree but with each image being different so you get the effect.

Games like Alice that are capped at 30FPS is due to them being console games ported to the PC... a TV which has an interlacing decoder built in is happy with 30FPS and you see it perfectly fine, when on a PC it gets all fucked up and that's why you see "jerky" images.
 

Huung

Well-Known Member
Also, is it not 60FPS which is basically the limit for our eyes - not 30?...

(ignoring all the special interlacing etc - just at a basic level)
 

BiG D

Administrator
Staff member
Theoretically perhaps, but there's definitely a perceivable difference between 60 and 120 fps if you have the hardware capable of displaying it.
 

Spicypixel

New Member
I'd have pinned the highest I could notice was about 75. Anecdotal evidence is best.
It's worth noting when a GPU reads 30fps it's mostly average so you'll notice the drops down to 10 and speed ups to 50+ etc.
Tl'dr only minimum frame rate matters.
 

Ki!ler-Mk1

Active Member
When i had a CRT i could tell if the fps was below the refresh rate(85), but since getting an LCD, im happy with 35 for non twitch gaming.

I often wonder why constant 30 is valued over fluctuating 25-50 though. I am not referring exclusively to consoles.

Its a shame we cant set an fps cap rather than stupid vs at 60
 
Top