Sign in with
Sign up | Sign in
Your question

Crossfire - I had no idea

Last response: in Graphics & Displays
Share
a c 217 U Graphics card
March 4, 2013 2:30:45 PM

In another thread, someone posted a link to a site that described a new way to analyze data. This new method analyzes what is displayed to the screen, rather than what FRAPS gathers from the video card itself. Since we are most concerned with what we see, this seems like a great solution, though even that isn't perfect, as syncing of the game and display is also important, but it is better than simply looking at frame rate.

Anyways, here are 2 links, the 2nd one is the shocking one to me. It makes me think that AMD does nothing to sync the frame rate in crossfire, and just display the frames the moment they are created, which helps explain how micro-stuttering is happening, and may be inflating FPS.

http://techreport.com/blog/24415/a

http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GeFo...

More about : crossfire idea

a c 273 U Graphics card
March 4, 2013 2:40:47 PM

I really hope this thread does not degenerate into a fanboy war because I think its a subject that merits discussion.
a c 217 U Graphics card
March 4, 2013 2:45:09 PM

Mousemonkey said:
I really hope this thread does not degenerate into a fanboy war because I think its a subject that merits discussion.


Agreed.
Related resources
a b U Graphics card
March 4, 2013 3:20:21 PM

I'm a little confused about the 3 frames being shown per picture in the article. Does this mean that only part of the screen is refreshed at a time? I've been thinking about adding another 7870 for cf and might want to hold off if it's going to be jittery. It already is in some games (Skyrim, Metro 2033) unless I use a framerate limiter and force vsync on + triple buffering.
a c 217 U Graphics card
March 4, 2013 3:30:27 PM

wanderer11 said:
I'm a little confused about the 3 frames being shown per picture in the article. Does this mean that only part of the screen is refreshed at a time? I've been thinking about adding another 7870 for cf and might want to hold off if it's going to be jittery. It already is in some games (Skyrim, Metro 2033) unless I use a framerate limiter and force vsync on + triple buffering.


You may need to understand the process involved in getting a frame to be displayed. There are two processes involved and a buffer that the two use:

1) The GPU creates frames as fast as it can.

2) The GPU sends a completed frame to a display buffer, which is nothing more than a block of memory.

3) The monitor reads and displays the image in the display buffer every refresh cycle like clock work, regardless if the GPU is writing to it at the same time.

Because the GPU can be writing to the display buffer at the same time as the monitor displays the info in the display buffer, it is possible that you can get multiple pieces of different images. Imagine that the monitor is going through the display buffer to display image 1, and once it gets 1/3 of the way through that image, the GPU changes the display buffer to image 2. The monitor will continue updating the screen, but at the 1/3 mark, it is now displaying image 2, rather than image 1, which it started with. Now 2/3 of the way down, the GPU might update the display buffer again, while the monitor is still updating its image, resulting in the last 2/3 of the displayed image being image 3.

What is happening here, is that with crossfire, image 1 is being created by GPU 1, image 2 is being created by GPU 2, and after image 1 is sent to the display buffer, GPU 1 starts on image 3.

The cause of short partial frame, that is a couple pixels high, is a result of GPU 1 finishing its image only a tiny bit before GPU 2 finishes its image. Almost immediately after GPU 1 sends its image 1 to the frame buffer, GPU 2 does the same thing, making GPU 1's image a short one. (The order can go either way with a similar result).

There needs to be a process that helps space out the frames, which currently is not taking place.
a b U Graphics card
March 4, 2013 3:41:52 PM

Thanks for the explanation. I think I get it now. This really make me leery about trying crossfire. I'm thinking AMD will work out a new driver to address this issue though. 13.2 beta is supposed to fix the frame latency issues for 7950, or at lest make it better.
a c 217 U Graphics card
March 4, 2013 3:44:19 PM

wanderer11 said:
Thanks for the explanation. I think I get it now. This really make me leery about trying crossfire. I'm thinking AMD will work out a new driver to address this issue though. 13.2 beta is supposed to fix the frame latency issues for 7950, or at lest make it better.


Unfortunately, these guys are using the latest drivers, and this problem is still present. The drivers are trying to fix the frame latency issue, but seems to be testing at an earlier point in the process.
a b U Graphics card
March 4, 2013 4:08:18 PM

To me it looks like AMD is not doing frame metering or is doing it badly. Since the 5xx series NVidia started using frame metering to smooth out SLI stuttering issues by delaying frames that "show up early". I'm assuming from this data AMD cards are producing 2 frames very close to each other and one is swapped less than 1 ms after the other which causes a runt frame(a frame not show on the monitor). Those runt frames that get tossed aside are likely what is showing up in the low dips in the FPS data.

When you run at frames much higher than 60 fps on a 60 hz screen it means you will have more tearing. Nvidia delays these frames via frame metering which does impact pure FPS performance some but it smooth's out the tearing. I suspect looking at this data that vSync could reduce the tearing issue some and even possibly reduce it further with vSync on a 120 hz monitor. Would have been nice to see PCPer test vSync on a 120 hz monitor with crossfire.
a c 217 U Graphics card
March 4, 2013 4:19:07 PM

JamesSneed said:
I suspect looking at this data that vSync could reduce the tearing issue some and even possibly reduce it further with vSync on a 120 hz monitor. Would have been nice to see PCPer test vSync on a 120 hz monitor with crossfire.


I was thinking the same thing. I think v-sync might remove the issue all together, as v-sync forces frames to wait for a new refresh.
a c 217 U Graphics card
March 4, 2013 4:28:09 PM

I for one think these new tests are a tribute to Nvidia, more so than a mark against AMD. Nvidia has actually done something to help us, yet received no recognition for it. In fact, their attention to detail, likely hurts their FPS reading, the one thing that every website has been using to compare them to their competition.
March 4, 2013 5:25:53 PM

Very interesting, nice find!

To the poster above me, I agree that they have done a lot to help with the stuttering and recent bench marks have shown that Nvidia cards have performed significantly better when compared to AMD. I'll find the link to the article that I found this one but I beleive it would add value to this discussion.

As people have mentioned above, wouldn't V-SYNC get rid of this issue as it tries to match the frames up between the GPU and display? Was this hypothesis tested?
a b U Graphics card
March 4, 2013 5:36:28 PM

It does sound true to me. I had a 2x6870 setup and have to say I was never satisfied with the performance. I couldn't quite put my finger on it, but a single card always yielded a better experience than the two of them toghether. That was the reason I departed from dual-VGAs.
March 4, 2013 5:44:03 PM

http://www.tomshardware.co.uk/radeon-hd-7990-devil13-79...

http://www.tomshardware.co.uk/radeon-hd-7990-devil13-79...

This was covered in their '7990' review, there's things you can do about it. Adaptive vsync does beat everything that AMD(that they developed) has to offer but radeonpro seems to beat adaptive vsync(but the difference is almost not worth noting). Radeonpro does seem to show a few anomalies though.

http://www.tomshardware.co.uk/radeon-hd-7990-devil13-79...

http://www.tomshardware.co.uk/radeon-hd-7990-devil13-79...
a b U Graphics card
March 4, 2013 6:01:38 PM

Great link, very intresting stuff.
I currently have 2 SLI computers that get used for games every day. (Im addicted to PlanetSide 2 at the moment). 1 with sli GTX470s and one with GTS250s.

A while back I think with the 300 drivers Nvidia added an option (maximize Pre-render Frames - default 3). With that driver I noticed that microstutter seems gone. I don't know if that option addressed it or if its something else, but microstutter does not bother me any more.

I did notice with the pre render frams set to 3 it feels a little slow, or laggy, Set to 2 or 1 feels great. Set to 0 I see, or I think I see the microstutter.
a c 273 U Graphics card
March 4, 2013 6:18:01 PM

bucknutty said:
Great link, very intresting stuff.
I currently have 2 SLI computers that get used for games every day. (Im addicted to PlanetSide 2 at the moment). 1 with sli GTX470s and one with GTS250s.

A while back I think with the 300 drivers Nvidia added an option (maximize Pre-render Frames - default 3). With that driver I noticed that microstutter seems gone. I don't know if that option addressed it or if its something else, but microstutter does not bother me any more.

I did notice with the pre render frams set to 3 it feels a little slow, or laggy, Set to 2 or 1 feels great. Set to 0 I see, or I think I see the microstutter.

That's the keyword right there mate, how can an ordinary user distinguish between what they "think" they are seeing and what they are "actually" seeing?
a c 664 U Graphics card
March 4, 2013 6:32:08 PM

I think this has been a Nvidia "secret" for quite some time, according to this quote from the Tech Report:
Quote:
In fact, at CES last week, I was discussing the latest developments with Nvidia's Tom Petersen, and he told me there was one question I failed to ask in my investigation of Radeon-versus-GeForce frame latencies: why did Nvidia do so well? Turns out, he said, Nvidia has started engineering its drivers with an eye toward smooth and consistent frame rendering and delivery. I believe that effort began at some point during the Fermi generation of GPUs, so roughly two years ago, max. Clearly, that focus paid dividends in our comparison last month of the GTX 660 Ti and the Radeon HD 7950.

http://techreport.com/review/24218/a-driver-update-to-r...

Meanwhile, AMD has literally only been made aware of the issue over the past few months:
Quote:
When we first published our rematch between the 7950 and the GTX 660 Ti, we pinged AMD to ask if they could explain the Radeon's struggles in recent games. AMD spokesman Antal Tungler told us that our article had "raised some alarms" internally at the company, and he said they hoped to have some answers for us "before the holiday."

http://techreport.com/review/24022/does-the-radeon-hd-7...

Which in turn led to the 13.2 beta Catalyst drivers in January 2013 that specifically addressed frame latencies.
http://techreport.com/review/24218/a-driver-update-to-r...

I think AMD will get a handle on the issue, provided it's not a hardware problem, but it is interesting to note the technological state of advancement between the two companies. AMD seems to be scrambling, while Nvidia appears to be several steps ahead.
a c 664 U Graphics card
March 4, 2013 6:37:06 PM

Mousemonkey said:
That's the keyword right there mate, how can an ordinary user distinguish between what they "think" they are seeing and what they are "actually" seeing?

I would say it's more easily apparent in the "feel" of the controls. These videos purport to show the difference in Skyrim when one card suffers from more hitching than the other.
http://techreport.com/review/24051/geforce-versus-radeo...
March 4, 2013 7:24:16 PM

If memory allows, this reminds me a little a while back in my assembly days when I was playing with setting up the raster interupt in the CBM64 so that I could 1. draw a new screen when the raster had just gone out of the visible area, and 2, split the screen into multiple zones - at specific split points - so that the 8 hardware sprites could be re-used (actually,less than 8 since you wanted 'free' sprites to be available to cross screen zones). By rendering/building up each frame into a temporary store (buffer) you get very smooth screen gfx from one frame to the next, so that when the raster interupt kicks in for the next cycle, the switchover is immediate and clean. Ahhhh....memories.

Interesting stuff. The screen redraw rate (even now) is pretty much dependant on the frequency of each refresh cycle and of course is strictly hardware limited to the monitor. It's a pitty both NVIDIA and AMD haven't fully synchronised multi-gpu frame rendering.....something I do find odd. I'm wondering if the obvious marketability of high fps readings is restricting both companies to properly catering for differing frame times, since the nett effect of doing this correctly would seem to be to reduce average fps readings.

a c 217 U Graphics card
March 4, 2013 7:36:39 PM

I wouldn't say both companies have done nothing. Nvidia has added hardware and software syncing to help deliver more evenly spaced out frames. However, some of it has to be predicted, as you don't know how long the next frame being rendered is going to take to be created. You can only guess based on past frames.

As mentioned in some of these articles that have talked about the latency issue, Nvidia has said they have taken steps in this area for the past couple years.
March 4, 2013 10:22:19 PM

Absolutely true......particularly for Nvidia.
I should have added that perhaps more emphasis should be placed on better optimising the core game gfx engine (i.e. developer) and how it feeds frame data into the gfx hardware, rather than relying too much on both hardware companies. Then again, I am assuming that the game code has a say in regards to this particular area.
These are interesting articles, and if proven correct, having 1 frame consist of as little as a few chopped-off lines of gfx data is going to throw the importance of fps in some doubt.
March 5, 2013 4:06:31 PM

17seconds said:
I think this has been a Nvidia "secret" for quite some time, according to this quote from the Tech Report:
Quote:
In fact, at CES last week, I was discussing the latest developments with Nvidia's Tom Petersen, and he told me there was one question I failed to ask in my investigation of Radeon-versus-GeForce frame latencies: why did Nvidia do so well? Turns out, he said, Nvidia has started engineering its drivers with an eye toward smooth and consistent frame rendering and delivery. I believe that effort began at some point during the Fermi generation of GPUs, so roughly two years ago, max. Clearly, that focus paid dividends in our comparison last month of the GTX 660 Ti and the Radeon HD 7950.

http://techreport.com/review/24218/a-driver-update-to-r...

Meanwhile, AMD has literally only been made aware of the issue over the past few months:
Quote:
When we first published our rematch between the 7950 and the GTX 660 Ti, we pinged AMD to ask if they could explain the Radeon's struggles in recent games. AMD spokesman Antal Tungler told us that our article had "raised some alarms" internally at the company, and he said they hoped to have some answers for us "before the holiday."

http://techreport.com/review/24022/does-the-radeon-hd-7...

Which in turn led to the 13.2 beta Catalyst drivers in January 2013 that specifically addressed frame latencies.
http://techreport.com/review/24218/a-driver-update-to-r...

I think AMD will get a handle on the issue, provided it's not a hardware problem, but it is interesting to note the technological state of advancement between the two companies. AMD seems to be scrambling, while Nvidia appears to be several steps ahead.


Good point but I think the smarter decision for AMD to make here would just be to talk to the team behind radeonpro and just find a way to automatically implement it with crossfire, radeonpro seems to be miles ahead of AMD and about as good as nvidia's adaptive vsync.
a c 217 U Graphics card
March 5, 2013 8:00:35 PM

This particular problem is likely only a problem when v-sync is off. Because Nvidia may be handicapping their FPS a bit, without v-sync on, in order to provide more evenly spaced out frames, I'd really love to see sites try adding a test with v-sync on, while using a 144hz or 120hz monitor. There still maybe syncing problems in regards to syncing up to the action, but it is something that has not been explored yet, and would be quite interesting to see. Especially since many people play all games with v-sync on.
a b U Graphics card
March 7, 2013 12:18:21 AM

bystander said:
This particular problem is likely only a problem when v-sync is off. Because Nvidia may be handicapping their FPS a bit, without v-sync on, in order to provide more evenly spaced out frames, I'd really love to see sites try adding a test with v-sync on, while using a 144hz or 120hz monitor. There still maybe syncing problems in regards to syncing up to the action, but it is something that has not been explored yet, and would be quite interesting to see. Especially since many people play all games with v-sync on.


Exactly, completely agree. I also would like to see the tests done.
!