Sign in with
Sign up | Sign in
Your question

GPU vs. CPU Upgrade: Extensive Tests

Tags:
  • Graphics Cards
  • Graphics
  • Product
Last response: in Graphics & Displays
Share
May 15, 2008 7:52:33 AM

This is an endless topic of conversation, with everybody you meet having their own pet opinion. What brings better results, purchasing a faster graphics card, or investing your cash in a more powerful processor? In an effort to find out, Tom’s Hardware has taken a good look at the most important chips. In this article, the Geforce 6800 GT, 7950 GT, 8800 GT, 8800 GTS 512, 9600 GT 1024 and 9800 GTX are up for cross-testing in terms of performance comparisons, and pitted against current CPUs like the E2160, E6750, Q6600 and X6800EE.

http://www.tomshardware.com/reviews/cpu-gpu-upgrade,192...



A very welcomed comparison!! Thanks :) 

More about : gpu cpu upgrade extensive tests

May 15, 2008 8:07:13 AM

Thanks, a good insight for me as i'm deciding to upgrade my system from my PCI slotted dimension 1100.
May 15, 2008 8:37:56 AM

Maybe all the ppl saying, "get a quad, it t0tally pwnz !!" will shut up for awhile.
Related resources
a b U Graphics card
May 15, 2008 8:42:07 AM

^amen to that
a b U Graphics card
May 15, 2008 9:03:30 AM

Ive seen too many times people have been told that running a s939 at 2.7 and higher wasnt fast enough for their card. At 2.7, a s939 equals about 2.1 C2d, and this just shows what youll get. The only thing thats important here is some games are extremly cpu dependent, and for those select games, the fastest cpu affordable is usually a good idea. But having someone completely update their rig thats used primarily for gaming using say, an older oceed S939 just doesnt bring the benefits. Maybe 25% tops, whereas if theyd just buy a better card, theyd get a huge increase. Like Ive always said, the nest single purchase a gamer can make to better his gaming is abetter gfx card
May 15, 2008 9:51:59 AM

Well s939 is getting crusty, you should dump that thing if you game ;) 

I mean more of the ppl saying get a quad core over any dual core cpu, but then I just dumped my 2800 + with a 7600 gs for a 5400 and an 8800gts...and I can tell you that overall it is five times faster not counting HD access, and that 5 times was measured, not guessed at...1960 or so to 9611 3dmark 06 ;) .


a b U Graphics card
May 15, 2008 10:06:16 AM

This is what Im saying tho, someone came in wanting to upgrade from a 7900GT to a 88xx card. Someone asked whats the cpu? He was running a S939 at 2.7, and instead of saying get the 88xx, they said itd be a waste, and hed have to upgrade his whole rig, or suffer. This shows thats just not true. There really no difference between the S939 and S940 with AMD, very lttle advantage, and if you take that into consideration, with this article, then it shows that you wont suffer as much as gain. It may turn out a little unbalanced/underperforming, but itd still be a huge improvement in gaming
a b U Graphics card
May 15, 2008 12:27:00 PM

royalcrown said:
Maybe all the ppl saying, "get a quad, it t0tally pwnz !!" will shut up for awhile.

From this one review? Explain to me how a Q6600 @ 3.2GHz can't beat a Q6600 @ 2.4GHz with a 9800GXT, despite medium low res with low fsaa and cpu scaling evident on other cpu's. ? On quick glance, Somethings doesn't add up. Likewise, a 7950GT beats a 9800GTX in crysis? Why test low and very high in the same charts.

I'll look it over in depth later. But at a glance it Sure doesn't pan out with XBit's and other sites findings: http://www.xbitlabs.com/articles/cpu/display/core2quad-...

BTW, just checked up on this. Firingsquad doesn't show any CPU scaling in COD4 even at low res:
http://www.firingsquad.com/hardware/intel_core_2_duo_e8...



May 15, 2008 12:35:57 PM

pauldh said:
From this one review? Explain to me how a Q6600 @ 3.2GHz can't beat a Q6600 @ 2.4GHz with a 9800GXT, despite medium low res with low fsaa and cpu scaling evident on other cpu's. ? On quick glance, Somethings doesn't add up. Likewise, a 7950GT beats a 9800GTX in crysis? Why test low and very high in the same charts.

I'll look it over in depth later. But at a glance it Sure doesn't pan out with XBit's and other sites findings: http://www.xbitlabs.com/articles/cpu/display/core2quad-...

Yep, Tomshardware's reviews and benchs have been getting shaky lately. :sarcastic: 
a b U Graphics card
May 15, 2008 12:37:25 PM

royalcrown said:
Well s939 is getting crusty, you should dump that thing if you game ;) 

I mean more of the ppl saying get a quad core over any dual core cpu, but then I just dumped my 2800 + with a 7600 gs for a 5400 and an 8800gts...and I can tell you that overall it is five times faster not counting HD access, and that 5 times was measured, not guessed at...1960 or so to 9611 3dmark 06 ;)  .




Careful now before you go trash talking my rig..... ;) 
My crusty old socket 939 4600 X2 paired with a BFG 8800GTS 512 OC scores just around 9780 in 3dmark 06.
Course the CPU and GPU are both overclocked just a little....

By the way, just why the hell are there no AMD processors in this test?
Did I miss something, or this is just an "Intel Only" club.
That would really put some light on whether the processor is indeed instrumental with a powerful GPU.
a b U Graphics card
May 15, 2008 12:41:59 PM

jaydeejohn, I see what you are saying. The X2's are still quite good at higher clocks. But at low clocks like 2.0-2.2GHz, they can quite often drag performance down. But FS $500 shows just how good overall those CPU's can do. http://www.firingsquad.com/hardware/$500_gaming_pc_upgrade/page4.asp
May 15, 2008 1:11:56 PM

Ohh lol all NVidia cards and Intel CPU's...
a b U Graphics card
May 15, 2008 1:28:35 PM

Security said:
Ohh lol all NVidia cards and Intel CPU's...


Well of course, what else in the world do you possibly need for "extensive tests" on CPU vs GPU performance? :sarcastic: 
May 15, 2008 1:39:44 PM

pauldh said:
From this one review? Explain to me how a Q6600 @ 3.2GHz can't beat a Q6600 @ 2.4GHz with a 9800GXT, despite medium low res with low fsaa and cpu scaling evident on other cpu's. ? On quick glance, Somethings doesn't add up. Likewise, a 7950GT beats a 9800GTX in crysis? Why test low and very high in the same charts.

I'll look it over in depth later. But at a glance it Sure doesn't pan out with XBit's and other sites findings: http://www.xbitlabs.com/articles/cpu/display/core2quad-...

BTW, just checked up on this. Firingsquad doesn't show any CPU scaling in COD4 even at low res:
http://www.firingsquad.com/hardware/intel_core_2_duo_e8...


@pauldh,

I don't dispute all that, but the fact is right now, for GAMING, most devs are still cutting their teeth and LEARNING, dual cores...much less maxing outt and optimized for dual, and the situation is only compounded for quads. If you can get a quad with the same clock for about the same price sure, go for it, or if you do encoding and stuff for a living or just a lot of stuff like that, again, go for it. GAMING however does not really benefit just yet from quad cores for what they end up sosting vs a higher clocked dualcore. When the devs really get better at multicore, prices will have dropped by then anyways and current quads will be old tech compared to what will be out then.
May 15, 2008 1:41:47 PM

I can't see the point in using AMD stuff - it still performs the same basic functions doesn't it? so the relationship between the basic functions should be the same regardless of who made the hardware
May 15, 2008 1:42:07 PM

jitpublisher said:
Careful now before you go trash talking my rig..... ;) 
My crusty old socket 939 4600 X2 paired with a BFG 8800GTS 512 OC scores just around 9780 in 3dmark 06.
Course the CPU and GPU are both overclocked just a little....

By the way, just why the hell are there no AMD processors in this test?
Did I miss something, or this is just an "Intel Only" club.
That would really put some light on whether the processor is indeed instrumental with a powerful GPU.



That is a good score ! Did I mention mine was stock, so it will be blowing up AFTER yours ?! :p  lol j/k
a b U Graphics card
May 15, 2008 1:49:30 PM

^lol, yeah, I just got the 8800 about 2 months ago.
But I been running the processor at 2.8 for about 2 years now. It will clock to ~3.0-3.1 and run stable (that's where is has to be to get that score) however it does get pretty darn warm, so I don't leave it there most of the time. I would like to do a complete system upgrade sometime this summer though, just had other things come up that I had to spend the money on so far.
a b U Graphics card
May 15, 2008 2:26:58 PM

i have a few rigs...

amd X2 2.6 with dual 8800gts (g92) 10k 3dmarks + (had a fx 60 which i basically the same cpu with unlocked mulitplier...it scored 11k + on 3dmark overclocked to 2.9)
2.5 phenom w/ 3870 X2 and 3870 13k 3dmarks + (can't get a decent over clock past 2.6...still tweeking. got 14k 3dmarks+ at 2.7 but was unstable in real world games)
q9300 @ 3ghz w/ 3870 X2 and 3870 16k 3dmarks + (at stock 2.5 it scores near the same as the phenom just a hair ahead its the over clocking potential thats huge here)
qx9650 @ 4ghz 3 8800GTX's 21k 3dmarks + (at stock 3ghz i get 15k 3dmarks)
gives u an idea of whats out there

these are all competent at 1920x1080/1920x1200 with full eye candy and playible frame rates (ie COD4, unreal 3, bioshock, assisians creed, front line fuel for war, turok....save crysis.... crysis at meduim to high ok....@ 4 ghz very high is almost playbile in tri sli.)
May 15, 2008 2:37:55 PM

jitpublisher said:

By the way, just why the hell are there no AMD processors in this test?
Did I miss something, or this is just an "Intel Only" club.

Nah, AMD just failed to provide Socket 775 compatible processors.

It would be interesting to see if the scaling is different with AMD mainboards/CPUs/GPUs. Then again, how likely is that?
May 15, 2008 2:58:26 PM

Well the tests displayed a bit what i was expecting.
For gaming experience you get much more bangs per buck, if you upgrade the GPU instead of the CPU.
The gain is much more if you have a average CPU and a top end GPU then vice-versa.

Nothing to see here.


"Till they take the GPU from my cold dead hands!!!"
May 15, 2008 3:44:24 PM

royalcrown said:
Maybe all the ppl saying, "get a quad, it t0tally pwnz !!" will shut up for awhile.

I think people say this because they went from an AMD XP or a P4 and upgraded straight to the Intel Quad Core. Then they think the massive improvement is because of the quad, when really what makes it so fast is the new architecture, they would have seen the same benefits by merely going Core 2 Duo.
a b U Graphics card
May 15, 2008 10:19:37 PM

royalcrown said:
@pauldh,

I don't dispute all that, but the fact is right now, for GAMING, most devs are still cutting their teeth and LEARNING, dual cores...much less maxing outt and optimized for dual, and the situation is only compounded for quads. If you can get a quad with the same clock for about the same price sure, go for it, or if you do encoding and stuff for a living or just a lot of stuff like that, again, go for it. GAMING however does not really benefit just yet from quad cores for what they end up sosting vs a higher clocked dualcore. When the devs really get better at multicore, prices will have dropped by then anyways and current quads will be old tech compared to what will be out then.

Yeah, I agree there is really no real world gaming performance right now between the ubber Dual and Quad core CPU's. They are all excellent and really pretty equal during actual gaming. Max playable settings in most games are typically GPU limited not CPU limited, so if you have enough CPU then there would be little difference. The Quads are just as good at gaming as the duals which has been my point all along, so like you said, for the gamer, it comes down to price. Down the road they could potentially be better like we already see in some games. Only thing I dispute is people who act like the Quad is worse at gaming for whatever reason they care to pull out of a hat. And I do dispute people who were pushing the e8400 at any price, like back when it was more expensive than the Q6600.

Moving onto CPU scaling tests (low res, no fsaa), the Q6600 at high clocks tends to keep up with dual core at even higher clocks. Look at Xbit for an example in all the games they test. But, lets just say for arguments sake, clocked the same the Quad and dual keep up with each other. What I pointed out above is, first the 2.4GHz Q6600 edged out the 3.2GHz Q6600 (something is wrong). Second the 2.67GHz e6750 beat the 3.2GHz Q6600 by 10 fps (something is really wrong). Xbit shows how well the Q6600 scales with higher clocks, even pulling ahead of an even higher clocked dual. I'm a little skeptical right off the bat because of the massive e6750 lead when clocked 500+ MHZ less than the Q6600.
May 15, 2008 10:27:03 PM

pauldh said:
Yeah, I agree there is really no real world gaming performance right now between the ubber Dual and Quad core CPU's. They are all excellent and really pretty equal during actual gaming. Max playable settings in most games are typically GPU limited not CPU limited, so if you have enough CPU then there would be little difference. The Quads are just as good at gaming as the duals which has been my point all along, so like you said, for the gamer, it comes down to price. Down the road they could potentially be better like we already see in some games. Only thing I dispute is people who act like the Quad is worse at gaming for whatever reason they care to pull out of a hat. And I do dispute people who were pushing the e8400 at any price, like back when it was more expensive than the Q6600.

Moving onto CPU scaling tests (low res, no fsaa), the Q6600 at high clocks tends to keep up with dual core at even higher clocks. Look at Xbit for an example in all the games they test. But, lets just say for arguments sake, clocked the same the Quad and dual keep up with each other. What I pointed out above is, first the 2.4GHz Q6600 edged out the 3.2GHz Q6600 (something is wrong). Second the 2.67GHz e6750 beat the 3.2GHz Q6600 by 10 fps (something is really wrong). Xbit shows how well the Q6600 scales with higher clocks, even pulling ahead of an even higher clocked dual. I'm a little skeptical right off the bat because of the massive e6750 lead when clocked 500+ MHZ less than the Q6600.

Yep, those results simply conflict with all existing benchmarks before it. It's not possible that all existing benchmarks are wrong, therefore that one must be wrong. :sarcastic: 

Tomshardware is really slipping lately.
a b U Graphics card
May 15, 2008 10:50:51 PM

dagger said:
Yep, those results simply conflict with all existing benchmarks before it. It's not possible that all existing benchmarks are wrong, therefore that one must be wrong. :sarcastic: 

Tomshardware is really slipping lately.

Well, I'm not going to jump all over them just yet as I really have not read the review only very quickly glanced at the Crysis and COD4 charts. But the COD4 results I had seen, look like they were done on different systems altogether, not just different CPU's.

Crysis seemed odd that they had no performance boost in the GPU bench from 2.4GHz to 3.2GHz. AT 1680x1050 all high 2xaa/16xaf, I gained 4-5 fps in the GPU bench OC'ing my Q6600 from 2.4 to 3.0GHz. Yet at 12x10 high they gained 0.1 fps going to 3.2GHz. It's like they dropped the mem clocks or something. I will have to test 12x10 high with a single 8800GT to see if I have gains, or if it's only with SLI 8800GT. Is it possible the single 9800GTX was holding things back at 12x10 high?????

How about you. In the Crysis with your system, what difference do you see in the GPU bench by OC'ing your Q?
May 15, 2008 11:00:47 PM

pauldh said:
Well, I'm not going to jump all over them just yet as I really have not read the review only very quickly glanced at the Crysis and COD4 charts. But the COD4 results I had seen, look like they were done on different systems altogether, not just different CPU's.

Crysis seemed odd that they had no performance boost in the GPU bench from 2.4GHz to 3.2GHz. AT 1680x1050 all high 2xaa/16xaf, I gained 4-5 fps in the GPU bench OC'ing my Q6600 from 2.4 to 3.0GHz. Yet at 12x10 high they gained 0.1 fps going to 3.2GHz. It's like they dropped the mem clocks or something. I will have to test 12x10 high with a single 8800GT to see if I have gains, or if it's only with SLI 8800GT. Is it possible the single 9800GTX was holding things back at 12x10 high?????

How about you. In the Crysis with your system, what difference do you see in the GPU bench by OC'ing your Q?

Well, I did notice that the frame rate drop in heavy battle sequences, when there are lots of AI blowing lots of destructable environment, mostly went away at 3.6ghz compared to 2.4ghz. It's a difference between constant 50fps compared to 45-50fps. :p 

Didn't actually bench it, too lazy. :D 
May 15, 2008 11:05:13 PM

lol

Can you overclock a 8400gs to 8800gt level?

No..

A $50 E2160 can be overclocked to E8200 level perfromance...

Making the decision between a graphics card or cpu upgrade for gaming pointless..




a b U Graphics card
May 15, 2008 11:21:22 PM

Yeah, lets hope future Intel budget chips can OC / perform like a beast too.
a b U Graphics card
May 16, 2008 2:18:11 AM

Paul, thanx for the link, looking a CoD4, it shows a whole whopping 10% performance gain using a x6800 vs a 5200x2 on a 8800GT. For the money, id be getting a better gfx card
a b U Graphics card
May 16, 2008 2:36:33 AM

Just wanted to post back crysis gpu bench performance numbers I did tonight. Glad I tested the single 8800GT and not just SLI 8800GT. Read on to see why.

AT 1280x1024 all high 0xaa/16xaf:
Q6600@3.0GHz SLI 8800GT - 47.03 fps ave
Q6600@2.4GHz SLI 8800GT - 41.72 fps ave
Q6600@3.0GHz Single 8800GT - 40.55 fps ave
Q6600@2.4GHz Single 8800GT - 40.56 fps ave

So basically, (edit similar to) Toms review showed (with a single 9800GTX), with a single 8800GT at 12x10 high 0x/16x it was GPU limited...same performance with the Q6600 at stock or OC'ed to 3.0GHz. But with SLI 8800GT it was CPU limited at stock 2.4GHz and maybe even at 3.0GHz. With SLI, the 3.0GHz OC gave a 5.31 fps higher average (a 12.73% increase).


Now lets look at the settings I play Crysis at.

1680x1050 all high 2xaa/16xaf:
Q6600@3.0GHz SLI 8800GT - 37.83 fps ave
Q6600@2.4GHz SLI 8800GT - 34.94 fps ave
Q6600@3.0GHz Single 8800GT - 26.26 fps ave
Q6600@2.4GHz Single 8800GT - 26.25 fps ave

So even at these 16x10 high 2xaa/16xaf settings there was still a gain of 2.89 fps (8.27%) overclocking the Q6600. And as expected things were GPU bound again with the single 8800GT with no gain seen OC'ing the CPU.

Comparing % gain going from single 8800GT to SLI 8800GT:
12x10 high 0xaa/16xaf:
Q6600 @ 2.4GHz - (+ 2.86%)
Q6600 @ 3.0GHz - (+ 15.98%)

1680x1050 high 2xaa/16xaf:
Q6600 @ 2.4GHz - (+ 33.10%)
Q6600 @ 3.0GHz - (+ 44.06%)

Sum it up ... If the GPU benchmark (note I say if) reflects real world crysi gaming, then SLI owners should overclock their CPU. Single GPU owners probably would see no gain overclocking their cpu while running high details. In real gaming with ai opponents and physics effects the cpu may come into play more than in the Crysis GPU benchmark.
a b U Graphics card
May 16, 2008 2:40:19 AM

JAYDEEJOHN said:
Paul, thanx for the link, looking a CoD4, it shows a whole whopping 10% performance gain using a x6800 vs a 5200x2 on a 8800GT. For the money, id be getting a better gfx card

For COD4 it sure likes like that is the best bet. Of course, FS probably uses a scripted timedemo, so real world gaming could very well be a little different. Driverheaven tests only real world gaming, and apples to apple settings unlike [H], so I like to compare results to theirs when possible.
a b U Graphics card
May 16, 2008 2:42:50 AM

Great post and great depth. Glad youre doing this, it helps aloy of people. Heres a question. If a oc helps a sli setup, could it be that the gains we arent seeing from quad card setups be mainly cpu limatations?
a b U Graphics card
May 16, 2008 2:57:54 AM

Thx. (edit: I go a little confused on Toms charts. Their very high Crysis showed no scaling. Their 12x10 high in crysis showed some scaling. COD4 showed no scaling with the Q6600) Toms review got me thinking if maybe the gains I saw when OC'in my Q only came paired with SLI, since they had no Crysis CPU scaling with a single G92. Since it was the GPU bench, I knew I could duplicate their results and as it turned out with one card I got no scaling just like them. Honestly I was kinda shocked to get results like Toms 12x10. I never thought things would be all GPU bound at 12x10 high no fsaa (I expected some cpu scaling with the extra 600MHz cpu clocks).

Good question on the Quad GPU's. I'd guess it sure could be true. I would think if I had run 16x10 4x/16x instead of 2x/16x, then the two cpu speeds may have equaled out. (GPU limited). So in the case of Quad SLI, I would think once you push 19x12 with aa things should shift more to the GPU's. But at what point is it truely GPU bound? Not sure. Also, I guess if you push res & fsaa too far we could have a vram limit come into play.


I always like review sites to use massive GPU power like SLI 8800GTX for doing their CPU scaling tests. Then like FS, I like to see low res no aa as well as say 16x12 4x/16x (depending on the game) to get an idea of scaling andd gameplay.
a b U Graphics card
May 16, 2008 3:03:46 AM

Yeah, a thourough complete idea of what youd have to expect in your gaming, no matter the game, res or eyecandy
May 16, 2008 3:15:40 AM

I'm extremely glad that this review came out. I was toying with getting a Q9450 instead of a E2160 (original plan), but after reading it, I'll stick with the E2160 OC and get the Q9450 when prices on quads come down. With the extra $300 I'm saving, I can upgrade my rig elsewhere. Extremely helpful and informative. Thanks Tom's!!
May 16, 2008 3:23:59 AM

pauldh said:
Just wanted to post back crysis gpu bench performance numbers I did tonight. Glad I tested the single 8800GT and not just SLI 8800GT. Read on to see why.

AT 1280x1024 all high 0xaa/16xaf:
Q6600@3.0GHz SLI 8800GT - 47.03 fps ave
Q6600@2.4GHz SLI 8800GT - 41.72 fps ave
Q6600@3.0GHz Single 8800GT - 40.55 fps ave
Q6600@2.4GHz Single 8800GT - 40.56 fps ave

So basically, (edit similar to) Toms review showed (with a single 9800GTX), with a single 8800GT at 12x10 high 0x/16x it was GPU limited...same performance with the Q6600 at stock or OC'ed to 3.0GHz. But with SLI 8800GT it was CPU limited at stock 2.4GHz and maybe even at 3.0GHz. With SLI, the 3.0GHz OC gave a 5.31 fps higher average (a 12.73% increase).


Now lets look at the settings I play Crysis at.

1680x1050 all high 2xaa/16xaf:
Q6600@3.0GHz SLI 8800GT - 37.83 fps ave
Q6600@2.4GHz SLI 8800GT - 34.94 fps ave
Q6600@3.0GHz Single 8800GT - 26.26 fps ave
Q6600@2.4GHz Single 8800GT - 26.25 fps ave

So even at these 16x10 high 2xaa/16xaf settings there was still a gain of 2.89 fps (8.27%) overclocking the Q6600. And as expected things were GPU bound again with the single 8800GT with no gain seen OC'ing the CPU.

Comparing % gain going from single 8800GT to SLI 8800GT:
12x10 high 0xaa/16xaf:
Q6600 @ 2.4GHz - (+ 2.86%)
Q6600 @ 3.0GHz - (+ 15.98%)

1680x1050 high 2xaa/16xaf:
Q6600 @ 2.4GHz - (+ 33.10%)
Q6600 @ 3.0GHz - (+ 44.06%)

Sum it up ... If the GPU benchmark (note I say if) reflects real world crysi gaming, then SLI owners should overclock their CPU. Single GPU owners probably would see no gain overclocking their cpu while running high details. In real gaming with ai opponents and physics effects the cpu may come into play more than in the Crysis GPU benchmark.

Lol, Toms should fire whatever grub who did their benches recently and hire you. :D 
a b U Graphics card
May 16, 2008 3:29:15 AM

hispeed120 said:
I'm extremely glad that this review came out. I was toying with getting a Q9450 instead of a E2160 (original plan), but after reading it, I'll stick with the E2160 OC and get the Q9450 when prices on quads come down. With the extra $300 I'm saving, I can upgrade my rig elsewhere. Extremely helpful and informative. Thanks Tom's!!

If you look at the overall totall of fps, you will see the Q6600 and a 9600GT outperforming the e2160 and a 9800GTX. Same goes if you OC both CPU's. That's kinda amazing as a whole to see a CPU make up for a double the cost GPU difference. BUT, rather than look at that overall total, look at your games at the res you hope to play at, and then make your decision. In general, a C2D at 2.4GHz or above is quite good for gaming.
a b U Graphics card
May 16, 2008 3:34:47 AM

Well, thx dagger. Still have not read over every game they benched, But apart from their COD4 results, I am starting to appreciate their review more. I still don't like the 7950GT being tested at low details and stuck int he crysis charts. And their lack of Q6600 clock speed scaling doesn't add up to other reviews that tested at 10x7.
!