Bottlenecking Tests for 2009?
Tags:
- Future Article Ideas
- Bottleneck
-
CPUs
- Tom's Hardware
Last response: in Site Feedback
Hey guys,
This thread made me realize that there's no article (AFAIK) covering bottlenecks caused by low-end CPUs to more contemporary PCIe cards. Maybe something to revisit?
Regards,
r_manic
This thread made me realize that there's no article (AFAIK) covering bottlenecks caused by low-end CPUs to more contemporary PCIe cards. Maybe something to revisit?
Regards,
r_manic
More about : bottlenecking tests 2009
Excellent idea, as well as a brief interpretation of bottleneck, and what it truly means/says. Theres been many a thread about how a mid ranged card is affected by a mid/lower range cpu.
I think an article like this maybe wont get ads, but itll certainly get readers, much like the P2 vs i7 article did
I think an article like this maybe wont get ads, but itll certainly get readers, much like the P2 vs i7 article did
strangestranger
July 28, 2009 10:22:56 AM
Related resources
- Toshiba Satellite P500 from 2009 with Vista 32bit - What is my bottleneck? - Forum
- Test Setup for the Q3 2009 Gaming Benches - Forum
- Final step - compatibility issues and bottleneck test(See my build-gaming PC under 1100-1200$) - Forum
- how can I test what is the bottleneck between two devices - Forum
- Stress testing with bottleneck - fair assessment? - Forum
wuzy
July 28, 2009 6:45:09 PM
Another PCIe lanes limitation through taping would be most welcome. This time for latest top-end singke GPU cards and PCIe 2.0 16x.
In fact they should do it for every new gen. of cards.
To BoM: Want more readers? Here's your chance. It's one of the few THG original review idea that's still selling like hot pancakes,
In fact they should do it for every new gen. of cards.
To BoM: Want more readers? Here's your chance. It's one of the few THG original review idea that's still selling like hot pancakes,
Master Exon
July 28, 2009 9:48:02 PM
strangestranger
July 30, 2009 12:04:24 AM
Those bandwidth articles are pathetic, they do nothing but do pointless tests without actually explaining the reasons why.
Look at any forum around the net and no one, no one actually can explain why usage would go up.
THe thinking is the more powerful the more usage without any backing up of why.
Show me an article which explain, without conjecture, just cold hard facts about what causes bandwidth usage.
People will do tests and all that and look at fps differences, usally average and try and deduce things from them but they show nothing, nothing at all of any use.
Look at any forum around the net and no one, no one actually can explain why usage would go up.
THe thinking is the more powerful the more usage without any backing up of why.
Show me an article which explain, without conjecture, just cold hard facts about what causes bandwidth usage.
People will do tests and all that and look at fps differences, usally average and try and deduce things from them but they show nothing, nothing at all of any use.
wuzy
July 30, 2009 8:26:09 AM
I find it's always the conclusion that many authors can get caught out on.
It has to be a fine balance between making it as specific as possible and not being caught out by generalisation (ah yes, I don't forget easily). And making the above statement understandable by general public at the same time.
A large proportion of readers may skip straight to conclusion I believe.
Being a little different here what I do when I read a review of core components:
1. Scan read through the intros
2. Check test platform (hardware AND software) to make sure there's no unfair bias (e.g. if SSD is being tested with Windows XP, stop reading)
3. Start reading the data and look for real-world usage ones, especially relating to your work. Disregard all raw synthetic benchmarks unless I'm researching on academic basis.
4. Read the conclusion of the author. If it matches mine then he/she gets a big tick in my book (very hard to come by btw). Partially matching, ok a pass. Totally biased and unfair, BLACKLISTED!
It has to be a fine balance between making it as specific as possible and not being caught out by generalisation (ah yes, I don't forget easily). And making the above statement understandable by general public at the same time.
A large proportion of readers may skip straight to conclusion I believe.
Being a little different here what I do when I read a review of core components:
1. Scan read through the intros
2. Check test platform (hardware AND software) to make sure there's no unfair bias (e.g. if SSD is being tested with Windows XP, stop reading)
3. Start reading the data and look for real-world usage ones, especially relating to your work. Disregard all raw synthetic benchmarks unless I'm researching on academic basis.
4. Read the conclusion of the author. If it matches mine then he/she gets a big tick in my book (very hard to come by btw). Partially matching, ok a pass. Totally biased and unfair, BLACKLISTED!
^^ Agreed. I think synthetics are used because they may be popular, but often pointless, matching real world apps, huge bonus, apps in same family/type differ too wildly and if set up isnt as close as can be made, without a mention or caveat to any changes, spells the reviewer doesnt know his stuff, and isnt at least following thru, as some sites are smaller, and simply dont always have on hand whats needed, which may or may not cloud the final verdict
strangestranger
July 30, 2009 1:32:01 PM
Related resources
- Performance test , bottleneck? Forum
- More resources
Read discussions in other Site Feedback categories
!
.