Sign in with
Sign up | Sign in
Your question

Urgent question about Graphics Card for a CAD Lab

Last response: in Graphics & Displays
Share
June 18, 2013 7:36:21 PM

I'm a professor who teaches Advanced AutoCAD (3D Rendering) and we are speccing new systems for our 4 year update. The problem we've been having is the computers are usually fine for the first year or two, but then, because students are always pushing the systems to their limit, the renderings take forever and they crash more frequently during rendering.

In addition to CAD rendering, the computers are also used for Photoshop, Word processing, Sketchup, and general computing.

Our contract is with Dell, and our budget is $3000 per system, so I sent in a suggested spec of an Alienware Aurora with an Nvidia GeForce GTX 690 4GB DDR5. This is a gaming card, to be sure, and I'm aware of the Gaming vs. Pro card differences for CAD. However, the response I received was for a system with a Quadro 4000 2GB. This is essentially an Nvidia 550Ti, which by the time we get the systems in the fall will be nearly three years old.

I know the Quadro drivers are optimized for CAD and 3D rendering, but my question is this:

Will the optimized drivers make up for the fact that the card is so inferior to what is available in the gaming card market? The price point of pro cards usually means you get a significantly older card dollar for dollar. To me this defeats the purpose of trying to make the systems as future-proof as possible. If the GTX 690 is significantly better than the Quadro 4000, even with the optimized drivers, how do I convince our tech department of this fact?

Thanks!
Fred
a b Î Nvidia
a c 91 U Graphics card
June 18, 2013 7:52:11 PM

first of all, Nvidia's 600 series has HORRIBLE CUDA performance per dollar, you'll be much better off with AMD's stream processors even though they're not as good (but you get more than double of them in number compared to the gtx 600 series).

the easiest way to convince you of this is with benchmarks: note how the 690 got destroyed by the much cheaper 7970 in compute and Maya rendering

http://www.tomshardware.com/reviews/geforce-gtx-690-ben...

if you're happy to spend $1000 per GPU, then get the gtx Titan. look at the 3D rendering performance in this review comparing the Titan, the 780, and the 680 (690 is basically two 680s)

http://www.tomshardware.com/reviews/geforce-gtx-780-per...


Also, DELL is correct recommending a Quadro card for your purposes as it would provide much more reliable performance. the architecture of the cards are inherently different to suite different purposes. also keep in mind what gaming cards are built for. sure they can work fast, but in the WORST CASE, they're allowed to just stutter or crash, and in most cases all it'll generate are a few runt frames and the game would move on.

Lastly, how long are the renderings that your students try to do? as you may know, it's impossible to catch every bug that hangs up a rendering session, especially in a teaching environment, and perhaps it would be better to have the students render to frames and stitch the movies together afterwards? that way if the renderer crashes you can pick up on the frame it crashed on.
a b U Graphics card
June 18, 2013 8:09:32 PM

what exactly were the specs of the Aurora you sent in? can you post them?
Related resources
June 18, 2013 9:00:17 PM

Fred Grivas,

Your CAD workstation graphics card question is one that I was forced to consider carefully in the recent past as I made the the shift to 3D CAD in 2010. The thing is, the requirements for image quality in 3D modeling and rendering establish very different priorities from gaming oriented hardware> CPU, RAM, and GPU included

It's tempting to believe that gaming / consumer graphics cards are higher performance for half the cost, but the opposite is the fact- above a certain point. Certainly gaming cards are faster, but the cost of using a gaming card in an imaging workstation can be many times higher. Gaming priorities have evolved hardware based on achieving the highest frame rates at the highest possible setting while workstation cards focus on image quality and stability- which is also not as fast. While benchmarks are very useful to make comparisons within their respective categories. benchmarks are quantitative instead of qualitative. Gaming frame rates simply can't be given the same weight as image quality in choosing workstation hardware.

Importantly, there are image creation tasks that a GTX or Radeon HD simply will not perform . When I substituted a GeForce GTX 285 (1GB) for the original Quadro FX 580 (512MB) in a Dell Precision T5400 , the results at first were encouraging- smooth 3D model navigation was the greatest benefit, but when the model became large and I tried extracting 2D images and renderings, the results were disastrous > rendering artifacts and crashes, bizarre shadows, inability to use multiple lighting sources, limited anti-aliasing, would sometimes fail to display textures in renderings, balky 3D navigation (if I didn't keep the image in constant motion, it would freeze) , inability to open viewports in Solidworks. The GTX 285 (1GB) had been carefully chosen for it's hardware and configuration similarities to the Quadro FX 5800 (4GB, CAD and video editing oriented) same GPU, 512-bit and 240 CUDA cores, but the two cards were just from different planets in workstation use. the differnce ebtween the two was in general only the drivers.

I replaced the GTX 285 (cost new, $350) with a Quadro FX 4800 ( cost new, $1,200) and while the Passmark rating declined slightly from 1909 to 1859 with the Quadro, all my quality and reliability / glitches disappeared. Breathing could be resumed when running renderings. In terms of time saved in running renderings multiple times due to crashes and bizarre behavior and time spent in diagnostic frustration, even if a GTX was ten times faster than a Quadro, it would still lose economically. For example, if a medium-sized architectural firm loses five hours in one month due to failed renderings that loss would buy a new Quadro K5000. In business, the extremity of overhead and loss revenue demands the highest reliability. On a different scale, if the server at amazon.com fails, it costs the firm $500,000 per hour and a two-hour failure justifies replacing the entire multi- million Dollar server.

This and past experience was such that after consideration, I decided that the only solution to a reliable, high image quality workstation had to be > Xeon > ECC RAM > Quadro. This conclusion was so complex and hard-won that in January, I started a thread on this subject to describe the important advantages of that configuration for content creation systems >

http://www.tomshardware.com/forum/386333-33-quadro-gefo...

> and apparently others have had this same, difficult decision to make as this thread has 15 posts and has been read 11,500 times. The performance numbers to cost ratio of GeForce / Radeon are just too tempting to those who don't have the relevant qualitative experiences. The thing is, up to a certain level of image quality expectation, content consumer cards do work, but as I discovered, that line over into vast time-wasting and frustration was reached quickly.
Possibly, in the thread linked above, you might find a convincing summary argument for your colleague in that thread. In the last post in which there is a consideration of the AMD Firepro there is a quote from a newegg Firepro user that makes a good summary of this difficult equation. >


This user review on newegg of the Firepro W8000 puts it very well an dis obviously based on experience that I don't have, but with similar conclusions>

>""Pros: My Company bought one [Firepro W8000] for a new workstation build. Ir runs plenty fast... I ran a benchmark on it and purely from a graphics rating prospective, it's faster than a GTX 680. That means that it's about double as expensive as gaming cards for the raw computing power you get... But this is Workstation graphics.

Cons: Drivers are still being updated constantly... Nvidia has a reputation for better drivers than AMD.

Other Thoughts: In terms of raw power, this card is great, better than the quadro 5000 (not sure about the k5000 yet). The problem you could run into is drivers and support for your application. As the drivers are optimized, the hardware will be able to utilize its full potential.. As I said before, the power in the GPU Passmark benchmark that I ran is faster than GTX 680. In actual applications it depends on what you're doing... Also it's hard to say where the W8000 will ultimately end up; even then though, you still have to look at your application. It's not easy to just say buy this since there are so many variables on the driver/software end of things. As for hardware, AMD has a reputation for giving you more for your money than Nvidia. ""< END
___________________________________

After a lot of thought, and because it's so easy to copy and paste, the following is my specification for my ideal , not outrageously priced, all-rounder imaging system. This system is designed for the applications I use > 2D /3D CAD, 3D modeling for architecture and industrial design, rendering, graphic design, and in the context of trying for the best current cost / performance / reliability configuration. By coincidence, the cost is similar to your current proposed system budget >

BambiBoom PixelDozer Cadaedigrapharific IV®®©™℞©™℞_5.31.13

1. Xeon E5-1650 6-core 3.2 /3.8GHz $600 ( http://ark.intel.com/products/64601 , Very strong computational power, the E5-1650 is ranked No 14 on the Passmark CPU benchmarks. The six cores / twelve at this speed are sufficient for good rendering performance= much faster than my Precision T5400 current eight cores / sixteen threads )

2. Noctua NH-D14 SE2011 140mm and 120mm PWM SSO CPU Cooler $90

3. ASUS P9X79 WS LGA 2011 Intel X79 SATA 6Gb/s USB 3.0 SSI CEB $380

4. Kingston 32B (4X 8GB) 240-Pin DDR3 SDRAM DDR3 1600 ECC Unbuffered Server Memory w/TS Intel Model KVR16E11/8I $300. ( The 32 GB is due to using six or seven applications simultaneously and sometimes multiple files up to 800MB )

5. NVIDIA Quadro K4000 3GB GDDR5 PCI Express 2.0 x16 Workstation Video Card $800

6. SAMSUNG 840 Pro Series MZ-7PD256BW 2.5" 256GB SATA III MLC Internal Solid State Drive (SSD) $250 (operating system and applications)

7. 2X WESTERN DIGITAL 1TB HARD DRIVE SATA 64MB 6 Gb/s WD AV-GP (RAID 1) $170 (Data and mirroring backup. I use a separate system for sound, or I would make these drives 4TB)

8. ASUS Black Blu-ray Burner SATA BW-12B1ST/BLK/G/AS $85

9. LIAN LI PC-A75 Black Aluminum ATX Full Tower Computer Case $182

10. SeaSonic X750 Gold 750W ATX12V V2.3/EPS 12V V2.91 SLI Ready 80 PLUS GOLD Certified Full Modular Active PFC Power Supply $150

11. Microsoft Windows 7 Ultimate SP1 64-bit - OEM $190

____________________________________________________

TOTAL > about $3,200

SAMSUNG S27A850T Matte Black 27" 5ms GTG Widescreen LED Backlight LCD Monitor 2560 X 1440 monitor $730.
SAMSUNG S24A850DW Matte Black 24" 5ms (GTG) Widescreen LED Backlight LCD Monitor 1920 x 1200 $430
_______________________________________________________

In my future, I would like to be able to run mechanical, structural, thermal, and fluid systems simulations as well as high resolution animations of 3D models. And here, for entertainment purposes that is oriented towards high performance using the most demanding applications as performance references> particle, molecular ( NAMD), aerodynamic, 3D CGI animation, video processing / rendering, thermal, structural, atmospheric, and oceanic modeling Personal Supercomputer optimized for GPU /CUDA accelerated applications>

http://www.nvidia.com/object/gpu-accelerated-applicatio...

BambiBoom PixelCannon Cadaeditographarific Supermodeler VI ®™℞©™℞®©_ 6.1.13

1. (2) Intel Xeon E5-2687W Sandy Bridge-EP 3.1GHz (3.8GHz Turbo Boost) LGA 2011 150W 8-Core Server Processor BX80621E52687W $3,869.98 ($1,934.99 each) (Providing 16 cores / 32 threads. The E5-2687W is currently no. 2 on Passmark CPU benchmarks and the sixth fastest system in Passmark baseline ratings, using a Quadro 6000)

2. (2) COOLER MASTER Hyper 212 EVO RR-212E-20PK-R2 Continuous Direct Contact 120mm Sleeve CPU Cooler $67.98 ($33.99 each) (This is an open category, and the Noctua listed above may be a better choice here, but this particular cooler is said to be effiecent and very quiet)

3. Intel S2600COE SSI EEB Server Motherboard Dual LGA 2011 DDR3 1600 $599.99

4. 128GB Kingston (8 x16GB) 240-Pin DDR3 SDRAM DDR3 1600 (PC3 12800) ECC Registered Server Memory $1238.98

5. LSI MegaRAID Internal Low-Power SATA/SAS 9260-8i 6Gb/s PCI-Express 2.0 w/ 512MB onboard memory RAID Controller Card, Single $498.99

6. PNY VCQ6000-PB Quadro 6000 6GB 384-bit GDDR5 PCI Express 2.0 x16 Workstation Video Card $3,658.99

7. NVIDIA TESLA K20 (900-22081-2220-000) GK110 5GB 320-bit GDDR5 PCI Express 2.0 x16 Workstation Video Card $3,499.99 (coprocessor for GPU accelerated applications)

8. (2) SAMSUNG 840 Pro Series MZ-7PD512BW 2.5" 512GB SATA III MLC Internal Solid State Drive (SSD) (RAID 0) $1,039.98 ($519.99 each) (OS and Applications)

9. (5) Western Digital RE WD4000FYYZ 4TB 7200 RPM 64MB Cache SATA 6.0Gb/s 3.5" Enterprise Internal Hard Drive (RAID 10) $2,149.95

10. LIAN LI PC-V2120X All Black Aluminum ATX Full Tower Computer Case $469.99

11. CORSAIR AXi AX1200i 1200W Digital ATX12V v2.31 and EPS 2.92 SLI Ready CrossFire Ready 80 PLUS PLATINUM Certified Full Modular Active PFC Power Supply $329.99

12. ASUS Black Blu-ray Burner SATA $79.99

13. Microsoft Windows 7 Ultimate SP1 64-bit - OEM $190

14. (2) NEC Display Solutions PA301W-BK-SV Black 30" 7ms Pivot, Swivel & Height Adjustable IPS Panel Widescreen Color-Critical Desktop Monitor $4,800 ($2,400 each)

____________________

Total = $22,448.79

____________________________________________

In summary, the system optimized for content creation has to be viewed as the result of an entirely different set of priorities and consequential hardware choices that are reverse engineered from the intended applications as compared to a system optimized for content consumption.


Cheers,

BambiBoom

> My current system and applications > [Dell Precision T5400 > 2X Xeon X5460 quad core @ 3.16GHz, 16 GB ECC , Quadro FX 4800 (1.5GB), WD RE4 / Segt Barcd 500GB,. M-Audio 2496 "Audiophile" soundcard / Logitech Z2300 > Windows 7 Ult > AutoCad, Revit, Solidworks, Sketchup, Adobe CS MC, Corel Technical Designer, WP Office, MS Office > Monitor > HP 27" 2711x @ 1920 X 1080]
a b U Graphics card
June 19, 2013 5:45:34 AM

When I used to work in the engineering field (I used to be an AutoCAD/3DsMAX professional-my largest project was creating the Readers Digest chiller room in 3D before a mechanical renovation) I had a simple equation on the cost breakdown of a workstation. Whatever the budget is for the workstation, half should be used towards a graphics card.

With that in mind Fred, start with a Quadro K5000 - http://www.newegg.com/Product/Product.aspx?Item=N82E168... and build the system around that.

Ultimately, you want your students to be able to achieve what they're envisioning - the only way seeing that come to fruition years down the road is to have the most powerful gpu designed for the task.
June 19, 2013 10:40:31 AM

dingo07,

The Quadro K5000 would be my first choice for my own system if I had a very healthy budget in the $7-8,000 range.- It's excellent in 2D, 3D and processing power, plenty of memory, and of course uses the Quadro driverst hat provide all those benefits in workstation applications.

As much as I appreciate the K5000, I'm puzzled by the idea / policy of, "Whatever the budget is for the workstation, half should be used towards a graphics card." Certainly, with workstations, the graphics cards are expensive, but in consideration of our Friend Fred Grivas' proposed CAD lab system, spending , $1,800 on the graphics card for a $3,000 system is disproportionate. That would mean that the CPU, motherboard, RAM, Case, PSU, and operating system could total only $1,200. And, as workstation competent CPU's that can exploit the K5000 potential in a 3D CAD and rendering platform- meaning the clock speed should be in the 3GHz or better range and have six cores, would cost at least, $1,000, there's very little left for the rest of the system. I would say a K5000 system should use something as a minimum an E5-1660 (6-core, 3.3 / 3.9GHz) at about $1,100, but is more likely to have dual Xeons. The $3,200 system outlined in my previous post used the best cost / performance Xeon E5's at $600 and while a cheaper motherboard (=- $100) and reduced RAM (= -$150), lower quality PSU (= -$40), cheaper case (= -$60), for a reduction of about $350, adding a K5000 would bring the cost to about $3,900. ($3,200 -$350 -$800 (Quadro K4000) + $1,800 (Quadro K5000).

In my view, in the budget / performance proportions, a workstation graphics card is more likely to cost in the 1/3 range, in this example meaning a K4000.

Certainly, I would enjoy seeing a specification for a system using the cost proportions you mention. Although it possible, but I tend to believe that even when the system is quite expensive, $7-8,000, those may barely justify/ allow a $3,500 Quadro 6000 because everything else will be substantially more expennive. One of the systems in the Passmark "Top 100" is a Xeon > Quadro 6000 system and that uses two eight core Xeon E5-2576w's - $3,900, has 64MB RAM, $600, a $500 motherboard, possibly a GPU coprocessor like a Tesla C2075 (the CPU score was higher than two 3687W combined), an LSI Logic RAID controller and an unknown number of SSD. plus and mechanical drives, but I would say that system was at least $10-11,000, so the Quadro 6000 could represent a bit over 33-40%. But, overall, a rule of 50% for the GPU would seem very difficult to achieve a balanced specification without serious weak links in the chain.

Interestingly, it would seem gamers- who often express the idea that Quadros are a waste of money, may end up spending as much or more on the GPU proportionally than a workstation- thanks to using multiple cards. Someone who buys three GTX 680's for an SLI configuration will be spending at least $1,200 on the GPU's. That system will probably have an i7-3930 or 3960X - $500 more expensive as a Xeon E5-1650, a $400 Rampage motherboard, and so on. A high end gaming system can be as expensive as a workstation system that can earn $150 per hour, and pay for itself in a month or two and be useful for another four or five years.

The Quadro K5000 is so good, (and look at how high the prices are as used) it's worth really trying to fit into a system budget, but I think it's not feasible in the $3,000 range.

Cheers,

BambiBoom


a b Î Nvidia
a c 91 U Graphics card
June 19, 2013 10:55:01 AM

bambiboom said:
dingo07,

The Quadro K5000 would be my first choice for my own system if I had a very healthy budget in the $7-8,000 range.- It's excellent in 2D, 3D and processing power, plenty of memory, and of course uses the Quadro driverst hat provide all those benefits in workstation applications.

As much as I appreciate the K5000, I'm puzzled by the idea / policy of, "Whatever the budget is for the workstation, half should be used towards a graphics card." Certainly, with workstations, the graphics cards are expensive, but in consideration of our Friend Fred Grivas' proposed CAD lab syste, spending , $1,800 on the graphics card for a $3,000 system is disproportionate. That would mean that the CPU, motherboard, RAM, Case, PSU, and operating system could total only $1,200. And, as workstation competent CPU's that can exploit the K5000 potential in a 3D CAD and redenring platform- meaning the clock speed should be in the 3GHz or better range and have six cores, would cost at least, $1,000, there's very little left for the rest of the system. I would say a K5000 system should use something like an E5-1660 (6-core, 3.3 / 3.9GHZ) at about $1,100. The $3,200 system outlined in my previous post used the best cost / performance Xeon E5's at $600 and while a cheaper motherboard (=- $100) and reduced RAM (= -$150), lower quality PSU (= -$40), cheaper case (= -$60), for a reduction of about $300, adding a K5000 would bring the cost to about $3,900. ($3,200 -$300 -$800 (Quadro K4000) + $1,800 (Quadro K5000). The E5-1650 is a special case as the more likely CPU for a system at this level would by an E5-1660 which costs $1,100.

In my view, in the budget / performance proportions, a workstation graphics card will end up costing more in the 1/3 range, in this example meaning a K4000.

Certainly, I would enjoy seeing a specification for a system using the cost proportions you mention. Although it possible, but I tend to believe that even when the system is quite expensive, $7-8,000, those may barely justify/ allow a $3,500 Quadro 6000 because everything else will be substantially more expennive. One of the systems in the Passmark "Top 100" is a Xeon > Quadro 6000 system and that uses two eight core Xeon E5-2576w's - $3,900, has 64MB RAM, $600, a $500 motherboard, possibly a GPU coprocessor like a Tesla C2075 (the CPU score was higher than two 3687W combined), an LSI Logic RAID controller and an unknown number of SSD. plus and mechanical drives, but I would say that system was at least $10-11,000, so the Quadro 6000 could represent a bit over 33-40%. But, overall, a rule of 50% for the GPU would seem very difficult to achieve a balanced specification without serious weak links in the chain.

Interestingly, it would seem gamers- who often express the idea that Quadros are a waste of money, may end up spending as much or more on the GPU proportionally than a workstation- thanks to using multiple cards. Someone who buy three GTX 680's for an SLI configuration will be spending at least $1,200 on the GPU's. That system will probably have an i9-3930X or 3960X - as expensive as a Xeon E5-1660, a $400 Rampage motherboard, and so on, so a high end gaming system can be as expensive as a workstation system that can earn $150 per hour, and pay for itself in a month or two.

The Quadro K5000 is so good, it's worth really trying to fit into a system budget, but I think it's not feasible in the $3,000 range.

Cheers,

BambiBoom




he can ditch xeon and go Sandybridge-E. something like the 3930K would give him everything he needs and fot a K5000 into the budget. the difference between the enthusiest chips from intel and server-grade chips (xeon series) isn't nearly as big for 3D CAD applications as say the difference between a gaming gtx690 and a Quadro K5000

June 19, 2013 11:26:08 AM

[/quotemsg]

he can ditch xeon and go Sandybridge-E. something like the 3930K would give him everything he needs and fot a K5000 into the budget. the difference between the enthusiest chips from intel and server-grade chips (xeon series) isn't nearly as big for 3D CAD applications as say the difference between a gaming gtx690 and a Quadro K5000

[/quotemsg]

vmem,

As I mentioned in my previous post, even if there are reasonable savings, a K5000 system with the other components having proportional performance to that GPU, it's still in the $3,800+ range. As an i7-3930K is $570, and the Xeon E5-1650 is $585, that's only saving $15 of the $800-900 over budget. While It would be terrific if our friend FredGrivas could use the K5000, $800 is almost 30% over budget.

Although it's correct to a degree your statement , ",..the difference between the enthusiest chips from intel and server-grade chips (xeon series) isn't nearly as big for 3D CAD applications as say the difference between a gaming gtx690 and a Quadro K5000" that does not address the role of the CPU in a systemic context, nor the fact that a K5000 just isn't feasible in a $3,000 budget. Without ECC error correcting RAM (and therefore Xeon), many of the important image quality benefits of the K5000 would be lost- mostly in the substantial increase in artifacting and incorrect shadows, and crude color gradients and transparency fields. The Xeon > ECC > Quadro chain is essential also for 10-bit color and multiple lighting sources for 3D CAD modeling / rendering. As well Xeons offer multiple CPU configurations- there are E7 10-core Xeons that can be used on an eight CPU board having 4,096GB RAM. That's 80 cores / 160 threads, more PCIe lanes than can be imagined, $38,000 for CPU's, a $3,000 motherboard, and about $11,000 in 32GB RAM modules. Yes, there are advantages to Xeons .

It's a sad (and expensive) story, but after twenty years of trying to save money and do workarounds, the CAD workstation world means always having to say (and pay) "Xeon , ECC, Quadro".

Computers- they're not just for shooting aliens anymore.

Cheers,

BambiBoom
a b Î Nvidia
a c 91 U Graphics card
June 19, 2013 11:37:20 AM

@bambiboom

good points and I really appreciate you sharing your experiences and wisdom... have learned a lot from your posts. and yeah, at the end of the day I think a K4000 based system or something similar would be a better fit for a $3000 per unit budget.

Intel did their homework when they crippled ECC support on the i7 3930K... it would've been such a great deal for the all purpose home computer otherwise...
June 19, 2013 12:47:17 PM

vmem,

Yes, I agree completely that the practice of deleting certain features already present on a chip to protect the profits on the "business priced" one. There have been Xeon CPU's that have had the multiple-CPU bridge deleted so the multiple CPU version 0the same chip- at a much higher price has to be purchased.

This works with Geforce and Quadros too. I saw a thread in an overclockers forum that had taken a Titan ($1,000) and by deleting two resistors - a brave thing to do- made the Titan believe it was either a Quadro K5000 ($1,800) -and would run the drivers, or that it was a Tesla K20 ($3,300) - a case where less is more. I was thinking that someone could make a fortune converting GTX's to Quadros. "Two taps of the soldering iron in the right places saves up to $2,300" would look very good in the brochure.

I see in the system notes on you post, that you're using an i5-2500K. I keep noticing these high on Passmark baselines making amazing numbers. You made a very wise choice. There's an i5-2500K / Maximus V / 16GB / GTX 780 system with a rating of 5900 > CPU=9567, 2D=1071, 3D=-9355. The only 3D scores I've seen higher are Titans on i7-3970X scoring in the 11000's. I was interested to see also that the top i5-2500K /GTX 680 system has 2D / 3D of 879 and 5230 -still extremely good. It may the way Passmark weights the tests, but the new Titan and GTX 700 series seem to be walking away with the prizes. The fact that the GTX 780 can operate at that level shows the i5-2500K has great potential- it's not a bottlenecker. The 2500K also seems to have an affinity for one of my favorite GTX's, the 580.

Cheers,

BambiBoom
a b Î Nvidia
a c 91 U Graphics card
June 19, 2013 6:45:57 PM

Bambiboom

Wow, I never know that you can convert a Titan to a k5000 or Tesla K20... that guy can make a fortune ($200 a pop should be easy to make)

Thanks for the compliment on my i5! I personally just got it because it was "good enough" for my daily use and gaming. but one of the reasons that i5 2500K chips score so high on benchmarks is because the hyperthreading on i7 will often limit overclocking headroom or sometimes even decrease scores on some benchmarks. most pro-overclockers will disable hyper-threading on a i7.

I think the i5 will hold a special place in history as a landmark chip for the quadcore/8-thread CPU era. Due to it's huge improvement over the nehalem generation, and the small progress that has come since, Intel has yet to release a chip that can truly say it is faster than Sandy-bridge (overclocked sandy-bridge chips still top charts in many benchmarks and will not bottleneck any everyday application). Personally I am looking forward toward to Haswell-E at the end of 2014 to see if that will end the reign of the i5 2500k.

If desktop applications continue to improve, eventually we'll be making better use out of the i7's hyperthreading, as well as be able to utilize more cores/threads. Between the rumored 8-core Haswell-E and arrival of DDR4, we may enter a new age of desktop computing. the current trend has shown that desktop improvements no-longer interest the mainstream consumer. aside from making huge profits from the expanding server market, I think companies like Nvidia and Intel will push hard for software engineers to "invent" new applications to utilize what's capable on a desktop to push their product, and more cores/threads, more data, and more automation may well be in the very near future.

Cheers!
!