Power And Temperature Benchmarks
Let's shift our perspective from games to power usage.
We can see the Radeon HD 5450 uses a bit less power than the Radeon HD 4550 and slightly less than the GeForce 210, despite the new card's vastly superior gaming performance. We also see that a higher power draw is the price that the Radeon HD 4650 demands for its gaming performance. But in the big scheme of things, a 43W increase under load isn't bad at all.
All of these temperatures are acceptable, but the new Radeon HD 5450 fares particulalry well for a passively-cooled card. It's notable that the GeForce 9500 GT is a Gigabyte model fitted with a beefy aftermarket cooler, and this explains its ability to keep load temperatures so very low. The GeForce 210 makes a great showing here, but it is the only card in the bottom three that sports an active fan cooler instead of a passive unit.
Whats the point of releasing a new graphics card thats worse than older cards? It runs Dx11 but there's no way it could even run a supported game.
Try refreshing the page. Should be working correctly now!
Not really, look at the specs. In CrossFire these cards would cost $100 for a total 160 shader cores. They still wouldn't hold a candle to a single $100 5670 when gaming, which has 400 shader cores all by itself.
CrossFiring the 5450 would be a total waste.
How do you expect it to handle the increase in temps? Even if you got some good airflow inside the case, that won't be sufficient.
They needed a i7 and 1200W PSU to test this card... :)
Useless...Either get a good card or stick with integrated.