Bunxh of X1800 reviews show up

Well a few reviews showing up, some XL, some claiming to be XTs.

<A HREF="http://www.hardocp.com/article.html?art=ODIy" target="_new">http://www.hardocp.com/article.html?art=ODIy</A>
<A HREF="http://www.beyond3d.com/reviews/ati/r520/" target="_new">http://www.beyond3d.com/reviews/ati/r520/</A>
<A HREF="http://techreport.com/reviews/2005q4/radeon-x1000/index.x?pg=1" target="_new">http://techreport.com/reviews/2005q4/radeon-x1000/index.x?pg=1</A>
<A HREF="http://www.hexus.net/content/item.php?item=3603" target="_new">http://www.hexus.net/content/item.php?item=3603</A>
<A HREF="http://www.guru3d.com/article/Videocards/262/" target="_new">http://www.guru3d.com/article/Videocards/262/</A>
<A HREF="http://www.extremetech.com/article2/0,1697,1867116,00.asp" target="_new">http://www.extremetech.com/article2/0,1697,1867116,00.asp</A>
<A HREF="http://www.firingsquad.com/hardware/ati_radeon_x1800_xt_xl/" target="_new">http://www.firingsquad.com/hardware/ati_radeon_x1800_xt_xl/</A>
<A HREF="http://www.driverheaven.net/reviews/r520reviewxvxv/" target="_new">http://www.driverheaven.net/reviews/r520reviewxvxv/</A>

Nothing too impressive yet (don't know how parity is supposed to help make up for lost ground?). It'll be interesting to see more in depth test at the highest settings.

COD2 results look good though, and some of the DH stuff looks better, especially when you take into account the Min FPS info.

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internët account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
 

cleeve

Illustrious
Yep, barely performance parity and still no availability of the high-end part for at least a month.

Still, parity is better than flat-out losing the performance contest, which is the state Ati was in.

Then again, the Radeon X1K series has some really nifty AVIVO features for video entheusiasts, enhanced AA (that works with HDR unlike the 7800 series!!!), and a GPU designed to act as a general-purpose math GPU (when an appropriate API comes out)...

It looks like the X1K series will compete based on it's forward-looking features, but not on price, availability, or performance... right now, anyway.

Having said that, given a choice between an X1300 PRO or 6600GT, those features sure do tip the scales. And it looks like the X600 can go toe-to-toe with the 12-pipe 6800. So given the choice, which of those cards would you pick? If AVIVO turns out to be as nifty as it's rumored, I can see alot of people opting for a Radeon X1K rather than an Nvidia counterpart, assuming game performance is equal. If these 90nm Ati chips are as cheap to produce as they should be, Ati's margins will be higher than Nvidias on these important low-and-midrange segments.

And wow, what those 16 pipelines can do! If Ati could release a 24 or 32 pipe card with this technology in a timely fashon, Nvidia would be hurting. But once more, that's nothing more than a rumored pipe-dream (although crashman seems to know a little more about it than he's allowed to let on)

So all and all, a mediocre arrival for the X1K cards, but they may have more up their sleeves in the long run than the performance numbers show.

But it certainly could have gone worse for Nvidia, and kudos to them for being out of the gate with 24-pipe cards available at launch.

________________
<b>Geforce <font color=red>6800 Ultra</b></font color=red>
<b>AthlonXP <font color=red>~3300+</b></font color=red> <i>(Barton 2500+ o/c 412 FSB @ 2266 Mhz)</i>
<b>3dMark05: <font color=red>5,275</b>
 
Well it (the XL) will 'compete' on performance, but not win flat out like we expected of whatever part they launched. PRICE/performance is always a tough thing to compare just at launch, they may have a competing product in the XL if the margins are good.

I agree though this is a 'good base to build upon', where they brought all their technology transitions to market without providing an 'FX-like' part. But like I said months ago, they have to release a better part if they want to make up for lost time, however if they simply want to forgoe that and just 'compete' then these products are good for parity, only need to get price down to parity too.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internët account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
 

knownalien

Distinguished
Jan 23, 2003
371
0
18,780
and a GPU designed to act as a general-purpose math GPU (when an appropriate API comes out)...

that might be the most interesting thing. Something for the future is always nice. But I am still curious how it could be implemented and under what circumstanced it is needed. It could aid the CPU? It could do physics calculations during game play? But what about doing the other stuff a GPU is supposed to do during gameplay? Is there really THAT much headroom in the GPU?? Almost makes it sound like it's a dual core.

K8T NeoFIS2R
Athlon 64bit 3400
2X1024 OCZ DDR400
Maxtor 40, 120
Western Digital Raptor 74 Gig
ATI Radeon 9700 Pro
NEC LCD Monitor 1760NX
Antec Tru Power 550
Windows XP
 

cleeve

Illustrious
Yeah... actually, there have been hard-core scientists using GPUs as secondary calculation devices for a while. Which makes me wonder, are the Nvidia cards really limited in this respect? Once an API comes out, is it going to care what videocard is in there? I don't know anything about the tech except that it's there and Ati claims to have optimized for it.

The most interesting thing I thought when I learned that bit of info was... hmmm... I wonder if you could use a crossfire mainboard with, say, an X1800 and X1300, and use the X1800 exclusively for video and the X1300 exclusively as a physics board...

If that's do-able in the forseeable future, the aegea guys are so screwed. It'll pit them up against the giants wayy before they're ready to compete with them.

________________
<b>Geforce <font color=red>6800 Ultra</b></font color=red>
<b>AthlonXP <font color=red>~3300+</b></font color=red> <i>(Barton 2500+ o/c 412 FSB @ 2266 Mhz)</i>
<b>3dMark05: <font color=red>5,275</b>
 

knownalien

Distinguished
Jan 23, 2003
371
0
18,780
but with the two card solution, one of the downsides that THG mentioned was that the recent crop of CF involved both cards doing ALL the calculations instead of splitting them up which implied that both were doing 5 + 5 but one card showed the "1" and the other the "0" as the result = 10. They would have to fix that first, making the two cards more independent (ironically), before they could even begin to tackle your idea. BUT, they may have already done this! We'll find out when the next wave of CF masters arrive.

K8T NeoFIS2R
Athlon 64bit 3400
2X1024 OCZ DDR400
Maxtor 40, 120
Western Digital Raptor 74 Gig
ATI Radeon 9700 Pro
NEC LCD Monitor 1760NX
Antec Tru Power 550
Windows XP
 

mpasternak

Distinguished
Apr 27, 2005
533
0
18,980
I think the biggest note about the X1xxx's

go buy one :p you can't. paper launch. nothing in stores yet. I live in ATI's hometown. i see their building from my bedroom

NO RETAILERS HAVE ANY CARDS HERE
 

knownalien

Distinguished
Jan 23, 2003
371
0
18,780
i see their building from my bedroom

you should go dig through their dumpster and see what you can find then report back to us! lol

K8T NeoFIS2R
Athlon 64bit 3400
2X1024 OCZ DDR400
Maxtor 40, 120
Western Digital Raptor 74 Gig
ATI Radeon 9700 Pro
NEC LCD Monitor 1760NX
Antec Tru Power 550
Windows XP
 

pauldh

Illustrious
Thx for all the review links. I started reading Wavey's review but couldn't give it full attention with 3 kids in the room. I was re-reading sentence constantly. So I went on to less technical reads.

I think this story as usual will take some time to draw any real conclusions. X1800XL and 7800GT (with the new betas) seem to be pretty equal. The X1800XT seems to be a little above the 7800GTX even with the betas; But only a few big victories. It was good for NV to get those betas in reviewers hands as without them they are far behind. I'd like to see the new betas for NV given a good look over, as well as retail versions of ATI's cards with their supplied drivers also getting dug into.

COD2 with the 512MB XT did as expected, demolishing the 256MB 7800GTX. Maybe NV will have 512MB 7800GTX's available by the time X1800XT can be purchased.

I guess overall I am left with a similar taste you are. NV's betas took the sting out of this launch and at best it looks like a slight ATI performance lead at the very top end. Pricing will determine alot, as will more AA with HDR tests to see if it's a definate usable feature advantage over the GF7's. NV's drivers are probably pretty much tweaked as much as they can be; we shall see if ATI can squeeze more in the near future. It should be intersting to see if the game developers can take advantage of ATI's architecture. Will the 7800GTX keep up with games released 1 year from now or will the ATI's shine?



<A HREF="http://service.futuremark.com/compare?3dm05=658042" target="_new">3DMark05</A> <A HREF="http://service.futuremark.com/compare?2k3=3781954" target="_new">3DMark03</A>
 

pickxx

Distinguished
Apr 20, 2004
3,262
0
20,780
I noticed that there were no overclocks from the reviews i read. I am curious to see what headroom there is. The 1800XL seems to me to be massivly underclocked. Doesn't it use the same RAM as the 1800XT? because the XT is 1500Mhz and the XL is 1000Mhz. I guess there could be something i am missing, thats why i wanted a reviewer to let me know.

Do you have any links where they try to OC any of the crds?

__________________________________________
Chaintech VNF3-250/A64 2800+/1GB(512x2) OCZ VX GOLD 2-2-2-5/BFG 6800GT/Thermaltake 420W/WD 200GB/Maxtor 300GB
 

Gamer_369

Distinguished
May 29, 2005
183
0
18,680
Ok, okay...
I've tried reading as much reviews as I could, but it can be a bit strenous. I hope I gained enough knowledge to discuss, but I won't rule out the fact of perhaps being mistaken.

First off, I'm dissapointed with ATI. I was led to believe that the high-end cards would be here at launch, but it turns out it'll be a month before we start to see the x1800 XT's. I guess in the mean time we can do with an x1800 XL, but that's still not even out yet. (Should be this week according to anand.)

Now, aside from all consumer aspects and availability, I have made my conclusion that the r520 cards have an edge over the 7800's, technologically speaking.

At first, it appears like all ATI has done is brought features to their cards that nVidia already had with the Geforce 6 and 7 series. *However* it also apears, in my opinion, ATI was able to make such features much more efficient...such as SM3 with the threaded core design architecture as well as Image Quality with the Adaptive AA, HDR + AA, and the high quality AF. Also the Memory Bandwith with the ring bus.

This was a new architecture, where the 7800 was largely based off the Geforce 6 series cards.

Although another way to look at it is the indication that the nv40 design must have been pretty darn good to be able to compete with this r520, and do so quite nicely.

Performance wise, it seems Sanders may have been accurate with his pre-released benchmarks, as he already has pointed out in his latest review.

Price/performance wise, I would still continue suggesting to consumers the 7800 series (for now, anyways, based on the benchmarks with the x1800XL and the 7800 GT).
 

Action_Man

Splendid
Jan 7, 2004
3,857
0
22,780
Performance wise, it seems Sanders may have been accurate with his pre-released benchmarks, as he already has pointed out in his latest review.

Uh no.

Some people are like slinkies....
Not really good for anything but you cant help smile when you see one tumble down the stairs.
 
Even if the performance were close to what Sanders said (which it isn't IMO) it's almost irrelevant as he didn't test/benchmark what he claimed he was testing.
As for his results actually being near to the ones we see now (look at those becnhies again and consider what 'close' means in comparison if an X700PRO and R9800P are not close, then sanders was definitely not close), I'd say if anything he'd be 'close' to XL performance, not XT, just like was mentioned by all the insiders at the time of his review.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internët account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
 

knownalien

Distinguished
Jan 23, 2003
371
0
18,780
The 90nm G72 will have a much smaller size than the 0.11-micron based G70 allowing for multiple GPUs to be utilized on one graphics card through SLI technology, providing an effective and attractive alternative to those who do not want to fork out money for two graphics cards, the sources claimed. <A HREF="http://www.digitimes.com/news/a20051005A7033.html" target="_new">link</A>

then read:

Right now, NVIDIA does not support four GPUs for SLI, as the firm presently does not intend to go in that direction. <A HREF="http://www.tomshardware.com/motherboard/20051004/one_gigabyte_motherboard_four_graphics_cards-04.html#running_four_pci_express_graphics_cards" target="_new">link</A>


to me it seems that for Nvidia, they know that users will buy more than one card. So, now they have a process whereby they intend to put "SLI" on one card. I think it is only a matter of time before they give us a solution which is still two cards, BUT each card has two GPU's on it. Could this be Nvidia's attempt to combat ATI's "general-purpose math GPU (when an appropriate API comes out)..."????

K8T NeoFIS2R
Athlon 64bit 3400
2X1024 OCZ DDR400
Maxtor 40, 120
Western Digital Raptor 74 Gig
ATI Radeon 9700 Pro
NEC LCD Monitor 1760NX
Antec Tru Power 550
Windows XP
 

addiarmadar

Distinguished
May 26, 2003
2,558
0
20,780
I noticed that there were no overclocks from the reviews i read. I am curious to see what headroom there is. The 1800XL seems to me to be massivly underclocked. Doesn't it use the same RAM as the 1800XT? because the XT is 1500Mhz and the XL is 1000Mhz. I guess there could be something i am missing, thats why i wanted a reviewer to let me know.

If the 1800s are on par with the 7800s then there is no need for overclocking them right now since you wont see the difference. I also wonder if they clock-step them as well....

<i><font color=red>Only an overclocker can make a computer into a convectional oven.</i></font color=red>
 

pauldh

Illustrious
No "need" to OC is far from no "desire" to OC. :wink:

It will take some time, but as soon as retail versions come out we will see how they OC.

<A HREF="http://service.futuremark.com/compare?3dm05=658042" target="_new">3DMark05</A> <A HREF="http://service.futuremark.com/compare?2k3=3781954" target="_new">3DMark03</A>
 
You do realize that they already have 2 GPUs on one cards like Gigabyte's 3D1, however what this means is that you might finally be able to SLi those cards which you can't right now due to core limits. It's not so much that nV knows people want it, they just don't want to be limited in the future the way they are now.

Could this be Nvidia's attempt to combat ATI's "general-purpose math GPU (when an appropriate API comes out)..."????
No this would be to equal the ability ATi has to have more than 2 VPUs simultaneously. I doubt either will implement it immediately, but the possibility to do so without limitation let''s them have a window into the future should they need it.

BTW, expect by this time next year nV and ATi start talking more about multi-core VPUs as well as flexible designs.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internët account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
 

pauldh

Illustrious
We shall see how retail versions do, but <A HREF="http://www.driverheaven.net/reviews/r520reviewxvxv/overclocking.htm" target="_new">Driverheaven</A> found their X1800XT completely stable at 685/1790. Here's the kicker...that is with stock cooling and stock voltage!!! That's some serious MHz there. What are the OC'in extremists going to get out of one of these? :eek:

<A HREF="http://service.futuremark.com/compare?3dm05=658042" target="_new">3DMark05</A> <A HREF="http://service.futuremark.com/compare?2k3=3781954" target="_new">3DMark03</A>
 

knownalien

Distinguished
Jan 23, 2003
371
0
18,780
ATi start talking more about multi-core VPUs as well as flexible designs.

my question becomes, in a modern day system, what is ultimately the bottleneck for the graphics card:

1) still FSB?
2) onboard Ram on the graphics board
3) the GPU
4) the CPU
5) system RAM
6) PCI-e (x8 or x16) bandwidth?

I am curious. I guess put another way, what is the thing that stops graphics cards from running at phenomenal speeds? Heat?

K8T NeoFIS2R
Athlon 64bit 3400
2X1024 OCZ DDR400
Maxtor 40, 120
Western Digital Raptor 74 Gig
ATI Radeon 9700 Pro
NEC LCD Monitor 1760NX
Antec Tru Power 550
Windows XP
 

pickxx

Distinguished
Apr 20, 2004
3,262
0
20,780
Well yes heat will stop the card from working at high speeds...but actually whats limiting the cards for the most part is the CPU. If you look at benchmarks at lower res. you will see a 6800GT be equals with the 7800GTX/GT and thwe 1800XT/XL because the ard can only go so fast because the cup can't calculate any faster....

Once games start getting truely multithreaded for dual core and stuff, then games will start to have twice the CPU power and frame rates should jump dramatically.

BUT!!!! The level of programming to do that is much much much much harder and from my understanding (not being a programmer) it would have to be built from the ground up and can't just be an add-on code to make it run well with dual cores.

With higher end GPU's even with the bottle necks you can increase res. increase AA/AF and turn beautiful things on....so it has an upside. But the cards are bottleneked from even the fastest OC'd AMD FX chip...

__________________________________________
Chaintech VNF3-250/A64 2800+/1GB(512x2) OCZ VX GOLD 2-2-2-5/BFG 6800GT/Thermaltake 420W/WD 200GB/Maxtor 300GB
 

knownalien

Distinguished
Jan 23, 2003
371
0
18,780
how in the hell did it come to this?!! CPU's sllowing down graphics cards! lol

as for heat, I think the industry needs to move away from fans and to some kind of standard watercolling method. Fans + Dust + noise + hardware = crap. With water cooling, they will be able to present higher clocked devices stock.

K8T NeoFIS2R
Athlon 64bit 3400
2X1024 OCZ DDR400
Maxtor 40, 120
Western Digital Raptor 74 Gig
ATI Radeon 9700 Pro
NEC LCD Monitor 1760NX
Antec Tru Power 550
Windows XP
 

pickxx

Distinguished
Apr 20, 2004
3,262
0
20,780
WHAT?!?

Think about this like cars....yes it would be sweet as hell if all cars were an Enzo...but they arn't. It would be sweet if they came with a nice Bose system, but they dont. If they all had great power, handling, and ammenities....then cars would all be $125,000 and people couldn't afford them.

Most of the time there is a need to create the lowest common product, Civic/air cooled card, that can be modified to get more power but thats an option not a requirement.

If someone wants to get a watercooled GPU from the factory, some places make those...like BFG. If you want a modified car a WRX STi is for you.

My card isn't loud at all, i have very little dust in my case(its called filters look into it.) and i think graphic card makers are doing a great job of controlling heat and noise. Some people bitch about dual slot solutions, but if they keep it cool, quiet, and running....i dont care personally.

__________________________________________
Chaintech VNF3-250/A64 2800+/1GB(512x2) OCZ VX GOLD 2-2-2-5/BFG 6800GT/Thermaltake 420W/WD 200GB/Maxtor 300GB
 

Snorkius

Splendid
Sep 16, 2003
3,659
0
22,780
The idea of using a GPU for math calculations has been around for years, and I think the whole 'optimized' thing is just a bowlfull of advertising crap. "Optimize something that does math calculations to do math calcualtions! Great idea!" Just like putting a "Y2K Ready!" sticker on something.

Writing and implementing the appropriate code is the hard part.

Vainglory be my wicked guide.