Radeon 2 and 3 info @ Digit-life
Tags:
-
Graphics Cards
- Core
- Support
- Micron
-
Graphics
Last response: in Graphics & Displays
noko
May 12, 2001 1:17:09 AM
Quote:
According to our sources ATI roadmap looks like this: <b>R200 (Radeon2) series</b>
. . R200 - 4 pipelines, 250MHz core, .15 micron, DX8.1 support, HydraVision technology support
. . AGP 4X bridge enables dual R200 configurations (MAXX)) - samples May, mass production September
. . RV200 - 2 pipelines, 250MHz core, .15 micron, HydraVision - samples May, mass production
. . September
<b>R300 series</b>
. . RL300 - 4 pipelines, .15 micron, DX9 support, HydraVision - early samples Q4'01
. . RV300 - 4 pipelines, 300MHz core, .15 micron, DX9 support, HydraVision - early samples Q4'01
. . R300 - 8 pipelines, 300MHz (or higher) core, .15 micron, DX9 support, HydraVision - early
. . samples Aug'01
<A HREF="http://www.digit-life.com/news.html#989604871" target="_new">http://www.digit-life.com/news.html#989604871</A>
Radeon2 50mhz faster then the current GF3 and MAX capable and ready to dump on the market anytime from the get go. Nvidia better watch out, ATI is smoking, soon the fire will blaze at Nvidia's feet. R300? 8 pipes!!!!! :smile: Let me smoke one please
.<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 05/12/01 12:39 PM.</EM></FONT></P>
More about : radeon info digit life
Anonymous
a
b
U
Graphics card
May 12, 2001 3:30:30 AM
HolyGrenade
May 12, 2001 11:50:13 AM
Related resources
- Full info about Radeon2 and 3! - Forum
- Preventing ebay SSD fraud: does e1,e2,e3,e4 info prove I was fooled? - Forum
- Need info on mobo, Asus M5a97 le r2.0 and Asrock 970 Extreme 3 - Forum
- Digit-Life's Radeon review (mainly Sapphires) - Forum
- HELP! Quick Format 3TB drive's 746GB (split to 2TB and 746GB)-Partition info GONE - Forum
rcf84
May 12, 2001 2:20:33 PM
noko
May 12, 2001 4:31:34 PM
My future predictions: (don't bet on it)
1. Rad2 toast the GF3 initially, also everyone is buying gamecubes as well. :cool:
2. nVidia fights back with GF3 ultra which puts the GF3 at the same clock as Rad2, Xbox is release.
3. ATI lashes back with Rad2 Max, which smashes GF3 ultra, no one wants the GF3 Ultra nor the XBox :redface:
3. nVidia releases NV30, renames it to TForce (The Force) but is not Directx9.0 compliant, has buggy drivers, doesn't work in XP. :frown:
4. ATI releases Rad 3 fully Directx 9.1 compliant, gamecube is the hit of the world, every kid wants one, parents remember there first game machine the nintendo, sells sky rocket.
5. ATI releases second version of FireGL card using Rad3, 8 pipe and takes over the professional market from nVidia. :wink:
6. nvidia stocks pluments, Xbox is cancelled due to poor sells and high cost to produce, plus few software games that are intriguing enough to play.
7. ATI buys out nVidia. :tongue:
8. ATI declared an international monopoly and is split up into pieces, which 3dfx is resurected and puts out the Rampage Revenge chip, RR, which kills everyone.
9. nvidia is never heard of or thought about again. :smile:
<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 05/12/01 12:34 PM.</EM></FONT></P>
1. Rad2 toast the GF3 initially, also everyone is buying gamecubes as well. :cool:
2. nVidia fights back with GF3 ultra which puts the GF3 at the same clock as Rad2, Xbox is release.
3. ATI lashes back with Rad2 Max, which smashes GF3 ultra, no one wants the GF3 Ultra nor the XBox :redface:
3. nVidia releases NV30, renames it to TForce (The Force) but is not Directx9.0 compliant, has buggy drivers, doesn't work in XP. :frown:
4. ATI releases Rad 3 fully Directx 9.1 compliant, gamecube is the hit of the world, every kid wants one, parents remember there first game machine the nintendo, sells sky rocket.
5. ATI releases second version of FireGL card using Rad3, 8 pipe and takes over the professional market from nVidia. :wink:
6. nvidia stocks pluments, Xbox is cancelled due to poor sells and high cost to produce, plus few software games that are intriguing enough to play.
7. ATI buys out nVidia. :tongue:
8. ATI declared an international monopoly and is split up into pieces, which 3dfx is resurected and puts out the Rampage Revenge chip, RR, which kills everyone.
9. nvidia is never heard of or thought about again. :smile:
<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 05/12/01 12:34 PM.</EM></FONT></P>
HolyGrenade
May 12, 2001 5:16:50 PM
Lol! good one!
Heres my prediction:
1. Radeon 2 is released, It allows better DVD playback and retains better compressed texture quality than the GF3, but is not as fast.
2. New faster GF3 is released and is dx8.1 compliant. XBOX is released, everyone loves it, developers love it because of the familiar and easy programming language and architecture, and also for not having to pay royalties for developing its games.
3. Graphics cards based on the nv30 is released. It is fully compliant with dx9.1, but very expensive. The Radeon 3 is released, but its not as fast as the nv30, nor does it compare due to its lacking features. ATI's official stance: it is meant to compete with the GF3 ULTRA.
4. nVidia release a wicked platform for AMD Chips.
5. Gamecube is released, but its proprietary game format and dvd incompatibility makes it a big no no to consumers. Also, developers don't like it because of having to pay royalties. Result: people start using it as stools.
6. nv30 Workstation is released using nvidia system and graphics architecture, and proves superior to 3DLabs Wildcat series, and thus is accepted in medium level Workstation graphics. While, FireGL still remains as an entry level 3D development.
7. nVidia Presedent develops a heart and decides enough is enough, we need to lower our graphics card prices.
8. People put their Gamecubes away with their Atari Jaguars and their Dreamcasts.
9. Kyro 3 is finally released and they realise every one else has programmible shaders, while they only have fixed t&l.
10. I Aquire nVidia in a poker game and start taking over all other technology companies including AMD, Intel, ATI, IBM, Sun, Apple, Adobe, Macromedia & AOL Time Warner and decide to launch my Global Domination Initiative.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
Heres my prediction:
1. Radeon 2 is released, It allows better DVD playback and retains better compressed texture quality than the GF3, but is not as fast.
2. New faster GF3 is released and is dx8.1 compliant. XBOX is released, everyone loves it, developers love it because of the familiar and easy programming language and architecture, and also for not having to pay royalties for developing its games.
3. Graphics cards based on the nv30 is released. It is fully compliant with dx9.1, but very expensive. The Radeon 3 is released, but its not as fast as the nv30, nor does it compare due to its lacking features. ATI's official stance: it is meant to compete with the GF3 ULTRA.
4. nVidia release a wicked platform for AMD Chips.
5. Gamecube is released, but its proprietary game format and dvd incompatibility makes it a big no no to consumers. Also, developers don't like it because of having to pay royalties. Result: people start using it as stools.
6. nv30 Workstation is released using nvidia system and graphics architecture, and proves superior to 3DLabs Wildcat series, and thus is accepted in medium level Workstation graphics. While, FireGL still remains as an entry level 3D development.
7. nVidia Presedent develops a heart and decides enough is enough, we need to lower our graphics card prices.
8. People put their Gamecubes away with their Atari Jaguars and their Dreamcasts.
9. Kyro 3 is finally released and they realise every one else has programmible shaders, while they only have fixed t&l.
10. I Aquire nVidia in a poker game and start taking over all other technology companies including AMD, Intel, ATI, IBM, Sun, Apple, Adobe, Macromedia & AOL Time Warner and decide to launch my Global Domination Initiative.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
noko
May 12, 2001 6:25:53 PM
DarkPhire
May 12, 2001 6:31:36 PM
I am very curios to see how X-Box sells. From what ive heard, MICROSOFT has done something right for once. (along with windows 2000) . Could microsoft be turning soft? making things that work finally? lol.
In anycase, I don't think nvidia is gona die anytime soon. unless for some weird reason they pull a 3dfx.
I WANT THE 3dfx GLIDE API BACK!!!!
--DarkPhire
In anycase, I don't think nvidia is gona die anytime soon. unless for some weird reason they pull a 3dfx.
I WANT THE 3dfx GLIDE API BACK!!!!
--DarkPhire
rcf84
May 12, 2001 7:02:29 PM
Makaveli
May 12, 2001 9:34:59 PM
I'm looking forward to the Ati Radeon2 I will sell my current 64mb Radeon for it. As for performance and FPS
I think it will be right up there with the Geforce 3 and the driver support will be much improved as it already is.
As for image quality it will still own Nvidia. But right now I like both companies cause they will only lower prices for us the customer. As for Kyro good luck to them, but I still see there card as a low budget card.
What most important to me when buying a new video card!
1. Price
2. Image Quality
3. Features
4. Performance
5. Compatibility
In that order!
I think it will be right up there with the Geforce 3 and the driver support will be much improved as it already is.
As for image quality it will still own Nvidia. But right now I like both companies cause they will only lower prices for us the customer. As for Kyro good luck to them, but I still see there card as a low budget card.
What most important to me when buying a new video card!
1. Price
2. Image Quality
3. Features
4. Performance
5. Compatibility
In that order!
rcf84
May 13, 2001 12:00:09 AM
Anonymous
a
b
U
Graphics card
May 13, 2001 12:09:45 AM
radeon will improve their's market share!
30-40 % in 2002
I think that kyro will take the place left open by 3dfx...
kyro 2/3/etc.. wining 30-40 % in 2002
nvidia will enjoy last place 20-30 % in 2002
Nvidia think that killing the competition (like 3dfx) is feasible !
no it's not!
It will only give some space open for others to ocupy !!!
LOL
30-40 % in 2002
I think that kyro will take the place left open by 3dfx...
kyro 2/3/etc.. wining 30-40 % in 2002
nvidia will enjoy last place 20-30 % in 2002
Nvidia think that killing the competition (like 3dfx) is feasible !
no it's not!
It will only give some space open for others to ocupy !!!
LOL
noko
May 13, 2001 12:32:25 AM
It was a dark period of time where the evil empire was ruled through the Emperor Microsoft and Darth Nvidia. A small but well trained and courages bunch of ATI rebels lead the assault for the end of the tyranny. The battle is fierce, the empire was surprised at the speed of the attack and the accuracy. The Emperor issues an order to Lord Nvidia, "Get ready the X-Box, wipe them out,,,all of them,," an evil smile is felt with a cold chill sucking inside of you. Come look at this last attempt for true freedom, freedom of your mind and of course your darn video card... Wait a minute, the video cards are dead, it has to be shown using the gamecube: :frown:
<A HREF="http://cubemedia.ign.com/media/previews/image/roguelead..." target="_new">Die you Rebel scum!</A>
<A HREF="http://cubemedia.ign.com/media/previews/image/roguelead..." target="_new">Eat this . . . .</A>
<A HREF="http://www.tendogamers.com/screenshots/gcn/rogue/rs2-8...." target="_new">Mean while getting the Xbox ready . . .</A>
Well while the X-box is getting ready compare these images between the Gamecube and the movie images. See how the prophesy is being completed.
<A HREF="http://cubemedia.ign.com/media/previews/image/rogue2/ro..." target="_new">Uncanny realism</A>
<A HREF="http://cubemedia.ign.com/media/previews/image/rogue2/ro..." target="_new">Which is which?</A>
X-box is ready to shoot, oh no!!!!
<A HREF="http://xboxmedia.ign.com/media/previews/image/outlawgol..." target="_new">Which ball?</A>
<A HREF="http://xboxmedia.ign.com/media/previews/image/outlawgol..." target="_new">Your club is broken . . .</A>
Oh well the X-box missed and the rebels won. <b>HURRAY!!!!!!!!</b>
:smile: :smile: :smile: :smile: :smile:
<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 05/12/01 08:57 PM.</EM></FONT></P>
<A HREF="http://cubemedia.ign.com/media/previews/image/roguelead..." target="_new">Die you Rebel scum!</A>
<A HREF="http://cubemedia.ign.com/media/previews/image/roguelead..." target="_new">Eat this . . . .</A>
<A HREF="http://www.tendogamers.com/screenshots/gcn/rogue/rs2-8...." target="_new">Mean while getting the Xbox ready . . .</A>
Well while the X-box is getting ready compare these images between the Gamecube and the movie images. See how the prophesy is being completed.
<A HREF="http://cubemedia.ign.com/media/previews/image/rogue2/ro..." target="_new">Uncanny realism</A>
<A HREF="http://cubemedia.ign.com/media/previews/image/rogue2/ro..." target="_new">Which is which?</A>
X-box is ready to shoot, oh no!!!!
<A HREF="http://xboxmedia.ign.com/media/previews/image/outlawgol..." target="_new">Which ball?</A>
<A HREF="http://xboxmedia.ign.com/media/previews/image/outlawgol..." target="_new">Your club is broken . . .</A>
Oh well the X-box missed and the rebels won. <b>HURRAY!!!!!!!!</b>
:smile: :smile: :smile: :smile: :smile:
<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 05/12/01 08:57 PM.</EM></FONT></P>
HolyGrenade
May 13, 2001 12:40:13 AM
Whats the basis of this prediction and which market share are you talking about? ...desktop, entry level Gaming, Performance gaming, low end workstation, high end workstation, mobile?
Current Market Status:
Desktop Graphics:
Mostly nVidia TNT2, followed by Matrox Millenium G400 and Ati Rage Pro.
Entry Level Gaming:
nVidia with its GeForce 1, GeForce 2 mx. Then comes 3DFX Talking from the grave with its extensive range of cards from the voodoo 3 and 4. ATI is quite far behind with its Radeon LE and VE. Kyro 1 has a tiny amount of market share. Matrox has a bigger share that Imagination Technologies kyro.
Medium Range Gaming:
nVidia again has the major share with the GeForce GTS and GeForce Pro. ATI has its Radeon 64 but only manages to capture a much smaller share. 3DFX also tastes a little of this sector with its voodoo 5 range. The Kyro 2 is not released yet.
Performance Gaming:
The Sales in this sector is much smaller than the others but the cost is higher. I dont think anything is competing with the GeForce 2 Ultra for this space.
Entry level Workstation:
ATI havent had FireGL long enough to claim profit in the low end. nVidia are agin ruling this section with there Quadro 1 & 2 range. Oxygen from 3DLabs can do a bit aswell.
Medium Level Workstation:
The newer fireGLs come in this sector. But this section is ruled by 3D Labs with its WildCat Range.
High end Mobile:
ATI rule this sector, but nVidia is starting to catch up.
Low end mobile:
I suppose people like neographics have a big chunk of the market share here.
See nVidia is like a rolling stone now, it has huge momentum. It has a brand recognised and liked by oems. ATI used to be the oem favourites but they lost a lot of their customers, who now favour nvidia.
IMGTEC will have to work hard to get market share. I dont think they have a product to revolutonise the market and earn them a huge chunk of any sector to give them a market lead. Nor do they have a broad range of products to earn them enough from each sector to give them an overall market lead. I don't think they'll cut it just yet. They'll just have to take the 3rd place in the entry-medium gaming sectors. Overall they'll be far behind the likes of nVidia, ATI and matrox.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
Current Market Status:
Desktop Graphics:
Mostly nVidia TNT2, followed by Matrox Millenium G400 and Ati Rage Pro.
Entry Level Gaming:
nVidia with its GeForce 1, GeForce 2 mx. Then comes 3DFX Talking from the grave with its extensive range of cards from the voodoo 3 and 4. ATI is quite far behind with its Radeon LE and VE. Kyro 1 has a tiny amount of market share. Matrox has a bigger share that Imagination Technologies kyro.
Medium Range Gaming:
nVidia again has the major share with the GeForce GTS and GeForce Pro. ATI has its Radeon 64 but only manages to capture a much smaller share. 3DFX also tastes a little of this sector with its voodoo 5 range. The Kyro 2 is not released yet.
Performance Gaming:
The Sales in this sector is much smaller than the others but the cost is higher. I dont think anything is competing with the GeForce 2 Ultra for this space.
Entry level Workstation:
ATI havent had FireGL long enough to claim profit in the low end. nVidia are agin ruling this section with there Quadro 1 & 2 range. Oxygen from 3DLabs can do a bit aswell.
Medium Level Workstation:
The newer fireGLs come in this sector. But this section is ruled by 3D Labs with its WildCat Range.
High end Mobile:
ATI rule this sector, but nVidia is starting to catch up.
Low end mobile:
I suppose people like neographics have a big chunk of the market share here.
See nVidia is like a rolling stone now, it has huge momentum. It has a brand recognised and liked by oems. ATI used to be the oem favourites but they lost a lot of their customers, who now favour nvidia.
IMGTEC will have to work hard to get market share. I dont think they have a product to revolutonise the market and earn them a huge chunk of any sector to give them a market lead. Nor do they have a broad range of products to earn them enough from each sector to give them an overall market lead. I don't think they'll cut it just yet. They'll just have to take the 3rd place in the entry-medium gaming sectors. Overall they'll be far behind the likes of nVidia, ATI and matrox.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
HolyGrenade
May 13, 2001 12:49:22 AM
noko
May 13, 2001 12:52:40 AM
HolyGrenade
May 13, 2001 2:18:18 AM
I checked out IGNs comparison between the three main consoles. There red ink bit on microsofts official comparison sheet was kinda funny, but the article was stupid. It was more like a buncho guys ranting over other people having it better than them.
Their stance was microsoft and sony gave theoretical figures, and I agree with that. But they were saying Nintendo was giving ingame figures and they were being extremely conservative. I disagree with that. no company will try to shoot them selves in the foot like that. The only reason a company will go out of routine and give realistic figures like that is because their theoretical numbers are considerably lower than the competition.
IGN were also saying that the Game Cube is "capable" of 256 Audio channels but nintendo are just saying that it can do 64. That just dont make sense. In answer to XBox's 3D Audio cpabilities, IGN are saying that the Gamecube supports dolby prologic. No wonder nintendo were keeping this quiet. Dolby prologic is not at all suitable for gaming audio. It is a static surround algorithm. DirectSound 3D, on the other hand, is dynamically processed 3D Audio. This is what you need for games.
I did like the huge 12.8GB bandwidth of the Game Cube. Personally I think the Gamecube is just a gaming console for kids. The Xbox in comparison is an entertainment system you can place in the living room. It could have been made to look better though. You cannot play dvd's on the gamecube. Also coming out 6 months later than the XBox you would like for it to have superior hardware.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
Their stance was microsoft and sony gave theoretical figures, and I agree with that. But they were saying Nintendo was giving ingame figures and they were being extremely conservative. I disagree with that. no company will try to shoot them selves in the foot like that. The only reason a company will go out of routine and give realistic figures like that is because their theoretical numbers are considerably lower than the competition.
IGN were also saying that the Game Cube is "capable" of 256 Audio channels but nintendo are just saying that it can do 64. That just dont make sense. In answer to XBox's 3D Audio cpabilities, IGN are saying that the Gamecube supports dolby prologic. No wonder nintendo were keeping this quiet. Dolby prologic is not at all suitable for gaming audio. It is a static surround algorithm. DirectSound 3D, on the other hand, is dynamically processed 3D Audio. This is what you need for games.
I did like the huge 12.8GB bandwidth of the Game Cube. Personally I think the Gamecube is just a gaming console for kids. The Xbox in comparison is an entertainment system you can place in the living room. It could have been made to look better though. You cannot play dvd's on the gamecube. Also coming out 6 months later than the XBox you would like for it to have superior hardware.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
noko
May 13, 2001 4:07:29 AM
noko
May 13, 2001 4:27:07 AM
Quote:
<b>When is the Xbox coming out?</b>Fall of 2001 is the estimated release date from Microsoft at this point. As for exact dates, we've heard rumors saying that October is the launch month, and that October 22nd might just be the launch day. Other reports have suggested a mid-November launch. As for an actual date, Microsoft is expected to make an announcement at E3
<b>How much is the Xbox going to cost?</b>
From the looks of things, the Xbox will release at $299 US. Not only have Microsoft officials commented that the system will come out at a price indicative of the current market, citing a $300 as the current benchmark, but several online retailers have hinted that the 'Box will launch at $299 or less. Microsoft is expected to make an official announcement regarding the price in a matter of weeks, but until then, you'd be wise to plan on putting away about three bills for Bill, if you know what we mean.
<b>How many games will be available at launch?</b>
While no specific titles have been confirmed for the actual day and date of the launch, Microsoft has stated they're seeking to have somewhere between <b>12-20 titles</b> on store shelves the day the Xbox hits the shelves, with upwards of 30+ titles in stores by the end of 2001.
http://xbox.ign.com/
What a joke 12-20 titles for the X-box. No real confirmed date and $299!
Quote:
Although an exact release date for NINTENDO GAMECUBE has not been announced, the current plan is to launch the system in Japan in July 2001 and in North America in October 2001. Keep checking Nintendo.com for updates as they become available!http://www.nintendo.com/gamecube/index.html
I bet the price of the gamecube will be half the cost of the X-box plus 3-6 times more games, no confirm date either.
Anonymous
a
b
U
Graphics card
May 13, 2001 5:33:15 AM
My bro told me that they either had to add more ram or the ram cost was more than expected (can't remember which) and that the price of the gamecube would be more than what they wanted to put it out as. Anyone else hear anything about this?
One good thing about xbox is that the games for that system can be easily converted to computer. If the gamecube fallows anything like it's predecessor it won't. Most if not all games for the n64 has bombed coming to the computer because it was outdated by the time it made it or just plain old didn't cut it. Nintendo might not want the games going to pc either because then they might not get full royalties for the games, but I have no idea if they would or not.
All in all we've got some interesting times ahead.
One good thing about xbox is that the games for that system can be easily converted to computer. If the gamecube fallows anything like it's predecessor it won't. Most if not all games for the n64 has bombed coming to the computer because it was outdated by the time it made it or just plain old didn't cut it. Nintendo might not want the games going to pc either because then they might not get full royalties for the games, but I have no idea if they would or not.
All in all we've got some interesting times ahead.
OzzieBloke
May 13, 2001 6:16:17 AM
mpjesse
May 13, 2001 8:26:13 AM
If anyone should no better it's ATI with their unintellegent "MAXX" design. The Rage MAXX is a perfect example- it performed almost no better than a single Rage. 3dfx learned it's lesson on dual processors- but too late. They suffered a sell out. ATI isn't doing so well these days. They lost Apple and they'll lose the mobile market soon too.
They need to learn that doubling up on GPU's is a half-ass way of doing things. nVidia has never done it (with the exception of their Quadro line which is for high end crap) and look how much better each of their new GPU's performs.
The Radeon 3 will flop.
-MP Jesse
"Signatures Still Suck"
They need to learn that doubling up on GPU's is a half-ass way of doing things. nVidia has never done it (with the exception of their Quadro line which is for high end crap) and look how much better each of their new GPU's performs.
The Radeon 3 will flop.
-MP Jesse
"Signatures Still Suck"
Anonymous
a
b
U
Graphics card
May 13, 2001 10:18:06 AM
HolyGrenade
May 13, 2001 11:09:48 AM
In "Ex Machina" on .tv they said the Game Cube will suffer Huge delays In Japan, which may carry over to the US and Europe. So, you'de be lucky to see any GameCubes before 2002 in the US.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
HolyGrenade
May 13, 2001 11:12:38 AM
Negaverse23
May 13, 2001 3:00:02 PM
noko
May 13, 2001 8:51:59 PM
Negaverse23
May 13, 2001 10:03:30 PM
HolyGrenade
May 13, 2001 10:10:56 PM
Something like that. But every console manufacurer loses money on first year or so of console sale. They Make their maoneyback by taking a cut from the publishers profit for each game sale. Also, they charge the the developers to use their proprietory technology for programming the games. Microsoft, however, will not be charging the developers for devoloping for the XBox. Afterall it is still only C++ using the DirectX API and some assembler optimisations.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
noko
May 14, 2001 3:05:55 AM
Thats why I don't understand how Microsoft is going to make money on it unless they are depending on there own games to pull them through. Here is something that may interest you, looks like GameCube is more ready then expected:
<A HREF="http://www.msnbc.com/news/571518.asp#BODY" target="_new">http://www.msnbc.com/news/571518.asp#BODY</A>
<A HREF="http://www.msnbc.com/news/571518.asp#BODY" target="_new">http://www.msnbc.com/news/571518.asp#BODY</A>
HolyGrenade
May 14, 2001 9:09:35 AM
rcf84
May 14, 2001 3:30:10 PM
HolyGrenade
May 14, 2001 3:40:21 PM
noko
May 14, 2001 4:16:40 PM
Looks like ATI is lining up some heavy artillery, doesn't it? ATI has more then just 3d graphics that will hit the market. The next All-In-Wonder will probably have picture-in-picture as well as digital video effects. Video where you can do uniques transitions between two video streams. The origianl Radeon could take two video streams and wrap one on a 3d object like a sphere or plane but not limited to that and animate it into the back ground while bringing the other in the forground. So doing videos on your computer will be even more easier while being able to do real time effects. The original Radeon could output to a HDTV set directly except ATI decided not to incorporate that capability in the Radeon drivers due to its inability to do the interlace modes on direct output. The Radeon2 will be able to do both progressive and interlance modes on direct output. What does that mean? It means that you will be able to hook up any HDTV to the Radeon and not only watch outstanding DVD but play your favorite games on your 45"-65" HDTV!! Imagine playing QuakeIII, Serious Sam, Unreal2, Max Payne at 1920x1090 or any other 18 resolution HDTV formats sitting in your recliner! How about using your HDTV as your monitor? You will be able to do it. Just think of having a High Resolution 65" monitor where you can browse the internet or do work on!! Doing video editing on a HDTV set would be very easy. The potential is awesome. ATI has been working on a HDTV turner as well so maybe that will also be incorporated into the All-In-Wonder. If you look at ATI progress in Video you see their intent of bringing the two worlds together. Nvidia lack of progress in video will hurt them as time goes on. Just like most chips the Radeon is a work in progress. The Radeon2 will use more of those capability and the future looks very bright for ATI. The product line will be untouched by anyone. Bye bye Nvidia. Just kidding :smile: .
<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 05/14/01 12:33 PM.</EM></FONT></P>
<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 05/14/01 12:33 PM.</EM></FONT></P>
HolyGrenade
May 14, 2001 4:22:07 PM
noko
May 14, 2001 4:30:46 PM
HDTV prices are dropping and will continue to do so. Plus doing editing professioanly your talking $10,000 plus for a semi professional setup just for the equipment not including the software. HDTVs go for $1000 and up but prices are dropping presently.
<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 05/14/01 11:25 PM.</EM></FONT></P>
<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 05/14/01 11:25 PM.</EM></FONT></P>
HolyGrenade
May 14, 2001 4:36:25 PM
Whoa! In the UK a normal 32" Sony FD Triniton Wega would put a hole the size of £1500-£2000. (One of the models does however have VGA Inputs, so you can connect your "Sony VAIO" to the TV.) Also they're all duel tuner with channel scan and frame scan and all the other bells and whistles.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
noko
May 14, 2001 5:23:49 PM
Thats cool, but not all HDTV's have VGA inputs. In fact very few have VGA inputs. Hopefully that will change but that is not the case now. For all the owners that have HDTV's now, most do not have VGA inputs. So the Radeon2 will be unique, you won't have to buy another HDTV to use. So what is your point? Buy only a VGA input capable HDTV vice buying the one you really want or can afford? Does those HDTV do editing with digital video effects? The Radeon2 can. The Radeon2 is a new breed of card, it won't just be a fancy fast game card (much faster then a GF3 junk) but a multi media splender
.
<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 05/14/01 01:40 PM.</EM></FONT></P>
.<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 05/14/01 01:40 PM.</EM></FONT></P>
HolyGrenade
May 14, 2001 6:35:00 PM
I think you got me wrong there. These are normal PAL TV's, but they do support ntsc and other pal-variation videos. They're not HDTV's. They're normal CRT TV's though they are perfectly flat, horizontally and vertically.
What I was saying is, they are a bit expensive.
The Radeon 2 would seem great for a personal video editing suite. I suppose it would be going against matrox products. What would be great is digital video support using IEEE-1394 connections, either via ports on the card itself or on them motherboard or a seperate card.
But I'm not sure if the Radeon 2 will actually be faster than, the GeForce 3. ATI did say the Radeons will be faster than GeForce 2s. Nonetheless it would seem it is likely to be a good overall card.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
What I was saying is, they are a bit expensive.
The Radeon 2 would seem great for a personal video editing suite. I suppose it would be going against matrox products. What would be great is digital video support using IEEE-1394 connections, either via ports on the card itself or on them motherboard or a seperate card.
But I'm not sure if the Radeon 2 will actually be faster than, the GeForce 3. ATI did say the Radeons will be faster than GeForce 2s. Nonetheless it would seem it is likely to be a good overall card.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
rcf84
May 14, 2001 7:08:22 PM
noko
May 14, 2001 9:27:14 PM
Sorry, I thought we where talking about HDTVs. Here in the States HDTVs are getting more common, still kinda expensive but the prices are rapidily coming down. Look at the below benchmark of Giants comparing a GF3, Radeon, GF2, mx and TNT. You will be surprised:
<A HREF="http://www.aceshardware.com/articles/reviews/GF3vsK2_Pa..." target="_new">Benchmark GF3/Radeon</A>
A very modern game with alot of features being used from the hardware. The GF3 beats the Radeon only by 1 FPS!! on a T-Bird 1.33ghz, <b>on a Duron 850 the Radeon ties the GF3!!!</b>
This maybe just a problem with that one game and doesn't mean nothing but wait, check this out:
<A HREF="http://www.beyond3d.com/reviews/nvidia_gf3/visiontek/be..." target="_new">Benchmarks of Serious Sam, without any Anti-Aliasing, 64-Tap anisotropic in Blue</A>
Three series of test was run at Beyond3d.com. Look at 1024x768 32bit Extreme. Also notice how dramatic the GF3 performance is hit when quality settings are upped. What is interesting to know at this setting the GF3 is doing 8:1 Anisotropic Filtering while the Radeon is doing a higher 16:1 Anisotropic Filtering, both doing 64-tap (meaning taking 64 samples from the source texture for each pixel being applied to the output mapped texture.) My Radeon with a slower T-Bird does 41.8 FPS with the exact same settings. The GF3 in this case is doing Radeon speeds. Yes the GF3 can produce allot of FPS with crappy output, but for it to do high quality output its frame rate is drastically reduced to the level of the Radeon. In this test I did not overclock my stock 182.25mhz speed.
I could lead you to other sites if you wish but when the GF3 is producing Radeon quality images its frame rate dives and matches the Radeon. The Radeon 2 will have two more pipes and will be about 70mhz faster with much faster ram, better HyperZ etc.. <b>The Radeon 2 will quite frankly toast the GF3!!</b>
<A HREF="http://www.aceshardware.com/articles/reviews/GF3vsK2_Pa..." target="_new">Benchmark GF3/Radeon</A>
A very modern game with alot of features being used from the hardware. The GF3 beats the Radeon only by 1 FPS!! on a T-Bird 1.33ghz, <b>on a Duron 850 the Radeon ties the GF3!!!</b>
This maybe just a problem with that one game and doesn't mean nothing but wait, check this out:
<A HREF="http://www.beyond3d.com/reviews/nvidia_gf3/visiontek/be..." target="_new">Benchmarks of Serious Sam, without any Anti-Aliasing, 64-Tap anisotropic in Blue</A>
Three series of test was run at Beyond3d.com. Look at 1024x768 32bit Extreme. Also notice how dramatic the GF3 performance is hit when quality settings are upped. What is interesting to know at this setting the GF3 is doing 8:1 Anisotropic Filtering while the Radeon is doing a higher 16:1 Anisotropic Filtering, both doing 64-tap (meaning taking 64 samples from the source texture for each pixel being applied to the output mapped texture.) My Radeon with a slower T-Bird does 41.8 FPS with the exact same settings. The GF3 in this case is doing Radeon speeds. Yes the GF3 can produce allot of FPS with crappy output, but for it to do high quality output its frame rate is drastically reduced to the level of the Radeon. In this test I did not overclock my stock 182.25mhz speed.
I could lead you to other sites if you wish but when the GF3 is producing Radeon quality images its frame rate dives and matches the Radeon. The Radeon 2 will have two more pipes and will be about 70mhz faster with much faster ram, better HyperZ etc.. <b>The Radeon 2 will quite frankly toast the GF3!!</b>
HolyGrenade
May 14, 2001 11:18:52 PM
C'mon, this is a cheapshot I would expect from someone like powervr2 (no offence powervr2 :wink: ).
The GeForce 3 performs brilliantly on virtually all the benchmarks. The image qualities are also brilliant, except for the persisting S3TC texture degredation, in transleucent textures.
You cant compare the GeForce 3 with the Radeon. That is just vulgar man!
The GeForce 3 is a truly superior card. Do you think the Radeon will be able to hold its own when all the High poly shader games that will be released in the near future. The Radeon 2 will do ok, but we'll still have to wait and see if it is capable of outperforming the GeForce 3. Also, when it is released, the nv30 will be near its release date... so we'll see what happens with the pricing. The GeForce 3 is likely to have a few price drops by then. besides, wait till the det12.xx drivers are released. They have quite a bit of GF3 specific code.
As for the Radeon 64, it is GeForce 2 GTS class hardware. In anandtech, doesn't it score on par with the GTS? This is where it was supposed to pull away from the nVidia cards with its HyperZ. It doesn't happen though.
I'm not totally convinced about nVidias Z-Occlusion Culling techniques, just as I have doubts about HyperZ. nVidia has implemented fast Z-buffer clearing, early-Z-checking and Z-culling. The most effective method which is Z-Culling Query, is not supported in DX8 nor in OpenGL 1.2. They don't even have the OpenGL extensions for it in their Det6.50. I think, we have to wait till 12.xx.
Talking about HSR, Tile based renderring also falters when it comes to 3D textures and some other features I cant remember about. Just thought I'd mention that since I've started ranting.
Oh well! I think I should stop now.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
The GeForce 3 performs brilliantly on virtually all the benchmarks. The image qualities are also brilliant, except for the persisting S3TC texture degredation, in transleucent textures.
You cant compare the GeForce 3 with the Radeon. That is just vulgar man!
The GeForce 3 is a truly superior card. Do you think the Radeon will be able to hold its own when all the High poly shader games that will be released in the near future. The Radeon 2 will do ok, but we'll still have to wait and see if it is capable of outperforming the GeForce 3. Also, when it is released, the nv30 will be near its release date... so we'll see what happens with the pricing. The GeForce 3 is likely to have a few price drops by then. besides, wait till the det12.xx drivers are released. They have quite a bit of GF3 specific code.
As for the Radeon 64, it is GeForce 2 GTS class hardware. In anandtech, doesn't it score on par with the GTS? This is where it was supposed to pull away from the nVidia cards with its HyperZ. It doesn't happen though.
I'm not totally convinced about nVidias Z-Occlusion Culling techniques, just as I have doubts about HyperZ. nVidia has implemented fast Z-buffer clearing, early-Z-checking and Z-culling. The most effective method which is Z-Culling Query, is not supported in DX8 nor in OpenGL 1.2. They don't even have the OpenGL extensions for it in their Det6.50. I think, we have to wait till 12.xx.
Talking about HSR, Tile based renderring also falters when it comes to 3D textures and some other features I cant remember about. Just thought I'd mention that since I've started ranting.
Oh well! I think I should stop now.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
noko
May 14, 2001 11:54:44 PM
Not cheap at all, backed by testing unless you want to ignore testing results. GF3 is surpose to be the next generation of card. Here we have a lowly Radeon keeping up with it. Lets all bow down to nVidia, ahmmmmmmmm - - ahmmmmmmmmm - - - ahmmmmmmmmm. <b>NOT!!!</b>. Do you want to see more mediocre results? I am not arguing that the GF3 can pump out FPS, I am arguing when the GF3 is producing high quality images like the Radeon its performance dives hard. REAL HARD. Here are some Radeon images doing its thing at Max Anisotropic filtering, all are from the Radeon:
<A HREF="http://home.cfl.rr.com/noko/shot.jpg" target="_new">A</A>
<A HREF="http://home.cfl.rr.com/noko/shot1.jpg" target="_new">N</A>
<A HREF="http://home.cfl.rr.com/noko/shot2.jpg" target="_new">I</A>
<A HREF="http://home.cfl.rr.com/noko/shot3.jpg" target="_new">S</A>
<A HREF="http://home.cfl.rr.com/noko/shot4.jpg" target="_new">O</A>
<A HREF="http://home.cfl.rr.com/noko/shot5.jpg" target="_new">T</A>
<A HREF="http://home.cfl.rr.com/noko/shot6.jpg" target="_new">R</A>
<A HREF="http://home.cfl.rr.com/noko/shot7.jpg" target="_new">O</A>
<A HREF="http://home.cfl.rr.com/noko/shot8.jpg" target="_new">P</A>
<A HREF="http://home.cfl.rr.com/noko/shot9.jpg" target="_new">I</A>
The Radeon is pumping out around 45-85 FPS doing these images. Do you want comparison shots between the Radeon and the GF3? You might be surprised again.
<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 05/14/01 09:27 PM.</EM></FONT></P>
<A HREF="http://home.cfl.rr.com/noko/shot.jpg" target="_new">A</A>
<A HREF="http://home.cfl.rr.com/noko/shot1.jpg" target="_new">N</A>
<A HREF="http://home.cfl.rr.com/noko/shot2.jpg" target="_new">I</A>
<A HREF="http://home.cfl.rr.com/noko/shot3.jpg" target="_new">S</A>
<A HREF="http://home.cfl.rr.com/noko/shot4.jpg" target="_new">O</A>
<A HREF="http://home.cfl.rr.com/noko/shot5.jpg" target="_new">T</A>
<A HREF="http://home.cfl.rr.com/noko/shot6.jpg" target="_new">R</A>
<A HREF="http://home.cfl.rr.com/noko/shot7.jpg" target="_new">O</A>
<A HREF="http://home.cfl.rr.com/noko/shot8.jpg" target="_new">P</A>
<A HREF="http://home.cfl.rr.com/noko/shot9.jpg" target="_new">I</A>
The Radeon is pumping out around 45-85 FPS doing these images. Do you want comparison shots between the Radeon and the GF3? You might be surprised again.
<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 05/14/01 09:27 PM.</EM></FONT></P>
rcf84
May 14, 2001 11:54:50 PM
noko
May 14, 2001 11:59:35 PM
rcf84
May 15, 2001 12:19:11 AM
noko
May 15, 2001 1:26:03 AM
OzzieBloke
May 15, 2001 8:20:16 AM
The possums are at it again... blimey!
At least noko is giving some numbers and pictures to look at here...
But wasn't the GF3 all about improving visual quality, and not frame rates? If this is the case, and the Radeon original is on par for visual quality, then I would think the Radeon 2 is going to make the GF3 hurt. I don't know how bad, but hurt it will, at least until they bring out the "GF3 ultra" or "GF3 super" or "GF3 I creamed my undies" or whatever it is they are producing.
Cow with legs spread wide either dead or playing 'cello.
At least noko is giving some numbers and pictures to look at here...
But wasn't the GF3 all about improving visual quality, and not frame rates? If this is the case, and the Radeon original is on par for visual quality, then I would think the Radeon 2 is going to make the GF3 hurt. I don't know how bad, but hurt it will, at least until they bring out the "GF3 ultra" or "GF3 super" or "GF3 I creamed my undies" or whatever it is they are producing.
Cow with legs spread wide either dead or playing 'cello.
HolyGrenade
May 15, 2001 8:55:06 AM
<b>16:1 anisotropy and 8:1 anisotropy, both at 64 tap</b>
What did you mean there? did you mean 64 tap anisotrpy with 16 sample trilinear?
I didn't quite get you there. And oh yeah, hold the excitement until the product is released.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
What did you mean there? did you mean 64 tap anisotrpy with 16 sample trilinear?
I didn't quite get you there. And oh yeah, hold the excitement until the product is released.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
HolyGrenade
May 15, 2001 9:25:56 AM
nVidia is releasing nv17 which will be a budget version of the GeForce 3. It will be priced at around $150 and is set to outperform the GF2 GTS and PRO. It will be TwinView. Testing cards should be available after june closely followed by the benchmark cards, while consumer versions are set for an autumn release. This chip will also be integrated in the nVidia Crush based motherboards, but I'm thinking these ones will probably have just one monitor output. The motherboards in the UK will be about £120.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
- 1 / 2
- 2
- Newest
Related resources
- Solvedis the XFX Radeon R9 280X Double Dissipation Black Edition 3GB backwards compatible to pci express 2.0? Forum
- SolvedRadeon R9 270x 4GB (PCI-E 3.0) vs. Asus P5Q3 (PCI-E 2.0) Forum
- Digit-Life looks at 'Optimized' drivers and 3Dmk03 Forum
- SolvedMSI Radeon R9 270 2GB with 3 Monitors? Forum
- SolvedDoes Sapphire Radeon R7 250 2GB GDDR3 supports DirectX 12 Forum
- SolvedShould I trade my GTX 760 2gb for the Radeon 7970 3gb Forum
- SolvedEVGA GeForce GTX 660 3GB SuperClocked Video Card vs. Asus Radeon R9 270X 2GB DirectCU II Video Card Forum
- SolvedSapphire Radeon R9 280X 3GB vs x2 XFX Radeon HD 5970 Black Edition 2 GB Forum
- SolvedWill my i3 530 @ 2.93 bottleneck radeon R9 270x video card? Forum
- SolvedMSI Radeon R9 280X 3gb vs. ASUS GeForce GTX760 2gb? Forum
- Solved2 Radeon r7 200 series on gigabyte ga-78lmt-usb3 Forum
- SolvedWhich one of these graphic cards is worth to buy 1. Geforce gt 2GB gddr5 2. Radeon HD 6770 3. Radeon r7 250 2GB gddr5? Forum
- SolvedSapphire Radeon R9 280X 3GB Dual-X vs. Sapphire Radeon R9 270X 2GB Dual-X Forum
- SolvedGigabyte GeForce GTX 770 2GB WINDFORCE Video Card vs asus radeon r9 280x 3g Forum
- SolvedAMD Radeon HD 7670M Overclocking info Forum
- More resources
Read discussions in other Graphics & Displays categories
!