Sign in with
Sign up | Sign in
Your question

AMD = Getting silly with too many cores

Tags:
  • CPUs
  • Quad Core
  • AMD
Last response: in CPUs
Share
June 8, 2006 11:10:08 PM

OK...I don't know about AMD's claims...
#1...Quad Core. Acceptable, it is on the front and I believe it will be a good market...EVEN THOUGH things are JUST NOW starting to get written for Dual Cores much less Quad Cores....So it'll be a long while before real benefits in most applications are seen with a quad core.

#2...8 cores? That's just getting stupid. Unless their architecture drops to like 50% of the normal price...there's no way anyone but a rich fanboy is going to afford that. Not only that, that's like making a damn server room. I thought the beauty of a desktop was cuz it FITS ON A DESK! Won't the mobos have to be huge to fit as many cores as they plan to create? Also, that's going to get hotter than hell...You'll need an air conditioner built on the side of the case for cooling!

I suppose if technology increases (as it always does)...They can make the chips smaller and fit more onto a regular sized mobo...but I think that's getting unrealistic...ESPECIALLY predicting as soon as 2008.

Bottom Line: They will definately have something better to counter Conroe next year I'm sure. Unfortunately, I'll probably end up buying the Conroe because I can't wait that long. But this whole 8+ cores in one ATX Mobo I think is just silly.

A penny for your thoughts to all of you.

More about : amd silly cores

June 8, 2006 11:45:33 PM

Quote:
OK...I don't know about AMD's claims...
#1...Quad Core. Acceptable, it is on the front and I believe it will be a good market...EVEN THOUGH things are JUST NOW starting to get written for Dual Cores much less Quad Cores....So it'll be a long while before real benefits in most applications are seen with a quad core.

#2...8 cores? That's just getting stupid. Unless their architecture drops to like 50% of the normal price...there's no way anyone but a rich fanboy is going to afford that. Not only that, that's like making a damn server room. I thought the beauty of a desktop was cuz it FITS ON A DESK! Won't the mobos have to be huge to fit as many cores as they plan to create? Also, that's going to get hotter than hell...You'll need an air conditioner built on the side of the case for cooling!

I suppose if technology increases (as it always does)...They can make the chips smaller and fit more onto a regular sized mobo...but I think that's getting unrealistic...ESPECIALLY predicting as soon as 2008.

Bottom Line: They will definately have something better to counter Conroe next year I'm sure. Unfortunately, I'll probably end up buying the Conroe because I can't wait that long. But this whole 8+ cores in one ATX Mobo I think is just silly.

A penny for your thoughts to all of you.


#1 Quad cores wont be out until k8l (year(s) away) and by then there will be a larger demand for more cores.

#2 8cores with 4procs is ridiculously enthusiast market right now... I actually dont even know an enthusiast type person that needs 8 threads going at once? Though 8 cores would be fine in a 2proc setup...

i doubt 4x4x4x4 will see the light of day...unless they aim it at a specific market like movie making or something?

4 procs on a desktop PC seems stupid.
June 8, 2006 11:49:16 PM

Quote:
OK...I don't know about AMD's claims...
#1...Quad Core. Acceptable, it is on the front and I believe it will be a good market...EVEN THOUGH things are JUST NOW starting to get written for Dual Cores much less Quad Cores....So it'll be a long while before real benefits in most applications are seen with a quad core.

#2...8 cores? That's just getting stupid. Unless their architecture drops to like 50% of the normal price...there's no way anyone but a rich fanboy is going to afford that. Not only that, that's like making a damn server room. I thought the beauty of a desktop was cuz it FITS ON A DESK! Won't the mobos have to be huge to fit as many cores as they plan to create? Also, that's going to get hotter than hell...You'll need an air conditioner built on the side of the case for cooling!

I suppose if technology increases (as it always does)...They can make the chips smaller and fit more onto a regular sized mobo...but I think that's getting unrealistic...ESPECIALLY predicting as soon as 2008.

Bottom Line: They will definately have something better to counter Conroe next year I'm sure. Unfortunately, I'll probably end up buying the Conroe because I can't wait that long. But this whole 8+ cores in one ATX Mobo I think is just silly.

A penny for your thoughts to all of you.


#1 Quad cores wont be out until k8l (year(s) away) and by then there will be a larger demand for more cores.

#2 8cores with 4procs is ridiculously enthusiast market right now... I actually dont even know an enthusiast type person that needs 8 threads going at once? Though 8 cores would be fine in a 2proc setup...

i doubt 4x4x4x4 will see the light of day...unless they aim it at a specific market like movie making or something?

4 procs on a desktop PC seems stupid.



4X4 is not 4 procs, it 2 dual core procs with 4 GPUs. 4X4 is just AMDs way of getting HTX on the desktop. Imagine having a flip chip G81(?) sitting next to your FX62.
Related resources
June 8, 2006 11:50:42 PM

Then imagine it having no bandwidth.
June 8, 2006 11:53:39 PM

Quote:




4X4 is not 4 procs, it 2 dual core procs with 4 GPUs. 4X4 is just AMDs way of getting HTX on the desktop. Imagine having a flip chip G81(?) sitting next to your FX62.


You havent read the original article....an AMD guy was talking about how they should release 4x4x4x4 not 4x4...in a sense 4 procs both dual core= 8 cores.
June 8, 2006 11:58:10 PM

You're getting the 8 cores from having 2 quad-core cpus sometime in the near future. But I do agree that it's overkill for the current software.
Anonymous
a b à CPUs
June 8, 2006 11:59:46 PM

Well right now Dual core can be usefull for many "power user". Your stating that games and other programs are getting multithread. So, I can see myself 2 years from now running the same amount of programs, but all of them being multi-threaded and using your 4/8 cores.

I dont think its THE best way to achieve higher performance but thats the way major player are heading. Also with AMD approach you might see so more dedicated CPU with specific task, you could then create some truly customized computers that fits exactly your needs and that are pretty scalable...

One final comment, I did one multi threading class in university, pretty basic( with MPI and p-thread).
If its possible to make a part of the program multithreaded, than going from 2-4-8 thread is really easy, the b**** is to make it multi thread in the first place and more threads might bring diminishing returns.
June 9, 2006 12:05:31 AM

Quote:
#1 Quad cores wont be out until k8l (year(s) away) and by then there will be a larger demand for more cores.

#2 8cores with 4procs is ridiculously enthusiast market right now... I actually dont even know an enthusiast type person that needs 8 threads going at once? Though 8 cores would be fine in a 2proc setup...

i doubt 4x4x4x4 will see the light of day...unless they aim it at a specific market like movie making or something?

4 procs on a desktop PC seems stupid.

#1 K8L dual cores are comming 1Q 2007 and the K8L quad cores, running at lower Ghz, will be out 2H 2007. Both these chips will work on the AM2 4X4. The 1H 2008 quad cores will come up in Ghz and require the new socket F due to AM2 want handle the HT3 throw put. The K8L is an HT3 design but is backwards compatable and force changing the mobo up till high performing quad cores performance creates a bottleneck.

#2 8 Cores is only with 2 CPU's in the quad core design on the 4X4 mobo.

4X4 makes for a massive upgrade path and will without question see the light of day.
June 9, 2006 12:06:16 AM

Quote:
Then imagine it having no bandwidth.


Word.
June 9, 2006 12:11:04 AM

Whilst somewhat relevant, MS has a PPT presentation on multithreading and they include two case studies in PGR3 and Kameo, yeah its for a console but its still interesting and related.

Link.
June 9, 2006 12:43:13 AM

Quote:
Then imagine it having no bandwidth.



Well, if you believe what they say at Anadtech and other places after Analyst Day, you will see that DDR2 in a pool with give PLENTY ENOUGH BANDWIDTH.

I'll find the liks where someone you may belive says that this is a possibility and will provide more than enough bandwidth fo rthe CPU and GPU.


From AnadTech Analyst Day:

This applies directly to companies like AGEIA with their PhysX card which, when used in a game, must communicate bi-directionally with the CPU before a frame can be sent to the GPU for rendering. Additionally, GPU makers could easily take advantage of this technology to tie the graphics card even more tightly to the CPU and system memory. In fact, this would serve to eliminate one of the largest differences between PCs and game consoles. The major advantage that still remains on console systems (aside from their limited need for backwards compatibility compared to the PC) is the distance from the CPU to the GPU. There is huge bandwidth and low latency between these two subsystems in a console, and many games are written to take advantage of (or even depend on) the ability to actively share the rendering workload between the CPU and GPU on a very low level. Won't it be ironic if we start seeing high performance Xbox 360 and PS3 emulators only a couple years after their release? This is the kind of thing that could make it possible.

This is for stupid!

Stick that in your vegamite sandwich!!!!!!!
June 9, 2006 12:48:03 AM

It mentions bandwidth between cpu and gpu not between gpu and memory which is no where near enough.
June 9, 2006 12:52:46 AM

A Geforce 7900 GTX has 51.2GB/s. You cant get anywhere near that on DDR2.

Quote:
This is for stupid!


For yourself you mean.
June 9, 2006 12:56:17 AM

isnt Quad SLI silly?
June 9, 2006 1:02:20 AM

Quote:
isnt Quad SLI silly?


Word.
June 9, 2006 1:03:18 AM

Don't feel too bad. Next year Intel will also have new processors out to counter AMD's counter. Could be a hard year or years for AMD.
June 9, 2006 1:05:47 AM

Quote:
It mentions bandwidth between cpu and gpu not between gpu and memory which is no where near enough.


So you're saying that Xenos from ATi didn't figure out that eDRAM on the die can give MORE THAN ENOUGH BANDWIDTH.

IF ATi or nVidia can do away with the silicon for the card and the GDDR3 and replace them with NUMA and DDR2 with EDRAM on-die, it will get TOO MUCH BANDWIDTH.

How much is Xenos getting? eDRAM in Xenos gets 52GB/s here.
That means that since HT3 will gets 26GB/s and 1200MHz DDR2 will get close to 15GB/s the 22.5GB/s courtesy of the 128-bit GDDR3 memory interface running at 700MHz is not so much anymore.


All together 4x4 can have 15GB/s + 25GB/s or 40GB/s using FAST DDR2 to feed the GPU.


I think it will happen. I told you nVidia wanted a socket and even posted a link from Anand.
June 9, 2006 1:10:05 AM

Quote:
A Geforce 7900 GTX has 51.2GB/s. You cant get anywhere near that on DDR2.

This is for stupid!


For yourself you mean.


That's why I said STUPID. embedded DRAM will fix that and cost less than 256-512MB of GDDR3 the cooling for it and the PCB. 2 16 bit HT3 links will do 51.2GB/s. Maybe RAMBUS can get some business for their XDR RAM which is A LOT faster than GDDR3. They even have a chipset that can be use for the HT3 connection.


Basically yourattempts to hav all the good ideas is not working. Thsi is a good idea and OTHERS that I linked to are saying it too.

Just go and swing next to an aborigine. Well maybe I shouldn't put aborigines as low as you.
June 9, 2006 1:13:08 AM

I think AMD's strategy is a good bet considering it's about to be in a position where it has clearly inferior chips (crap, I own one!) ... look at Nvidia... 4 GPUs?!? Crap, I own one of those too! In both cases companies are doing whatever they can do to help make up for their under-performing products... it's not like either company can pull a killer new product out of a magic hat.

So in summary... I agree with the original poster... it is getting silly. I will admit I own two dual-core systems (AMD desktop and Intel laptop), but my brain can imagine how dual core is useful... background tasks and overhead chores can be passed off to one while leaving another for gaming. But 8?!? 16?!? How about 1,00,000?!? At some point, we'll need better products... not just more of them.
June 9, 2006 1:13:44 AM

eDRAM is expenisve and you need tiling like I mentioned before.

Quote:
How much is Xenos getting? eDRAM in Xenos gets 52GB/s here.


No its not. Its 32gb/s between the parent and daughter die and it shares 22gb with xenon.
June 9, 2006 1:14:45 AM

Quote:
Just go and swing next to an aborigine. Well maybe I shouldn't put aborigines as low as you.


Stop knocking them racist.
June 9, 2006 1:18:38 AM

Quote:
embedded DRAM will fix that and cost less than 256-512MB of GDDR3 the cooling for it and the PCB.


BS.

Quote:
2 16 bit HT3 links will do 51.2GB/s.


Wow thats completely useless if you dont have the memory bandwidth to go with it.

Quote:
Maybe RAMBUS can get some business for their XDR RAM which is A LOT faster than GDDR3.


No its not. They're roughly the same and GDDR4 is faster not to mention cheaper.

Quote:
Basically yourattempts to hav all the good ideas is not working.


WTF are you rambling about?

Quote:
Thsi is a good idea and OTHERS that I linked to are saying it too.


Like? The only thing you've linked to is about increasing speed between cpu and gpu not gpu and memory.

Give up racist.
June 9, 2006 1:20:08 AM

Quote:
It mentions bandwidth between cpu and gpu not between gpu and memory which is no where near enough.


So you're saying that Xenos from ATi didn't figure out that eDRAM on the die can give MORE THAN ENOUGH BANDWIDTH.

IF ATi or nVidia can do away with the silicon for the card and the GDDR3 and replace them with NUMA and DDR2 with EDRAM on-die, it will get TOO MUCH BANDWIDTH.

How much is Xenos getting? eDRAM in Xenos gets 52GB/s here.
That means that since HT3 will gets 26GB/s and 1200MHz DDR2 will get close to 15GB/s the 22.5GB/s courtesy of the 128-bit GDDR3 memory interface running at 700MHz is not so much anymore.


All together 4x4 can have 15GB/s + 25GB/s or 40GB/s using FAST DDR2 to feed the GPU.


I think it will happen. I told you nVidia wanted a socket and even posted a link from Anand.
I have to agree with Action man, the bandwith of system memory isnt enough. graphics cards today are using DDR3 @ 1600Mhz, 800Mhz is u want to state the real clock, which is twice as fast and has lower latency that DD2 800 thats been introduced into the mainstream this year,

Plus! it would have to hop through the cpu's memory controller to get to the system memory, this adds more latency, GPU's have their memory chips connected directly to them for a reason - latency , it would a step backward to put that GPU in that socket, it would work, but the performance would be equal at best, much worse in reality. I mean just imagine the amt of calls the gpu has to make to the memory with 32pipes processing at full speed.

and BTW i think the EDRAM in the XENOS is about 8/10MB, and its cache.

of course if i've said anything incorrect, i am open to being corrected
June 9, 2006 1:21:40 AM

Quote:
I think AMD's strategy is a good bet considering it's about to be in a position where it has clearly inferior chips (crap, I own one!) ... look at Nvidia... 4 GPUs?!? Crap, I own one of those too! In both cases companies are doing whatever they can do to help make up for their under-performing products... it's not like either company can pull a killer new product out of a magic hat.

So in summary... I agree with the original poster... it is getting silly. I will admit I own two dual-core systems (AMD desktop and Intel laptop), but my brain can imagine how dual core is useful... background tasks and overhead chores can be passed off to one while leaving another for gaming. But 8?!? 16?!? How about 1,00,000?!? At some point, we'll need better products... not just more of them.


Since when has software kept up with HW? Woul dyou rather that developers extracted out to 8 threads frst? At least now devs have a REAL CHIP that will allow them to study ways to thread more efficiently, so unless you're not around in two years SW WILL use 4 threads, it can be done now but it's not worth it because most people DON'T HAVE more than ONE or TWO available threads. Besides that kind of SW has to be designed from the ground up ON THE HW so it has to come first.
June 9, 2006 1:33:41 AM

Quote:
isnt Quad SLI silly?


Word.

I don't think any ammount of core will be silly until i live to see the day of computers with 100cores on 1 die. and it being smaller than Mans fist. Ex. "Intel Centurion" or "AMD Dual50 Core"
June 9, 2006 1:37:57 AM

Anyone else remember when 486DX came out?
"What can that do that my SX won't?"
Maybe it's progress, maybe it's just more gratuitous junk.
Time will tell.
June 9, 2006 1:43:07 AM

AMD plans are like Dr. Mephisto's (South Park season 1) new creature the 5 arsed monkey. Genetic engineering additional arses isn't evolutionary nor revolutionary.
June 9, 2006 1:43:27 AM

Quote:
It mentions bandwidth between cpu and gpu not between gpu and memory which is no where near enough.


So you're saying that Xenos from ATi didn't figure out that eDRAM on the die can give MORE THAN ENOUGH BANDWIDTH.

IF ATi or nVidia can do away with the silicon for the card and the GDDR3 and replace them with NUMA and DDR2 with EDRAM on-die, it will get TOO MUCH BANDWIDTH.

How much is Xenos getting? eDRAM in Xenos gets 52GB/s here.
That means that since HT3 will gets 26GB/s and 1200MHz DDR2 will get close to 15GB/s the 22.5GB/s courtesy of the 128-bit GDDR3 memory interface running at 700MHz is not so much anymore.


All together 4x4 can have 15GB/s + 25GB/s or 40GB/s using FAST DDR2 to feed the GPU.


I think it will happen. I told you nVidia wanted a socket and even posted a link from Anand.
I have to agree with Action man, the bandwith of system memory isnt enough. graphics cards today are using DDR3 @ 1600Mhz, 800Mhz is u want to state the real clock, which is twice as fast and has lower latency that DD2 800 thats been introduced into the mainstream this year,

Plus! it would have to hop through the cpu's memory controller to get to the system memory, this adds more latency, GPU's have their memory chips connected directly to them for a reason - latency , it would a step backward to put that GPU in that socket, it would work, but the performance would be equal at best, much worse in reality. I mean just imagine the amt of calls the gpu has to make to the memory with 32pipes processing at full speed.

and BTW i think the EDRAM in the XENOS is about 8/10MB, and its cache.

of course if i've said anything incorrect, i am open to being corrected


HT3 has the LOWEST interconnect latency there is. It doesn't matter the size it only matters that it is designed such that IF 1200MHz DDR2 (which can be achieved at 4 CAS with 1066) doesn't have 25.6GB/s it can feed the CACHE at it's own rate while the eDRAM runs at its speed.

Let's do the math.

15,000,000,000 bytes / sec / 2.8,000,000,000 cycles / sec ~ 5 bytes /cycle. IF the eDRAM is working at say 20 bytes or ~ 60GB/s then it would be necessary to only read from system RAM every 4th cycle of the GPU, but since you can have 4GB in the DDR2 banks that means you can move more data - like L3 does for L1. By basing the instruction stream on a suitable boundary size it is then possible to use a lower bandwidth source on a higher bandwidth drain. Just like the way hand pulleys work or L1 cache in a CPU. L1 is faster than L2 and system RAM but susperscalar CPUs are still not using all of the bandwidth.


It's about efficiency not just bandwidth. Can the 7900GTX actually move 52GB/s from the IO system or is that the INTERNAL BANDWIDTH OF THE CHIP.

Just like partition to partition copies on a disk are faster than disk to disk transfers, it depends on

1. Your interconnect speed
2. The amount of buffer you can provide
3. How large the blocks of data are
4. Speed of the CPU


How do you think HTX is going to work in Socket F or AM2 if DDR2 can't provide the speed. People are talking orders of magnitude usng the same subsystem, so most of the bandwidth is internal to the chip, not from the RAM.

Basically you shoul djust admit that this is a possibility and it still may be announced by the end of the year.
June 9, 2006 1:56:55 AM

Quote:
between the parent and daughter



Ooooo, that sounds hot! Any pics?
June 9, 2006 2:01:39 AM

It was always a possibilty, i never denied that.

Before the end of the year? no way

I think they announced it to get everybody to say "WOW", and here is why i think this:

1. the graphics card market is huge!, u have alot of players, alot of brands
and they all make money off it, and consumers like the choice of cards, and card makers differentiate their product with OC's and games and bundles, warranties and features. pluging that chip is going to mean strictly ATI or Nvidia, and they arent marketing companies, they are design mostly and marketing somewhat, - they market their technology, but the card makers do most of the marketing.

2. the ecosystem for add in GPU cards is already well established and ATI and Nvidia wont be looking to upset pratners they have been supplying for a long time.

3. the current system (AGP & PCIE) are not limiting factors, there is excess bandwith via the interface. therefore there is no need at this point in time.

4. if 4x4 actually takes off in any big way, woudl u want to have to put your GPU in the socket or another CPU? i dont see how it would make sence given u have a PCIE slot just sitting there empty.

of course feel free to disagree, its an opinion after all.
June 9, 2006 2:03:08 AM

Quote:
It mentions bandwidth between cpu and gpu not between gpu and memory which is no where near enough.



Wow, just....wow.

Additionally, GPU makers could easily take advantage of this technology to tie the graphics card even more tightly to the CPU and system memory
June 9, 2006 2:06:13 AM

It doesnt say use system memory exclusively, it'd only be for like texturing, sort of like with the PS3.
June 9, 2006 2:35:02 AM

Quote:
It was always a possibilty, i never denied that.

Before the end of the year? no way

I think they announced it to get everybody to say "WOW", and here is why i think this:

1. the graphics card market is huge!, u have alot of players, alot of brands
and they all make money off it, and consumers like the choice of cards, and card makers differentiate their product with OC's and games and bundles, warranties and features. pluging that chip is going to mean strictly ATI or Nvidia, and they arent marketing companies, they are design mostly and marketing somewhat, - they market their technology, but the card makers do most of the marketing.

2. the ecosystem for add in GPU cards is already well established and ATI and Nvidia wont be looking to upset pratners they have been supplying for a long time.

3. the current system (AGP & PCIE) are not limiting factors, there is excess bandwith via the interface. therefore there is no need at this point in time.

4. if 4x4 actually takes off in any big way, woudl u want to have to put your GPU in the socket or another CPU? i dont see how it would make sence given u have a PCIE slot just sitting there empty.

of course feel free to disagree, its an opinion after all.



What you don't seem to understand is that AM ALREADY HAS A MARKET IN HPC for the HTX socket/slot. They are trying to get support from ATi and nVidia for high end Quadros and FireGLs (and GTx/x1000) since it will sell more of their chips.

nVidia WANTS a socket


Plus HTX is a slot also so if the board gets an HTX slot it can still get another socket for a second CPU.

The possibilities for mix and match are endless. Of course it may piss Intel off but hey I guesss AMD could say "this is for the Opteron launch."
June 9, 2006 2:38:19 AM

Quote:
nVidia WANTS a socket


Where does that say they WANT a socket? They're trying it, wow big deal.
June 9, 2006 2:48:34 AM

Hmm...everyone thinking im an idiot for saying 4x4x4x4 = 4 processors = 8 cores:
Quote:
AMD wants 4 processors
"My definition [of the technology] actually is 4x4x4x4x4. Four processors, 4 GPUs, fed by 4 [GB] of memory, four hard drives and four times the fun."


I dont see ANYTHING wrong with AMD's current Roadmap (wish everything was sooner maybe?) but i thought this thread was about the AMD spokesman talking about the 4x4x4x4 since we already nailed the topic about 4x4 into the ground.
June 9, 2006 3:36:56 AM

Quote:
Anyone else remember when 486DX came out?
"What can that do that my SX won't?"
Maybe it's progress, maybe it's just more gratuitous junk.
Time will tell.


Umm, it was significantly faster and came out few years before your SX's math coprocessor was zapped by Intel 1991.

On topic let them put more cores in there I can use them for rendering. The real question that I have is how will Vista handle these extra sockets with regard to licencing? I'm sure that's info is out there just haven't bothered to research as I anticipate it being another slug of an OS that will eat one of my future cpu cores.

The thing that gets me is the fact that Intel will have their Quad cores out in Q1 and workstation multi socket MB are a given. So I don't really get what all the hype is about.
June 9, 2006 3:40:55 AM

Im sorry to say this im a Amd fan sense a while now but this whole Core deal is getting more then stupid. If they continue doing that im not sure if ill stick with amd for my next gaming rig. I can maybe see 2-4 Cores more then that is plan waste of money period.
June 9, 2006 3:57:45 AM

Quote:
isnt Quad SLI silly?


Isn't arguing about CPUs aren't yet available for purchase silly?

Isn't arguing silly?

Face it: There will always be Intel fanbooys, and there will always be AMD fanboys.

Intel fanboys will always think that Intel pwns all, and that anyone who disagrees is an AMDumbass.

AMD fanboys will always think the opposite.

The fanboys will always argue about which CPU maker is better.

And I will always think of them as idiots.

In the meantime, I'll be grateful that I have a computer and Internet access, unlike many starving children living in Africa to whom AMD is just a funny way to arrange strange letters from a foreign language.
June 9, 2006 4:00:59 AM

Quote:
Im sorry to say this im a Amd fan sense a while now but this whole Core deal is getting more then stupid. If they continue doing that im not sure if ill stick with amd for my next gaming rig. I can maybe see 2-4 Cores more then that is plan waste of money period.


Maybe your money.... Certainly not an enthusisast's money.

If you believe in 4 cores right now (4x4) why not 8 in the future!!!!?
June 9, 2006 4:08:58 AM

Quote:
Im sorry to say this im a Amd fan sense a while now but this whole Core deal is getting more then stupid. If they continue doing that im not sure if ill stick with amd for my next gaming rig. I can maybe see 2-4 Cores more then that is plan waste of money period.


Maybe your money.... Certainly not an enthusisast's money.

If you believe in 4 cores right now (4x4) why not 8 in the future!!!!?
June 9, 2006 4:11:53 AM

Quote:
isnt Quad SLI silly?


Isn't arguing about CPUs aren't yet available for purchase silly?

Isn't arguing silly?

Face it: There will always be Intel fanbooys, and there will always be AMD fanboys.

Intel fanboys will always think that Intel pwns all, and that anyone who disagrees is an AMDumbass.

AMD fanboys will always think the opposite.

The fanboys will always argue about which CPU maker is better.

And I will always think of them as idiots.

In the meantime, I'll be grateful that I have a computer and Internet access, unlike many starving children living in Africa to whom AMD is just a funny way to arrange strange letters from a foreign language.

Your joking or joining the peace corp, right?
June 9, 2006 4:23:33 AM

8 cores is not silly at all but it's the people who would buy it are. I could see 8 cores in the next 2 to 3 years but right now I don't think it is really necessary but if they do launch it I would have no doubt that consumers would'nt buy it because it's overkill. :) 
June 9, 2006 4:26:41 AM

Keep your pennies - you need them so you can attend school, learn how to spell "definitely" and take a course about "remaining silent and thought a fool, rather than opening your mouth and removing ALL doubt."

8-cores is dumb right?

Yeah. Read the following. See if you see a pattern:

"The world has a market for maybe 8 computers"
"640 Kilobytes should be enough for anyone."
"I'll NEVER need a Gigabyte of RAM! NEVER!"
"You know, even if you type non-stop, and never sleep, you'll never fill up a 10MB hard drive".
"This computer will last me 10 years!"
"Who would ever need a 32-bit computer, except for scientists?"

Now, run along home with your silly commentary, maybe there's someone there who might take you seriously for more than 500 milliseconds.
June 9, 2006 4:27:48 AM

Quote:
isnt Quad SLI silly?


Isn't arguing about CPUs aren't yet available for purchase silly?

Isn't arguing silly?

Face it: There will always be Intel fanbooys, and there will always be AMD fanboys.

Intel fanboys will always think that Intel pwns all, and that anyone who disagrees is an AMDumbass.

AMD fanboys will always think the opposite.

The fanboys will always argue about which CPU maker is better.

And I will always think of them as idiots.

In the meantime, I'll be grateful that I have a computer and Internet access, unlike many starving children living in Africa to whom AMD is just a funny way to arrange strange letters from a foreign language.

Your joking or joining the peace corp, right?

I'm not joking; have you ever heard of One Laptop Per Child? These are $100 PCs, with specs similar to PCs of 1998, that will be given to children in Africa so that they can educate themselves over the Internet. Go into the wiki, and then click "hardware" to see the specs of these machines. They have 400MHz AMD processors, 128MB of RAM, and 512MB of flash memory used for storage. They're running a stripped-down version of Linux. Just consider yourself lucky that you have food and clean water, let alone instant access to pretty much whatever the hell you want over the Internet.
June 9, 2006 4:29:48 AM

Quote:
2 16 bit HT3 links will do 51.2GB/s.

I don't really want to get involved in this debate again, but I'd thought I'd point out that that is completely of. HT3.0 runs at 5.2GT/s so a single 16bit link will only provide 10.4GB/s. 2 16bit links for bidirectional operation will provide 20.8GB/s of aggregate bandwidth. Even using 32-bit links the Hypertransport Consortium quotes lower theoretical maximum numbers than you.

Quote:
1.8 GHz, 2.0 GHz, 2.4 GHz and 2.6 GHz Clock Support
41.6 GB/s Aggregate Bandwidth
20.8 GB/s (166.4 Gb/s) per Link

http://www.hypertransport.org/tech/tech_htthree.cfm?m=3

41.6GB/s aggregate over 2 links may sound like a lot but that is assuming AMD even implements all of the HT3.0 features which is unlikely. HT2.0 runs at up to 1.4GHz with 32bit links providing up to 11.2GB/s per link while AMD only uses 1GHz with 16bit links. They'll likely remain with 16bit since I believe 32-bit links have higher latencies. They're unlikely to use 2.6GHz since even though that is in spec, it doesn't leave much margin for error and it'll be more costly for motherboard makers. A 2GHz link is likely since it's a nice round doubled marketing figure and leaves plenty of room for overclockers. 16bit 2GHz links give a 16GB/s aggregate bandwidth.
June 9, 2006 4:33:50 AM

Quote:
Keep your pennies - you need them so you can attend school, learn how to spell "definitely" and take a course about "remaining silent and thought a fool, rather than opening your mouth and removing ALL doubt."

8-cores is dumb right?

Yeah. Read the following. See if you see a pattern:

"The world has a market for maybe 8 computers"
"640 Kilobytes should be enough for anyone."
"I'll NEVER need a Gigabyte of RAM! NEVER!"
"You know, even if you type non-stop, and never sleep, you'll never fill up a 10MB hard drive".
"This computer will last me 10 years!"
"Who would ever need a 32-bit computer, except for scientists?"

Now, run along home with your silly commentary, maybe there's someone there who might take you seriously for more than 500 milliseconds.


I'm sure that someday there will be a market for 64-bit software and 8-core CPUs; that day just hasn't come yet.
June 9, 2006 4:44:03 AM

Yeah maybe a couple years down the road, But its still a little soon and Personaly i think they should get 64bit running good before jumping into 20 cores. Cause for me i still have a software issue here and driver one there. I know this isnt amds fault but maybe taking stuff one step at a time would work better for the consumer cause we cant even use 64bit stuff fully yet. And sure no normal person will own a 4x4x4x4. The people that will own a computer like this will have the software that needs such setups.
June 9, 2006 4:52:32 AM

Quote:
Yeah maybe a couple years down the road, But its still a little soon and Personaly i think they should get 64bit running good before jumping into 20 cores. Cause for me i still have a software issue here and driver one there. I know this isnt amds fault but maybe taking stuff one step at a time would work better for the consumer cause we cant even use 64bit stuff fully yet. And sure no normal person will own a 4x4x4x4. The people that will own a computer like this will have the software that needs such setups.


We'll have to see what happens with Windows Vista. It's amazing how it took like 50 years for them to put two chips on one die; a year later they're already talking about four.

Moore's law will have to be changed to say that the amount of cores will double every time that a fanboy posts a link to an article from The Inquirer.
June 9, 2006 5:30:08 AM

OK I had a much bigger post but I just erased it all cuz it was too long and drawn out.

I vote that AMD is going to have some problems getting the public to take this new design mainly because of cost and heat/cooling solutions. I believe for at least the first year or more, the market will be very slim for this design.
      • 1 / 3
      • 2
      • 3
      • Newest
!