Sign in with
Sign up | Sign in
Your question

NVIDIA'S Big Bang II coming soon

Last response: in Graphics & Displays
Share

What will be NVIDIA's Big Bang II?

Total: 95 votes (13 blank votes)

  • GT200 die shrink
  • 20 %
  • Brand new arch
  • 20 %
  • SLI II
  • 14 %
  • RV770 killer
  • 9 %
  • "The True Way It's Meant To Be Played + Priced"
  • 9 %
  • OMGWTFITSGONNAPWNNOMATTERWHAT
  • 10 %
  • Call it quits
  • 22 %
July 24, 2008 6:33:36 PM

http://www.tweaktown.com/news/9858/nvidia_s_next_big_ba...

Quote:
Now, those things may be stating the obvious as far as progression goes for future driver releases, but the particularly interesting thing to note is the mention of the Big Bang II which this spy shot indicates is arriving in September.

Given NVIDIA called the introduction of SLI Big Bang I prior to its unveiling, this next big bang certainly has us all wondering.

What could it be?


Don't you guys just love speculation? :lol:  Please vote!

More about : nvidia big bang coming

July 24, 2008 6:51:04 PM

i wanna see a graphic board where you can swap the processor and ram. have it fully customizable. or i want to see true dual core processors
July 24, 2008 7:11:57 PM

cal8949 said:
i wanna see a graphic board where you can swap the processor and ram. have it fully customizable. or i want to see true dual core processors


True words. :D  However, I don't guess that's what NVIDIA's BB2 is all about... I guess ASUS had a prototype based on the concept you described, anyway, but it never made its way to the streets.
Related resources
a b Î Nvidia
July 24, 2008 7:33:11 PM

That's a toughie. I'd like to see a card with AT LEAST 8800GT performance (preferably 4870X2 performance!) that uses a fraction of the power of current cards.
What I think we'll get is a very modest bump in performance, but at least the first of a set of working APIs to let a GPU handle a lot more tasks, not just physics, and useful demo code to go with them.
July 24, 2008 7:52:38 PM

It's called steadyshot; it comes standard on almost every digital camera nowadays.

Tip:

Stop drinking 15 cups of coffee before attempting to snap shots of a PC monitor...

Being able to actually READ the print on the screen might help with the confusion.
July 24, 2008 7:55:05 PM

Onus said:
That's a toughie. I'd like to see a card with AT LEAST 8800GT performance (preferably 4870X2 performance!) that uses a fraction of the power of current cards.
What I think we'll get is a very modest bump in performance, but at least the first of a set of working APIs to let a GPU handle a lot more tasks, not just physics, and useful demo code to go with them.


That would be nice, although very unlikely... :C Also, if we think about current GPUs' temperatures...
Anonymous
July 24, 2008 7:56:32 PM

I think its likely the BIG BANG II will be a new cross connect between cards that will allow better scaling... also they will announce it in conjunction with the gt200 refresh...

although I hope its a new arch... its highly unlikely
July 24, 2008 8:09:41 PM

Quote:
I think its likely the BIG BANG II will be a new cross connect between cards that will allow better scaling... also they will announce it in conjunction with the gt200 refresh...

although I hope its a new arch... its highly unlikely



This is somewhat what I was leaning to as well; but we'll see I suppose
July 24, 2008 8:12:14 PM

Isn't most of SLI handled through drivers and software though? I don't see a new connector making that huge of an impact.
July 24, 2008 8:20:54 PM

Quote:
What I think we'll get is a very modest bump in performance, but at least the first of a set of working APIs to let a GPU handle a lot more tasks, not just physics, and useful demo code to go with them.


Hybrid SLI with one GPU committed to traditional GPU tasks and the second as an additional 'CPU'??
July 24, 2008 8:30:02 PM

what cal89 said would be awesome, ive been dreaming about something like that...imagine how cheap that solution could be, upgrading every few months would be too possible
i kno asus had a prototype, but they only made like...6 i think
July 24, 2008 8:32:34 PM

i'd have to say it's just a new version of SLI that will scale about 23% better than the current SLI layout... performance tweaks really...

i dont have time to look it up again, but there's a company out there that claims to have scaled SLI and X-Fyre to 80-95%, that's right, both SLI and X-Fyre, with that said, it's not hard to believe that Nvidia knows how to scale SLI better, they're just milking it like a hog...
a b Î Nvidia
July 24, 2008 8:37:25 PM

A new version of SLI that allows you to use mixed cards
a b Î Nvidia
July 24, 2008 8:37:50 PM

Possibly, or even a single GPU able to handle work in addition to traditional graphics tasks, particularly when not gaming.
...something that might begin to justify their assertions regarding the CPU becoming unnecessary. Imagine something able to make Intel sweat enough to start cutting prices, or maybe stopping the "new socket every time the Grinch sneezes" obsolescence game.
July 24, 2008 8:42:57 PM

The Dual Core GPU..., nuff said
July 24, 2008 8:55:26 PM

Doubt its anything to do with SLI.

I'm guessing a new architecture; I've heard some whispers for some time now, and the current line of GPU's are all re-hashed 8800's. NVIDIA's due for something thats 100% new.

It could also be a partnership of some sort, or something to do with onboard physics.
July 24, 2008 8:57:20 PM

what if it's all the goodness that we hope of all put into one big KA-POW!?!?!
July 24, 2008 9:05:31 PM

It is useless marketing hype to stop people from buying 4870 X2s....
Anonymous
July 24, 2008 9:54:00 PM

WHAT IF... the big bang II is the announcement that world peace may now commence

hehe
July 24, 2008 10:03:21 PM

Quote:
WHAT IF... the big bang II is the announcement that world peace may now commence

hehe


Most people wouldn't care since their framerate wouldn't improve and they wouldn't be able to start flame wars. ;D

(Or actually nobody would do anything because the "world" would be "KABOOM", if that's what you meant ;D)
a b Î Nvidia
July 25, 2008 12:10:56 AM

Considering it say this update will bring various improvements like;
Display connectivity
Quality improvements
Performance improvements
OGL 3.0 support

Sounds more driver based than hardware based.

Sound like new algorythms, improved SLi including PhysX on a second card not just primary, plus proably updated control panel, and of course it will launch next to the G200b to give it more than one day of press.
July 25, 2008 1:12:23 AM

mathiasschnell said:
Isn't most of SLI handled through drivers and software though? I don't see a new connector making that huge of an impact.


Could reduce cpu overhead - but that might be solvable through drivers as well.

I could see a pretty big upset with Intel and BR04 chips if that were the case though, heh; but anyways, TGGA has a good point and that'd be nice too.
July 25, 2008 1:33:04 AM

i can see the gtx280+ with that high a clock that they'll have to move it outside the case because of heat, but they will put a spin on it and say its the next logical step, its the only thing they can do to counter the 4870x2 at such short notice
July 25, 2008 2:55:24 AM

They are actually going to start selling graphics upgrades as full Nvidia "TESLA GAMER" units. It'll only cost 12,000$ for a graphics upgrade, it'll be fully external to your regular PC, and it'll come with a KOOL KASE TOO!

: DDDDDDDDD
July 25, 2008 4:16:38 AM

who cares? it'll be another overpriced nvidia product. The truth of the matter is that even if they bring out a card that eclipsed everything in the graphics world what software needs it besides the resource hungry crytek engine?
July 25, 2008 4:26:11 AM

NVIDIA's first Big Bang driver was the one that added support for SLI so this new driver should be something big.........as far as im conserned, SLI started showing its color only recenttly.........so was it truely a big band....
July 25, 2008 5:07:41 AM

This one is EASY... It's driver level Volt mods in nTune. You know how now they're greyed out... Well, BANG! soon they won't be. The new GTX 200 series and I believe the 9800GX2 have a software modifiable voltage reg. chip on board. I know a few card builders have offered this before but now I believe it will be available to all! This excites me because I will be able to put away my soldering iron now. Anyway think about it... When SLI (1st Big Bang) came out the performance gains weren't what they are today. And if I am right, the ability to adjust the vGPU and vMEM on your card will allow for SLI like performance gains. On a simple watercooling loop my two 8800GTX's score 20596 in 3DMARK06 with there hardVOLTmods. They run at 729core 1944shader and 2430memory, and remember each MHz counts more on a 8800GTX and Ultra card due to 24 ROPs and 384-bit memory bus compared to 16 ROPs and 256-bit of the 9800xxx & 8800GTS 512 series. It's what I think anyway... Later!
July 25, 2008 6:20:29 AM

I know what it is, its Duke Nukem on TWIMTBP!!!! And he will be driving his green CUDA!!!! And it wont be DX10.1 !!!! And itll have ageia physics!!!! And itll stop anyone from buying a R700 !!!! And they bought Larrabee !!!! We get one screen shot, thats a crappy take at that, and this gives us all a bone to chew on while nVidias been dropping prices to keep competitive, not exactly explaining why they charged so much to begin with, and slowly lowered their prices, instead of a quick response. I keep hearing G350, I hear theres a 9800+ coming, now this. Wheres the beef?
July 25, 2008 6:47:51 AM

JAYDEEJOHN said:
I know what it is, its Duke Nukem on TWIMTBP!!!! And he will be driving his green CUDA!!!! And it wont be DX10.1 !!!! And itll have ageia physics!!!! And itll stop anyone from buying a R700 !!!! And they bought Larrabee !!!! We get one screen shot, thats a crappy take at that, and this gives us all a bone to chew on while nVidias been dropping prices to keep competitive, not exactly explaining why they charged so much to begin with, and slowly lowered their prices, instead of a quick response. I keep hearing G350, I hear theres a 9800+ coming, now this. Wheres the beef?



^^hahahahahaa :lol: 

i wonder where concrum is.....m sure he can shed sum color full out of the world insight on this.....
July 25, 2008 6:51:24 AM

My problem with all this is, as nVidia released one card after another, doing nothing, and charging more, sometimes by just changing the name, then they come out with their new cards, priced very high, and somewhat disappointing in performance, then after the 4xxxs are released, they "come out" with the 9800GTX+, which still isnt out yet, but got reviewed alongside the 4850 previews. They dont quickly lower their prices, theres rumors of a 350, now this. nVidia needs something, and on the cheap too, to get themselves back in the peoples good graces. Theyve burned many a bridge, their AIB partners as well. I know its not "official" , but Im betting this is coming from a green PR outfit somewheres. I dont have a problem with that, but theyd better deliver, astound us so to speak, then itll be right again, cause if this is a flop, or just rumor, or anything Ive mentioned, nVidias just dug themselves in even more. I hope thats not the case
July 25, 2008 7:45:39 AM

I doubt this is anything more than desperate marketing - they up the sh!t and they know it.

In times of desperatation and a fall from grace, they have applied the 1st rule of marketing 101.

"Find something successful that you've done. Add a +1. Issue a press release".
July 25, 2008 1:30:55 PM



Too early for that: I don't think that DX11 will be coming THAT soon. Well, that's possible, but I also read reports stating that it should arrive by 2009 Q3 or Q4, close to Windows 7's launch (but also compatible with Vista). I'll try to find the link. Anyway, since Microsoft has been saying that DX11 is "based on and compatible with DX10", do you guys think that HD 48xx will be able to do it, since they already have a Tesselation Engine and so on?

Perhaps TGGA knows this one.
July 25, 2008 5:01:57 PM

DX10.1 cards will have much greater compatibility to DX11. DX10 only cards will be the same as they are now. Not sure, I thought it was done in software somehow, thus making it possible. Unless youre in Africa or at the zoo (sometimes it resembles it here tho) theres never an Ape around when you need one heheh A lil more on DX11 http://www.shacknews.com/onearticle.x/53810 On my link, read the thrid bullet point listed. This will help current gfx makers, and wont help to change the playing field (ala Larrabee) as well as NO mention of RayTracing.
July 25, 2008 5:30:48 PM

big bang 2 sounds like the title to a porno
July 25, 2008 7:05:32 PM

fugben said:
big bang 2 sounds like the title to a porno


HAHAHA! Boy, I can imagine Huang as the star, opening his "can of whoop-ass"! :lol:  :lol:  :lol: 
July 25, 2008 7:07:30 PM

JAYDEEJOHN said:
DX10.1 cards will have much greater compatibility to DX11. DX10 only cards will be the same as they are now. Not sure, I thought it was done in software somehow, thus making it possible. Unless youre in Africa or at the zoo (sometimes it resembles it here tho) theres never an Ape around when you need one heheh A lil more on DX11 http://www.shacknews.com/onearticle.x/53810 On my link, read the thrid bullet point listed. This will help current gfx makers, and wont help to change the playing field (ala Larrabee) as well as NO mention of RayTracing.


Thanks a lot, Jaydee! I was waiting for your comments, hahaha! Have you already voted? :wahoo: 
July 25, 2008 7:26:20 PM

I just did, I voted for SLI, mainly because Im thinking more of a software related scenario. Maybe more gpgpu functionality, and SLI improvements. These are great things if we forget that maybe the competition can either do them already, or can just as easily, either way, its a step in the right direction. I just dont think nVidia should be patting themselves on their collective backs unless its truly something astounding
July 25, 2008 7:53:12 PM

NVIDIA's Big Bang II revealed, jaydee: http://www.theinquirer.net/gb/inquirer/news/2008/07/25/...

Quote:
WHAT DO YOU do when you suddenly find yourself in second place, trailing badly with no hope for the rest of the year? You stop artificially crippling your drivers and spin it to the users as magnanimous, welcome to Nvidia's Big Bang II.


Kinda what you said. :lol: 
July 27, 2008 5:38:32 AM

They announce that they borrowed some money and bought AMD/ATI, then they say **** YOU all and raise all their prices by $100 and shut down ATI. Oh and add a Bang at the end.
July 27, 2008 2:49:54 PM

Be nice to see like a GPU core at CPU speed :p  2.2 Ghz , but i dont kno tha much about graphics cards so i dunno if that is possible
Anonymous
July 27, 2008 4:52:28 PM

lol what a lame big bang

either way I think it will be announced in conjunction with the release of gtx 280 refresh... maybe the rumored gtx 350... or just a die shrink of the gtx 280 no body knows
July 28, 2008 12:42:32 AM

....skynet ?
July 28, 2008 10:04:22 AM

Nvidia's new "big bang 2" is a victim of Nvidias confusing naming scheme. Its not in fact the sequel to nvidias first "big bang" event its named big bang 2 as its a sequel to THE big bang.

Nvidia have designed a new GPU leap frogging AMD in process's with so many transistors tightly packed in to such a small package, running so hot and requiring so much power in one GPU that the only equal in history is the original infinite temperature infinite energy infinite density singularity just preceding the original big bang. They are calling this the "singularity GPU" hence the "big bang 2" project name.

Nvidia observers are uncertain however as to whether the new singularity GPU will turn out to be a real big bang 2 or just a blackhole out of which money, heat and energy cannot escape.
a b Î Nvidia
July 28, 2008 12:01:19 PM

What does that say for the "theory" of Intelligent Design?
July 28, 2008 12:06:15 PM

Onus said:
What does that say for the "theory" of Intelligent Design?


I believe inteligence in design went out the window about the time accountants, comitees and focus groups got in on the design process... since then intelligent design is purely theory.
July 28, 2008 1:15:16 PM

This is very much going to be a driver-related thing, and not hardware. To be honest, while nVidia so very badly NEEDS a brand-spanking-new architecture now that RV770 finally has ATi beating the crap out of G80/G92/GT200, (which are all effectively identical) they aren't going to get out of their mess without it, and every day they don't sees them lose a combination of profit margins/market share; all they can do is attempt to control the ratio of what it is they lose.

However, nVidia's placed WAY too much of a bet on GT200, which they were confident would, thanks to having vastly more die area than any known integrated circuit in history, would give them an unbeatable edge... They most certainly weren't believing that it would turn out to be an utter lemon in the face of RV770, that such a chip could rival it in pure performance, the one strength they sacrificed everything else for, in spite of taking only 44.4% as much silicon.

So no, don't expect nVidia to be able to pull a rabbit out of its hat and possibly ever be competitive with RV770 anytime this year, as much as they'd like you to believe so you'll put off that Radeon 4850/70 purchase... And I'm pretty sure that they're not sleeping all that well at night either, worrying how whatever they have next is going to fare given that it appears it may be greeted shortly out the gate by RV870, which among other things will apparently shrink down from 55nm to either 40 or 45nm.
warezme said:
The Dual Core GPU..., nuff said

Gah, to think that people still believe in this sort of thing...

To be honest, a modern GPU is already multi-core; RV770, for instance, can be said to have 160 cores; each block of 5 stream processors is wholly independent of each other, and possesses a degree of its own memory in the form of registers, and the GPU handles hundreds of simultaneous threads, many times over what an Intel quad-core can. Similarly, one could label GT200 as a 240-core GPU, as likewise, each of its stream processors can independently process its own data and instructions.

The whole "dual-core" thing comes from a misconception that GPUs are simply CPUs for graphics... Which is simply not true. They may be integrated circuits (ICs) designed for computing, but the similarities effectively end there.

GPUs are designed for math-intensive applications with massive arrays of data that become ever-larger with each new game, (up till Crysis, at least) and can be infinitely repeated, because you've got a LOT of pixels to process per frame, and a minimum cap of 60 frames per second to process. As a result, unlike with CPUs, GPUs can be made parallel and will scale extremely well. Additionally, a GPU, unlike a CPU, has pretty much always come with its own integrated memory controller, as well as all the I/O functions it needs: both the PCI/AGP/PCI-e/whatever interface, as well as output interfaces. (RAMDAC, DVI/HDMI interface) As such, a GPU is more akin to a complete "system on a chip" than a single component. And it would make utterly zero sense to clone most of those parts to make a "dual-core" chip. On dual-GPU boards, they are retained because they are still used, just like how dual-CPU boards are.
July 28, 2008 1:42:54 PM

Honeslty, BIG BANG II will be:

A Die shrink.
A SLI-Optimization.
Every G80 (or better) will run physics.

Of those 3, 2 of them have been sitting on the last 6 months in freaking drivers team Desk. They get competition now, so they are being launched. If my "speculation" is true, i would be even more pissed because Nvidia just took a breather in the middle of the race. If so, they deserve to be outgunned.
July 28, 2008 10:42:22 PM

Dual core GPU... hehehe

I think this Big Bang II is more of a Big Marketing campaign to get people guessing so they won't buy the ATI cards quite as fast.

Supporting SLi on Intel chipsets would be an actual Big Bang of sorts, but I don't see NV going that path.

Edit: I love how "Call it quits" is winning...hehe
!