Cyperpower Unveils Fang Taipan Gaming Laptop
CyberPower released a new gaming laptop with the dramatic name "Fang Taipan".
The manufacturer claims that it is the most "customizable gaming laptop". Buyers can choose to equip the system with two Nvidia GPUs, Core i7 Ivy Bridge processors, and memory.
The casing is based on the Clevo P370EM whitebook design, which integrates a 17.3-inch 1920x1080-pixel display with an integrated 2.0 megapixel webcam, fingerprint sensor, and an illuminated keyboard. The base system with an i7-3630QM processor, 16 GB memory, a 60 GB SSD, a 1 TB HDD, a DVD writer, two Nvidia GTX 670M GPUs, and a Windows 8 pre-order is priced at $1,819. Step up to an i7-3940XM processor, 32 GB memory, two GTX 680M GPUs, two 512 GB SSDs, a Blu-ray disc burner, and a dedicated sound card and you are looking at spending about $5,000.
According to Cyberpower, the Taipan, which got its name from the venomous Australian Taipan snake, weighs about 8.6 lbs in a basic configuration with one GPU.
I'd rather get a single GTX 680(which they offer) but how does going from 2xGTX 670ms to a single GTX 680m signify a $250+ increase? Shouldn't it go down a bit?
Seems kinda of a rip off at this price.
Totally agree, but you should also mention that you got this off their site i was confused until i double checked your info. Most pc manufacturers will try to rip you off on hardware if they have the ability to do so Apple isn't the only computer company trying to rip people off. (Apple just happens to be best at it)
There are a few games where there is no FPS improvements, or even FPS drop. I don't remember the names exactly. I suppose you got lucky that your games' developers actually too their time to make sure their stuff is SLI/CF compatible.
its because of the price of the 680m. compare it to the 7970, whichis only slightly slower than it, and its like 240$ less. the 680m atm is definitely not worth its price because of the gap.
Microstuttering isn't caused by not pushing enough frames, its caused by the timing to display the frames. even behemoths like dual 7970's or dual 680s can be affected by microstuttering. the case can be found in some setups. it isn't found in everyone's sli/crossfire, but it still exists nonetheless. some people are also more sensitive to smaller increments of fps change than others.
using a 7970m or a 680m, the performance is similar to a HD 7870 desktop.
They center the touchpad according to the location of the keyboard. So if the keyboard is to the left (leaves room for the 10 keypad - total of 17 keys), than they center it under the keyboard.
Yeah, then you update your driver and it usually fixes the problem. Borderlands 2 for instance, I was using 99% of GPU 1 and 5-10% of GPU 2. nVidia released a new update and viola, even split of usage! It is usually up to the GFX card companies to make support for SLi and nVidia is VERY good about it. One of the reasons why I prefer them over AMD. I have never really noticed micro-stuttering which is weird because my eyes are VERY sensitive to things like screen tearing without v sync on.
I'd rather get a single GTX 680(which they offer) but how does going from 2xGTX 670ms to a single GTX 680m signify a $250+ increase? Shouldn't it go down a bit?
Seems kinda of a rip off at this price.
Apparently the mobile gtx680 is based off the desktop 670, while the mobile 670 is roughly equivalent to an underclocked desktop gt 640. I can kinda understand the price difference.
http://www.geforce.com/whats-new/articles/introducing-the-geforce-gtx-680m-mobile-gpu/
I really wish companies would stop rebranding chips as a new generation when the most certainly arn't!!
Also, I had a look at the specs again-
GPU. CUDA cores. Core clock Frequency mem frequency mem bus. Bandwidth
GTX 680M 1344 720MHz 1800MHz 256-bit 115.2GB/s
GTX 675M 384. 620MHz 1500MHz. 256-bit 96GB/s
GTX 670M 336 598MHz 1500MHz 192-bit 72GB/s
GTX 660M 384 835MHz 2000MHz. 128-bit 64GB/s
Now, that is ridiculous seperation. You lose nearly a THOUSAND cores in the step down from the 680M
Also, how come the 660M seemingly has more power than the 670 and the 675? Is that down to architecture differences?
Don't ya hate it when companies take so many liberties with the brand name and number?
Also, I had a look at the specs again-
GPU. CUDA cores. Core clock Frequency mem frequency mem bus. Bandwidth
GTX 680M 1344 720MHz 1800MHz 256-bit 115.2GB/s
GTX 675M 384. 620MHz 1500MHz. 256-bit 96GB/s
GTX 670M 336 598MHz 1500MHz 192-bit 72GB/s
GTX 660M 384 835MHz 2000MHz. 128-bit 64GB/s
Now, that is ridiculous seperation. You lose nearly a THOUSAND cores in the step down from the 680M
Also, how come the 660M seemingly has more power than the 670 and the 675? Is that down to architecture differences?
Don't ya hate it when companies take so many liberties with the brand name and number?
The 675M and 670M are Fermi chips, they are just rebranded 580M and 570m at higher clockspeeds.