Sign in with
Sign up | Sign in
Your question

Whats a Mobility X1400 equvilent too?

  • Graphics Cards
  • GPUs
  • Desktops
  • Mobility
  • Graphics
Last response: in Graphics & Displays
November 14, 2006 6:17:56 PM

Whats a Mobility X1400 equvilent too? In desktop GPU levels of course. (like its equal to a GF 7300?)

More about : whats mobility x1400 equvilent

a c 365 U Graphics card
November 14, 2006 6:20:36 PM

That would be my rough guess.
November 14, 2006 6:47:14 PM

Probably just like an X1300 SE.
Related resources
November 15, 2006 4:17:43 PM

sorry, I'm not as well versed in ATI tech. A X1300se is the lowest level X1*** right? So it would be great for older games(Homeworld 2 MODs, Dawn of War) OK for resonably new games(Half Life 2, Civ 4) and poor for stuff like Prey and a slideshow for Crysis...
November 15, 2006 5:11:43 PM

Pretty much. Half Life 2 should work decently though, Prey maybe at terribly low settings, and crysis... slideshow for sure.
November 15, 2006 5:30:42 PM

really? What about stuff like Civ 4? I know Supreme Commander would have to be at extra low, and Company of Heroes at what, mid-low?

By the way, what are ways I can reduce bottlenecks(without changing hardware---this is a laptop). My dad's laptop has a 2.0ghz Core 2, and 1 gig of ram, but only a X1400. Obviously, its GPU bottlenecked.(and maybe a bit RAM) I obviously cant overclock. Anyways I can increase perfromance at all?
November 15, 2006 6:01:30 PM

Nothing you can do really. Just try everything, and lower settings until it works.

The Core2 will help a bit at lower resolutions, for sure.
November 16, 2006 3:51:14 AM

so where does the GPU start to get more of the bottleneck? I'll guess that at 1080 720 its about even. 800 600 its CPU, and 1280 1024 on its GPU. Correct?
November 16, 2006 12:49:45 PM

Both the GPU and CPU will bottleneck all resolutions, but the GPU starts to become more important at 800x600 and for sure at 1024x768.

Alot of it has to do with the game though, not all games are equal and some need alot of CPU.
November 16, 2006 9:12:54 PM

There is no clear point of when bottlenecks shift; it depends on your settings, the game, and even the kind of scene in the game. My favorite example would be in Oblivion; in a city, even a Core 2 extreme might bottleneck, say, a Radeon X800GTO; almost whatever you do, in cities in Oblivion, your CPU is determining your performance. However, on the flip side, if you go for a FPS title like Prey, you can go with a weak CPU and still get good performance with a solid graphics card.
November 17, 2006 3:55:57 PM

can the CPU compensate for a weak GPU by a margin? For instance, can the Core 2 take some of the graphics burden for the 1400? I know that before seperate GPUs, CPUs did the graphics. Obviously the compensation would be minor, but will the Core help the x1400 at all?(in graphics)
November 17, 2006 4:01:03 PM

Core2 won't take the graphics burden from the video card, but at lower resolutions where the graphics card isn't struggling as much the Core2 will finish it's own tasks faster.

Bottom line, faster CPU = faster FPS at lower resolutions.
November 17, 2006 4:45:04 PM

I have a Dell Inspiron 1505 with a Mobility X1400 and honestly, I believe it was NOT money well spent... especially when I got it (back then it was a $149 upgrade) The variety of games that it can play is way too limited. Sims 2? It runs OK I guess. Star Wars Galaxies? Barely acceptable. I haven't dared attempt to play anything too modern on it (BF2, Oblivion, etc) and wish I'd have just kept my money and bought a Nintendo DS Lite instead of opting for that upgrade. It's a 4 pixel pipe card... that puts it WELL behind an Nvidia 6600 GT (8 pipes) Yes, everyone here rags on the Intel 950 video chip, but I think it would have worked fine for me and saved me some money.
November 18, 2006 7:04:08 AM

Yeah, I was worried about that... My dad *needs* a light laptop(he likes the IBM lenovo thinkpads)... If I could have convinced him into getting a normal sized laptop I could have gotten a much higher level card... Still, ANYTHING is better than the old Radeon 7500 his old laptop used...
November 18, 2006 2:27:21 PM

As long as dad isn't into gaming, he won't be disappointed with it... and I'm sure it'll handle Vista's fancy new interface just fine.
November 18, 2006 2:49:14 PM

yeah, thats true. Its just stinks thats its the same laptop I use for gaming on trips...

So it can run Vista with all the goodies? Even though its low level DX9? Or will there be 1 or 2 DX 10 only feautres of Vista?
November 18, 2006 3:03:24 PM

x1300 with hypermemory
November 18, 2006 3:03:51 PM

From what I see on the ATI site, this X1400 is actually a X1300 core (I concluded this from the number of transistors).

X1400 specs

X1300 specs

Unfortunately, there is no information on the frequencies (on both).

I was checking with a friend for laptops with something else than Intel graphics, but the difference in price from the GMA950 to the X1400 is to big IMO. And I don't think that X1400 can handle recent games, so it's really no difference from the Intel chip.
November 18, 2006 3:37:21 PM

My X1400 is clocked at 432 MHz. The 1400 has 128 MB of dedicated memory and will utilize an additional 128 MB of shared RAM for a total of 256 MB.

It is a gaming weakling, but with people saying the GM 950 does OK with Vista... that tells me this one will do just fine.
November 18, 2006 7:44:27 PM

Just for curiosity, how would it fare against a Geforce AGP 6600GT? Thats my desktop GPU...
November 18, 2006 7:49:14 PM

4 pipe card (1400) vs. an 8 pipe card (6600 GT)

It would be a massacre.