Sign in with
Sign up | Sign in
Your question
Solved

What distro to use to resurrect an old Thunderbird rig?

Tags:
Last response: in Linux/Free BSD
Share
July 29, 2009 7:18:27 AM

I have an Athlon T-bird rig I'm trying to resurrect from being terribly slow to usable again. It's been running Windows XP for years with 448MB of RAM but one of the RAM slots has kicked the bucket so I'm down to 384MB now. I have run XP on it with 256MB but I'm sure you can imagine what fun that would be especially after using the same installation for 3 years straight. But I digress.

I am wondering if the newer versions of the common distributions would be too hefty for 384MB RAM and an 850MHz T-bird (it's degrading so I've toned down the overclock). I wouldn't be running KDE because it will probably suck up too much of the precious little RAM I have.

Full specs:

850MHz Athlon T-bird.
384MB PC133
9800 Pro 128MB (overkill to the max but I had a spare card lying around and it was better than the old GF2 MX400 that the PC already had :p )
Soundblaster Live! 16-bit. I think it's the platinum version.
ASUS A7V (VIA KT133 chipset)
160GB 7200RPM Samsung PATA drive

Best solution

July 29, 2009 8:45:15 AM

Try Xubuntu and Fedora with XFCE and see how well it runs :) 

If Xubuntu and Fedora XFCE are too slow try slax.
Share
July 29, 2009 2:33:07 PM

Will do :)  Never used Slax though.
m
0
l
Related resources
July 29, 2009 3:56:04 PM

Ubuntu 9.04 and Fedora 11 with Gnome run ok on a trashy Celeron 900 with 512MB RAM booting from SD. This is not ideal but it runs, so your 850MHz Athlon T-bird should run ok on XFCE which is faster and less demanding than Gnome or KDE.

A netbook spin of Fedora or Ubuntu should run ok too.

Debian is usually good for older machines but it can be a pain to setup.

Good luck :) 
m
0
l
July 29, 2009 4:08:43 PM

Just to throw a spanner into the works, how about considering FreeBSD running the XFCE4 desktop. That should run pretty sweetly on that setup.
m
0
l
July 29, 2009 8:28:59 PM

Xubuntu served me well on an older laptop, but then again it was a P4 @ 2GHz, so I don't know how well Xubuntu will scale down from there. Also, you could just ditch the Desktop environments altogether and go with a lightweight window manager like one of the *Box (e.g. OpenBox, BlackBox, FluxBox) or FVWM. If you want to tear out even more bloat, you should go with debian, or if you are really adventurous (or you have a computer cluster at your disposal for compiling), you could go with Gentoo. I'd not recommend Gentoo on a machine that slow, though because compiling will take _FOREVER_. Arch Linux might be a good way to get the speed and flexibility of Gentoo without all the compiling.

--Zorak
m
0
l
July 30, 2009 12:23:53 AM

Ok no more suggestions of compiling Gentoo, I'm not really much good with Linux as is :p  I wasn't even able to get Arch set up properly in a VM lol. I've never tried any *BSD OSs either. The system will mostly be used for running OO.o so it doesn't need to be super snappy but anything should be better than running XP SP2 and waiting for 5 minutes before the system is barely usable.

I'm also trying Linux Mint because it doesn't really require any setup beyond disabling the Terminal Fortune quotes.

EDIT: Forgot to mention in the OP that it will be using a 160GB 7200RPM Samsung PATA drive
m
0
l
July 30, 2009 9:17:38 AM

In that case I'd say http://www.xubuntu.org/ is probably your best bet. It's the same setup as Ubuntu so a snap to get up and running. I do like ijack's suggestion though.
m
0
l
July 30, 2009 5:09:03 PM

There are a few lightweight versions of PCLinuxOS too. Usually has a nice setup staight from install. Dabbled with other distros over the years and this is the one I've always seemed happiest with.
m
0
l
July 31, 2009 2:13:58 AM

randomizer said:
Ok no more suggestions of compiling Gentoo, I'm not really much good with Linux as is :p  I wasn't even able to get Arch set up properly in a VM lol. I've never tried any *BSD OSs either. The system will mostly be used for running OO.o so it doesn't need to be super snappy but anything should be better than running XP SP2 and waiting for 5 minutes before the system is barely usable.

I'm also trying Linux Mint because it doesn't really require any setup beyond disabling the Terminal Fortune quotes.

EDIT: Forgot to mention in the OP that it will be using a 160GB 7200RPM Samsung PATA drive


Heh, compiling much on an old 850 MHz T-bird would be a pretty darn slow process unless you use distcc to let your other machines help along with the process. I am currently running Gentoo on my X2 4200+ desktop and it takes a long time to compile things; I couldn't imagine how slow an 850 T-bird would be. I personally like Debian (running Gentoo on my desktop for a specific reason) and Debian Lenny XFCE would run very nicely on your machine.
m
0
l
July 31, 2009 2:21:53 AM

Well this PC has no Ethernet capabilities and only USB1.1 for an 802.11g NIC (I don't have a PCI card for it) so I can't imagine that I'd get enough network throughput to help with the compiling that much. :( 
m
0
l
July 31, 2009 5:22:11 PM

You could compile on your good machine and copy it over, just a question of setting the right flags and some pen drive action.
m
0
l
August 1, 2009 1:57:07 AM

Googling time I guess :p 
m
0
l
August 1, 2009 2:32:15 AM

Fedora XFCE and Xubuntu should take about 20-60min to install.

If you have to compile everything it could take 3 weeks ;) 

:) 
m
0
l
August 1, 2009 5:55:59 AM

randomizer said:
Well this PC has no Ethernet capabilities and only USB1.1 for an 802.11g NIC (I don't have a PCI card for it) so I can't imagine that I'd get enough network throughput to help with the compiling that much. :( 


Want my NIC?

I'll give it for free if you want it, just gotta find it first.
m
0
l
August 1, 2009 5:57:32 AM

I actually got rid of an old 10/100 card recently, didn't think I'd need it now that everything is integrated 100/1000. I was wrong... :( 
m
0
l
August 1, 2009 6:01:02 AM

I have a gigabit Intel NIC but that's for me /evil
m
0
l
August 1, 2009 6:46:52 PM

Quote:
Fedora XFCE and Xubuntu should take about 20-60min to install.

If you have to compile everything it could take 3 weeks ;) 

:) 


Nah. He could probably do it in anywhere between a week and 4 days ;)  Really though, when I was setting up Gentoo, I'd say the reason why it took me as long as it did was because I'd never configured a Kernel before and so I had to look up a boatload of things about my hardware and I read all of the different bells and whistles that you could compile in. The actual compiling time in total was probably about half a day, but then again I have a quad core machine and I was able to enable parallel compilation options in GCC which helps things along a bit. But yeah, if he had a compile farm at his disposal things could go MUCH faster :D 

Anyways, it sounds like he has already made his decision for the most part. I'd say Xubuntu or Fluxbuntu will probably serve that machine well, and if it doesn't, he can go to something smaller like DSL or Puppy Linux and add extras from there.

--Zorak
m
0
l
August 1, 2009 11:33:52 PM

It depends on how many packages you're compiling and your hardware :) 

I was guessing / kidding / exaggerating a bit ( thus the ;)  ), but it's all very relative.

If you're only compiling the kernel and a couple of other major components then it will probably take a few hours on modern hardware.

If you try to compile 10,000 packages it will take a long while, even if you have a small compile farm.

You might think 10,000 is a large number but in fact it's not that large, Fedora offers about 8,000 pre-compiled packages and Debian about 25,000 and several other distros offer 10,000 or more pre-compiled packages. If you try to compile that many packages from source it will take quite some time, even if you have a lot of hardware at your disposal.

So it depends on how low-level you want to get with Gentoo or LFS and other factors.

:) 
m
0
l
August 2, 2009 1:49:58 AM

Well it would be a "fun" project in my spare time. But like Zorak, I've never configured a kernel so I'd spend alot longer researching how to do it that actually doing it. Hopefully the compilers can use as many threads as possible :D 
m
0
l
August 2, 2009 2:06:06 AM

Yeah, it would be fun to do it some day and it would be a good educational experience for sure :)  but this isn't something most users should attempt.

If enough people decided to compile Linux from source they'd crash the power grid in their respective countries.

gcc will use 100% of your CPU cores for hours on end while doing this. The thermal stress alone could kill your computer if it's not up to it ( you absolutely have to have good airflow and AC for proper cooling ).
m
0
l
August 2, 2009 2:22:45 AM

Good airflow? Nope. A/C? Nope. Looks like I'm out of luck :p  Well I am not one for being afraid of a bit of CPU heat, given I took my E6600 up to 70C idle temps once :lol:  I'll undervolt my i7 920 and see if it can handle it, I know it exceeds 80C while encoding video with relatively cool ambient temps of under 24C.
m
0
l
August 2, 2009 2:53:18 AM

gcc is so good at stressing the CPU and every other component in your computer that many computer companies and hardcore geeks use it to burn in their computers to test them for stability ( similar to the way prime95 works but much worse ).

If you push the CPU to 100% utilization and all your other parts for a couple of weeks straight it's very likely that any subpar components in your system will fail, possibly catastrophically.

gcc doesn't just kill the CPU, it pounds your memory, chipset, hard drive and everything else into the ground and just about every individual electronic component on those devices as well.

If your voltage regulators are not properly cooled their built-in thermal shutdown protection circuitry might kick in and your computer could power down or catch fire and your caps could explode, especially if they are bootleg or low quality caps.

Really terrible things can happen if you push the components to their temperature limit for a certain period of time.
m
0
l
August 2, 2009 3:03:56 AM

That's why things have warranties. If a component can't handle consistently high loads then it isn't up to standard. :D  Although I don't really want to pound my hard drive, being mechanical even if it is functioning fine the wear and tear is just being accelerated by overuse.

I really need a new case with some decent fans. The only fan 120mm I have is my PSU fan and I only have a 92mm and 80mm case fans, both of which don't push a great deal of air.
m
0
l
August 2, 2009 3:06:09 AM

randomizer said:
Good airflow? Nope. A/C? Nope. Looks like I'm out of luck :p  Well I am not one for being afraid of a bit of CPU heat, given I took my E6600 up to 70C idle temps once :lol:  I'll undervolt my i7 920 and see if it can handle it, I know it exceeds 80C while encoding video with relatively cool ambient temps of under 24C.


Dat's why you buy CPUs with a low TDP value.

m
0
l
August 2, 2009 3:08:44 AM

TDP != power consumption.
m
0
l
August 2, 2009 3:39:23 AM

Yeah :) 

And TDP is often manipulated by the marketing department.

Most consumer PCs are not built to run at 100% utilization 24/7/365, it sucks but it's true sometimes.

A well built server on the other hand is supposed to be engineered to run at 100% utilization 24/7/365 which is why nice servers usually have RAM with error correction, tons of fans that are louder than the space shuttle during launch and high quality components certified to run at higher temperatures and loads.

:) 
m
0
l
August 2, 2009 4:14:14 AM

linux_0 said:
Yeah :) 

And TDP is often manipulated by the marketing department.



I'm just saying you'd have less heat to deal with.
m
0
l
August 2, 2009 6:31:17 AM

linux_0 said:
gcc will use 100% of your CPU cores for hours on end while doing this. The thermal stress alone could kill your computer if it's not up to it ( you absolutely have to have good airflow and AC for proper cooling ).
A little over-dramatic I think. Gentoo users regularly compile the kernel and/or large parts of user space without problem. And FreeBSD users will also regularly compile the kernel and most of user space in one go. I've never had any problems doing this on any computer. I must have run Gentoo and/or FreeBSD on at least 20 computers by now. Even an old PPC mac mini doesn't balk at installing Gentoo from scratch.

As an aside, I'm quite surprised to hear Linux enthusiasts say they've never configured (so presumably never compiled) the kernel.

PS. It'll take maybe a day to compile all of Gentoo on the setup you describe, and the CPU isn't running at 100% all of this time. You get bursts of activity interspersed with quite long periods of downloading and running./configure scripts. No way is this measured in weeks.
m
0
l
August 2, 2009 7:05:11 AM

How do you compile?
m
0
l
August 2, 2009 7:07:38 AM

Less heat is always a good thing :) 

While the CPU gets most of the attention, every single electronic component on the motherboard and inside the computer has it's own environmental limits.

If your CPU is running at 68F or 20C that's great but if your RAM is running at 176F or 80C you'll be in trouble.

Some components cannot exceed 185F or 85C, although lower temperatures can still cause damage over time.

:) 
m
0
l
August 2, 2009 7:13:45 AM

@ijack I've seen sparks fly before, literally! :) 

@amdfangirl

make menuconfig
make
make modules_install


:) 
m
0
l
August 2, 2009 7:15:42 AM

Is it auto-compile?
m
0
l
August 2, 2009 7:25:48 AM

@ijack

I wasn't talking about Gentoo or LFS specifically, I was saying if you try to compile the 10,000 packages that the various distributions usually offer pre-compiled on an older system such as the one randomizer has, it would take a very long time.

The 2.6.x kernel alone can take 20-40 minutes to compile depending on your configuration options and your hardware.

Apache can take 20 minutes to several hours depending on what you're doing to it and how many times you mess it up ;) 
m
0
l
August 2, 2009 7:35:15 AM

amdfangirl said:
Is it auto-compile?




Well... make does all the hard work for you, so you don't have to gcc each file by hand :) 

Without make, it would be a nightmare to compile.

By my count 2.6.30.4 has 12073 .c files and 10598 .h files.

:) 
m
0
l
August 2, 2009 7:39:09 AM

amdfangirl said:
How do you compile?
If that was directed at me, just the normal means for the distros in question. "emerge --update" for Gentoo, "make buildworld" for FreeBSD.
m
0
l
August 2, 2009 7:44:07 AM

Ijack said:
As an aside, I'm quite surprised to hear Linux enthusiasts say they've never configured (so presumably never compiled) the kernel.

I'm not a Linux enthusiast, I don't even run Linux on my current system (I had a basic setup on my old rig). I haven't found the initiative to dive into the setup of it yet. The problem for me is finding a need or use for it other than just to play around with.

linux_0 said:
Without make, it would be a nightmare to compile.

By my count 2.6.30.4 has 12073 .c files and 10598 .h files.


No kidding!
m
0
l
August 2, 2009 7:44:14 AM

linux_0 said:
@ijackThe 2.6.x kernel alone can take 20-40 minutes to compile depending on your configuration options and your hardware.
That would be on a severely limited machine, or a severely badly configured kernel. The main thing, I find, that takes time compiling the kernel is all the modules. But the point about a custom kernel is that you don't enable modules for 123 ethernet cards, just the one that you actually have. On a modern PC the kernel takes a few minutes to compile, on my PPC Mac Mini (the slowest machine I use) maybe 10 or 15 minutes. Most drivers I'll compile into the kernel, with just a handful of modules.
m
0
l
August 2, 2009 7:46:49 AM

linux_0 said:
By my count 2.6.30.4 has 12073 .c files and 10598 .h files.
But, realistically, you don't even use half of those files. Most of them are for hardware, or filesystems, or networking options, that you won't be creating modules for. But you do have to configure the kernel to remove all those unwanted drivers.
m
0
l
August 2, 2009 7:48:52 AM

randomizer said:
I'm not a Linux enthusiast, I don't even run Linux on my current system (I had a basic setup on my old rig). I haven't found the initiative to dive into the setup of it yet. The problem for me is finding a need or use for it other than just to play around with.



No kidding!
Fair enough. In that case you certainly won't want to be looking at Gentoo!
m
0
l
August 2, 2009 7:50:18 AM

Out of interest, what size could you cut a typical Ubuntu installation down to by compiling it yourself? I think Ubuntu is about 2-4GB or something at the moment.
m
0
l
August 2, 2009 7:52:18 AM

Ijack said:
Fair enough. In that case you certainly won't want to be looking at Gentoo!

Only Gentoo I've used was a pre-compiled version resurrecting a PPC Powerbook G4, which subsequently had the HDD fail.
m
0
l
August 2, 2009 7:57:00 AM

Well that's what make menuconfig is for :) 

A n00b might be lost inside the make menuconfig or make xconfig for several hours.

A pro could do it in 30 seconds or a few minutes or no time at all if you have a .config already :D 

If you compile a generic kernel the way redhat and ubuntu might, it could take a long time because they do include 500 ethernet cards and 2,000 dohickey drivers.

If you compile an optimized kernel for your system with only a couple of drivers included it might get done in 8minutes.
m
0
l
August 2, 2009 8:08:00 AM

randomizer said:
Out of interest, what size could you cut a typical Ubuntu installation down to by compiling it yourself? I think Ubuntu is about 2-4GB or something at the moment.



Good question, who knows :) 

Ubuntu desktop is about 2-4GB.

Ubuntu server is less, about 1GB IIRC, maybe less.

Debian can be even less than ubuntu server.

About 15 years ago you could get a barebones linux distribution installed if you had about 100MB of disk space ( not RAM, disk space ).

With ucLinux you can get it well below 100MB ( 8-32MB IIRC ) but you can't do that much with it, all you get is busybox and a couple of tools.
m
0
l
August 2, 2009 8:12:34 AM

I should try it :D  Although working out which drivers are which and which drivers I need would probably take me years :lol: 
m
0
l
August 2, 2009 8:24:56 AM

A default Ubuntu desktop install will fit on a 4GB SD card or SSD.

It will not fit onto a 2GB SD card, usb flash drive or SSD. You'd have to whip out a turbocharged swiss army chainsaw to get it to fit ;) 

Haven't tried the alternate desktop but ubuntu server should fit on a 2GB device.
m
0
l
August 2, 2009 8:33:03 AM

randomizer said:
I should try it :D  Although working out which drivers are which and which drivers I need would probably take me years :lol: 




:lol: 

I'm sure we could arrange to have a .config and a pci network card smuggled over to you ;) 

Let's just hope Australian customs doesn't read this forum.

:) 
m
0
l
August 2, 2009 8:34:25 AM

What is the earliest kernel version which supports all 8 threads of the i7? Alongside possibly compiling a distro on it I want to compare the time to encode some video with Handbrake on Windows to Linux. I don't know if I should expect any noticeable difference (it's still x264), but perhaps the file system difference might show some improvement/detriment.
m
0
l
August 2, 2009 8:42:14 AM

I don't know about the i7 but on the P4 Hyper-Threading could actually cost you about 5-10% in lost performance.

There was also talk that Hyper-Threading created some major security issues at the hardware level.
m
0
l
!