Does anyone know how to Fragment a drive effectively?
I tried searching, but most topics come up with how to defragment. I've set up a new system and have a big hard drive and not much in it. So this gives me the oppertunity to do some tests of different deframenters and file systems before the final install.
I created a new 60GB partition and set out to fragment it badly. Almost everything I tried didn't work. I run defrag to have it be zero percent fragmented. I copied in 2 GB of files from a backup and then tried to delete random folders and nothing. I then tried to replace those folders and it didn't help. Actually on one occassion I got it to work by replacing a folder. In defrag it showed a red strip that proved it. But now nearly every time I can't repeat that again. I created a fat32 for fun and to see what would happen, hoping things would fragment worse. It didn't seem to matter.
Finally I dumped 1.5GB of pictures on the clean partition. I then ran a program that would resize each file to be a different size. It didn't help either. I would have thought that would have worked.
This is not really that important, but it would have been nice to do those tests. Does anyone know of a really effective way to fragment a drive? LOL
This might work - Would try it on a logical drive (Not your operating system.
Fill the disk with Small files (ie = to cluster size (typical use to be = 64K)or slightly smaller, Track sequence, after nearly filling disk, then delete say ever 4th, or fifth file, or for more fragmented delete every 10th. Now load an equivalant # of bytes of large files (say around 500K -> 1 Meg. If you know a little basic programing you could automate this very easily.
EX - 60 gig drive X. Write 900,000 64K files then delete every 10 file. Then write 3840 1.5 meg files back
900,000 x 64K = 57.6 Gigs (Do not like to fill a disk 100%)
deleting every 10 file: 90 K x 64 K = 5760 M / 1.5 M = 3840 1 1/2 meg files
PLEASE NOTE I used dec not hex - This is just to give an Idea.
Cluster is made up of X Number of 512 Byte Sectors in the old fat32 days, or was that fat16
I wrote a basic program that generated random bytes and saved them. I don't remember where I put it, I'll try and dig it up for you.
I have one here that can fill your memory with random variables if you want...
Thanks retiredchief.. I do program, but I didn't have any way to effectively delete every 10th file and didn't really feel like installing a compiler or language etc. So I tried to do something similar. Instead of using 60GB, I made a 1GB partition which is easier to handle. I then copied 1GB of about 100k picture files. Then I ran an auto JPG resizer to make all files 25% of the original size. This left a few hundred megs free. I then copied in one huge file and the drive was horribly fragmented, LOL. It was mostly in the red. So now I have a way to do it. I also found out that if the drive is small, then it's easier to see the red fragments. Thanks for the ideas..
Very interesting idea daft. I'm sure with a little modification it could work. Also thanks frozenlead that might be interesting as well.
Kryzzay. I see what you are thinking, but what you don't know is I have about 2 or 3 things I'd like to test out. So it's not as simple as a disk defragment for me. I was trying to keep it short, but I guess it makes people wonder why, lol.
But in terms of defraggers. Some people on the net are claiming a 5x to 10x improvement in the speed of windows defrag on some programs. When using 1TB of into, that will save time. I kind of doubt I'd see a 10x increase, but that was only one thing I wanted to test. I did like Auslogics. I think the author of diskkeepr is a control freak and unless it's way faster, I won't use it.
But again, I set up a 250GB hard drive 4 years ago and didn't do much planning. Now I have a bit of mess. Now, I'm setting up about a terabyte and would like to plan out all the partitions, chosen file systems (fat32/ntfs) plus defragers in a way that I can keep it from turning into a mess.
Sure every one talks about fragmentation in fat32 vs ntfs. But people have a tendency to either exaggerate, or not really know how a file system responds. Or they just repeat what they read. I just wanted to do a few simple tests for fun and see what really happens. Most of my stuff will be ntfs, but a few small partitions will be fat32. And yes, there are good reasons for this.
Being able to purposely fragment a partition gives me a feel for what can happen. Plus I've always wanted to see it in action. So I see it as spending a little time to plan my drives out correctly before installing them since I don't like to try to move partitions later. That said, I think I'm saving time and trouble in the future, not wasting it. But thanks for the input.
if you want, you can create huge files without having to use a compression program... what you do is make an unnesesaraly large txt file and then go to the directory where the file is, make a copy of it, make the copy a different name and put them both in the same directory
open command promt and use the command
COPY B/ *filename* + *filename* this will combine the two files into one and then just repeat, pending on how large the original files were, it will grow by that size every time.
This under 100K file has the ability to defragment drives as one would expect, but has an added functionality of what it refers to as "shotgun" mode. The shotgun mode will do what it implies, it will take all the files matching your file mask and shoot bits of them all over the drive.