Archived from groups: microsoft.public.windowsxp.basics (
More info?)
"Paul Power" <Paul Power@discussions.microsoft.com>
wrote in news:A7BF59B8-321B-4037-AA5A-FD28698FAA65@microsoft.com:
> Having a rip roaring discussion about how often one should format
> their hard drive. One person suggests doing it every 3-4 months.
> Another says that MS recommends every 2 years (I would like to see
> documentation on that). Anyone have an opinion? Or solid
> documentation?
It is not reformatting that is important. It is rewriting the data that
is important. I'm no physics PhD so the following is my layman's view
on what happens. Hard drives are magnetic recording devices. This uses
magnetic dipoles (or is it magnetic domains?) to record ones and zeros.
There is magnetic stress when the dipoles are aligned different to each
other and they tend to realign over time (i.e., there is torque for them
to realign back to their most lax arrangment). This makes the data
"soft". Magnetic retentivity refers to the residual magnet field left
behind after erasing the media but also represents the ability to retain
a magnetic field, and it wanes over time so you end up with soft areas
on your hard drive (i.e., your data fades away if it never gets
exercised by rewriting it). Reading the data is nondestructive (i.e.,
reading the data doesn't change it so you don't have to rewrite it).
During a read, the magnetic flux generates an electrical current in the
read head (it detects reversals), or the resistance in the read head
changes due to the magnetic flux (i.e., magnetoresistive read head), but
the data is never changed and then restored so it is never refreshed.
If reading the bits was destructive which would then require them to be
rewritten then they would always get refreshed simply by reading them.
I haven't seen anything yet that says reading the data is destructive
and forces a rewrite of it (because this would really slow down the read
operation to do the read and follow with a write). So just refresh the
data yourself (i.e., force a rewrite of it).
Typically files change position often enough to leave varying open spots
on the disk for unused sectors, so defragmenting all the partitions is
often sufficient to force a refresh of the data (because it got moved).
A single defrag won't move all data but over years the files probably
move around enough through use and defragging that those that didn't get
moved before do get moved later. The other method is to use a refresh
utility that reads the data, copies it to a holding area (that was first
verified okay for use and then verifies any data written to it to ensure
readability), wipes the old spot with several various bit patterns, and
then rewrites (moves) the data back, or, and if sufficient free space is
available, just do the bit pattern overwrites in the unused area, move
the data over there, and verify the new copy of the data before erasing
it from the old spot and updating the file table (so it is akin to
defragging but you are also exercising and verifying the new spot to
which the data gets moved). Even running 'chkdsk /r' will probably be
sufficient in most cases.
The problem doesn't crop up much anymore for hard drives unless you are
archiving them for backups for long periods of times (but then relying
on a backup *device* rather than just the media means you also risk the
mechanicals of the device failing due to physical or electrical shock or
just plain component failure whereas you can just replace the drive and
continue using the media). I've seen the problem mostly occur with
floppies, the solution of which is to copy the files into a temp
directory on the hard drive, format the floppy, and copy the files back
which effectively refreshes the bits on the floppy (although only
minimal surface testing and no alignment checking was done by the FORMAT
program). You could defragment the floppy to effect a refresh (but only
for those sectors that actually got moved) but that would take longer
than just moving the files, formatting, and moving the files back plus
you may not have sufficient free space to perform the defrag.
The drive hardware (platter) itself may develop soft spots and might be
discovered by using Scandisk with its surface check option or running
'chkdsk /r', but remember that these utilities allow for recovery. A
spot might be tested a max of 15 times before it is declared bad and
marked unusable, but what if it passes on the 14th retry? Do you really
want to use media that is near failure but manages to just pass as okay?
Other and better 3rd party utilities will probably give much more
control and details as to soft spots and the quality and current state
of your media. Data retention wanes over time and can go soft enough to
cause problems, especially if temperatures are elevated, so you'll want
to refresh your bits. Defragging might be enough but only if
fragmentation is allowed to become excessive so lots of sectors get
moved on each defrag, but who wants to let their mass storage subsystem
get progressively slower due to increasing fragmentation just so they up
the chance that more files get moved around? A refresh utility will
exercise the data in place even if the partition has little or no
fragmentation. A reformat is not necessary unless you feel that repeated
defragging won't exercise ALL files and you don't want to use a disk
refresh utility.
Be aware that defragging will never touch the MBR (master boot record).
Typically the only part of the MBR that ever changes is the partition
table *if* you change or move partitions (and then only a portion of the
partition table entry for those partitions gets changed unless you also
change the partition type and what all other attributes are recorded in
the partition table entry). The bootstrap program (first 460 bytes) and
disk signature bytes of the MBR don't get touched by whatever
maintenance you do within the partitions because the MBR is not in any
partition. Defrag also won't touch the boot sector (first sector) of
the partition(s). Portions of large files may never get moved by defrag
so that area of the disk never gets refreshed. If it is something you
are really concerned about, get a utility that refreshes the disk. I
haven't bothered with one for around 6 or 8 years (but then I'm not
using hard drives to archive backups and the desktops usually get
updated around 3 years, or less, with bigger drives). Overall, hard
drives will fail (crash, get electrically or physically shocked, sieze
up) so their survival is shorter lived than for when problems crop up
from data or media going soft, or they get upgraded (by replacement)
long before then. However, I have had floppies get weak after about 3
years and a "refresh" gets them working again.
As far as a refresh utility goes, the only one that I can recall at the
moment is SpinRite. My hard drives don't physically exist long enough
for me to care; i.e., they die or get replaced long before data or media
goes "soft". In Scandisk for Windows 9x/ME, it has an option to check
the surface. It's surface check isn't anywhere near as thorough as
Spinrite's but it was good enough for me. Windows NT/2000/XP has
'chkdsk /r' to check the disk surface and that should be sufficient for
most users, along with doing defrags. You already have sufficient tools
included in Windows for "probably good enough" refreshing of your disk
media. However, if you're ever in dire straits with critical data too
soft to read reliably from the disk, you'll need something better to
retrieve it, if possible, and to maintain that disk thereafter, and if
sufficiently burned by that experience then you'll probably start
maintaining your other drives, too. If "probably good enough"
protection from data or media becoming soft is not good enough then
you'll need to check what disk refresh utilities are available.
I'm sure by now that there has to be something other than Spinrite that
does equal or more thorough data and media checking, alignment analysis,
data recovery, and refreshing patterns and verification. Running
'chkdsk /r', Scandisk with its surface check option, or PartitionMagic
that can recover from bad sectors are "regular octane" solutions but
some folks want "premium octane" preventative maintenance and disaster
recovery tools. I lost my Spinrite in a move so I couldn't get an
upgrade at a discounted price and never bothered buying a full version
after that, but then I haven't yet again ran into a hard drive that has
exhibited soft data or soft spots (that the simpler tools already
available couldn't handle well enough). Ontrack has their EasyRecovery
DataRecovery software but I don't know how it compares against the
cheaper-priced Spinrite. SpinRite hasn't had an upgrade in something
like 5 years but I hear version 6 is being worked on (I haven't kept up
on what is happening with it).
--
_________________________________________________________________
******** Post replies to newsgroup - Share with others ********
Email: lh_811newsATyahooDOTcom and append "=NEWS=" to Subject.
_________________________________________________________________