[Tfug] Defragging *nix

Ronald Sutherland ronald.sutherland at gmail.com
Mon Oct 29 09:58:42 MST 2007


One thing I got from that is its a bit of a holy war, of the vi vs. emacs
type. Now days I take holy wars to indicate that both groups are having
problems, and just can't bring them selfs to detach form there broken ideas.
They need to merge there ideas together, but then the result is not black
and white. In other words spreading the files out but not to much, keeping
them more clustered near the mid point of the active part of the disk where
the read write head should sleep to have minimum seek time to any file. It
sounds like NTFS and ext[2|3] spread the files out on the disk, although
some clustering may be visible in NTFS.

I write test programs for electronic widgets, and have been doing that for
some time. Back in the Win 3.1/95/98 days I used Basic/VB for most stuff.
some test programs would make a single file and keep growing it, and others
would make a new file each time a test finished, and finally in my most
screwed up efforts I would make many result files and modify those files
with new data for each test stage in the production process (board,
first-functional, isolation, burnin, final-functional). I found problems
with all of these methods, I did not like defragmenting but it helped some,
I'm sure it accelerated HD failure on some setups. I was very happy when I
used NTFS for the first time (NT3.51), it gave me hope, but there were no
drivers for my instruments (gpib). I stopped playing with defragmenting when
I got NT4 on the production test computers. It was some time before I
started having problems with NTFS but for the most part I am able to manage
those issues by not letting huge numbers of files build up (automatic
archive after results age a configurable amount of time).

I've not the nerve to put Linux computers on the production floor, even
though I use it for version management at work and most stuff at home. One
of these days I guess I need to try filling up the file system using those
methods to see when it craps out. I think Linux is used in industry as much
for compatibility as anything, for example we have a fair amount of CNC type
assembly equipment based on DOS, OS/2, and some other real time stuff. Linux
can deal with most anything they can crap out, and sometimes this stuff does
not offer (or we can't find) the defragment tools. It would be be a very
risky operation, so I would image the data first, and then defrag on Linux
only in absolute need.

On 10/29/07, Earl <earljviolet at juno.com> wrote:
>
> >>I noticed a disk defragmenting program in the Ubuntu repositories that
> >>set me to wondering, "Has anyone used such a thing?  Under what
> >>circumstances?"
>
> Above is my original question.  I read the thread of a short while back
> and found it quite interesting and revealing.  However, my question was not
> "why or why not defrag?"
>
> Earl
>
> "God made man, Sam Colt made men equal."
> "The fates guide him who will, him who won't they drag" Spengler
>
>
> _______________________________________________
> Tucson Free Unix Group - tfug at tfug.org
> Subscription Options:
> http://www.tfug.org/mailman/listinfo/tfug_tfug.org
>



More information about the tfug mailing list