connery@bnrmtv.UUCP (Glenn Connery) (02/27/86)
Various people have asked for a method to allow them to mark bad sectors off on the AT hard disk (the CMIs seem to develop them like a spreading disease). The Norton Utilities 3.x includes a program called DISKTEST (or DT) which does this. You can test all your sectors or files by running it and if it finds unallocated bad sectors it will ask if you want to mark them off. Only problem is that it does this only by read-testing (obvious since it doesn't want to destroy anything) and so often will not find something that is a problem. Also doesn't play with the retry on error handler so sometimes things get by and you have to run it a few times in a row. Would be nice if you could set the 'sensitivity' to change the number of retries which would cause an error to be noticed and if it would also do write tests on unallocated sectors. Anyway, for what its worth and given that I don't like the rest of the package very much, this program has saved me a lot of grief with my CMI drive.
bill@hp-pcd.UUCP (bill) (02/28/86)
I'll attest to the worthiness of Norton's DT and NU utilities! A while back I inadvertantly deleted every file on my AT's hard disk (don't ask). It took about two days with NU to recover the 150 or 200 files that I didn't have backup copies of (there were about 1800 files on the disk). I chalked the rest off as a long-needed clean-up. Anyway, the drive had been giving me numerous problems up until that time. I put all the recovered files on floppy, then reformatted the hard disk, and finally ran DT on it to mark the bad clusters. (DT will mark clusters as unusable only if they are not currently in use by a file; otherwise, it just tells you what cluster has a problem and does nothing about it. Also, DOS provides for marking bad "clusters", not "sectors". On a typical hard disk, there might be four 512-byte sectors in a 2K-byte cluster. A 20M hard disk then has about 11,000 clusters.) DT found and marked fifteen bad clusters on the first try. Everything ran smoothly after that for about 6 months. Then, about a week or so ago, errors started cropping up again. I ran DT again and it located four new bad clusters (it took about five passes to catch all four). Anyway, I think I'll start running DT once every week or two. While the cluster failure rate on this hard disk is ridiculous, at least DT keeps me aware of how fast it's degrading and prevents DOS from using the bad areas. The only problems with it are [1] it doesn't always spot a cluster that's marginally bad, so you end up having to run DT several times before you become satisfied that they've all been found, and [2] if DT finds a bad cluster that's currently in use by a file, I have yet to see it pin down the name of the file, although it does do a "file read check" that appears to try to do just that. You do get the number of the cluster that's bad, but then I've always had to manually figure out what file sector that is and work backwards from the data that's there to determine what file it's in. Then I (save, if possible, and) delete the file, and then run DT again. bill frolik hp-pcd!bill