Can NTFS compression speed up slow notebook HDDs ?

Performance, hardware, software, general buying and gaming discussion..
Post Reply
Message
Author
Puppy
Senior ThinkPadder
Senior ThinkPadder
Posts: 2261
Joined: Sat Oct 30, 2004 4:52 am
Location: Prague, Czech Republic

Can NTFS compression speed up slow notebook HDDs ?

#1 Post by Puppy » Sun Aug 21, 2005 3:46 pm

Do you have experience with NTFS compression regarding speeding up loading large files. Typically virtual disk files of virtual machines like Microsoft Virtual PC or VMware ? My configuration is 4200 rpm HDD and 1.6 GHz Dothan.

Bob Collins
Junior Member
Junior Member
Posts: 279
Joined: Sun Apr 25, 2004 2:16 pm
Location: Palm Beach Gardens, FL

#2 Post by Bob Collins » Sun Aug 21, 2005 9:49 pm

As I understand your question, you ask about speeding up through compression, in particular NTFS compression.

Simply, no. The compression will help the drive be faster after a good defrag, because there will be less stuff to look through to find files. Now the catch is that when you go to use a compressed file, it must be decompressed on the fly which is of course slower than reading the same file as if it had not been compressed.

The thing with NTFS compression is that it is transparent for the user. It does free up HD space, but the cost is actually what you are after and that is speed.

Your best bet if you do not already do it is to defrag the drive every week. Fragmented files will hurt system performance quite tremendously. The internal defrag is a stripped down defrag from Executive software (assuming you are running some form of Windows NT, 2000, XP) and as such does not allow scheduled defragging. If you get the full version, it includes the scheduler. I have that at the office and I have my systems defrag nightly. They always run smoothly.

Your best bet for disk speed will be a faster RPM drive, like the Hitachi 7,200 rpm jobs. I noticed a nice increase in my T22 merely by going to a 5,400 rom drive from the 4,200 rpm original drive.
Bob
701C, 600X, T22, G4 Powerbook

JonathanGennick
Junior Member
Junior Member
Posts: 302
Joined: Fri Feb 25, 2005 1:03 pm
Location: Munising, MI, USA
Contact:

#3 Post by JonathanGennick » Tue Aug 23, 2005 8:41 pm

The idea of using compression to boost performance of an I/O bound system does have merit. I know at least one database administrator who picked up a performance boost on some (not all) I/O bound queries by using Oracle's table compression. In his case, the gain from having to read fewer blocks from disk far outweighed the overhead of decompression.

But database management systems do a lot more I/O than we do on our Thinkpads. I tend to doubt that compression would improve performance for reading/writing small files such as applications and such. For large files? The only real way to find out would be to just try it both ways and see whether you could measure an improved performance for those tasks that most concern you.

Bob Collins
Junior Member
Junior Member
Posts: 279
Joined: Sun Apr 25, 2004 2:16 pm
Location: Palm Beach Gardens, FL

#4 Post by Bob Collins » Wed Aug 24, 2005 8:00 am

Good point on the DB tools, however I would think that the table compression is quite different than NTFS compression. As I understand it, NTFS compression is similar to zipping and as such would require decompression on read, and then compression on write if changes were made to the file.

As noted, on a Thinkpad, or really any "normal" use system will not see any benefit from this compression.

I would be interested if you tried it and ran some quantifiable tests.
Bob
701C, 600X, T22, G4 Powerbook

JonathanGennick
Junior Member
Junior Member
Posts: 302
Joined: Fri Feb 25, 2005 1:03 pm
Location: Munising, MI, USA
Contact:

#5 Post by JonathanGennick » Wed Aug 24, 2005 9:37 am

Bob, I forget the compression algorithm that Oracle uses. I believe that rows are compressed when written to a block and uncompressed when read. (note that being written to a block doesn't necessarily translate into disk i/o, because blocks get bufferred). I should look it up sometime and refresh my memory.

But in the end, I agree with you. I'm skeptical that compression would translate into a significant performance improvement for the typical sort of I/O one does on a laptop.

If I had the time though, I'd try it on some large files. It'd be an interesting experiment to make. But at the moment I just don't have the time.

As my colleagues and I often say when discussion things Oracle, one good test is worth a 1000 conjectures :)

Post Reply
  • Similar Topics
    Replies
    Views
    Last post

Return to “Thinkpad - General HARDWARE/SOFTWARE questions”

Who is online

Users browsing this forum: No registered users and 4 guests