As for many problems, Linux includes on-board solutions for which other operating systems require additional, and often expensive, software. Moreover, filesystems reserve free block groups on the hard disk to store growing files completely.Įven such forward-looking mechanisms cannot deal fully with the problem. For example, to attempt to copy data completely to contiguous free space on the hard disk, filesystems keep data to be written in RAM for a period of time until the final size is determined. To minimize file defragmentation, common filesystems like ext2 and its successors ext3 and ext4 include some mechanisms to counteract the effect. Ultimately, generally useful tools like BleachBit or Rpmorphan can aggravate the situation under certain circumstances by removing files that are no longer needed: Deletions between allocated segments generate non-contiguous free blocks. Partitions on which more than half of the capacity is used will fragment data more heavily as the system needs to spread larger files over an increasing number of areas because of the lack of free space. User data isn’t the only contributor to fragmentation the operating system itself encourages the big divide: Power users who like to experiment and often try out software also contribute to fragmentation by installing new programs and then deleting them. This movement takes time and continues to increase on heavily fragmented disks. When such a file is read, the read heads move to a new position several times to gather the pieces. Thus, power users, even on Linux, are advised to reorganize their data occasionally.įragmentation is something that primarily affects larger files that will not fit completely into the free space on a hard drive because of a lack of sufficiently large, contiguous space thus, a file will reside in different segments (see the “Theory” box). This slows down not just the hard disk itself, but also the entire system – sometimes noticeably. The Linux filesystems ext2, ext3, and ext4 don’t need that much attention, but over time, after innumerable writes and deletes, data becomes fragmented. ![]() ![]() Whereas just a few years ago, hard drives with a couple hundred gigabytes were perfectly sufficient, today, multiple-terabyte disk storage capacity is commonly used. In this age of digital content, data collections in normal households are growing rapidly.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |