Question:
Tell me about defragmentation in detail?
Ashika A
2008-03-29 01:46:08 UTC
Tell me about defragmentation in detail?
Ten answers:
?
2008-03-29 02:04:41 UTC
defragmentation is a process that reduces the amount of fragmentation in file systems. It does this by physically reorganizing the contents of the disk to store the pieces of each file close together and contiguously. It also attempts to create larger regions of free space using compaction to impede the return of fragmentation. Some defragmenters also try to keep smaller files within a single directory together, as they are often accessed in sequence.



Fragmentation occurs when the operating system cannot or will not allocate enough contiguous space to store a complete file as a unit, but instead puts parts of it in gaps between other files (usually those gaps exist because they formerly held a file that the operating system has subsequently deleted or because the operating system allocated excess space for the file in the first place). Larger files and greater numbers of files also contribute to fragmentation and consequent performance loss. Defragmentation attempts to alleviate these problems.



A common strategy to optimize defragmentation and to reduce the impact of fragmentation is to partition the hard disk(s) in a way that separates partitions of the file system that experience many more reads than writes from the more volatile zones where files are created and deleted frequently. In Microsoft Windows, the contents of directories such as "\Program Files" or "\Windows" are modified far less frequently than they are read. The directories that contain the users' profiles are modified constantly (especially with the Temp directory and Internet Explorer cache creating thousands of files that are deleted in a few days). If files from user profiles are held on a dedicated partition (as is commonly done on UNIX systems), the defragmenter runs better since it does not need to deal with all the static files from other directories. For partitions with relatively little write activity, defragmentation performance greatly improves after the first defragmentation, since the defragmenter will need to defrag only a small number of new files in the future.



In fact, in a modern multi-user operating system, an ordinary user cannot defragment the system disks since superuser access is required to move system files. Additionally, file systems such as NTFS (and most Unix/Linux filesystems) are designed to decrease the likelihood of fragmentation.[2][3] Improvements in modern hard drives such as RAM cache, faster platter rotation speed, and greater data density reduce the negative impact of fragmentation on system performance to some degree, though increases in commonly used data quantities offset those benefits. However, modern systems profit enormously from the huge disk capacities currently available, since partially filled disks fragment much less than full disks.[4] In any case, these limitations of defragmentation have led to design decisions in modern operating systems like Windows Vista to automatically defragment in a background process but not to attempt to completely defragment a volume because doing so would only produce negligible performance gains





Vote for best answer.
2008-03-29 02:49:59 UTC
Windows Disk Defragmenter is a computer program included in Microsoft Windows designed to increase access speed (and sometimes increase the amount of usable space) by rearranging files stored on a disk to occupy contiguous storage locations, or defragmenting. The purpose is to optimize the time it takes to read and write files to/from the disk by minimizing head travel time and maximizing the transfer rate.
?
2016-10-08 03:40:02 UTC
Defragmentation isn't a severe technique. it particularly is performed as quickly as in each week or possibly a month based upon the way you employ your laptop. in case you have countless information, different archives then defragment your laptop atleast as quickly as a month. additionally lacking a defragmentation would not impression your abode windows.
Palagawad
2008-03-29 02:01:42 UTC
A term used for the process of scanning the file system and rejoining any split files back into consecutive pieces.



Defragmentation - is the process of locating the noncontiguous fragments of data into which a computer file may be divided as it is stored on a hard disk, and rearranging the fragments and restoring them into fewer fragments or into the whole file. Defragmentation reduces data access time and allows storage to be used more efficiently



For example, a computer program must access various files on your hard drive every time you try to run it. If those files are spread out on opposite sides of your hard drive instead of gathered and organized neatly, as they are supposed to be, your computer will have to work extra hard and take extra time to access the information it needs.
Off1c3r
2008-03-29 01:51:13 UTC
Files on your harddrive are made up of many pieces and sometimes those pieces get spread out over your harddrive instead of being in one place next to each other. When you access that file, the harddrive has to search all over the place to get the file and it makes your access slower and lessens the harddrives lifespan making it do extra work. Defragmenting picks up all those pieces and puts them next to each other so it can be viewed faster and accessed faster.. its also neater =)
2008-03-29 01:53:03 UTC
About defragmentation you will have to type the word in google
techguru_94
2008-03-29 03:37:11 UTC
Check this out www.dotcomworld.blogspot.com It is a popular blog ... And has good stuff ... I came across it on www.pirating.us ... .It has cool stuff .. . . . i am sure you would find your answer there .. I found cool stuff in this blog ..... Look for your answer in the archive ...
Anirudh R
2008-03-29 02:07:39 UTC
go to control pannel , properties and defragment........
Deb
2008-03-29 01:50:10 UTC
Read this - -

http://www.webopedia.com/TERM/D/Defrag.html



and this - -

http://www.webopedia.com/TERM/F/fragmentation.html



or do you want to know "how to defrag my computer" ???



.
Aditya T
2008-03-29 01:55:23 UTC
SIDDHARTH DONT POST if u dont know.

U will be banned for spamming

Defragmentation

From Wikipedia, the free encyclopedia

Jump to: navigation, search

"Defrag" redirects here. For other uses, see Defrag (disambiguation).

"Disk Defragmenter" redirects here. For the Microsoft Windows utility, see Disk Defragmenter (Windows).



In the context of administering computer systems, defragmentation is a process that reduces the amount of fragmentation in file systems. It does this by physically reorganizing the contents of the disk to store the pieces of each file close together and contiguously. It also attempts to create larger regions of free space using compaction to impede the return of fragmentation. Some defragmenters also try to keep smaller files within a single directory together, as they are often accessed in sequence.

Contents

[hide]



* 1 Motivation

* 2 Causes

o 2.1 Example

* 3 Common countermeasures

o 3.1 Partitioning

* 4 Problems

o 4.1 Immovable files

o 4.2 Fragmentation buildup

* 5 Myths

* 6 Approach and defragmenters by file system type

* 7 See also

* 8 References

* 9 Sources

* 10 External links



[edit] Motivation



Sequential reading and writing data on a heavily fragmented file system is slowed down as the time needed for the disk heads to move between fragments and waiting for the disk platter to rotate into position is increased (see seek time and rotational delay). For many common operations, the performance bottleneck of the entire computer is the hard disk; thus the desire to process more efficiently encourages defragmentation. Operating system vendors often recommend periodic defragmentation to keep disk access speed from degrading over time.



Fragmented data also spreads over more of the disk than it needs to. Thus, one may defragment to gather data together in one area, before splitting a single partition into two or more partitions (for example, with GNU Parted or PartitionMagic).



Defragmenting can increase the life-span of the hard drive itself, by minimizing head movement and simplifying data access operations.



[edit] Causes



Fragmentation occurs when the operating system cannot or will not allocate enough contiguous space to store a complete file as a unit, but instead puts parts of it in gaps between other files (usually those gaps exist because they formerly held a file that the operating system has subsequently deleted or because the operating system allocated excess space for the file in the first place). Larger files and greater numbers of files also contribute to fragmentation and consequent performance loss. Defragmentation attempts to alleviate these problems.



[edit] Example



Consider the following scenario, as shown by the image on the right:



An otherwise blank disk has 5 files, A, B, C, D and E each using 10 blocks of space (for this section, a block is an allocation unit of that system, it could be 1K, 100K or 1 megabyte and is not any specific size). On a blank disk, all of these files will be allocated one after the other. (Example (1) on the image.) If file B is deleted, there are two options, leave the space for B empty and use it again later, or compress all the files after B so that the empty space follows it. This could be time consuming if there were hundreds or thousands of files which needed to be moved, so in general the empty space is simply left there, marked in a table as available for later use, then used again as needed.[1] (Example (2) on the image.) Now, if a new file, F, is allocated 7 blocks of space, it can be placed into the first 7 blocks of the space formerly holding the file B and the 3 blocks following it will remain available. (Example (3) on the image.) If another new file, G is added, and needs only three blocks, it could then occupy the space after F and before C. (Example (4) on the image). Now, if subsequently F needs to be expanded, since the space immediately following it is no longer available, there are two options: (1) add a new block somewhere else and indicate that F has a second extent, or (2) move the file F to someplace else where it can be created as one contiguous file of the new, larger size. The latter operation may not be possible as the file may be larger than any one contiguous space available, or the file conceivably could be so large the operation would take an undesirably long period of time, thus the usual practice is simply to create an extent somewhere else and chain the new extent onto the old one. (Example (5) on the image.) Repeat this practice hundreds or thousands of times and eventually the file system has many free segments in many places and many files may be spread over many extents. If, as a result of free space fragmentation, a newly created file (or a file which has been extended) has to be placed in a large number of extents, access time for that file (or for all files) may become excessively long.



The process of creating new files, and of deleting and expanding existing files, may sometimes be colloquially referred to as churn, and can occur at both the level of the general root file system, but in subdirectories as well. Fragmentation not only occurs at the level of individual files, but also when different files in a directory (and maybe its subdirectories), that are often read in a sequence, start to "drift apart" as a result of "churn".



A defragmentation program must move files around within the free space available to undo fragmentation. This is a memory intensive operation and cannot be performed on a file system with no free space. The reorganization involved in defragmentation does not change logical location of the files (defined as their location within the directory structure).



[edit] Common countermeasures



[edit] Partitioning



A common strategy to optimize defragmentation and to reduce the impact of fragmentation is to partition the hard disk(s) in a way that separates partitions of the file system that experience many more reads than writes from the more volatile zones where files are created and deleted frequently. In Microsoft Windows, the contents of directories such as "\Program Files" or "\Windows" are modified far less frequently than they are read. The directories that contain the users' profiles are modified constantly (especially with the Temp directory and Internet Explorer cache creating thousands of files that are deleted in a few days). If files from user profiles are held on a dedicated partition (as is commonly done on UNIX systems), the defragmenter runs better since it does not need to deal with all the static files from other directories. For partitions with relatively little write activity, defragmentation performance greatly improves after the first defragmentation, since the defragmenter will need to defrag only a small number of new files in the future.



[edit] Problems



[edit] Immovable files



The presence of immovable system files, especially a swap file, can impede defragmentation. These files can be safely moved when the operating system is not in use. For example, ntfsresize moves these files to resize an NTFS partition.



[edit] Fragmentation buildup



On systems without fragmentation resistance, fragmentation builds upon itself when left unhandled, so periodic defragmentation is necessary to keep the disk performance at peak and avoid the excess overhead of less frequent defragmentation.



[edit] Myths



In fact, in a modern multi-user operating system, an ordinary user cannot defragment the system disks since superuser access is required to move system files. Additionally, file systems such as NTFS (and most Unix/Linux filesystems) are designed to decrease the likelihood of fragmentation.[2][3] Improvements in modern hard drives such as RAM cache, faster platter rotation speed, and greater data density reduce the negative impact of fragmentation on system performance to some degree, though increases in commonly used data quantities offset those benefits. However, modern systems profit enormously from the huge disk capacities currently available, since partially filled disks fragment much less than full disks.[4] In any case, these limitations of defragmentation have led to design decisions in modern operating systems like Windows Vista to automatically defragment in a background process but not to attempt to completely defragment a volume because doing so would only produce negligible performance gains.[5]



[edit] Approach and defragmenters by file system type



* FAT: DOS 6.x and Windows 9x-systems come with a defragmentation utility called Defrag. The DOS version is a limited version of Norton SpeedDisk[6], and the Windows version is licensed from Diskeeper.

* NTFS: Windows 2000 and newer include a defragmentation tool based on Diskeeper. NT 4 and below do not have built-in defragmentation utilities. Unfortunately the integrated defragger does not consolidate free space. Thus a heavily fragmented drive with many small files may still have no large consecutive free space after defragmentation. So any new large file will instantly be split into small fragments with immediate impact on performance. This can happen even if the overall disk usage is less than 60%[7]

* ext2 (Linux) uses an offline defragmenter called e2defrag, which does not work with its successor ext3, unless the ext3 filesystem is temporarily down-graded to ext2. Instead, a filesystem-independent defragmenter like Shake[1] may be used.

* vxfs has fsadm utility meant to perform also defrag operations.

* JFS has a defragfs[2] utility on IBM operating systems.

* HFS Plus in 1998 introduced a number of optimizations to the allocation algorithms in an attempt to defragment files while they're being accessed without a separate defragmenter.[citation needed]


This content was originally posted on Y! Answers, a Q&A website that shut down in 2021.
Loading...