It is important to determine the various size requirements of the data before performing any backup. Consider the following:
Is there enough backup media available for the backup to complete correctly?
Is the backup going to require multiple volumes? If the backup cannot be stored onto a single backup media will the tool require multivolume capability? DD can be made multivolume-capable using scripts.
Is backup space a concern? Does the backup device support hardware compression? If space is a concern and the backup device does not support hardware compression then through piping any of the backup tools can be made to send the backup stream through Gzip or Bzip2 for compression before redirecting the data stream to the backup device.
Large datasets such as operating systems or databases should be backed up using Dump or DD because they are much faster than the other tools. DD is efficient if the device (partition) is more full than empty because it copies all bits from that specified device.
Small datasets and user data can be easily backed up using Cpio, Tar, or Dump. If there are no constraints such as locked files, special file attributes, etc., then Cpio and Tar may be more appropriate then Dump. However, larger datasets should use Dump wherever possible.
Do'stlaringiz bilan baham: |