1 / 70

Those who suggest to use "rsync", I have couple of questions for you.

Those who suggest to use "rsync", I have couple of questions for you. I saw the site and document,. * The largest test was 24MB. I am looking for giga bytes servers, Will it support ? Also someone in mailing list raise the issue of eating so much "CPU" by "rsync" ??.

tierra
Télécharger la présentation

Those who suggest to use "rsync", I have couple of questions for you.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Those who suggest to use "rsync", I have couple of questions for you.

  2. I saw the site and document,

  3. * The largest test was 24MB. I am looking for giga bytes servers, Will it support ? Also someone in mailing list raise the issue of eating so much "CPU" by "rsync" ??

  4. * It does not address replicating all the file attributes (as far as I can tell).

  5. * It does not address the initial synchronization problem ?

  6. Thanks

  7. --- Richard Sharpe <sharpe@ns.aus.com>

  8. > wrote:

  9. >At 12:50 PM 8/12/00 +0930, Dan Shearer wrote:

  10. >>On Sat, 12 Aug 2000, Daryl Tester wrote:

  11. >>

  12. >>> Stephen Donaldson wrote:

  13. >>>

  14. >>> > I would suggest that, given the limited

  15. >>> > input so far, you identify the most commonly used files and cron them

  16. >>> > regularly..

  17. >>>

  18. >>> I think you need to re-read the question. It was a single file

  19. >>> of 2 gig in size.

  20. >>>

  21. >>> I have done something in a commercial environment for a relatively

  22. >>> large file, which involves:

  23. >>>

  24. >>> 1) Running the transaction file on a mirrored drive.

  25. >>> 2) Breaking the mirror (to get a "reasonably consistent snapshot).

  26. >>> 3) Backing up the non-used ex mirror.

  27. >>> 4) Merging the drive back with its original mirror.

  28. >>>

  29. >>> Steps 2-4 are performed every 15 minutes. Works OK, but probably not

  30. >>> for 2 gigs of data.

  31. >>

  32. >>A more efficient way to do steps 2-4 is to use rsync,

  33. >>http://rsync.samba.org.

  34. >

  35. >Well, that depends. At a site I am involved with, we currently rsync about

  36. >5GB of data between two machines every 20 minutes. It chews up so much CPU

  37. >that users notice it. The problem is it has to check every file in that 5GB

  38. >or so worth of data.

  39. >

  40. >In addition, loosing up to 20 minutes worth of data, and the

  41. >inconsistencies between the two sets is not a good thing.

  42. >

  43. >>There are ways of doing this sort of thing at the filesystem level and

  44. >>also at the disc block level. Neither are trivial to set up. rsync is a

  45. >>good interim, simple solution.

  46. >

  47. >So, we are looking at using drbd, which is one of the solutions I imagine

  48. >Dan is talking about.

  49. >

  50. >>Dan

More Related