Copying files: does Windows write to disk if files are identical?How to copy files and create hardlinks instead of copying when files are identicalDisk-to-disk copying speedAccurate drive backup or file/folder synchronization that maintains metadataadd linefeed when copying Windows 7 filesTrying to remove/diagnose single Current_Pending_Sector in S.M.A.R.T. dataWhy does copying files from one partition to another of the same disk take time?Does appending (merging) files together work with XCOPY?

How to show all results from a repeat loop in the Script Editor results pane

Making sense of possessed dolls: how could they actually kill people?

Driving distance between O'Hare (ORD) and Downtown Chicago is 20.5 mi and 1H 17min?

Are all LTI systems invertible? If not, what is a good counterexample?

Space complexity for storing integers in Python

Where in the Aruch HaShulchan He'Atid are Hilchot Sotah?

a cryptic-crossword-type clue

Languages class in high school

Which battle was the most lopsided result in terms of casualties?

Does any country have free college & open admissions?

Subjonctive mood in Latin

How do I "Get My Foot in the Door"?

What is my volume?

Can a professor do an internship?

Is there a benefit to leaving reviews on Expedia?

When the direction of a movement changes, is the object at rest at some time?

Shoe shine shop model in Rust

Download, upload, downstream, upstream

Are we really moving at the speed of light in the time dimension?

What would cause vast differences in lifespans between the sexes?

Can we differentiate Fermat's little theorem?

Why didn't Nick Fury call Captain Marvel instead of Spider-Man?

Where is a warlock's soul?

Why didn't Abraham ask the single best question?



Copying files: does Windows write to disk if files are identical?


How to copy files and create hardlinks instead of copying when files are identicalDisk-to-disk copying speedAccurate drive backup or file/folder synchronization that maintains metadataadd linefeed when copying Windows 7 filesTrying to remove/diagnose single Current_Pending_Sector in S.M.A.R.T. dataWhy does copying files from one partition to another of the same disk take time?Does appending (merging) files together work with XCOPY?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty
margin-bottom:0;









30

















I have a directory with a lot of books in PDF format (around 2GB in size).



While reading I often leave notes and annotations in the files. Occasionally I make a backup on an external hard drive. Since I do not remember which files have been modified since the last backup, I just copy and overwrite the whole thing.



Does Windows check if files with the same name are identical (by content) before overwriting? If not, how would I approach doing exactly this?










share|improve this question























  • 16





    No windows copy is not smart enough to do that. There may be some third party copy tools that can.

    – Moab
    Oct 13 at 21:43











  • Sync Back Pro (2Brightsparks) can do this. It will only copy changed or new files. It is a third party tool, but top notch and I use it to synchronize 2 large folders totaling 100 GB. It only copies changes

    – John
    Oct 13 at 23:23






  • 6





    I use Robocopy robocopy C:source C:destination /DCOPY:DAT /R:1 /W:1 for such tasks, it works for me but if you try it, be sure to try it on a source and destination where it does not matter if something goes wrong

    – SimonS
    Oct 14 at 9:53






  • 2





    Windows files have an "archive" bit that is set on write. Many backup tools will use this.

    – Stop Harming Monica
    Oct 14 at 13:21






  • 1





    GoodSync Pro is my goto Windows equivalent to rsync. This does a bitwise diff, which is what you're referring to.

    – Blairg23
    Oct 15 at 23:03

















30

















I have a directory with a lot of books in PDF format (around 2GB in size).



While reading I often leave notes and annotations in the files. Occasionally I make a backup on an external hard drive. Since I do not remember which files have been modified since the last backup, I just copy and overwrite the whole thing.



Does Windows check if files with the same name are identical (by content) before overwriting? If not, how would I approach doing exactly this?










share|improve this question























  • 16





    No windows copy is not smart enough to do that. There may be some third party copy tools that can.

    – Moab
    Oct 13 at 21:43











  • Sync Back Pro (2Brightsparks) can do this. It will only copy changed or new files. It is a third party tool, but top notch and I use it to synchronize 2 large folders totaling 100 GB. It only copies changes

    – John
    Oct 13 at 23:23






  • 6





    I use Robocopy robocopy C:source C:destination /DCOPY:DAT /R:1 /W:1 for such tasks, it works for me but if you try it, be sure to try it on a source and destination where it does not matter if something goes wrong

    – SimonS
    Oct 14 at 9:53






  • 2





    Windows files have an "archive" bit that is set on write. Many backup tools will use this.

    – Stop Harming Monica
    Oct 14 at 13:21






  • 1





    GoodSync Pro is my goto Windows equivalent to rsync. This does a bitwise diff, which is what you're referring to.

    – Blairg23
    Oct 15 at 23:03













30












30








30


9






I have a directory with a lot of books in PDF format (around 2GB in size).



While reading I often leave notes and annotations in the files. Occasionally I make a backup on an external hard drive. Since I do not remember which files have been modified since the last backup, I just copy and overwrite the whole thing.



Does Windows check if files with the same name are identical (by content) before overwriting? If not, how would I approach doing exactly this?










share|improve this question

















I have a directory with a lot of books in PDF format (around 2GB in size).



While reading I often leave notes and annotations in the files. Occasionally I make a backup on an external hard drive. Since I do not remember which files have been modified since the last backup, I just copy and overwrite the whole thing.



Does Windows check if files with the same name are identical (by content) before overwriting? If not, how would I approach doing exactly this?







windows hard-drive file-transfer copy






share|improve this question
















share|improve this question













share|improve this question




share|improve this question








edited Oct 16 at 9:03









Mr Ethernet

2,7266 silver badges21 bronze badges




2,7266 silver badges21 bronze badges










asked Oct 13 at 21:37









DoflaminhgoDoflaminhgo

1562 silver badges5 bronze badges




1562 silver badges5 bronze badges










  • 16





    No windows copy is not smart enough to do that. There may be some third party copy tools that can.

    – Moab
    Oct 13 at 21:43











  • Sync Back Pro (2Brightsparks) can do this. It will only copy changed or new files. It is a third party tool, but top notch and I use it to synchronize 2 large folders totaling 100 GB. It only copies changes

    – John
    Oct 13 at 23:23






  • 6





    I use Robocopy robocopy C:source C:destination /DCOPY:DAT /R:1 /W:1 for such tasks, it works for me but if you try it, be sure to try it on a source and destination where it does not matter if something goes wrong

    – SimonS
    Oct 14 at 9:53






  • 2





    Windows files have an "archive" bit that is set on write. Many backup tools will use this.

    – Stop Harming Monica
    Oct 14 at 13:21






  • 1





    GoodSync Pro is my goto Windows equivalent to rsync. This does a bitwise diff, which is what you're referring to.

    – Blairg23
    Oct 15 at 23:03












  • 16





    No windows copy is not smart enough to do that. There may be some third party copy tools that can.

    – Moab
    Oct 13 at 21:43











  • Sync Back Pro (2Brightsparks) can do this. It will only copy changed or new files. It is a third party tool, but top notch and I use it to synchronize 2 large folders totaling 100 GB. It only copies changes

    – John
    Oct 13 at 23:23






  • 6





    I use Robocopy robocopy C:source C:destination /DCOPY:DAT /R:1 /W:1 for such tasks, it works for me but if you try it, be sure to try it on a source and destination where it does not matter if something goes wrong

    – SimonS
    Oct 14 at 9:53






  • 2





    Windows files have an "archive" bit that is set on write. Many backup tools will use this.

    – Stop Harming Monica
    Oct 14 at 13:21






  • 1





    GoodSync Pro is my goto Windows equivalent to rsync. This does a bitwise diff, which is what you're referring to.

    – Blairg23
    Oct 15 at 23:03







16




16





No windows copy is not smart enough to do that. There may be some third party copy tools that can.

– Moab
Oct 13 at 21:43





No windows copy is not smart enough to do that. There may be some third party copy tools that can.

– Moab
Oct 13 at 21:43













Sync Back Pro (2Brightsparks) can do this. It will only copy changed or new files. It is a third party tool, but top notch and I use it to synchronize 2 large folders totaling 100 GB. It only copies changes

– John
Oct 13 at 23:23





Sync Back Pro (2Brightsparks) can do this. It will only copy changed or new files. It is a third party tool, but top notch and I use it to synchronize 2 large folders totaling 100 GB. It only copies changes

– John
Oct 13 at 23:23




6




6





I use Robocopy robocopy C:source C:destination /DCOPY:DAT /R:1 /W:1 for such tasks, it works for me but if you try it, be sure to try it on a source and destination where it does not matter if something goes wrong

– SimonS
Oct 14 at 9:53





I use Robocopy robocopy C:source C:destination /DCOPY:DAT /R:1 /W:1 for such tasks, it works for me but if you try it, be sure to try it on a source and destination where it does not matter if something goes wrong

– SimonS
Oct 14 at 9:53




2




2





Windows files have an "archive" bit that is set on write. Many backup tools will use this.

– Stop Harming Monica
Oct 14 at 13:21





Windows files have an "archive" bit that is set on write. Many backup tools will use this.

– Stop Harming Monica
Oct 14 at 13:21




1




1





GoodSync Pro is my goto Windows equivalent to rsync. This does a bitwise diff, which is what you're referring to.

– Blairg23
Oct 15 at 23:03





GoodSync Pro is my goto Windows equivalent to rsync. This does a bitwise diff, which is what you're referring to.

– Blairg23
Oct 15 at 23:03










6 Answers
6






active

oldest

votes


















62


















Robocopy.



Windows cannot differentiate between identical and modified files if you copy using Windows Explorer.



Windows can differentiate between identical and modified files if you copy using Robocopy, which is a file-copy utility included in Windows (Vista, 7, 8.1 and 10).



There's no need to use third-party tools.



You can save this script as a batch file and re-run it whenever you want to perform a backup:



robocopy /e c:PDFs p:PDFs



  • Whenever a PDF file is annotated and the changes are saved, both its Last Modified and Size attributes will change in the source folder, triggering an overwrite of the corresponding file in the destination folder the next time Robocopy is invoked.

  • Robocopy compares the Size and Last Modified attributes of all files that exist in both the source and destination. If either attribute is different, then the destination file will be overwritten. Note: these values only need to be different, not necessarily newer or larger; even if a source file has an older Last Modified or smaller Size attribute, the source will still overwrite the destination.

  • the /e (or /s) switch isn't necessary if all of your PDFs are in the root of the folder referenced in the script but if you want to include Subfolders then use /s. If you want to include subfolders and Empty subfolders, then use /e.

  • I would assign the backup drive a letter further along in the alphabet, so there's no risk of another drive being inadvertently assigned the drive letter used in the script, causing the backup to fail at some point in the future. I used P here for PDF.

That simple script is all you need.






share|improve this answer























  • 9





    Is it really wise to "skip" failed copies when making backups?

    – Jacob Raihle
    Oct 14 at 14:19






  • 7





    Just a side note: Rococopy does not check the content, it only checks metadata. But yeah, in this case I suppose robocopy should suffice.

    – Albin
    Oct 14 at 16:10







  • 4





    @Jacob Raihle good question. The retry and wait parameters have default values of 1 million attempts and 30 seconds respectively. You can always use smaller values for both but locked files often require user intervention to unlock and reattempts in the short term generally fail regardless. So this is personal preference but I like to let RC race through all the files without getting hung up for a few minutes on a single one. If a file does fail, this will be reported in the job summary report anyway. In general, I find that tomorrow's backup will scoop up any file that was locked today.

    – Mr Ethernet
    Oct 14 at 19:21







  • 1





    overwrite every PDF on your backup drive that doesn't have a matching Last Modified timestamp does that take into account if the destination timestamp is newer than source? It doesn't seem relevant in this question, but still worth knowing.

    – Gary
    Oct 15 at 15:11











  • @Gary Robocopy doesn't distinguish between older vs. newer or smaller vs. bigger files, it only cares that they are different. So even if a file in the destination folder is newer and/or bigger, Robocopy will still happily overwrite it with the older and/or smaller file from the source folder. It senses that the files are different and resolves the difference by pulling data across in the same direction every time: source to destination.

    – Mr Ethernet
    Oct 16 at 0:04



















6


















Windows does not do this. It will however prompt you to overwrite files with the same name and you can select manually if you want to do it.



For a easier solution, use FreeFilesSync to compare the folders and overwrite only changed/updated files (Mirror option and select File time and size in Comparison Settings).






share|improve this answer

































    4


















    Yes and no! Windows Explorer only checks for metadata (file size, dates etc.).



    But you could use a script e.g. powershell (see here) which comes with (most) Windows or 3rd party tools that let you compare/copy files using file checksums e.g. MD5 or SH1 hashing (see here and/or use a search engine).



    I myself like to use the software checksum compare (see here), it lets you compare files and directories including file checksums and it works from a USB pen drive.



    If you don't need to compare the file's content and if you just want to copy "newer" files you can use any advance copy method like xcopy, robocopy, etc.



    Note: the different hashing methods have up and downsides (mainly reliability vs. speed). For me MD5 is more than enough for this type of file comparison, but that's a personal preference. See here for further info on that topic.






    share|improve this answer























    • 1





      @Mehrdad what do you mean?

      – Albin
      Oct 14 at 7:54






    • 24





      @Mehrdad MD5 broken in a security sense. In a deduplicating sense it's ideal, being fast. What's your threat model, when the OP is using it to check the uniqueness of their own files, when they have the originals right there?

      – Chris H
      Oct 14 at 8:12







    • 1





      @Mehrdad more like 25 years, yeah I wouldn't use it for password hashing or checking for the integrity of a file but for simple file comparison, it's a fast and easy algorithm. And it was just an example! But technically you are right, although using e.g. SHA1 will slow down the comparison process significantly (I think around 1/5, but dont quote me on that - it all depends on the impementation and the use case). For me it wouldn't be worth it, on the relatively small chance a collision would occur. Anyway, I included your comment in my answer, thanks.

      – Albin
      Oct 14 at 8:33






    • 10





      "SHA1 is still going to be considerably faster than reading the files from even a fast SSD." - In order to SHA hash the files, you still need to read them all...

      – Milney
      Oct 14 at 11:50






    • 2





      @wrecclesham it sort of does, when it finds conflicts in the file name it shows you changes in size/date and asks you to resolve them manually for a single or for all similar cases

      – Albin
      Oct 14 at 18:11


















    3


















    In short: No



    Windows doesn't dot that in a straightforward way.



    Well, it does, but like everything in Windows it's ambiguous at best. You will be prompted for name conflicts, and depending on your Windows version, you get a more or less understandable dialog with several options to choose from, with an additional note ("Blah blah, different size, newer"). You can then, one by one, choose whether or not to keep the modified file, and you have the option of applying this to all "identical" matches.

    Now of course it's Windows, so you have no guarantee that "newer" actually means newer, and you do not know what is "identical" (is it just the name collision, is it the size change, is it the modification date, or is it everything?).



    Alternatives



    There exist a huge variety of file sync programs, both free and commercial which are somewhat better insofar as they check whether a file has been modified before overwriting it, rsync being the traditional mother-of-all-tools, but also being a tidbid less user-friendly than some people may wish.

    However, I do not recommend any of these because they are not fundamentally making things better.



    Personally, if you are not afraid of a little commandline (could always make a batch file!) I'd recommend Matt Mahoney's excellent zpaq. This is basically ZIP, except it compresses much better, and it does deduplication on the fly.



    How it that better?



    Well, checksum-comparing tools are all nice and that. Especially when you go over the network, nothing can beat rsync running on both ends, it's just awesome. But while a typical sync tool will do the job just fine (and better than Explorer) this is not what it's best at.



    Writing to an external drive, whether or not you compare checksums, has a couple of things you need to cope with:



    • Access time on the drive (abysmal)

    • Latency over USB or what you use (getting better but still kinda abysmal)

    • Bandwidth (actually pretty good nowadays)

    • Drive writes (and amplifications)

    • Drive reads

    In order to compare checksums, you first have to read in the files. Fullstop. Which means that for a couple of thousand files, you pay for the latency of traversing the directory structure, opening files over a high-latency link, and reading the files several thousand times. Plus, transferring them in small units over a high-latency wire. Well, that sucks big time, it is a very expensive process.



    Then, you must write the files that have changed, again with several high-latency operations such as opening files, and overwriting data, and again one by one. This sucks twice because not only is it inherently unsafe (you lose the file being overwritten if your cat stumbles over the USB cable) but also with modern shingling harddrives (such as many external drives), it can be excruciatingly slow, down to single-megabyte-per-second if you are unlucky. That, and the latency of thousands of small transfers adding up.

    A well-written file copy tool may be able to deal with the safety issue by copying a temporary file, and atomically renaming it afterwards (but this adds even more overhead!).



    Now, an archive format like zpaq will create an archive that contains the checksums of the files already, they can be read in quickly and sequentially from one location. It then locally (locally means "on your side of the cable" where you presumably have a reasonably fast disk connected via SATA or M.2 or something) compares checksums, compresses differences, and append-only writes the compressed data sequentially to the existing archive. Yes, this means that the archive will grow a little over time because you carry a whole history around. Alas, get over it, the cost is very moderate thanks to diffing and compression.



    This method is faster, and safer at the same time. If you pull the cable mid-operation, your current backup is interrupted (obviously!). But you do not lose your previous data. All transactions that go over "slow" links are strictly sequential, large transfers, which maximizes throughput.






    share|improve this answer























    • 3





      Robocopy is included in all versions of Windows since Vista and will do exactly what the OP wants. So I would say, "In short: yes... if you ask it nicely!"

      – Mr Ethernet
      Oct 14 at 9:25












    • @wrecclesham: While it is true that robocopy will do the job, it is nowhere near something that someone with "normal" skill can easily grok, nor is it well-suited for the task because it is susceptible to exactly the high-latency-link problem that I pointed out (in particular because robocopy does extra work in order to be reliable, i.e. copy-then-rename).

      – Damon
      Oct 14 at 9:31






    • 2





      1) Super User describes itself as a "Q&A for computer enthusiasts and power users". A simple Robocopy script isn't going to confuse the target audience of this site. I don't think you're giving Super User's userbase enough credit! 😉 2) Your "high-latency link" theory doesn't apply here. The OP is backing up a tiny amount of data via USB, not via some high-latency WAN link. Only a relatively small number of modified PDFs will be included. I use Robocopy to back up ~1 TB of similar files to a USB drive and subsequent nightly runs, including only modified files, take less than 1 minute.

      – Mr Ethernet
      Oct 14 at 10:25











    • @wrecclesham: I consider myself a computer enthusiast (with ~34 years of practice) but I wouldn't want to use robocopy. Alas, different people feel comfortable with different tools. But my point remains: While it is true that RC will only write a tiny amount, it must still read a lot, or rely on file modification times alone, which isn't safe (or USN journals, which isn't guaranteed to be present at all, or pristine). Its work is strictly non-sequential which means that latency adds up. USB latency is very noticeable (millisecond range). "Milli" multiplied with "many" is significant.

      – Damon
      Oct 14 at 10:39











    • "it must still read a lot... or rely on file modification times alone". Robocopy only checks metadata, which means that it hardly needs to read any data at all for skipped files, which allows it to handle identical files very efficiently. If the OP highlights a PDF, the file's timestamp will change. Unmodified PDFs will have identical Last Modified timestamps in both places. Robocopy can therefore differentiate between files to be skipped and those to be overwritten. You make an interesting point but I'm not sure I understand where the inherent risk is here. What could theoretically go wrong?

      – Mr Ethernet
      Oct 14 at 10:52



















    1


















    XCOPY/D will only copy files if the source is newer than the destination.
    (XCOPY/S/D for a recursive copy)






    share|improve this answer


























    • XCOPY does not take the contents of the files into consideration

      – Bert
      Oct 16 at 13:24


















    0


















    Microsoft SyncToy



    Microsoft makes a great Windows PC app called SyncToy where you specify pairs of "left folder and right folder", and choose to echo changes from left to right, contribute changes from left to right without deleting right, or synchronize between left and right. There is a user interface for previewing the changes before committing.



    If a file is detected as identical, it will be skipped over, which is the functionality that you are looking for.



    I have been using the Echo mode for about 10 years to incrementally mirror changes from my desktop PC to an external drive.



    https://www.microsoft.com/en-us/download/details.aspx?id=15155






    share|improve this answer



























      Your Answer








      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "3"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader:
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      ,
      onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );














      draft saved

      draft discarded
















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f1491937%2fcopying-files-does-windows-write-to-disk-if-files-are-identical%23new-answer', 'question_page');

      );

      Post as a guest















      Required, but never shown


























      6 Answers
      6






      active

      oldest

      votes








      6 Answers
      6






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      62


















      Robocopy.



      Windows cannot differentiate between identical and modified files if you copy using Windows Explorer.



      Windows can differentiate between identical and modified files if you copy using Robocopy, which is a file-copy utility included in Windows (Vista, 7, 8.1 and 10).



      There's no need to use third-party tools.



      You can save this script as a batch file and re-run it whenever you want to perform a backup:



      robocopy /e c:PDFs p:PDFs



      • Whenever a PDF file is annotated and the changes are saved, both its Last Modified and Size attributes will change in the source folder, triggering an overwrite of the corresponding file in the destination folder the next time Robocopy is invoked.

      • Robocopy compares the Size and Last Modified attributes of all files that exist in both the source and destination. If either attribute is different, then the destination file will be overwritten. Note: these values only need to be different, not necessarily newer or larger; even if a source file has an older Last Modified or smaller Size attribute, the source will still overwrite the destination.

      • the /e (or /s) switch isn't necessary if all of your PDFs are in the root of the folder referenced in the script but if you want to include Subfolders then use /s. If you want to include subfolders and Empty subfolders, then use /e.

      • I would assign the backup drive a letter further along in the alphabet, so there's no risk of another drive being inadvertently assigned the drive letter used in the script, causing the backup to fail at some point in the future. I used P here for PDF.

      That simple script is all you need.






      share|improve this answer























      • 9





        Is it really wise to "skip" failed copies when making backups?

        – Jacob Raihle
        Oct 14 at 14:19






      • 7





        Just a side note: Rococopy does not check the content, it only checks metadata. But yeah, in this case I suppose robocopy should suffice.

        – Albin
        Oct 14 at 16:10







      • 4





        @Jacob Raihle good question. The retry and wait parameters have default values of 1 million attempts and 30 seconds respectively. You can always use smaller values for both but locked files often require user intervention to unlock and reattempts in the short term generally fail regardless. So this is personal preference but I like to let RC race through all the files without getting hung up for a few minutes on a single one. If a file does fail, this will be reported in the job summary report anyway. In general, I find that tomorrow's backup will scoop up any file that was locked today.

        – Mr Ethernet
        Oct 14 at 19:21







      • 1





        overwrite every PDF on your backup drive that doesn't have a matching Last Modified timestamp does that take into account if the destination timestamp is newer than source? It doesn't seem relevant in this question, but still worth knowing.

        – Gary
        Oct 15 at 15:11











      • @Gary Robocopy doesn't distinguish between older vs. newer or smaller vs. bigger files, it only cares that they are different. So even if a file in the destination folder is newer and/or bigger, Robocopy will still happily overwrite it with the older and/or smaller file from the source folder. It senses that the files are different and resolves the difference by pulling data across in the same direction every time: source to destination.

        – Mr Ethernet
        Oct 16 at 0:04
















      62


















      Robocopy.



      Windows cannot differentiate between identical and modified files if you copy using Windows Explorer.



      Windows can differentiate between identical and modified files if you copy using Robocopy, which is a file-copy utility included in Windows (Vista, 7, 8.1 and 10).



      There's no need to use third-party tools.



      You can save this script as a batch file and re-run it whenever you want to perform a backup:



      robocopy /e c:PDFs p:PDFs



      • Whenever a PDF file is annotated and the changes are saved, both its Last Modified and Size attributes will change in the source folder, triggering an overwrite of the corresponding file in the destination folder the next time Robocopy is invoked.

      • Robocopy compares the Size and Last Modified attributes of all files that exist in both the source and destination. If either attribute is different, then the destination file will be overwritten. Note: these values only need to be different, not necessarily newer or larger; even if a source file has an older Last Modified or smaller Size attribute, the source will still overwrite the destination.

      • the /e (or /s) switch isn't necessary if all of your PDFs are in the root of the folder referenced in the script but if you want to include Subfolders then use /s. If you want to include subfolders and Empty subfolders, then use /e.

      • I would assign the backup drive a letter further along in the alphabet, so there's no risk of another drive being inadvertently assigned the drive letter used in the script, causing the backup to fail at some point in the future. I used P here for PDF.

      That simple script is all you need.






      share|improve this answer























      • 9





        Is it really wise to "skip" failed copies when making backups?

        – Jacob Raihle
        Oct 14 at 14:19






      • 7





        Just a side note: Rococopy does not check the content, it only checks metadata. But yeah, in this case I suppose robocopy should suffice.

        – Albin
        Oct 14 at 16:10







      • 4





        @Jacob Raihle good question. The retry and wait parameters have default values of 1 million attempts and 30 seconds respectively. You can always use smaller values for both but locked files often require user intervention to unlock and reattempts in the short term generally fail regardless. So this is personal preference but I like to let RC race through all the files without getting hung up for a few minutes on a single one. If a file does fail, this will be reported in the job summary report anyway. In general, I find that tomorrow's backup will scoop up any file that was locked today.

        – Mr Ethernet
        Oct 14 at 19:21







      • 1





        overwrite every PDF on your backup drive that doesn't have a matching Last Modified timestamp does that take into account if the destination timestamp is newer than source? It doesn't seem relevant in this question, but still worth knowing.

        – Gary
        Oct 15 at 15:11











      • @Gary Robocopy doesn't distinguish between older vs. newer or smaller vs. bigger files, it only cares that they are different. So even if a file in the destination folder is newer and/or bigger, Robocopy will still happily overwrite it with the older and/or smaller file from the source folder. It senses that the files are different and resolves the difference by pulling data across in the same direction every time: source to destination.

        – Mr Ethernet
        Oct 16 at 0:04














      62














      62










      62









      Robocopy.



      Windows cannot differentiate between identical and modified files if you copy using Windows Explorer.



      Windows can differentiate between identical and modified files if you copy using Robocopy, which is a file-copy utility included in Windows (Vista, 7, 8.1 and 10).



      There's no need to use third-party tools.



      You can save this script as a batch file and re-run it whenever you want to perform a backup:



      robocopy /e c:PDFs p:PDFs



      • Whenever a PDF file is annotated and the changes are saved, both its Last Modified and Size attributes will change in the source folder, triggering an overwrite of the corresponding file in the destination folder the next time Robocopy is invoked.

      • Robocopy compares the Size and Last Modified attributes of all files that exist in both the source and destination. If either attribute is different, then the destination file will be overwritten. Note: these values only need to be different, not necessarily newer or larger; even if a source file has an older Last Modified or smaller Size attribute, the source will still overwrite the destination.

      • the /e (or /s) switch isn't necessary if all of your PDFs are in the root of the folder referenced in the script but if you want to include Subfolders then use /s. If you want to include subfolders and Empty subfolders, then use /e.

      • I would assign the backup drive a letter further along in the alphabet, so there's no risk of another drive being inadvertently assigned the drive letter used in the script, causing the backup to fail at some point in the future. I used P here for PDF.

      That simple script is all you need.






      share|improve this answer
















      Robocopy.



      Windows cannot differentiate between identical and modified files if you copy using Windows Explorer.



      Windows can differentiate between identical and modified files if you copy using Robocopy, which is a file-copy utility included in Windows (Vista, 7, 8.1 and 10).



      There's no need to use third-party tools.



      You can save this script as a batch file and re-run it whenever you want to perform a backup:



      robocopy /e c:PDFs p:PDFs



      • Whenever a PDF file is annotated and the changes are saved, both its Last Modified and Size attributes will change in the source folder, triggering an overwrite of the corresponding file in the destination folder the next time Robocopy is invoked.

      • Robocopy compares the Size and Last Modified attributes of all files that exist in both the source and destination. If either attribute is different, then the destination file will be overwritten. Note: these values only need to be different, not necessarily newer or larger; even if a source file has an older Last Modified or smaller Size attribute, the source will still overwrite the destination.

      • the /e (or /s) switch isn't necessary if all of your PDFs are in the root of the folder referenced in the script but if you want to include Subfolders then use /s. If you want to include subfolders and Empty subfolders, then use /e.

      • I would assign the backup drive a letter further along in the alphabet, so there's no risk of another drive being inadvertently assigned the drive letter used in the script, causing the backup to fail at some point in the future. I used P here for PDF.

      That simple script is all you need.







      share|improve this answer















      share|improve this answer




      share|improve this answer








      edited Oct 16 at 8:27

























      answered Oct 14 at 8:55









      Mr EthernetMr Ethernet

      2,7266 silver badges21 bronze badges




      2,7266 silver badges21 bronze badges










      • 9





        Is it really wise to "skip" failed copies when making backups?

        – Jacob Raihle
        Oct 14 at 14:19






      • 7





        Just a side note: Rococopy does not check the content, it only checks metadata. But yeah, in this case I suppose robocopy should suffice.

        – Albin
        Oct 14 at 16:10







      • 4





        @Jacob Raihle good question. The retry and wait parameters have default values of 1 million attempts and 30 seconds respectively. You can always use smaller values for both but locked files often require user intervention to unlock and reattempts in the short term generally fail regardless. So this is personal preference but I like to let RC race through all the files without getting hung up for a few minutes on a single one. If a file does fail, this will be reported in the job summary report anyway. In general, I find that tomorrow's backup will scoop up any file that was locked today.

        – Mr Ethernet
        Oct 14 at 19:21







      • 1





        overwrite every PDF on your backup drive that doesn't have a matching Last Modified timestamp does that take into account if the destination timestamp is newer than source? It doesn't seem relevant in this question, but still worth knowing.

        – Gary
        Oct 15 at 15:11











      • @Gary Robocopy doesn't distinguish between older vs. newer or smaller vs. bigger files, it only cares that they are different. So even if a file in the destination folder is newer and/or bigger, Robocopy will still happily overwrite it with the older and/or smaller file from the source folder. It senses that the files are different and resolves the difference by pulling data across in the same direction every time: source to destination.

        – Mr Ethernet
        Oct 16 at 0:04













      • 9





        Is it really wise to "skip" failed copies when making backups?

        – Jacob Raihle
        Oct 14 at 14:19






      • 7





        Just a side note: Rococopy does not check the content, it only checks metadata. But yeah, in this case I suppose robocopy should suffice.

        – Albin
        Oct 14 at 16:10







      • 4





        @Jacob Raihle good question. The retry and wait parameters have default values of 1 million attempts and 30 seconds respectively. You can always use smaller values for both but locked files often require user intervention to unlock and reattempts in the short term generally fail regardless. So this is personal preference but I like to let RC race through all the files without getting hung up for a few minutes on a single one. If a file does fail, this will be reported in the job summary report anyway. In general, I find that tomorrow's backup will scoop up any file that was locked today.

        – Mr Ethernet
        Oct 14 at 19:21







      • 1





        overwrite every PDF on your backup drive that doesn't have a matching Last Modified timestamp does that take into account if the destination timestamp is newer than source? It doesn't seem relevant in this question, but still worth knowing.

        – Gary
        Oct 15 at 15:11











      • @Gary Robocopy doesn't distinguish between older vs. newer or smaller vs. bigger files, it only cares that they are different. So even if a file in the destination folder is newer and/or bigger, Robocopy will still happily overwrite it with the older and/or smaller file from the source folder. It senses that the files are different and resolves the difference by pulling data across in the same direction every time: source to destination.

        – Mr Ethernet
        Oct 16 at 0:04








      9




      9





      Is it really wise to "skip" failed copies when making backups?

      – Jacob Raihle
      Oct 14 at 14:19





      Is it really wise to "skip" failed copies when making backups?

      – Jacob Raihle
      Oct 14 at 14:19




      7




      7





      Just a side note: Rococopy does not check the content, it only checks metadata. But yeah, in this case I suppose robocopy should suffice.

      – Albin
      Oct 14 at 16:10






      Just a side note: Rococopy does not check the content, it only checks metadata. But yeah, in this case I suppose robocopy should suffice.

      – Albin
      Oct 14 at 16:10





      4




      4





      @Jacob Raihle good question. The retry and wait parameters have default values of 1 million attempts and 30 seconds respectively. You can always use smaller values for both but locked files often require user intervention to unlock and reattempts in the short term generally fail regardless. So this is personal preference but I like to let RC race through all the files without getting hung up for a few minutes on a single one. If a file does fail, this will be reported in the job summary report anyway. In general, I find that tomorrow's backup will scoop up any file that was locked today.

      – Mr Ethernet
      Oct 14 at 19:21






      @Jacob Raihle good question. The retry and wait parameters have default values of 1 million attempts and 30 seconds respectively. You can always use smaller values for both but locked files often require user intervention to unlock and reattempts in the short term generally fail regardless. So this is personal preference but I like to let RC race through all the files without getting hung up for a few minutes on a single one. If a file does fail, this will be reported in the job summary report anyway. In general, I find that tomorrow's backup will scoop up any file that was locked today.

      – Mr Ethernet
      Oct 14 at 19:21





      1




      1





      overwrite every PDF on your backup drive that doesn't have a matching Last Modified timestamp does that take into account if the destination timestamp is newer than source? It doesn't seem relevant in this question, but still worth knowing.

      – Gary
      Oct 15 at 15:11





      overwrite every PDF on your backup drive that doesn't have a matching Last Modified timestamp does that take into account if the destination timestamp is newer than source? It doesn't seem relevant in this question, but still worth knowing.

      – Gary
      Oct 15 at 15:11













      @Gary Robocopy doesn't distinguish between older vs. newer or smaller vs. bigger files, it only cares that they are different. So even if a file in the destination folder is newer and/or bigger, Robocopy will still happily overwrite it with the older and/or smaller file from the source folder. It senses that the files are different and resolves the difference by pulling data across in the same direction every time: source to destination.

      – Mr Ethernet
      Oct 16 at 0:04






      @Gary Robocopy doesn't distinguish between older vs. newer or smaller vs. bigger files, it only cares that they are different. So even if a file in the destination folder is newer and/or bigger, Robocopy will still happily overwrite it with the older and/or smaller file from the source folder. It senses that the files are different and resolves the difference by pulling data across in the same direction every time: source to destination.

      – Mr Ethernet
      Oct 16 at 0:04














      6


















      Windows does not do this. It will however prompt you to overwrite files with the same name and you can select manually if you want to do it.



      For a easier solution, use FreeFilesSync to compare the folders and overwrite only changed/updated files (Mirror option and select File time and size in Comparison Settings).






      share|improve this answer






























        6


















        Windows does not do this. It will however prompt you to overwrite files with the same name and you can select manually if you want to do it.



        For a easier solution, use FreeFilesSync to compare the folders and overwrite only changed/updated files (Mirror option and select File time and size in Comparison Settings).






        share|improve this answer




























          6














          6










          6









          Windows does not do this. It will however prompt you to overwrite files with the same name and you can select manually if you want to do it.



          For a easier solution, use FreeFilesSync to compare the folders and overwrite only changed/updated files (Mirror option and select File time and size in Comparison Settings).






          share|improve this answer














          Windows does not do this. It will however prompt you to overwrite files with the same name and you can select manually if you want to do it.



          For a easier solution, use FreeFilesSync to compare the folders and overwrite only changed/updated files (Mirror option and select File time and size in Comparison Settings).







          share|improve this answer













          share|improve this answer




          share|improve this answer










          answered Oct 14 at 3:35









          xyphaxypha

          2,1192 gold badges14 silver badges30 bronze badges




          2,1192 gold badges14 silver badges30 bronze badges
























              4


















              Yes and no! Windows Explorer only checks for metadata (file size, dates etc.).



              But you could use a script e.g. powershell (see here) which comes with (most) Windows or 3rd party tools that let you compare/copy files using file checksums e.g. MD5 or SH1 hashing (see here and/or use a search engine).



              I myself like to use the software checksum compare (see here), it lets you compare files and directories including file checksums and it works from a USB pen drive.



              If you don't need to compare the file's content and if you just want to copy "newer" files you can use any advance copy method like xcopy, robocopy, etc.



              Note: the different hashing methods have up and downsides (mainly reliability vs. speed). For me MD5 is more than enough for this type of file comparison, but that's a personal preference. See here for further info on that topic.






              share|improve this answer























              • 1





                @Mehrdad what do you mean?

                – Albin
                Oct 14 at 7:54






              • 24





                @Mehrdad MD5 broken in a security sense. In a deduplicating sense it's ideal, being fast. What's your threat model, when the OP is using it to check the uniqueness of their own files, when they have the originals right there?

                – Chris H
                Oct 14 at 8:12







              • 1





                @Mehrdad more like 25 years, yeah I wouldn't use it for password hashing or checking for the integrity of a file but for simple file comparison, it's a fast and easy algorithm. And it was just an example! But technically you are right, although using e.g. SHA1 will slow down the comparison process significantly (I think around 1/5, but dont quote me on that - it all depends on the impementation and the use case). For me it wouldn't be worth it, on the relatively small chance a collision would occur. Anyway, I included your comment in my answer, thanks.

                – Albin
                Oct 14 at 8:33






              • 10





                "SHA1 is still going to be considerably faster than reading the files from even a fast SSD." - In order to SHA hash the files, you still need to read them all...

                – Milney
                Oct 14 at 11:50






              • 2





                @wrecclesham it sort of does, when it finds conflicts in the file name it shows you changes in size/date and asks you to resolve them manually for a single or for all similar cases

                – Albin
                Oct 14 at 18:11















              4


















              Yes and no! Windows Explorer only checks for metadata (file size, dates etc.).



              But you could use a script e.g. powershell (see here) which comes with (most) Windows or 3rd party tools that let you compare/copy files using file checksums e.g. MD5 or SH1 hashing (see here and/or use a search engine).



              I myself like to use the software checksum compare (see here), it lets you compare files and directories including file checksums and it works from a USB pen drive.



              If you don't need to compare the file's content and if you just want to copy "newer" files you can use any advance copy method like xcopy, robocopy, etc.



              Note: the different hashing methods have up and downsides (mainly reliability vs. speed). For me MD5 is more than enough for this type of file comparison, but that's a personal preference. See here for further info on that topic.






              share|improve this answer























              • 1





                @Mehrdad what do you mean?

                – Albin
                Oct 14 at 7:54






              • 24





                @Mehrdad MD5 broken in a security sense. In a deduplicating sense it's ideal, being fast. What's your threat model, when the OP is using it to check the uniqueness of their own files, when they have the originals right there?

                – Chris H
                Oct 14 at 8:12







              • 1





                @Mehrdad more like 25 years, yeah I wouldn't use it for password hashing or checking for the integrity of a file but for simple file comparison, it's a fast and easy algorithm. And it was just an example! But technically you are right, although using e.g. SHA1 will slow down the comparison process significantly (I think around 1/5, but dont quote me on that - it all depends on the impementation and the use case). For me it wouldn't be worth it, on the relatively small chance a collision would occur. Anyway, I included your comment in my answer, thanks.

                – Albin
                Oct 14 at 8:33






              • 10





                "SHA1 is still going to be considerably faster than reading the files from even a fast SSD." - In order to SHA hash the files, you still need to read them all...

                – Milney
                Oct 14 at 11:50






              • 2





                @wrecclesham it sort of does, when it finds conflicts in the file name it shows you changes in size/date and asks you to resolve them manually for a single or for all similar cases

                – Albin
                Oct 14 at 18:11













              4














              4










              4









              Yes and no! Windows Explorer only checks for metadata (file size, dates etc.).



              But you could use a script e.g. powershell (see here) which comes with (most) Windows or 3rd party tools that let you compare/copy files using file checksums e.g. MD5 or SH1 hashing (see here and/or use a search engine).



              I myself like to use the software checksum compare (see here), it lets you compare files and directories including file checksums and it works from a USB pen drive.



              If you don't need to compare the file's content and if you just want to copy "newer" files you can use any advance copy method like xcopy, robocopy, etc.



              Note: the different hashing methods have up and downsides (mainly reliability vs. speed). For me MD5 is more than enough for this type of file comparison, but that's a personal preference. See here for further info on that topic.






              share|improve this answer
















              Yes and no! Windows Explorer only checks for metadata (file size, dates etc.).



              But you could use a script e.g. powershell (see here) which comes with (most) Windows or 3rd party tools that let you compare/copy files using file checksums e.g. MD5 or SH1 hashing (see here and/or use a search engine).



              I myself like to use the software checksum compare (see here), it lets you compare files and directories including file checksums and it works from a USB pen drive.



              If you don't need to compare the file's content and if you just want to copy "newer" files you can use any advance copy method like xcopy, robocopy, etc.



              Note: the different hashing methods have up and downsides (mainly reliability vs. speed). For me MD5 is more than enough for this type of file comparison, but that's a personal preference. See here for further info on that topic.







              share|improve this answer















              share|improve this answer




              share|improve this answer








              edited Oct 15 at 18:57

























              answered Oct 13 at 21:47









              AlbinAlbin

              3,1941 gold badge16 silver badges36 bronze badges




              3,1941 gold badge16 silver badges36 bronze badges










              • 1





                @Mehrdad what do you mean?

                – Albin
                Oct 14 at 7:54






              • 24





                @Mehrdad MD5 broken in a security sense. In a deduplicating sense it's ideal, being fast. What's your threat model, when the OP is using it to check the uniqueness of their own files, when they have the originals right there?

                – Chris H
                Oct 14 at 8:12







              • 1





                @Mehrdad more like 25 years, yeah I wouldn't use it for password hashing or checking for the integrity of a file but for simple file comparison, it's a fast and easy algorithm. And it was just an example! But technically you are right, although using e.g. SHA1 will slow down the comparison process significantly (I think around 1/5, but dont quote me on that - it all depends on the impementation and the use case). For me it wouldn't be worth it, on the relatively small chance a collision would occur. Anyway, I included your comment in my answer, thanks.

                – Albin
                Oct 14 at 8:33






              • 10





                "SHA1 is still going to be considerably faster than reading the files from even a fast SSD." - In order to SHA hash the files, you still need to read them all...

                – Milney
                Oct 14 at 11:50






              • 2





                @wrecclesham it sort of does, when it finds conflicts in the file name it shows you changes in size/date and asks you to resolve them manually for a single or for all similar cases

                – Albin
                Oct 14 at 18:11












              • 1





                @Mehrdad what do you mean?

                – Albin
                Oct 14 at 7:54






              • 24





                @Mehrdad MD5 broken in a security sense. In a deduplicating sense it's ideal, being fast. What's your threat model, when the OP is using it to check the uniqueness of their own files, when they have the originals right there?

                – Chris H
                Oct 14 at 8:12







              • 1





                @Mehrdad more like 25 years, yeah I wouldn't use it for password hashing or checking for the integrity of a file but for simple file comparison, it's a fast and easy algorithm. And it was just an example! But technically you are right, although using e.g. SHA1 will slow down the comparison process significantly (I think around 1/5, but dont quote me on that - it all depends on the impementation and the use case). For me it wouldn't be worth it, on the relatively small chance a collision would occur. Anyway, I included your comment in my answer, thanks.

                – Albin
                Oct 14 at 8:33






              • 10





                "SHA1 is still going to be considerably faster than reading the files from even a fast SSD." - In order to SHA hash the files, you still need to read them all...

                – Milney
                Oct 14 at 11:50






              • 2





                @wrecclesham it sort of does, when it finds conflicts in the file name it shows you changes in size/date and asks you to resolve them manually for a single or for all similar cases

                – Albin
                Oct 14 at 18:11







              1




              1





              @Mehrdad what do you mean?

              – Albin
              Oct 14 at 7:54





              @Mehrdad what do you mean?

              – Albin
              Oct 14 at 7:54




              24




              24





              @Mehrdad MD5 broken in a security sense. In a deduplicating sense it's ideal, being fast. What's your threat model, when the OP is using it to check the uniqueness of their own files, when they have the originals right there?

              – Chris H
              Oct 14 at 8:12






              @Mehrdad MD5 broken in a security sense. In a deduplicating sense it's ideal, being fast. What's your threat model, when the OP is using it to check the uniqueness of their own files, when they have the originals right there?

              – Chris H
              Oct 14 at 8:12





              1




              1





              @Mehrdad more like 25 years, yeah I wouldn't use it for password hashing or checking for the integrity of a file but for simple file comparison, it's a fast and easy algorithm. And it was just an example! But technically you are right, although using e.g. SHA1 will slow down the comparison process significantly (I think around 1/5, but dont quote me on that - it all depends on the impementation and the use case). For me it wouldn't be worth it, on the relatively small chance a collision would occur. Anyway, I included your comment in my answer, thanks.

              – Albin
              Oct 14 at 8:33





              @Mehrdad more like 25 years, yeah I wouldn't use it for password hashing or checking for the integrity of a file but for simple file comparison, it's a fast and easy algorithm. And it was just an example! But technically you are right, although using e.g. SHA1 will slow down the comparison process significantly (I think around 1/5, but dont quote me on that - it all depends on the impementation and the use case). For me it wouldn't be worth it, on the relatively small chance a collision would occur. Anyway, I included your comment in my answer, thanks.

              – Albin
              Oct 14 at 8:33




              10




              10





              "SHA1 is still going to be considerably faster than reading the files from even a fast SSD." - In order to SHA hash the files, you still need to read them all...

              – Milney
              Oct 14 at 11:50





              "SHA1 is still going to be considerably faster than reading the files from even a fast SSD." - In order to SHA hash the files, you still need to read them all...

              – Milney
              Oct 14 at 11:50




              2




              2





              @wrecclesham it sort of does, when it finds conflicts in the file name it shows you changes in size/date and asks you to resolve them manually for a single or for all similar cases

              – Albin
              Oct 14 at 18:11





              @wrecclesham it sort of does, when it finds conflicts in the file name it shows you changes in size/date and asks you to resolve them manually for a single or for all similar cases

              – Albin
              Oct 14 at 18:11











              3


















              In short: No



              Windows doesn't dot that in a straightforward way.



              Well, it does, but like everything in Windows it's ambiguous at best. You will be prompted for name conflicts, and depending on your Windows version, you get a more or less understandable dialog with several options to choose from, with an additional note ("Blah blah, different size, newer"). You can then, one by one, choose whether or not to keep the modified file, and you have the option of applying this to all "identical" matches.

              Now of course it's Windows, so you have no guarantee that "newer" actually means newer, and you do not know what is "identical" (is it just the name collision, is it the size change, is it the modification date, or is it everything?).



              Alternatives



              There exist a huge variety of file sync programs, both free and commercial which are somewhat better insofar as they check whether a file has been modified before overwriting it, rsync being the traditional mother-of-all-tools, but also being a tidbid less user-friendly than some people may wish.

              However, I do not recommend any of these because they are not fundamentally making things better.



              Personally, if you are not afraid of a little commandline (could always make a batch file!) I'd recommend Matt Mahoney's excellent zpaq. This is basically ZIP, except it compresses much better, and it does deduplication on the fly.



              How it that better?



              Well, checksum-comparing tools are all nice and that. Especially when you go over the network, nothing can beat rsync running on both ends, it's just awesome. But while a typical sync tool will do the job just fine (and better than Explorer) this is not what it's best at.



              Writing to an external drive, whether or not you compare checksums, has a couple of things you need to cope with:



              • Access time on the drive (abysmal)

              • Latency over USB or what you use (getting better but still kinda abysmal)

              • Bandwidth (actually pretty good nowadays)

              • Drive writes (and amplifications)

              • Drive reads

              In order to compare checksums, you first have to read in the files. Fullstop. Which means that for a couple of thousand files, you pay for the latency of traversing the directory structure, opening files over a high-latency link, and reading the files several thousand times. Plus, transferring them in small units over a high-latency wire. Well, that sucks big time, it is a very expensive process.



              Then, you must write the files that have changed, again with several high-latency operations such as opening files, and overwriting data, and again one by one. This sucks twice because not only is it inherently unsafe (you lose the file being overwritten if your cat stumbles over the USB cable) but also with modern shingling harddrives (such as many external drives), it can be excruciatingly slow, down to single-megabyte-per-second if you are unlucky. That, and the latency of thousands of small transfers adding up.

              A well-written file copy tool may be able to deal with the safety issue by copying a temporary file, and atomically renaming it afterwards (but this adds even more overhead!).



              Now, an archive format like zpaq will create an archive that contains the checksums of the files already, they can be read in quickly and sequentially from one location. It then locally (locally means "on your side of the cable" where you presumably have a reasonably fast disk connected via SATA or M.2 or something) compares checksums, compresses differences, and append-only writes the compressed data sequentially to the existing archive. Yes, this means that the archive will grow a little over time because you carry a whole history around. Alas, get over it, the cost is very moderate thanks to diffing and compression.



              This method is faster, and safer at the same time. If you pull the cable mid-operation, your current backup is interrupted (obviously!). But you do not lose your previous data. All transactions that go over "slow" links are strictly sequential, large transfers, which maximizes throughput.






              share|improve this answer























              • 3





                Robocopy is included in all versions of Windows since Vista and will do exactly what the OP wants. So I would say, "In short: yes... if you ask it nicely!"

                – Mr Ethernet
                Oct 14 at 9:25












              • @wrecclesham: While it is true that robocopy will do the job, it is nowhere near something that someone with "normal" skill can easily grok, nor is it well-suited for the task because it is susceptible to exactly the high-latency-link problem that I pointed out (in particular because robocopy does extra work in order to be reliable, i.e. copy-then-rename).

                – Damon
                Oct 14 at 9:31






              • 2





                1) Super User describes itself as a "Q&A for computer enthusiasts and power users". A simple Robocopy script isn't going to confuse the target audience of this site. I don't think you're giving Super User's userbase enough credit! 😉 2) Your "high-latency link" theory doesn't apply here. The OP is backing up a tiny amount of data via USB, not via some high-latency WAN link. Only a relatively small number of modified PDFs will be included. I use Robocopy to back up ~1 TB of similar files to a USB drive and subsequent nightly runs, including only modified files, take less than 1 minute.

                – Mr Ethernet
                Oct 14 at 10:25











              • @wrecclesham: I consider myself a computer enthusiast (with ~34 years of practice) but I wouldn't want to use robocopy. Alas, different people feel comfortable with different tools. But my point remains: While it is true that RC will only write a tiny amount, it must still read a lot, or rely on file modification times alone, which isn't safe (or USN journals, which isn't guaranteed to be present at all, or pristine). Its work is strictly non-sequential which means that latency adds up. USB latency is very noticeable (millisecond range). "Milli" multiplied with "many" is significant.

                – Damon
                Oct 14 at 10:39











              • "it must still read a lot... or rely on file modification times alone". Robocopy only checks metadata, which means that it hardly needs to read any data at all for skipped files, which allows it to handle identical files very efficiently. If the OP highlights a PDF, the file's timestamp will change. Unmodified PDFs will have identical Last Modified timestamps in both places. Robocopy can therefore differentiate between files to be skipped and those to be overwritten. You make an interesting point but I'm not sure I understand where the inherent risk is here. What could theoretically go wrong?

                – Mr Ethernet
                Oct 14 at 10:52
















              3


















              In short: No



              Windows doesn't dot that in a straightforward way.



              Well, it does, but like everything in Windows it's ambiguous at best. You will be prompted for name conflicts, and depending on your Windows version, you get a more or less understandable dialog with several options to choose from, with an additional note ("Blah blah, different size, newer"). You can then, one by one, choose whether or not to keep the modified file, and you have the option of applying this to all "identical" matches.

              Now of course it's Windows, so you have no guarantee that "newer" actually means newer, and you do not know what is "identical" (is it just the name collision, is it the size change, is it the modification date, or is it everything?).



              Alternatives



              There exist a huge variety of file sync programs, both free and commercial which are somewhat better insofar as they check whether a file has been modified before overwriting it, rsync being the traditional mother-of-all-tools, but also being a tidbid less user-friendly than some people may wish.

              However, I do not recommend any of these because they are not fundamentally making things better.



              Personally, if you are not afraid of a little commandline (could always make a batch file!) I'd recommend Matt Mahoney's excellent zpaq. This is basically ZIP, except it compresses much better, and it does deduplication on the fly.



              How it that better?



              Well, checksum-comparing tools are all nice and that. Especially when you go over the network, nothing can beat rsync running on both ends, it's just awesome. But while a typical sync tool will do the job just fine (and better than Explorer) this is not what it's best at.



              Writing to an external drive, whether or not you compare checksums, has a couple of things you need to cope with:



              • Access time on the drive (abysmal)

              • Latency over USB or what you use (getting better but still kinda abysmal)

              • Bandwidth (actually pretty good nowadays)

              • Drive writes (and amplifications)

              • Drive reads

              In order to compare checksums, you first have to read in the files. Fullstop. Which means that for a couple of thousand files, you pay for the latency of traversing the directory structure, opening files over a high-latency link, and reading the files several thousand times. Plus, transferring them in small units over a high-latency wire. Well, that sucks big time, it is a very expensive process.



              Then, you must write the files that have changed, again with several high-latency operations such as opening files, and overwriting data, and again one by one. This sucks twice because not only is it inherently unsafe (you lose the file being overwritten if your cat stumbles over the USB cable) but also with modern shingling harddrives (such as many external drives), it can be excruciatingly slow, down to single-megabyte-per-second if you are unlucky. That, and the latency of thousands of small transfers adding up.

              A well-written file copy tool may be able to deal with the safety issue by copying a temporary file, and atomically renaming it afterwards (but this adds even more overhead!).



              Now, an archive format like zpaq will create an archive that contains the checksums of the files already, they can be read in quickly and sequentially from one location. It then locally (locally means "on your side of the cable" where you presumably have a reasonably fast disk connected via SATA or M.2 or something) compares checksums, compresses differences, and append-only writes the compressed data sequentially to the existing archive. Yes, this means that the archive will grow a little over time because you carry a whole history around. Alas, get over it, the cost is very moderate thanks to diffing and compression.



              This method is faster, and safer at the same time. If you pull the cable mid-operation, your current backup is interrupted (obviously!). But you do not lose your previous data. All transactions that go over "slow" links are strictly sequential, large transfers, which maximizes throughput.






              share|improve this answer























              • 3





                Robocopy is included in all versions of Windows since Vista and will do exactly what the OP wants. So I would say, "In short: yes... if you ask it nicely!"

                – Mr Ethernet
                Oct 14 at 9:25












              • @wrecclesham: While it is true that robocopy will do the job, it is nowhere near something that someone with "normal" skill can easily grok, nor is it well-suited for the task because it is susceptible to exactly the high-latency-link problem that I pointed out (in particular because robocopy does extra work in order to be reliable, i.e. copy-then-rename).

                – Damon
                Oct 14 at 9:31






              • 2





                1) Super User describes itself as a "Q&A for computer enthusiasts and power users". A simple Robocopy script isn't going to confuse the target audience of this site. I don't think you're giving Super User's userbase enough credit! 😉 2) Your "high-latency link" theory doesn't apply here. The OP is backing up a tiny amount of data via USB, not via some high-latency WAN link. Only a relatively small number of modified PDFs will be included. I use Robocopy to back up ~1 TB of similar files to a USB drive and subsequent nightly runs, including only modified files, take less than 1 minute.

                – Mr Ethernet
                Oct 14 at 10:25











              • @wrecclesham: I consider myself a computer enthusiast (with ~34 years of practice) but I wouldn't want to use robocopy. Alas, different people feel comfortable with different tools. But my point remains: While it is true that RC will only write a tiny amount, it must still read a lot, or rely on file modification times alone, which isn't safe (or USN journals, which isn't guaranteed to be present at all, or pristine). Its work is strictly non-sequential which means that latency adds up. USB latency is very noticeable (millisecond range). "Milli" multiplied with "many" is significant.

                – Damon
                Oct 14 at 10:39











              • "it must still read a lot... or rely on file modification times alone". Robocopy only checks metadata, which means that it hardly needs to read any data at all for skipped files, which allows it to handle identical files very efficiently. If the OP highlights a PDF, the file's timestamp will change. Unmodified PDFs will have identical Last Modified timestamps in both places. Robocopy can therefore differentiate between files to be skipped and those to be overwritten. You make an interesting point but I'm not sure I understand where the inherent risk is here. What could theoretically go wrong?

                – Mr Ethernet
                Oct 14 at 10:52














              3














              3










              3









              In short: No



              Windows doesn't dot that in a straightforward way.



              Well, it does, but like everything in Windows it's ambiguous at best. You will be prompted for name conflicts, and depending on your Windows version, you get a more or less understandable dialog with several options to choose from, with an additional note ("Blah blah, different size, newer"). You can then, one by one, choose whether or not to keep the modified file, and you have the option of applying this to all "identical" matches.

              Now of course it's Windows, so you have no guarantee that "newer" actually means newer, and you do not know what is "identical" (is it just the name collision, is it the size change, is it the modification date, or is it everything?).



              Alternatives



              There exist a huge variety of file sync programs, both free and commercial which are somewhat better insofar as they check whether a file has been modified before overwriting it, rsync being the traditional mother-of-all-tools, but also being a tidbid less user-friendly than some people may wish.

              However, I do not recommend any of these because they are not fundamentally making things better.



              Personally, if you are not afraid of a little commandline (could always make a batch file!) I'd recommend Matt Mahoney's excellent zpaq. This is basically ZIP, except it compresses much better, and it does deduplication on the fly.



              How it that better?



              Well, checksum-comparing tools are all nice and that. Especially when you go over the network, nothing can beat rsync running on both ends, it's just awesome. But while a typical sync tool will do the job just fine (and better than Explorer) this is not what it's best at.



              Writing to an external drive, whether or not you compare checksums, has a couple of things you need to cope with:



              • Access time on the drive (abysmal)

              • Latency over USB or what you use (getting better but still kinda abysmal)

              • Bandwidth (actually pretty good nowadays)

              • Drive writes (and amplifications)

              • Drive reads

              In order to compare checksums, you first have to read in the files. Fullstop. Which means that for a couple of thousand files, you pay for the latency of traversing the directory structure, opening files over a high-latency link, and reading the files several thousand times. Plus, transferring them in small units over a high-latency wire. Well, that sucks big time, it is a very expensive process.



              Then, you must write the files that have changed, again with several high-latency operations such as opening files, and overwriting data, and again one by one. This sucks twice because not only is it inherently unsafe (you lose the file being overwritten if your cat stumbles over the USB cable) but also with modern shingling harddrives (such as many external drives), it can be excruciatingly slow, down to single-megabyte-per-second if you are unlucky. That, and the latency of thousands of small transfers adding up.

              A well-written file copy tool may be able to deal with the safety issue by copying a temporary file, and atomically renaming it afterwards (but this adds even more overhead!).



              Now, an archive format like zpaq will create an archive that contains the checksums of the files already, they can be read in quickly and sequentially from one location. It then locally (locally means "on your side of the cable" where you presumably have a reasonably fast disk connected via SATA or M.2 or something) compares checksums, compresses differences, and append-only writes the compressed data sequentially to the existing archive. Yes, this means that the archive will grow a little over time because you carry a whole history around. Alas, get over it, the cost is very moderate thanks to diffing and compression.



              This method is faster, and safer at the same time. If you pull the cable mid-operation, your current backup is interrupted (obviously!). But you do not lose your previous data. All transactions that go over "slow" links are strictly sequential, large transfers, which maximizes throughput.






              share|improve this answer
















              In short: No



              Windows doesn't dot that in a straightforward way.



              Well, it does, but like everything in Windows it's ambiguous at best. You will be prompted for name conflicts, and depending on your Windows version, you get a more or less understandable dialog with several options to choose from, with an additional note ("Blah blah, different size, newer"). You can then, one by one, choose whether or not to keep the modified file, and you have the option of applying this to all "identical" matches.

              Now of course it's Windows, so you have no guarantee that "newer" actually means newer, and you do not know what is "identical" (is it just the name collision, is it the size change, is it the modification date, or is it everything?).



              Alternatives



              There exist a huge variety of file sync programs, both free and commercial which are somewhat better insofar as they check whether a file has been modified before overwriting it, rsync being the traditional mother-of-all-tools, but also being a tidbid less user-friendly than some people may wish.

              However, I do not recommend any of these because they are not fundamentally making things better.



              Personally, if you are not afraid of a little commandline (could always make a batch file!) I'd recommend Matt Mahoney's excellent zpaq. This is basically ZIP, except it compresses much better, and it does deduplication on the fly.



              How it that better?



              Well, checksum-comparing tools are all nice and that. Especially when you go over the network, nothing can beat rsync running on both ends, it's just awesome. But while a typical sync tool will do the job just fine (and better than Explorer) this is not what it's best at.



              Writing to an external drive, whether or not you compare checksums, has a couple of things you need to cope with:



              • Access time on the drive (abysmal)

              • Latency over USB or what you use (getting better but still kinda abysmal)

              • Bandwidth (actually pretty good nowadays)

              • Drive writes (and amplifications)

              • Drive reads

              In order to compare checksums, you first have to read in the files. Fullstop. Which means that for a couple of thousand files, you pay for the latency of traversing the directory structure, opening files over a high-latency link, and reading the files several thousand times. Plus, transferring them in small units over a high-latency wire. Well, that sucks big time, it is a very expensive process.



              Then, you must write the files that have changed, again with several high-latency operations such as opening files, and overwriting data, and again one by one. This sucks twice because not only is it inherently unsafe (you lose the file being overwritten if your cat stumbles over the USB cable) but also with modern shingling harddrives (such as many external drives), it can be excruciatingly slow, down to single-megabyte-per-second if you are unlucky. That, and the latency of thousands of small transfers adding up.

              A well-written file copy tool may be able to deal with the safety issue by copying a temporary file, and atomically renaming it afterwards (but this adds even more overhead!).



              Now, an archive format like zpaq will create an archive that contains the checksums of the files already, they can be read in quickly and sequentially from one location. It then locally (locally means "on your side of the cable" where you presumably have a reasonably fast disk connected via SATA or M.2 or something) compares checksums, compresses differences, and append-only writes the compressed data sequentially to the existing archive. Yes, this means that the archive will grow a little over time because you carry a whole history around. Alas, get over it, the cost is very moderate thanks to diffing and compression.



              This method is faster, and safer at the same time. If you pull the cable mid-operation, your current backup is interrupted (obviously!). But you do not lose your previous data. All transactions that go over "slow" links are strictly sequential, large transfers, which maximizes throughput.







              share|improve this answer















              share|improve this answer




              share|improve this answer








              edited Oct 14 at 9:25

























              answered Oct 14 at 9:22









              DamonDamon

              3,8373 gold badges17 silver badges26 bronze badges




              3,8373 gold badges17 silver badges26 bronze badges










              • 3





                Robocopy is included in all versions of Windows since Vista and will do exactly what the OP wants. So I would say, "In short: yes... if you ask it nicely!"

                – Mr Ethernet
                Oct 14 at 9:25












              • @wrecclesham: While it is true that robocopy will do the job, it is nowhere near something that someone with "normal" skill can easily grok, nor is it well-suited for the task because it is susceptible to exactly the high-latency-link problem that I pointed out (in particular because robocopy does extra work in order to be reliable, i.e. copy-then-rename).

                – Damon
                Oct 14 at 9:31






              • 2





                1) Super User describes itself as a "Q&A for computer enthusiasts and power users". A simple Robocopy script isn't going to confuse the target audience of this site. I don't think you're giving Super User's userbase enough credit! 😉 2) Your "high-latency link" theory doesn't apply here. The OP is backing up a tiny amount of data via USB, not via some high-latency WAN link. Only a relatively small number of modified PDFs will be included. I use Robocopy to back up ~1 TB of similar files to a USB drive and subsequent nightly runs, including only modified files, take less than 1 minute.

                – Mr Ethernet
                Oct 14 at 10:25











              • @wrecclesham: I consider myself a computer enthusiast (with ~34 years of practice) but I wouldn't want to use robocopy. Alas, different people feel comfortable with different tools. But my point remains: While it is true that RC will only write a tiny amount, it must still read a lot, or rely on file modification times alone, which isn't safe (or USN journals, which isn't guaranteed to be present at all, or pristine). Its work is strictly non-sequential which means that latency adds up. USB latency is very noticeable (millisecond range). "Milli" multiplied with "many" is significant.

                – Damon
                Oct 14 at 10:39











              • "it must still read a lot... or rely on file modification times alone". Robocopy only checks metadata, which means that it hardly needs to read any data at all for skipped files, which allows it to handle identical files very efficiently. If the OP highlights a PDF, the file's timestamp will change. Unmodified PDFs will have identical Last Modified timestamps in both places. Robocopy can therefore differentiate between files to be skipped and those to be overwritten. You make an interesting point but I'm not sure I understand where the inherent risk is here. What could theoretically go wrong?

                – Mr Ethernet
                Oct 14 at 10:52













              • 3





                Robocopy is included in all versions of Windows since Vista and will do exactly what the OP wants. So I would say, "In short: yes... if you ask it nicely!"

                – Mr Ethernet
                Oct 14 at 9:25












              • @wrecclesham: While it is true that robocopy will do the job, it is nowhere near something that someone with "normal" skill can easily grok, nor is it well-suited for the task because it is susceptible to exactly the high-latency-link problem that I pointed out (in particular because robocopy does extra work in order to be reliable, i.e. copy-then-rename).

                – Damon
                Oct 14 at 9:31






              • 2





                1) Super User describes itself as a "Q&A for computer enthusiasts and power users". A simple Robocopy script isn't going to confuse the target audience of this site. I don't think you're giving Super User's userbase enough credit! 😉 2) Your "high-latency link" theory doesn't apply here. The OP is backing up a tiny amount of data via USB, not via some high-latency WAN link. Only a relatively small number of modified PDFs will be included. I use Robocopy to back up ~1 TB of similar files to a USB drive and subsequent nightly runs, including only modified files, take less than 1 minute.

                – Mr Ethernet
                Oct 14 at 10:25











              • @wrecclesham: I consider myself a computer enthusiast (with ~34 years of practice) but I wouldn't want to use robocopy. Alas, different people feel comfortable with different tools. But my point remains: While it is true that RC will only write a tiny amount, it must still read a lot, or rely on file modification times alone, which isn't safe (or USN journals, which isn't guaranteed to be present at all, or pristine). Its work is strictly non-sequential which means that latency adds up. USB latency is very noticeable (millisecond range). "Milli" multiplied with "many" is significant.

                – Damon
                Oct 14 at 10:39











              • "it must still read a lot... or rely on file modification times alone". Robocopy only checks metadata, which means that it hardly needs to read any data at all for skipped files, which allows it to handle identical files very efficiently. If the OP highlights a PDF, the file's timestamp will change. Unmodified PDFs will have identical Last Modified timestamps in both places. Robocopy can therefore differentiate between files to be skipped and those to be overwritten. You make an interesting point but I'm not sure I understand where the inherent risk is here. What could theoretically go wrong?

                – Mr Ethernet
                Oct 14 at 10:52








              3




              3





              Robocopy is included in all versions of Windows since Vista and will do exactly what the OP wants. So I would say, "In short: yes... if you ask it nicely!"

              – Mr Ethernet
              Oct 14 at 9:25






              Robocopy is included in all versions of Windows since Vista and will do exactly what the OP wants. So I would say, "In short: yes... if you ask it nicely!"

              – Mr Ethernet
              Oct 14 at 9:25














              @wrecclesham: While it is true that robocopy will do the job, it is nowhere near something that someone with "normal" skill can easily grok, nor is it well-suited for the task because it is susceptible to exactly the high-latency-link problem that I pointed out (in particular because robocopy does extra work in order to be reliable, i.e. copy-then-rename).

              – Damon
              Oct 14 at 9:31





              @wrecclesham: While it is true that robocopy will do the job, it is nowhere near something that someone with "normal" skill can easily grok, nor is it well-suited for the task because it is susceptible to exactly the high-latency-link problem that I pointed out (in particular because robocopy does extra work in order to be reliable, i.e. copy-then-rename).

              – Damon
              Oct 14 at 9:31




              2




              2





              1) Super User describes itself as a "Q&A for computer enthusiasts and power users". A simple Robocopy script isn't going to confuse the target audience of this site. I don't think you're giving Super User's userbase enough credit! 😉 2) Your "high-latency link" theory doesn't apply here. The OP is backing up a tiny amount of data via USB, not via some high-latency WAN link. Only a relatively small number of modified PDFs will be included. I use Robocopy to back up ~1 TB of similar files to a USB drive and subsequent nightly runs, including only modified files, take less than 1 minute.

              – Mr Ethernet
              Oct 14 at 10:25





              1) Super User describes itself as a "Q&A for computer enthusiasts and power users". A simple Robocopy script isn't going to confuse the target audience of this site. I don't think you're giving Super User's userbase enough credit! 😉 2) Your "high-latency link" theory doesn't apply here. The OP is backing up a tiny amount of data via USB, not via some high-latency WAN link. Only a relatively small number of modified PDFs will be included. I use Robocopy to back up ~1 TB of similar files to a USB drive and subsequent nightly runs, including only modified files, take less than 1 minute.

              – Mr Ethernet
              Oct 14 at 10:25













              @wrecclesham: I consider myself a computer enthusiast (with ~34 years of practice) but I wouldn't want to use robocopy. Alas, different people feel comfortable with different tools. But my point remains: While it is true that RC will only write a tiny amount, it must still read a lot, or rely on file modification times alone, which isn't safe (or USN journals, which isn't guaranteed to be present at all, or pristine). Its work is strictly non-sequential which means that latency adds up. USB latency is very noticeable (millisecond range). "Milli" multiplied with "many" is significant.

              – Damon
              Oct 14 at 10:39





              @wrecclesham: I consider myself a computer enthusiast (with ~34 years of practice) but I wouldn't want to use robocopy. Alas, different people feel comfortable with different tools. But my point remains: While it is true that RC will only write a tiny amount, it must still read a lot, or rely on file modification times alone, which isn't safe (or USN journals, which isn't guaranteed to be present at all, or pristine). Its work is strictly non-sequential which means that latency adds up. USB latency is very noticeable (millisecond range). "Milli" multiplied with "many" is significant.

              – Damon
              Oct 14 at 10:39













              "it must still read a lot... or rely on file modification times alone". Robocopy only checks metadata, which means that it hardly needs to read any data at all for skipped files, which allows it to handle identical files very efficiently. If the OP highlights a PDF, the file's timestamp will change. Unmodified PDFs will have identical Last Modified timestamps in both places. Robocopy can therefore differentiate between files to be skipped and those to be overwritten. You make an interesting point but I'm not sure I understand where the inherent risk is here. What could theoretically go wrong?

              – Mr Ethernet
              Oct 14 at 10:52






              "it must still read a lot... or rely on file modification times alone". Robocopy only checks metadata, which means that it hardly needs to read any data at all for skipped files, which allows it to handle identical files very efficiently. If the OP highlights a PDF, the file's timestamp will change. Unmodified PDFs will have identical Last Modified timestamps in both places. Robocopy can therefore differentiate between files to be skipped and those to be overwritten. You make an interesting point but I'm not sure I understand where the inherent risk is here. What could theoretically go wrong?

              – Mr Ethernet
              Oct 14 at 10:52












              1


















              XCOPY/D will only copy files if the source is newer than the destination.
              (XCOPY/S/D for a recursive copy)






              share|improve this answer


























              • XCOPY does not take the contents of the files into consideration

                – Bert
                Oct 16 at 13:24















              1


















              XCOPY/D will only copy files if the source is newer than the destination.
              (XCOPY/S/D for a recursive copy)






              share|improve this answer


























              • XCOPY does not take the contents of the files into consideration

                – Bert
                Oct 16 at 13:24













              1














              1










              1









              XCOPY/D will only copy files if the source is newer than the destination.
              (XCOPY/S/D for a recursive copy)






              share|improve this answer














              XCOPY/D will only copy files if the source is newer than the destination.
              (XCOPY/S/D for a recursive copy)







              share|improve this answer













              share|improve this answer




              share|improve this answer










              answered Oct 15 at 4:19









              Daniel KlughDaniel Klugh

              291 bronze badge




              291 bronze badge















              • XCOPY does not take the contents of the files into consideration

                – Bert
                Oct 16 at 13:24

















              • XCOPY does not take the contents of the files into consideration

                – Bert
                Oct 16 at 13:24
















              XCOPY does not take the contents of the files into consideration

              – Bert
              Oct 16 at 13:24





              XCOPY does not take the contents of the files into consideration

              – Bert
              Oct 16 at 13:24











              0


















              Microsoft SyncToy



              Microsoft makes a great Windows PC app called SyncToy where you specify pairs of "left folder and right folder", and choose to echo changes from left to right, contribute changes from left to right without deleting right, or synchronize between left and right. There is a user interface for previewing the changes before committing.



              If a file is detected as identical, it will be skipped over, which is the functionality that you are looking for.



              I have been using the Echo mode for about 10 years to incrementally mirror changes from my desktop PC to an external drive.



              https://www.microsoft.com/en-us/download/details.aspx?id=15155






              share|improve this answer






























                0


















                Microsoft SyncToy



                Microsoft makes a great Windows PC app called SyncToy where you specify pairs of "left folder and right folder", and choose to echo changes from left to right, contribute changes from left to right without deleting right, or synchronize between left and right. There is a user interface for previewing the changes before committing.



                If a file is detected as identical, it will be skipped over, which is the functionality that you are looking for.



                I have been using the Echo mode for about 10 years to incrementally mirror changes from my desktop PC to an external drive.



                https://www.microsoft.com/en-us/download/details.aspx?id=15155






                share|improve this answer




























                  0














                  0










                  0









                  Microsoft SyncToy



                  Microsoft makes a great Windows PC app called SyncToy where you specify pairs of "left folder and right folder", and choose to echo changes from left to right, contribute changes from left to right without deleting right, or synchronize between left and right. There is a user interface for previewing the changes before committing.



                  If a file is detected as identical, it will be skipped over, which is the functionality that you are looking for.



                  I have been using the Echo mode for about 10 years to incrementally mirror changes from my desktop PC to an external drive.



                  https://www.microsoft.com/en-us/download/details.aspx?id=15155






                  share|improve this answer














                  Microsoft SyncToy



                  Microsoft makes a great Windows PC app called SyncToy where you specify pairs of "left folder and right folder", and choose to echo changes from left to right, contribute changes from left to right without deleting right, or synchronize between left and right. There is a user interface for previewing the changes before committing.



                  If a file is detected as identical, it will be skipped over, which is the functionality that you are looking for.



                  I have been using the Echo mode for about 10 years to incrementally mirror changes from my desktop PC to an external drive.



                  https://www.microsoft.com/en-us/download/details.aspx?id=15155







                  share|improve this answer













                  share|improve this answer




                  share|improve this answer










                  answered Oct 16 at 13:23









                  StalePhishStalePhish

                  1012 bronze badges




                  1012 bronze badges































                      draft saved

                      draft discarded















































                      Thanks for contributing an answer to Super User!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid


                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.

                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f1491937%2fcopying-files-does-windows-write-to-disk-if-files-are-identical%23new-answer', 'question_page');

                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown









                      Popular posts from this blog

                      Canceling a color specificationRandomly assigning color to Graphics3D objects?Default color for Filling in Mathematica 9Coloring specific elements of sets with a prime modified order in an array plotHow to pick a color differing significantly from the colors already in a given color list?Detection of the text colorColor numbers based on their valueCan color schemes for use with ColorData include opacity specification?My dynamic color schemes

                      Invision Community Contents History See also References External links Navigation menuProprietaryinvisioncommunity.comIPS Community ForumsIPS Community Forumsthis blog entry"License Changes, IP.Board 3.4, and the Future""Interview -- Matt Mecham of Ibforums""CEO Invision Power Board, Matt Mecham Is a Liar, Thief!"IPB License Explanation 1.3, 1.3.1, 2.0, and 2.1ArchivedSecurity Fixes, Updates And Enhancements For IPB 1.3.1Archived"New Demo Accounts - Invision Power Services"the original"New Default Skin"the original"Invision Power Board 3.0.0 and Applications Released"the original"Archived copy"the original"Perpetual licenses being done away with""Release Notes - Invision Power Services""Introducing: IPS Community Suite 4!"Invision Community Release Notes

                      199年 目錄 大件事 到箇年出世嗰人 到箇年死嗰人 節慶、風俗習慣 導覽選單