How do you manage back up files?

Luttogtokh Palamdorj 8 years ago in General updated by fbilki (Moderator / Admin (AUS)) 7 years ago 7
Dear All,

How do you manage backup files?

Should they be saved every hour?

How do you find a file that was saved months ago in a project?

I too have wondered what the best backup system would be.
My current setup is pretty basic. When I am reminded that a loss of data would be catastrophic, I simply zip the project file, date it and put it on the company server or on an external harddrive. This is not a very good system because it relies on my memory and takes up a fair bit of storage.
I have yet to implement, but was thinking of using a macro to FCOPY only crucial files and export formsets to a external location. I think this would keep the storage low and make it easy to frequently backup my work.
To answer your other questions I save at basically every pause. I don't think there is way to just jump back in time a month in your project unless you set out to be able to do this initially somehow outside of MicroMine. I know MicroMine automatically backups up the FLDVAL.BDB file but it doesn't contain your project data.
I use a program called FreeFileSync to take care of my backups, which go to a portable hard disk and a NAS.

It's pretty easy to use: you define the folder pairings between your working disk and your backup disk, use filters to include or exclude specific files and folders, and set the synchronisation mode.

Here is how I do it:

I use Two Way mode for backing up to the portable hard disk, because I use that disk as a medium for syncing my work between (up to four) different machines.To ensure the disk contains the latest version I always sync each machine straight after startup (to copy modified files onto it) and sync again before I shut it down (to copy modified files off).

I use Mirror mode for backing up to the NAS. This simply creates an exact copy of the source folders, including any file deletions.

I have a third backup configuration that I call the Archive. This one is set to Update mode and I use it to move projects and documents that I no longer want on my computer's hard disk. Unlike mirror mode this one never deletes files. I've created an Archive folder, which is normally empty, on each computer. I only move files into it when I need to archive them, and then I delete them once the archiving is done.

You can also run FreeFileSync in real-time mode using a secondary utility called RealtimeSync. One neat feature is that you can define the time delay between when a file is modified and when it gets backed up, which is ideal for backing up Micromine files that take a few minutes to create. Of course, it doesn't help if you're half-way through your second edit when the software starts syncing based on the first one!

FreeFileSync is free and open source software. I've been using it for years and it is extremely well designed and very reliable. Updates are published on a monthly basis.

But a word of caution: the developer makes money by embedding third-party software in the installer. That just means you need to carefully read every word during installation to ensure you disable the third-party software. I've never been "infected", although there have been times when I've had to re-read the installer a couple of times. (The embedded software is different every time.)

With respect to finding backed-up files, I'd just use the normal Windows search tools.

Geoff, like you once used batch files to do my backups, but they quickly got abandoned when I discovered this application.

I hope this helps.

Thanks for that info on backups, mirroring, etc. I do the same as you with a notebook from my home office and placing the data files on a portable hard drive. I then go into the office and transfer it to a dedicated hard drive on the desk top at work and have to ensure that I have the latest files on that machine for others to access either via VPN or over the network. I also place the data on the network for security and as a permanent record as that get backed up to the NAS drive.
You have provided as excellent solution to my dilemma and thanks for the suggestion. Cheers Peter
For what it is worth I use a product called GoodSync to manage backups, it can run in real time or on a schedule and is very reasonably priced (with no 3rd party software).  Beyond_Compare4 is also very useful
Before expanding on backup issues I think it is important to differentiate between what is backup and what is archive. Backup is really a mechanism to recover lost data immediately. Whereas Archive is about long term storage and retrieval of data (often just of the prime data, and important derivatives, ie a lot of working files may not be necessary). Archiving is a bigger topic and should involve keeping at least three copies at two locations.

For now I will just discuss good backup strategies. I’m old enough to remember when backups where written to magnetic tape and a good strategy was developed by most mainframe and mini computer owners. This was known as the generational backup scheme and was most commonly the three-generation or "grandfather-father-son" method. At its most basic the method can be described as a series of steps where a complete copy of the data to be backed up on removable media such as tape , CD or DVD (many projects these days might require several DVDs). At the next scheduled backup period, say the next week or next day, another complete copy of the data is made, which of course includes the changes in the data during that period. This is the father. At the next scheduled backup, the third copy, or son is produced. After that third generation the backup tapes could be recycled, ie the grandfather becomes the new son, the son becomes the father and the father moved onto the grandfather (CD or DVD would probably not be reusable [BTW Read/Write CD & DVDs are not very reliable if reused often]. Thus could be as simple as moving position in the storage rack or box.

This is a very reliable system but has the disadvantage that things that change would only get backed up when the next full backup is due. So many companies also implemented an incremental backup at more frequent intervals. Which only backed up things that had changed since the last full backup. For example incremental backs could be run daily and only a smaller amount of data stored. So to recover most data would require perhaps only two tapes say the son. Last full backup, and the incremental for the previous day. Normally these incremental tapes would also be recycled at the time of each full backup. But you could only recover up to 3 weeks later any lost (or corrupted) data.

Move forward a few decades and we now have much larger data sets and traditional removable media is in general too slow and of limited capacity. Now storage on hard disks is much cheaper and you should probably choose to use external (USB Style) hard drives instead of removable media. The same strategies work well here and some such USB hard drive come with software to formalise these steps. In general they still have to be manually instigated!

Alternatively if you have an extensive LAN and have some spare space you can use, the task of creating three generatons of “save sets” (they are not really true backups, as they will required some manual work to recover files particularly if it is only a selective recover) that are zipped version of the project are pretty easy to do. [BTW, 7zip.org is an open source project to provide powerful compression tools and their 7zip Utra format give better than 100 times compression on micromine ascii files and even 4 times compression on binary files. A word of warning a lot of sites offer versions of 7zip but the download is full of crapware so spend a bit of time to find the original 7zip.org project and get your download from there]. At a simpler level (without file compression) you can just use Xcopy (or RoboCopy) Fcopy (in Linux) etc. However you must be careful with this approach to avoid having thousands of near duplicate files in near duplicate directories all over the place!

The next great tactic to help implement backup when your LAN has plenty of available space is using file & folder synchronization tools. These vary from very manual free downloads to sophisticate offering that will backup locally as well as onto the net, FTP or even P2P connections. I have used such a program called Synchron (which I believe is still free to download) mainly for automating handing of digital photos (I have a massive collection). It allows you to easily implement the three generation and incremental backup strategy discussed earlier and if your storage is permanently connected you can automatically schedule your backup and regular intervals. I have heard good reports from those using GoodSync, thanks also Keith, sounds to have these features also (maybe better) and is not expensive. Your external hardrive might already have such software installed and called backup or similar. Unfortunately there are a lot of less than satisfactory versions of file synch around, so spend a bit of time making sure want you have suits you. Here are a few common limitations, file restore is very complex, does not monitor file deletions and renames, file collisions (you can get two different versions of a file with different content), perhaps needs a programmable job scheduler that can detect key actions (eg plugging in a USB drive or updated files in a given directory) and then undertake appropriate synch. If you are interested in using synch try one of the many free or inexpensive offerings first before committing to anything really expensive. [BTW Many on-line/could backup services include such synch software as there principle backup tools but some of the datasets we work on are so large that even with decent broadband access it might take most of a month nad your full data allowance just to upload a day or so work files].

No matter how you do the synching it is vital with micromine that you maintain the working directory integrity *ie copy the whole directory, keep the same names and don't rename files (eg add dates).  This meta detail should be included in an enveloping upper level directory (en SON19-03-15). Then this gets renamed FATHER19-03-15 (or FAT19-03-15), but all the sub directory and file names remain the same. This means you can use simple existing file browsing tools to easily find and recover whole projects down to just a couple of files. After the third generation just delete (or archive if you really want) the GRANDFATHER19-03-15.(or GFAT19-03-15) before you start the new SON09-04-15

The next logical step is doing the synching often (ie as soon as anything changes) and this is called disk mirroring. It is quiet a complex issue technically and typically solved with special hardware. One form of which are now known as RAID (originally Redundant Arrays of Inexpensive Disks). Whilst very few PC or computer workstations have such arrays built in these days, the technology did become popular in NAS (network attached storage). There have been a few flops (an internet search should highlight the bad guys) but I have a trusty netgear stora unit with two 2GB disks (mirroring each other) that is reaching its 7th birthday without problem (and a in few crucial situations it has proven fast enough & invaluable). BTW The Stora has had its share of less than complimentary reviews, which I don’t understand. The network attachment means file transfer is slowed down to the speed of your network but make it easier to share and have on-line all the time. This is great to avoid physical disasters like disk crashes. I have had 4 external drive crashes and two laptop drives fail in the same 7 years and I have lost only a handful of work, easily regenerated, and nothing significant (I did run into some adobe lightroom problems but that is a separate issue). There is however a compromise in how often data is to be backed up. For example I have set my stora to do its “mirror” backup every 2 hours from my desktop but just daily from my laptops.

The BIG secret that no one seems willing spill about NAS and RAID is that such technology will diligently backup corrupted files, scrambled directories and unintended Delete Alls etc (ie user errors) and then keep perfect copies of these corrupted files. You of course have the window in time of the backup period to fix such errors, if you notice them. Thus NAS & RAID are great for physical security (at some expense of this extra hardware) but they are not the full backup story, they may not protect you against user errors. …Shhh this seems to be a secret!!!

Sorry this post got a little long, but I thought the context might help. Disk Mirroring can help avoid the physical disasters but the traditional three generation method especially as applied in an automated file synch environment still has its place for confident overall backup.

Inspired by this thread, I took some time to sort out my backup system this week.

Norm, thanks for the context which certainly spurred me on to update my system that previously relied on a scheduled windows 'archiving' to NAS with periodic copying of files to an external hard drive, neither of which were satisfactory.

Frank, thanks for the software recommendation and the description of your system. FreeFileSync run through a batch file via windows task scheduler works very well and is incredibly flexible, addressing most of the steps & issues mentioned by Norm, including generational 'versioning' of files.

As an aside, I think it's great that contributors to this forum are happy to provide help & advice about topics that are not strictly MM Software related. I've come across a bunch of helpful tips and tricks applicable to my work beyond the use of MM.
Thanks for the very nice feedback, Leon. Thanks also to Norm for providing such useful information in the first place. It's very gratifying to see the forum bringing our users, staff and clients alike, together in an open environment, and I'm glad it's given you some good ideas.