Here at Newicon, we’re big on data. We’re also big on data integrity and security.
Many of our clients have web applications or eCommerce stores upon which their business relies. These are driven, most commonly, by MySQL or SQL Server databases. Were they to lose the valuable data stored in those databases, it would be more than a small inconvenience; it could be catastrophic and even leave them bankrupt!
I know that sounds rather over-dramatic, but it would be foolish to underestimate the importance of sales order data for an eCommerce store owner, or indeed the list of projects in our own task system…
To this end, our servers all run nightly backups, with remote copies for redundancy. Lately, however, we have found certain aspects lacking. The key problems being efficient maintenance of the backups and the amount of space they take up on each server. Although the backups are incremental (and so only store updated files), on servers that are less capacious they can cause problems such as running out of disk space.
The long and short of it is, we found ourselves needing a more ‘us’ method of backups…
Creating Our Own Backups
This is where Node.js and Amazon S3 come in. For the uninitiated, here’s a very, very brief explanation of each:
Amazon S3 is one of the suite of tools that forms Amazon Web Services. S3, or ‘Simple Storage Service’ allows for storage and retrieval of files, using an API or web ‘console’.
You can connect to Amazon S3 using any number of SDKs (software development kits), but we opted for Node.js because we have the required skill-sets throughout the company to be able to develop for it. This means that anyone in the team can pick it up and make improvements or fix bugs.
For the majority of our clients, there are two types of files that need backing up: databases and uploaded media. We’re not concerned about code itself, as all of our projects are stored in git repositories. Therefore, we focussed our efforts on these and the best method for checking and backing up.
The database backups were simple enough to implement: we would simply run a dump on each required database and upload the output file. As we value security as well as integrity, we decided to encrypt the output files using openssl after compressing them. These encrypted, compressed files would then be both secure and much smaller than the original data.
For the media, we decided that rather than upload all of the files each time, we would check the timestamps and only upload files that had been modified since the last backup. Thankfully, the S3 API is very quick and efficient, so we can verify each file when we upload it, or store a local cache of file metadata for very large sites or systems. S3 also provides file versioning, meaning that we can upload without impunity and retrieve an older version of the same file if we really need it. Win win!
Once we had the basics of an upload script, we needed to make it easily installable, maintainable and executable across all of our servers. For this, NPM (Node Package Manager) came to the rescue. Using the Node.js standards, we were able to create a package that contains all of the code and can be called directly from the command line. The package can be installed on any machine with Node.js and NPM installed. It can even be used to backup files from your local machine!
So, today we are proud to introduce NIBU (Newicon Backups), our very own backup solution.
From today, we’ve installed NIBU on all of our client servers free of charge. This will replace the previous backup solution across the board, although hopefully our clients will never need to use it…
If you’re a developer or sysadmin with the need for an automated backup service built using the latest tools, check out NIBU on NPM: https://www.npmjs.org/package/nibu.
If you’re a business owner and don’t already host your website or application with us, get in touch to find out the other awesome things we do to look after all of our valued customers.
If you’re interested in working with Newicon on your next digital project, get in touch now.