The discourse website is programmed to do backups of its data - but
it happens on the same virtual machine as the website is hosted on.
As we use the website more and more, i think it would be nice to be able to
do weekly backups of the website AND the virtual machine on another server.
So, if anyone has access to a computer in which I can ssh in and store
website data, do let me know. The process of backing up a website would include:
1. Shutting down the VM
2. Creating a vagrant box image.
3. Start up the VM
3. Login to remote server and copy the vagrant box image.
4. Remove old images (optionally)
In my estimation, this would mean a downtime of half an hour every week. If we dont want this level of backup, we can just periodically download the website backup. (The last one was 4.9MB in size)
Another options is discourse allows backing up to Amazon S3 (presumably without downtime), but again we'd need somebody to have an S3 account.
I also have access to sugarlabs (olpc) servers in cambridge, boston, but
would ideally like to not use those for this purpose (but can if need be