In a previous article, I described how I migrated some websites from a bare metal server to a virtual machine running on the same server.
It’s actually three virtual machines. One is running the development, on the test and one the production environment. The three are basically identical in setup and installed software, but production has more resources assigned to it (2 processors instead of one, and more virtual disk space for data).
The next challenge was, how to automate the development process and avoid to having manually copy the files each time a update is performed.
There are several ways to do this, of course but they ideally all have something to do with using DevOps methodology.
As I have announced a few days ago, I was looking into how to migrate my websites to a virtual server environment using VirtualBox.
The installation and configuration was pretty straightforward and it was basically the same as on original websites, the operation systems remains Ubuntu 18.04 LTS and the software environment is identical. However, this was a good opportunity to clean-up some things that have become outdated.
My company website ofehrmedia.com runs on the newer version of Zotonic Erlang CMS (at the time of writing this is 0.51). There was no problem migrating the content and database from a previous version (namedly 0.39).
My sun.ofehr.space website is still running on Yaws Webserver, but some of the data acquisition code needed to be updated, as the source format changed. Thankfully, we are close to solar minimum of solar cycle 24, so there is time for a bigger update on how data on solar events is collected and displayed. For the time being, SDO Videos are no longer produced, as there was an api change on Helioviewer.org, that’s fixed now, but I decided to redo the whole process of how this data is acquired and treated.
The earth.ofehr.space is also still running on Yaws Webserver, and a handful of sources for earthquake data, namely Iceland, Turkey, Mexico, Switzerland, Philippines and some others were ditched, as they make it exceedingly difficult to acquire the data, and I’ve decided it’s not worth my time. I will spend efforts on improving the data display on the remaining data sources.
The planets.ofehr.space Website is also running on Yaws Webserver and it currently only displays data on near earth objects.
Now, the interesting part will be to see how the VirtualBox environment behaves in production and how easy it is to do DevOps style development with it.
I am currently running some websites on bare metal servers and while I am not prepared (yet) to move these to virtual servers in the cloud, I do want to virtualize them and run them on Oracle’s VirtualBox.
Most of the migration is straightforward, of course. I duplicated the Ubuntu 18.04 LTS environment in a VirtualBox and moved the configuration and files over. For the data collected I created a separate storage container which expands as needed.
There was only one issue in networking. I used bridged adapter in network settings, however the box was only reachable from the host operating system, not from other machines. That is fixed now, not sure how, though. It’s one of these “change settings multiple times until it works” type of fixes.
Currently the development and test environments are moved, and the development environment is set up so I can edit the files and connect to the database. Now the only thing to figure out is how to best publish the changes from development to test to production. This should happen with the least possible effort and highest degree of automation possible. Still working on that…