The goal was to replace the existing solution of publishing to Twitter directly from the script producing the maps of earthquake locations, with a new solution, that allows for publishing to Twitter in an asynchronous way, and implement a better throttling mechanism, as the old way ran into Twitter rate limits several times, which led to either the account being blocked or shadow banned, because it was identified as publishing spam during periods of high seismic activity.
Now, the applications, after having successfully created the maps, places a message in a RabbitMQ queue. The queue is checked periodically and the message is published to Twitter after a short random delay of something between 1 and 4 seconds.
You can see the published Tweets here. This is the development and test account, the application is still in development mode, so the links may or may not work.
So I was trying to do some work with RabbitMQ using Python and Pika. Namely, I want to write message queues for use in some of my applications that let me do stuff asynchronously, so that the Python program is not blocking for any significant amount of time.
For that I installed rabbitmq-server and the package python3-pika on an Ubuntu 19.04 box.
An example program I wrote (actually adapted from an existing one) some time ago, showing the speed-up achieved when switching from CPU to GPU documentation, it also served to test the setup and benchmark my systems.
The example can be run as a Jupyter notebook or in a terminal.
The program generates a fractal image, the Mandelbrot set, and measures the time the computation takes.
Unsurprisingly, there is a considerable speed-up when switching from CPU to GPU. On my systems the CPU version takes around 4.4s to compute the image, while the GPU version does it in around 0.3s – considerable time saver, I’d say.
The other interesting thing here is, how easy it is to used your NVIDIA card with Python and take advantage of these speed-ups.
My code is available at Github, feel free to use, comment or share.
CERN had its OpenDays on September 14 and 15. As the LHC is in Long Shutdown 2 (LS2) for upgrades until early 2021, this was a good possibility for CERN to present itself and its work to the public.
Both days drew huge crowds and lines for underground visits were long – at one point waiting times for ATLAS visits were 3 hours.
I arrived on Sunday, September 15 shortly before 10 a.m. and after getting my wrist band at the check-in tent went straight for transport to remote site – I already know part of the Meyrin site, and Atlas was already overcrowded so I went to the bus stop in search of Bus F, to go to the CMS Experiment site. Unfortunately, I couldn’t find this bus, so I decided to jump on the one going to the LHCb site. Good choice!
Nice day in Lucerne and excellent opportunity to learn about CSCS’s work and interact with the staff. This was my second CSCS Lab Day, and altough I am not working in the HPC field, I learned a lot. This event is interesting, because it is focused on the interaction of HPC users with the CSCS infrastructure, so you can get a lot of information about containers, virtualization and CSCS user environment, without being overwhelmed with all the HPC specific stuff.
The day started with a talk given by Prof. Demenico Giardini, ETH Zurich who described, how the Seismometer of the InSight Mars mission was developped and deployed and what results obtained were so far.
As I have announced a few days ago, I was looking into how to migrate my websites to a virtual server environment using VirtualBox.
The installation and configuration was pretty straightforward and it was basically the same as on original websites, the operation systems remains Ubuntu 18.04 LTS and the software environment is identical. However, this was a good opportunity to clean-up some things that have become outdated.
My company website ofehrmedia.com runs on the newer version of Zotonic Erlang CMS (at the time of writing this is 0.51). There was no problem migrating the content and database from a previous version (namedly 0.39).
My sun.ofehr.space website is still running on Yaws Webserver, but some of the data acquisition code needed to be updated, as the source format changed. Thankfully, we are close to solar minimum of solar cycle 24, so there is time for a bigger update on how data on solar events is collected and displayed. For the time being, SDO Videos are no longer produced, as there was an api change on Helioviewer.org, that’s fixed now, but I decided to redo the whole process of how this data is acquired and treated.
The earth.ofehr.space is also still running on Yaws Webserver, and a handful of sources for earthquake data, namely Iceland, Turkey, Mexico, Switzerland, Philippines and some others were ditched, as they make it exceedingly difficult to acquire the data, and I’ve decided it’s not worth my time. I will spend efforts on improving the data display on the remaining data sources.
The planets.ofehr.space Website is also running on Yaws Webserver and it currently only displays data on near earth objects.
Now, the interesting part will be to see how the VirtualBox environment behaves in production and how easy it is to do DevOps style development with it.
Not exactly News, as it dates back to April 2019, but still interesting since we are very close to the end of solar cycle 24 and the solar minimum.
NOAA and NASA experts sat together and came up with a preliminary prediction as to what to expect from the next solar cycle numbered 25.
It’s expected to be a weak cycle, similar to the current one. As the cycles have been getting weaker and weaker since solar cycle 21, this might mean that the decline in solar activity is coming to and end with solar cycle 25.
According to this preliminary prediction solar cycle 24 is to end somewhere between July 2019 and September 2020. Solar cycle 25 is expected to peak between 2023 and 2026 with around 95 to 130 sunspots.
The number of sunspots is an indication of solar activity and space weather events such as radio blackouts, geomagnetic storms and solar radiations storms.
Today, we have a new active region on the Sun with two sunspots. The Hale magnetic configuration is given as β which means that there might be some solar flare activity, but not very likely above C class.
This is a rare event, as solar cycle 24 is drawing to a close, with a minimum predicted somewhere between July 2019 and September 2020. During this time there are little to none acitve regions and thus solar flares on the Sun.
I am currently running some websites on bare metal servers and while I am not prepared (yet) to move these to virtual servers in the cloud, I do want to virtualize them and run them on Oracle’s VirtualBox.
Most of the migration is straightforward, of course. I duplicated the Ubuntu 18.04 LTS environment in a VirtualBox and moved the configuration and files over. For the data collected I created a separate storage container which expands as needed.
There was only one issue in networking. I used bridged adapter in network settings, however the box was only reachable from the host operating system, not from other machines. That is fixed now, not sure how, though. It’s one of these “change settings multiple times until it works” type of fixes.
Currently the development and test environments are moved, and the development environment is set up so I can edit the files and connect to the database. Now the only thing to figure out is how to best publish the changes from development to test to production. This should happen with the least possible effort and highest degree of automation possible. Still working on that…
While this is not a weather blog, I still think this is impressive. The Atlantic hurricane season started quietly in June with not much activity until end of August. Now we are reminded that the hurricane seasons lasts till November with this impressive system called «Dorian».
That’s how the NOAA weather satellite saw the hurricane on August 31 early morning UTC.