Very minimalist approach by Swiss government, basically contenting itself with issuance of flyers containing useful tips on how to protect yourself and others from COVID19
Authorities mainly content themselves to tell citizens what the SHOULD be doing, like not using public transport at peak times, or keeping at least 2 metres distance from each other, as if that were even remotely possible for working people. Disinfection of public transports – or at least the most neuralgic parts like the handlebars? No one does that.
Other than that authorities do nothing. Apparently, some medical equipment is running low already (with 210 cases only) and medical personal is told to work without protective gear, but make no attempt to procured more.
I also wonder whether there are any cases in which the virus was transmitted by grabbing a contaminated flyer or free newspaper of which there are tens of thousands read by hundreds of thousands of people every day?
I am relieved to know that the Federal Office of Public Health’s says that “The top priority for the Swiss Federal Council is the health of the public.” Everything okay then…
It’s not even a severe epidemic (yet) and hopefully won’t be, but one lesson to be learnt is already obvious.
No on, absolutely no one is really prepared for such an event. Neither in terms of crisis management, nor in terms of economic and financial consequences. In Switzerland has a bit over 100 coronavirus cases and health-care resources are already running low and personnel is overstretched, and in term of economic costs, they are starting to eat into the substance: Orders from abroad are low, tourism is down, banks’ share tank, etc
But thankfully, Federal and Cantonal governments have decided to “remain seized of the matter” and continue to observe the situation closely 😂
A lot has been said about how you can avoid being infected with the new #Coronavirus. One idea is to let people preferable work from home, also called “home office”.
Certainly a good idea in many instances! However, one needs to prepare for the case that in an epidemic outbreak situation, ISP and Cloud service providers might also be affected. People who work there might be infected too and not be able to work and so services might be diminished or not be available at all. So, while the situation with Covid19 is not anywhere near that scenario, it is still something to think about when you are planning for business continuity!
Personally, I have not considered this scenario at all until I got an e-mail from Microsoft:
Then I saved this file and enabled the service with
sudo systemctl enable atlas-jupyter.service
After that I could start it by typing
sudo systemctl start atlas-jupyter.service
And the Jupyter server now starts without problem every time I boot the virtual machine and I don’t have to log in and can also start the virtual machine in headless mode, meaning no console (GUI) necessary.
This is just a quick note for people using ATLAS Experiment OpenData with Oracle VirtualBox VMs on Fedora. The installation of the ATLAS VM is wonderfully described here. I just have two quick notes. One for users that run Oracle VirtualBox on Fedora (31) and one that is for the VM itself.
First, you’ll need to install the VirtualBox Extension Pack to be able to start the VM. This is because it needs USB 2.0 support enabled to be able to start the VM. Setting USB settings to USB 1.1 will not fix this. The VM will start, but will get stuck during boot up. On Ubuntu, the VirtualBox Extension Pack is provided as an Ubuntu package and can be installed with the command
sudo apt install virtualbox-ext-pack
On Fedora there is no such package. So you’ll have to get the VirtualBox Extension Pack from the Website so you’ll have to get it from the above link and manually install it. This is quite easy, though.
Click the download link and choose Open with Oracle VM VirtualBox. After installation the ATLAS VM starts without issue.
Particle physics is kind of a hobby of mine and since some time it is even possible to get access to some of the data generated with the LHC accelerator at CERN. One such dataset is from the LHCb experiment, which gives you access to data about decays of B-mesons to three hadrons. The largest file is some 636MB (B2HHH_MagnetDown.root) which I chose to start exploring.
Exploring means that at the start I do not know exactly what kind of data is in there. So I had to do some exploring and decided to start with just reading out some data and drawing graphs with it. For starters, I am not primarily interested in the physics, but in how to work with the data and what I can use it for. So I study the toosl and the programming, but it is clear that at the same time I analyse the data, I will have to study the physics behind it, otherwise it is not possible to make useful evaluations.
I was lucky enough to be able to participate in the 3rd FCC Workshop from January 13 to 17, 2020 and got I first hand look behind the scenes of the planning of the Future Circular Collider (FCC) which is supposed to come after the current Large Hadron Collider (LHC) has gone through its High Luminosity (HL) upgrades and needs to be replace somewhere in the beginning of the 2040s. That sounds like a long time, but as was pointed out, there is a lot of civil engineering work to be done – namely digging the 100 km circumference tunnels – and this needs to be started soon.
FCC is actually several colliders all of which are refereed to as FCC-INT. The first to be implemented is the FCC-ee collider, which is a electron-positron collider. Here, there is some competition between the circular FCC-ee (in CERN), the linear ILC (in Japan) and CLIC (in CERN) designs.
If you’re new to this subject I recommend reading Circular and Linear e+e− Colliders: Another Story of Complementarity by Alain Blondel and Partick Janot (arxiv.org:1912.11871). In a nutshell, FCC-ee is the front-runner if you plan to do more than just Higgs physics. Namely EW, Flavour and Top physics as well as Beyond Standard Model physics (BSM) and if you want to keep the road open to a proton-proton (a hadron collider) called FCC-hh. Current thinking seems to be that FCC-ee is favoured but with synergies of either ILC (or even CLIC) being built in Japan.
What I profited most in these 5 intense days, was to get some points drawn which I can now connect. Especially in QCD and EFT, BSM physics, but also collider technologies, software used to do particle physics and data acquisition (DAQ) process.
I now have a much better general understanding about the actual data which is being collected. Unfortunately, with my Windows 10 notebook, I couldn’t really participate in the software workshop – this is corrected now. It’s running Fedora 31 – which turns out to be noticeably faster…
I enjoyed my stay at CERN. Nice international atmosphere. Some buildings could use a make-over, though :-).
The rendering of the scenes happens fully on the client side inside the browser and can require a substantial amount of memory and computing resources if you work with complex scenes. However, this one is simple enough to be enjoyed on systems with not to many resources.
The TLEs, which stands for “Two-Line-Elements”, is a specially formatted text file, containing two-lines with object data for each object that is being tracked and a header line, giving the name of the object. It can be obtained from the Celestrak website and typically looks like this:
CesiumJS is very cool and lets you visualise data with a few lines of code. Of course, you can write complex applications with it. The point to consider is that the rendering of the scenes you create does happen fully on the client side and can require a large amount of RAM and processing power.
Here’s a visualisation of the what the code does. It also lets you zoom, pan and rotate.
Currently working with my RTL-SDR device to catch ADS-B messages from nearby airplanes. Luckily, there’s an app for that called dump1090 that works with my device out of the box. I run it on FreeBSD 12.1 and it catches the messages well. Unfortunately, it is a bit behind and I am not sure the app is still maintained. In any case, the problem is with the way the app produces json files, and uses Google maps to visualize the data captured. So I am currently rewriting part of it to produce a GeoJson formatted output file and a Webpage that uses Mapbox (OpenStreeMaps) instead of Google. The C-code currently compiles on Linux but not (yet) on FreeBSD, but I am confident I can have a working base version by the end of next week (depending of schedule of course).
Here’s a sneak peak at the current layout (which will be improved once the backend and data-display works well enough)
The goal of this project is to learn and demonstrate how to visualise real-time data and to learn how to work with signals from antennas – but that will be another project…
I wrote a quick example program in Python. The code consumes data in JSON format, uses Pandas to work the data and Matplotlib to display the data. It is a Jupyter notebook, but can easily be adapted to work standalone. Find the code on GitHub.
The data is from the SWPC website and contains monthly predictions on what the number of sunspot and the solar 10.7cm flux will be – this data is important, for example, for radio amateurs. The data is valid as of November 11, 2019 but is going to change over time. The current data can be found at services.swpc.noaa.gov.
As you can see the activity is predicted to be very low until December 2022. The new solar cycle, Solar Cycle 25, is believed to have either already started or to be starting soon, until the end of 2019.