Particle physics is kind of a hobby of mine and since some time it is even possible to get access to some of the data generated with the LHC accelerator at CERN. One such dataset is from the LHCb experiment, which gives you access to data about decays of B-mesons to three hadrons. The largest file is some 636MB (B2HHH_MagnetDown.root) which I chose to start exploring.
Exploring means that at the start I do not know exactly what kind of data is in there. So I had to do some exploring and decided to start with just reading out some data and drawing graphs with it. For starters, I am not primarily interested in the physics, but in how to work with the data and what I can use it for. So I study the toosl and the programming, but it is clear that at the same time I analyse the data, I will have to study the physics behind it, otherwise it is not possible to make useful evaluations.
I was lucky enough to be able to participate in the 3rd FCC Workshop from January 13 to 17, 2020 and got I first hand look behind the scenes of the planning of the Future Circular Collider (FCC) which is supposed to come after the current Large Hadron Collider (LHC) has gone through its High Luminosity (HL) upgrades and needs to be replace somewhere in the beginning of the 2040s. That sounds like a long time, but as was pointed out, there is a lot of civil engineering work to be done – namely digging the 100 km circumference tunnels – and this needs to be started soon.
FCC is actually several colliders all of which are refereed to as FCC-INT. The first to be implemented is the FCC-ee collider, which is a electron-positron collider. Here, there is some competition between the circular FCC-ee (in CERN), the linear ILC (in Japan) and CLIC (in CERN) designs.
If you’re new to this subject I recommend reading Circular and Linear e+e− Colliders: Another Story of Complementarity by Alain Blondel and Partick Janot (arxiv.org:1912.11871). In a nutshell, FCC-ee is the front-runner if you plan to do more than just Higgs physics. Namely EW, Flavour and Top physics as well as Beyond Standard Model physics (BSM) and if you want to keep the road open to a proton-proton (a hadron collider) called FCC-hh. Current thinking seems to be that FCC-ee is favoured but with synergies of either ILC (or even CLIC) being built in Japan.
What I profited most in these 5 intense days, was to get some points drawn which I can now connect. Especially in QCD and EFT, BSM physics, but also collider technologies, software used to do particle physics and data acquisition (DAQ) process.
I now have a much better general understanding about the actual data which is being collected. Unfortunately, with my Windows 10 notebook, I couldn’t really participate in the software workshop – this is corrected now. It’s running Fedora 31 – which turns out to be noticeably faster…
I enjoyed my stay at CERN. Nice international atmosphere. Some buildings could use a make-over, though :-).
The rendering of the scenes happens fully on the client side inside the browser and can require a substantial amount of memory and computing resources if you work with complex scenes. However, this one is simple enough to be enjoyed on systems with not to many resources.
The TLEs, which stands for “Two-Line-Elements”, is a specially formatted text file, containing two-lines with object data for each object that is being tracked and a header line, giving the name of the object. It can be obtained from the Celestrak website and typically looks like this:
CesiumJS is very cool and lets you visualise data with a few lines of code. Of course, you can write complex applications with it. The point to consider is that the rendering of the scenes you create does happen fully on the client side and can require a large amount of RAM and processing power.
Here’s a visualisation of the what the code does. It also lets you zoom, pan and rotate.
I wrote a quick example program in Python. The code consumes data in JSON format, uses Pandas to work the data and Matplotlib to display the data. It is a Jupyter notebook, but can easily be adapted to work standalone. Find the code on GitHub.
The data is from the SWPC website and contains monthly predictions on what the number of sunspot and the solar 10.7cm flux will be – this data is important, for example, for radio amateurs. The data is valid as of November 11, 2019 but is going to change over time. The current data can be found at services.swpc.noaa.gov.
As you can see the activity is predicted to be very low until December 2022. The new solar cycle, Solar Cycle 25, is believed to have either already started or to be starting soon, until the end of 2019.
BOINC is a open-source software provided by the University of Berkeley and is intended for people to contribute computing time of their computers to running calculations for scientific projects. Examples of such projects are Einstein@Home, SETI@Home, or LHC@Home among many others.
BOINC client can be run standalone or in connection with Oracle’s VirtualBox. Some projects indeed require VirtualBox to run.
I have been running the BOINC client software for several years now on different platforms like Fedora, Ubuntu, FreeBSD and Windows 10 and contributed to a handful of science projects, but mostly to the LHC@Home.
BOINC also has a server part that let’s you host your own science projects. If you have a lot of computations to do, and need additional computing power, you might want to look at this solution. BOINC server consists of several parts, such as Apache HTTP server and MySQL databaseserver. However, this is a bit tedious. So thankfully, volunteers provide Docker containers and VirtualBox VMs you can download and use. Details can be found here.
CERN had its OpenDays on September 14 and 15. As the LHC is in Long Shutdown 2 (LS2) for upgrades until early 2021, this was a good possibility for CERN to present itself and its work to the public.
Both days drew huge crowds and lines for underground visits were long – at one point waiting times for ATLAS visits were 3 hours.
I arrived on Sunday, September 15 shortly before 10 a.m. and after getting my wrist band at the check-in tent went straight for transport to remote site – I already know part of the Meyrin site, and Atlas was already overcrowded so I went to the bus stop in search of Bus F, to go to the CMS Experiment site. Unfortunately, I couldn’t find this bus, so I decided to jump on the one going to the LHCb site. Good choice!
Not exactly News, as it dates back to April 2019, but still interesting since we are very close to the end of solar cycle 24 and the solar minimum.
NOAA and NASA experts sat together and came up with a preliminary prediction as to what to expect from the next solar cycle numbered 25.
It’s expected to be a weak cycle, similar to the current one. As the cycles have been getting weaker and weaker since solar cycle 21, this might mean that the decline in solar activity is coming to and end with solar cycle 25.
According to this preliminary prediction solar cycle 24 is to end somewhere between July 2019 and September 2020. Solar cycle 25 is expected to peak between 2023 and 2026 with around 95 to 130 sunspots.
The number of sunspots is an indication of solar activity and space weather events such as radio blackouts, geomagnetic storms and solar radiations storms.
Today, we have a new active region on the Sun with two sunspots. The Hale magnetic configuration is given as β which means that there might be some solar flare activity, but not very likely above C class.
This is a rare event, as solar cycle 24 is drawing to a close, with a minimum predicted somewhere between July 2019 and September 2020. During this time there are little to none acitve regions and thus solar flares on the Sun.
While this is not a weather blog, I still think this is impressive. The Atlantic hurricane season started quietly in June with not much activity until end of August. Now we are reminded that the hurricane seasons lasts till November with this impressive system called «Dorian».
That’s how the NOAA weather satellite saw the hurricane on August 31 early morning UTC.
I have been using BOINC software to participate in scientific computing projects for around four years and contributed to several projects such as Einstein@Home, SETI@Home, Asteroids@Home and my personal favourite LHC@Home.
Starting with getting LHC@Home projects directly from LHC@Home, I switched to a pool with Gridcoin. I am now switching back and let my boxes crunch exclusively for LHC@Home.
My Boinc four clients now use a local SQUID proxy especially configured for LHC@Home and CERNVM-FS. While the number of machines probably does not do much to cut down on network usage, it’s something I tried some years ago and had abonded it. Apparently, LHC@HOme is now recommending you run a local proxy if you have several crunchers in your network.
I am happy to oblige!