In recent weeks I’ve been working a lot on the Trigger game (Live page.) These update have included an overhaul of the style, addition of new pages, rewriting of the “Spy mode”, adding new particles, tweaking the graphics, and many more changes behind the scenes. The code was significantly refactored to make it easier to extend and understand, as now this has becomes a collaborative project. The game has been tested on a few schools and shown to be a good success with children (and adults) and it seems to have a bright future. I also added sounds, music, and a simple music player.
Like the Science Shift Simulator, this is a game that emerged as a subset of the aDetector project. The player has to save events that match the criteria given, just like a trigger does in real life. The results are then combined across all the players in a given team to determine the final score. The scores can be combined across experiments to make a “discovery”. This is still in development, which will continue over the net week and months.
This is another cooperative multiplayer game aimed at showing the public (especially high school pupils) how particle physics research actually takes place. Any number of players can take part and they are split into “Team ATLAS” and “Team CMS”. The score of each team is determined by the performance of the players on each “shift” they take at the trigger, and the final scores are combined for the discovery of the Higgs boson. There is also a “spy” mode where people can see the events as they are submitted.
- Challenge: This project needs an attractive, fast, and realistic detector model.
- Solution: Having already developed a decent detector model for the aDetector project, I simply used a two dimensional version for this project. I then split the detector finely in \(\eta\) and \(\phi\) to make interactions between the particles and detector easier. The aesthetics went through a few iterations before settling on the final design. However further optimisations and aesthetic changes are anticipated as development continues. (Resolved, to be revisited.)
- Challenge: This game puts a bit of a strain on my server.
- Solution: My web space uses a shared server, so sometimes many HTTP requests from the same client looks like a Denial Of Service (DOS) attack, resulting in a throttling of the requests. There are two main strategies to overcome this. The first option is to bundle up several requests into one request, reducing the total number of requests, and the load on the server. This solution has not been implemented yet. The second option is to change the server settings. I do not have access to these, but as development continues I intend to move to a different server that can handle so many very small requests. (To be revisited.)
- Challenge: This game needs cross platform and cross device support.
- Solution: This game was initially intended to be played with an iPad, but I did not have an iPad for testing. On the day of the release of the game I had to tweak the settings so the response times were slower with respect to mouse events to make it easier to play on a tablet device. These settings are trivially changed to allow multiple device support. (Resolved.)
- Challenge: The game should be repsonsive to the inputs of the user.
- Solution: Initially the game did not confirm success when a user clicked on the screen and this lead to confusion. As a result I had to add a big green “tick” for success and a big red “cross” for failure to inform the user of the status of the event. (Resolved.)
- Challenge: The game needed an animated histogram for the final result.
I don’t normally put lots of screenshots up, but I’m quite proud of the asethetics here, so here are the three main screens of the main game:
The design went through a few iterations before settling on the current choice:
The user has access to several controls to interact with the application. They can choose how to view the detector, using Cartesian coordinates and two Euler angles (with the roll axis suppressed.) The most expensive parts of the process are the generation of the event displays and the generation of the particle table. By default these are only updated after a certain interval, to allow the user to accumulate a significant number of events without being slowed down by the graphics. To save time the detector itself is rendered once in a cutaway view, and the particle tracks are overlaid on the saved image. Eventually the user will be able to get a full event display, including the detector response to the particle with glowing detector components etc.
The user has access to collections of particles, including electrons, muons, pions, kaons, photons, and protons. From these they can construct other particles, making selections as they do so. Once they have made parent particles they can then plot kinematic variables including mass, momentum, transverse moment, and helicity angle. This should, in principle, allow students to learn how to recreate particles and how to separate signal from background effectively.
Given the large amount of information available the user has access to a number of tabs which can can be collapsed out of view. This allows the user to run the application with the expensive canvas and DOM updates, and thus collect many more events.
This is still a work in progress, with reconstruction of particle being the next main priority. Eventually the user would be able to load their favourite detector geometry and beam conditions, then perform their analysis, saving the output in xml files and possible being able to upload these to a server. This would allow users to act as “players” with “physics campaigns”, including the SPS experiments, HERA experiments, B factories, LEP experiments, and LHC experiments. This is, of course, a very ambitious goal, and one which has been ongoing for over a year at this point.
See other posts tagged with aDetector.
- Challenge: A sophisticated model for the detector was needed.
- Solution: The detector is split up by subdetector, with each subdetector having its own characteristic responses to different particles. The detector is split up in cylindrical coordinates, \((
ho,\eta,\phi)\), with each subdetector also being split into modules. Individual modules then react the particles for reconstruction purposes. Thus with a few key parameters even a sophisticated model can be stored in a few variables that can be tuned quickly and easily. (Resolved.)
- Challenge: The detector shold have a three dimensional view that the user can control.
- Solution: The detector is drawn using a wireframe with transparent panels. This is a method I developed in 2009 for a now defunct PHP generated SVG based visualisation of the BaBar electromagnetic calorimeter, which I used to show the absorbed dose as a function of detector region and time. The drawing algorithm is not perfect, as panels are drawn in order from furthest from the user to closest. This is sufficient for most purposes, but occasionally panels will intersect causing strange artefacts. Eventually this should be replaced with a much more stable, robust, and fast implementation, such as in three.js. (Resolved, to be revisited.)
- Challenge: Particles should be modeled realistically and physically correctly.
- Solution: Particles are modelled with the most important parameters (mass, charge, decay modes etc) taken from the PDG. Their kinematic properties are modeled using special four vector classes, and decay products “inherit” from their parents in a consistent manner. Particles at the highest layer of the tree are assigned their four momenta, and then their decay products are decayed, inheriting the boost and production vertex from their parents. This is repeated recursively until all unstable particles are decayed. So far this does not take spin into account, as everything is decayed using a phase space model. Particles with large widths have their lineshape modeled using a Breit-Wigner shape. As a result, particle have realistic looking jets and four momentum is conserved. This required the development of special libraries to handle these decays and kinematic constraints. (Resolved, to be revisited.)
- Challenge: Particles decay trees must be traversed consistently.
- Solution: This is harder than it sounds! Every time an event is generated, particles are recursively decayed for as long as phase space allows. The particles must then be traversed and dispalyed in the table, in a consistent manner. Ensuring that all particles are listed hierarchally without missing anything out takes some care. (Resolved.)
- Challenge: Particles lists had to be prepared for the user.
- Solution: The user has access to a handful of “building blocks” to play with. These are taken from the (generated) list of particles per event and filtered by the particle type. Further lists can be created or filtered from these lists, and parent particles can reconstructed from combining lists. This means having to provide special classes to handle the particles and ensure that no particles are reconstructed recursively (leading to infinite loops.) Apart from using reconstructed particles instead of generated particles, this has been resolved. (Resolved, to be revisited.)
- Challenge: The user needs to be able to make histograms.
- Solution: I had made histgorams for other projects, including the Reflections and Box plotter projects, so this was mostly fairly easy to do. Even so, being able to create new histograms of arbitrary scales and variables on the fly meant that this histogram class had to be more robust than previous projects. (Resolved.)
One of my more far reaching projects is the LGBT CERN group. It is a diverse group with people from across the world, and one of the issues that we care about is safety in different nations. This project keeps track of progress in different nations.
The information is stored in xml files which are then read and used to create maps. The user can step forward and backward through the history, or let it autoplay. There are two versions of the page, once which uses Google Maps and one which has a custom map which is more colourblind friendly.
There is scope to extend this project for other uses, and I also have maps showing the state of the EU and its history.
This project is currently unfinished and needs some further cleaning up of code when time allows. This was based on Google maps API experience from the Railway tickets project.
- Challenge: Finding vectors for the national borders was not easy.
- Solution: After much searching I found some useful vectors for the national borders. They are used here with two caveats: they are not my intectual property, and they are not small in size. As a result I am not too keen to share this project with the wider world. (Resolved)
- Challenge: Making a colourblind friendly Google Map is not easy.
- Solution: I wanted to make the map colourblind friendly, and the simplest way to do this was to use striped fills. This is not a satisfctory solution using Google Maps, so I made a second page where is is a reasonable solution. This meant that I had make a page that writes maps from scratch, including panning and zooming. This was easier than expected and may lead to other map-based projects in the future. (Resolved)
- Challenge: This project used striped fills in the polygons.
- Solution: One of the more difficult parts of this project was developing an algorithm that intersected multiple polygons in a consistent and sensible way. After much experimentation and development, this was achieved. This used knowledge developed on a previous and unfinihsed project that creates city skyline graphics. (Resolved)
- Challenge: I needed an xml reader.
A few years ago I found myself in need of a simple program to make pixel art. At the time I had access to the PHP suite of graphical libraries, and the ability to manipulate the HTML DOM, so armed with these tools I put together a simple tool that would allow me to create small images online, which I called the Painter. It was not particularly efficient, but it got the job done, and did so for free. (Now that the HTML canvas is supported pretty much everywhere that tools needs to be rewritten for the latest generation of browser technology.) During the development of this tool I found myself wanting to solve the following problem: Given a two dimensional rectangular array of squares, how can I find all squares that reside within a given boundary? It’s a fairly straightforward problem, but one that requires keeping track of two lists of squares, and one that should scale, as far as possible, with \(n^2\) or lower. After some head scratching and experimenting with a few lines of code I found the solution I needed, tweaked it a bit, and was fairly impressed with its elegance.
A few months ago I found myself having to deal with the CMS Preshower system. I wanted to make a map of the system to that I could visualise how it looked in three dimensions. The only things I really knew as that it was arranged in a grid of square components (each square component subsequently arranged in strips) and that I had a tool where if I could access a single square I could access all of its neighbours. Since the physical extent of the preshower system gave me an obvious boundary I found myself with a very familiar sounding problem… Without even needing to look up the code from the Painter tool I wrote down the solution in a few lines and in about 10 minutes. What may have taken me a few hours of framing the question, determining how to find the solution, and subsequently implement it was reduced to a trivially solved problem with a minimal footprint of CPU time and memory. This later lead on to the development of the \(\eta-\phi\) map, which is still under development and may eventually lead to more breakthroughs for the CMS experiment.
I make it a point to not invest much time in a project unless it’s going to teach me a new skill or show me how a new technology or feature works. By exploring what was possible with a few rudimentary tools I was able to unwittingly give myself the solution to a real world problem that saved a lot of headaches and time. Whether the extra projects are worth the extra time is a different discussion entirely (I personally think they are, and also think that what I learn in physics programming today I can use in private enterprise tomorrow) and one that deserves some serious thought. For now I am happy that I have a huge pool of experience to draw on that only keeps growing in time.
In the world of particle physics it’s possible to simulate (our best approximation of) the fabric of reality. The bad news is that this comes at a cost. Nearly all software is made in-house by scientists. Many of them are excellent programmers, but many aren’t. The documentation is often missing steps and there are sometimes dependecies of outdated packages. To make matters worse the documentation is often sparse and pointing to the wrong resources.
Today I’m trying to use two packages called pythia and Delphes, which are used to simulate particle collisions and the respective detector responses. So far I’ve had to visit the pythia page at least four times (including outdated pages that no longer exist) to find the correct location of the package. Once downloaded and installed, it turns out it requires some additional packages to run the parts I need. That means going down another rabbit warren of pages and outdated links to find out how to get FastJet and HepMC.
On one hand keeping all these packages separate and linking to each home page makes things more versatile and faster to develop. On the other hand having a single recipe or even a single installer can work wonders and get people using these projects more quickly and easily. That’s what the CMS experiment does, and it gets people past the fisrt hurdle. All you need to do is a simple cmsrel <name> and then cmsenv and you have a fully working, internally consistent copy of the CMS software infrastructure. Each step that gets added to the process is a hurdle between the user and the end result, which often leads to frustration to the point where people simply give up trying to make things works. There have been several times in the past where I’ve made the decision that continuing to pursue a given route is just pointless. That’s why whenever I make a package I try to give simple all-in-one instructions to make things as easy as possible.
In any case this blog is supposed to be a useful repository of knowledge about the things I make and do, so here’s the link to the recipe: Physics cheatsheets.
This project was the first step in a larger project that scrubbed data from pages for analysis.
This page takes input from Wikipedia (although at the moment this is extracted manually from the main page) and then produces a C script which summarises the data for further analysis. ROOT is then used to make a plot showing the data. The idea behind the project was to make data scrubbing easier and to give me practice in data scrubbing. If I ever get sufficient time I intend to create some resources to make data scrubbing easier and make sharing of publically accessible data simpler. The the data are taken from an outdated Wikipedia page which is missing many sources.
- Challenge: The data needs to be scrubbed from the page easily.
- Solution: To save time I manually copied the data from Wikipedia for offline analysis, but this should be fixed to read the data online. (Resolved, to be revisited)
- Challenge: The output needs to be some kind of source file, for example C or python.
- Solution: The source file I chose was a ROOT macro for convenience. Ideally the user should be able to choose what output they want (including xml) to best suit their needs. (Resolved, to be revisited)
In 2014 the CMS experiment at CERN released their 2010 data to the public for analysis. As a quick exercise I decided to analyse the dimuon mass spectrum to show that this could be done in a reasonable amount of time.
The input file is the 2010 Run B comma separated values file. The python script then produces mass spectra for same sign and opposite sign mass spectra and zooms in on the interesting regions.
- Challenge: The main challenge was that this project was made as quickly as possible.
- Solution: This project uses python and existing ROOT libraries for the maximal development speed. The other data format available was using CMSSW and a virtual machine. In principle using CMSSW should be straightforward, but I decided against using this because the software was already four years old and support would be minimal or non-existant, even to current CMS physicsts. (Resolved)
This project creates an infographic showing the ROOT colors and how they relate to each other.
Physicists use ROOT to make plots and it’s often useful to be able to easily browse the color space. ROOT provides a color wheel, but I find the rectangular display very useful as well. In principle, colors in the same column should suit each other, which makes the rectangular display more useful than the color wheel.
One of my responsibilities as a CMS physicist is developing C++ and python code using the framework, CMSSW. The coding standards are very high for CMSSW, with strict checking and treating warnings as errors, so even the slightest lapse in safety causes compile errors. I just came across a build error because I tried to pass a const instance of class as an argument to a method and using a non const method of that class, giving the error message:
error: passing ‘const PixelMatchSParameters’ as ‘this’ argument of ‘float PixelMatchSParameters::sPhi1B()’ discards qualifiers
It had been a while since I’d seen this error, so I had to look it up and when it did, and I realised the problem a smile spread across my face. There’s something beautiful about how strongly cast C is, and how strict C++ forces you to be when passing around objects. When the code is clean and safe it’s something I can be proud of, and that’s one of the reasons I love C++. (On the other hand there are times when C++ makes me want to pull my hair out, but they are often the result of poorly documented code, or working with someone else’s perverse design choices.)
I’ve been developing in C and C++ for about a decade now, but so far I have not created my own standalone project from scratch before without building upon some other framework. Perhaps it’s time I did.