Tag Archives: xml


One of the projects I have wanted to develop for a long time is a browser based particle physics experiment simulator. Such a project would generate events using Monte Carlo methods and simuate their interactions with the detector. This was made partly as an educational aid, partly as a challenge to myself, and partly because at the time I was feeling some frustration with the lack of real analysis in my job. As expected for a Javascript based CPU intensive appplication, this reaches the limits of what is possible with current technology quite rapidly.


Live page V1
Live page V2
Live page V3
Live page V4
Live page V5
GitHub repository


The model for the detector is saved in an xml file which is loaded using the same methods developed in the Marble Hornets project. Particle data are currently stored in Javascript source files, but will eventually use xml as well. The particle interactions are simulated first by choosing a process (eg \(e^+e^-\to q\bar{q}\)) and then decaying the particles. Jets are formed by popping \(q-\bar{q}\) pairs out of the vacuum while phase space allows and then arranging the resulting pairs in hadrons. Bound states are then decayed according to their Particle Data Group (PDG) branching fractions, with phase space decays used. The paths of the particles are propagated through the detector using a stepwise helix propagation. Energy depositions in the detector are estimated, according to the characteristic properties of the detector components. A list of particles is then compiled based on the particles produced and these can be used to reconstruct parent particles.

The user has access to several controls to interact with the application. They can choose how to view the detector, using Cartesian coordinates and two Euler angles (with the roll axis suppressed.) The most expensive parts of the process are the generation of the event displays and the generation of the particle table. By default these are only updated after a certain interval, to allow the user to accumulate a significant number of events without being slowed down by the graphics. To save time the detector itself is rendered once in a cutaway view, and the particle tracks are overlaid on the saved image. Eventually the user will be able to get a full event display, including the detector response to the particle with glowing detector components etc.

The user has access to collections of particles, including electrons, muons, pions, kaons, photons, and protons. From these they can construct other particles, making selections as they do so. Once they have made parent particles they can then plot kinematic variables including mass, momentum, transverse moment, and helicity angle. This should, in principle, allow students to learn how to recreate particles and how to separate signal from background effectively.

Given the large amount of information available the user has access to a number of tabs which can can be collapsed out of view. This allows the user to run the application with the expensive canvas and DOM updates, and thus collect many more events.

This is still a work in progress, with reconstruction of particle being the next main priority. Eventually the user would be able to load their favourite detector geometry and beam conditions, then perform their analysis, saving the output in xml files and possible being able to upload these to a server. This would allow users to act as “players” with “physics campaigns”, including the SPS experiments, HERA experiments, B factories, LEP experiments, and LHC experiments. This is, of course, a very ambitious goal, and one which has been ongoing for over a year at this point.

See other posts tagged with aDetector.


Challenge: A sophisticated model for the detector was needed.
Solution: The detector is split up by subdetector, with each subdetector having its own characteristic responses to different particles. The detector is split up in cylindrical coordinates, \((
ho,\eta,\phi)\), with each subdetector also being split into modules. Individual modules then react the particles for reconstruction purposes. Thus with a few key parameters even a sophisticated model can be stored in a few variables that can be tuned quickly and easily. (Resolved.)
Challenge: The detector shold have a three dimensional view that the user can control.
Solution: The detector is drawn using a wireframe with transparent panels. This is a method I developed in 2009 for a now defunct PHP generated SVG based visualisation of the BaBar electromagnetic calorimeter, which I used to show the absorbed dose as a function of detector region and time. The drawing algorithm is not perfect, as panels are drawn in order from furthest from the user to closest. This is sufficient for most purposes, but occasionally panels will intersect causing strange artefacts. Eventually this should be replaced with a much more stable, robust, and fast implementation, such as in three.js. (Resolved, to be revisited.)
Challenge: Particles should be modeled realistically and physically correctly.
Solution: Particles are modelled with the most important parameters (mass, charge, decay modes etc) taken from the PDG. Their kinematic properties are modeled using special four vector classes, and decay products “inherit” from their parents in a consistent manner. Particles at the highest layer of the tree are assigned their four momenta, and then their decay products are decayed, inheriting the boost and production vertex from their parents. This is repeated recursively until all unstable particles are decayed. So far this does not take spin into account, as everything is decayed using a phase space model. Particles with large widths have their lineshape modeled using a Breit-Wigner shape. As a result, particle have realistic looking jets and four momentum is conserved. This required the development of special libraries to handle these decays and kinematic constraints. (Resolved, to be revisited.)
Challenge: Particles decay trees must be traversed consistently.
Solution: This is harder than it sounds! Every time an event is generated, particles are recursively decayed for as long as phase space allows. The particles must then be traversed and dispalyed in the table, in a consistent manner. Ensuring that all particles are listed hierarchally without missing anything out takes some care. (Resolved.)
Challenge: Particles lists had to be prepared for the user.
Solution: The user has access to a handful of “building blocks” to play with. These are taken from the (generated) list of particles per event and filtered by the particle type. Further lists can be created or filtered from these lists, and parent particles can reconstructed from combining lists. This means having to provide special classes to handle the particles and ensure that no particles are reconstructed recursively (leading to infinite loops.) Apart from using reconstructed particles instead of generated particles, this has been resolved. (Resolved, to be revisited.)
Challenge: The user needs to be able to make histograms.
Solution: I had made histgorams for other projects, including the Reflections and Box plotter projects, so this was mostly fairly easy to do. Even so, being able to create new histograms of arbitrary scales and variables on the fly meant that this histogram class had to be more robust than previous projects. (Resolved.)


Screenshot of aDetector V5
Screenshot of aDetector V5

Marble Hornets

Marble Hornets is a web series that currently spans the course of about five years. The premise of the series is that a student film maker started filming an amateur movie in 2006, but abandoned the film due to “unworkable conditions on set”. It quickly turns into a horror series that brings with some novel techniques. However what really inspired me to make this project was the asynchronous storytelling narrative, which requires the viewer to piece together the true chronology based on the context of the videos. I’m an active participant on one of the busiest discussion boards for this series and regularly update this project as each new video is released.


Live page
GitHub repository


So far, the Marble Hornets project has two main aspects to it. The initial work began with the Marble Hornets player, a collection of youtube videos that are manipulated by the youtube JS API. The user can autoplay all the videos in the series, filter based on many parameters, and even create their own playlists. The player is made so that the user can autoplay everything in chronological order in full screen mode. The data was originally stored in Javascript files, but this has since been moved to XML files to make maintenance and data entry easier and more portable. After creating the player I added a lot of further information including links to the wikia pages, youtube videos and forum threads for each video, as well as the twitter feed, a real world map of filming locations and other interesting links, turning the player into a hub of information.

A previous version of the player was adapted to make the my_dads_tapes player, although I lost interest in the series and stopped updating that player. At some point I intend to automate some of the player manipulation so that a user can create their own player for any youtube account, which would automatically source the relevant information and download the thumbnails.

The Marble Hornets player
The Marble Hornets player

The second aspect of the project is more interesting and challenging, and it is the automated creation of infographic images to help clarify the information known about the series. These files are shared with the community on the forums, and help users discuss the series. Videos are split into scenes, which contains characters and items. The scenes are sorted chronologically (although in many cases the chronological ordering is ambiguous, and the consensus opinion is usually taken in these cases) and then the characters and items are represented by arrows which flow from scene to scene. The scenes are arranged in columns to give the location of the characters, or the owners of the items. Users can create and enter their own XML files to make their own infographics, and automatically dump XML snippets by clicking on the scenes they wish to select. The users can filter by characters, camerapersons, items, and seasons. The scenes are colour coded to represent the sources of the footage.

Part of the master timeline
Part of the master timeline
Part of the (even more complicated) items inforgraphic
Part of the (even more complicated) items inforgraphic


Challenge: The web series is told out of order, so one of the biggest problems to solve was sorting the scenes in order, when the order was sometimes ambiguous.
Solution: This was solved by following the fan-made “comprehensive timeline” when in doubt, and sorting the scenes with dates, and in the case of multiple scenes per day, by time. The scenes are assigned timestamps, and the videos are assigned release dates. With this in place, the scenes and videos can then be sorted quickly and easily. (Resolved)
Challenge: The data has to be stored in an organised manner.
Solution: Initially this was solved by declaring objects directly. In order to make the data more portable and for other users to contribute, I wrote and parsed xml files, so that nearly all the data is sorted semantically. One of the infographics keeps track of the exchange of posessions between characters, and this data is not yet accounted for in the xml files. (Resolved, to be revisited)
Challenge: This project required extensive use of the youtube js API.
Solution: Thanks to this project I can now queue up videos, make them autoplay, and use a timer to move between sections of video. (Resolved)
Challenge: The video stills had to respond to the user’s actions.
Solution: The video player in this project allows the user to mouseover the video stills, with dynamic changes of style as the user does so, makng it more aesthetically pleasing. This had to be responsive and with little overhead, and the result is rather pleasing. (Resolved)
Challenge: The timeline has to be laid out properly on the page.
Solution: This means that each element must be given enough space to display all the text and images, and then arranged to give the character arrows sufficient space. This has been achieved using a text wrapping function, and parsing the lists of objects multiple times to ensure sufficient spacing. Editing this code is not a pleasant experience though, it could certianly benefit from some refactoring. (Resolved, to be revisited)
Challenge: The player should allow the user to create a custom playlist
Solution: The user can create a playlst filtering by characters etc, and choosing to play in release order or chronological order. The player also allows the user to watch all scenes in order. (Resolved)
Challenge: There have been many many challenges with this project, so I may add more as they occur to me!
Solution: (Resolved)