Hacker News new | past | comments | ask | show | jobs | submit login
First collisions at Belle II (symmetrymagazine.org)
34 points by digital55 on April 27, 2018 | hide | past | favorite | 13 comments



This is an exciting time for Belle II to come online.

Although every couple of months you hear rumours of deviations in data from the Standard Model of particle physics from some experiment, that later fizzle away with the collection of more data/better understanding of systematic errors, there have been a few deviations recently that look promising.

Specifically, regarding B mesons mentioned in the article, whose bound states seem to be decaying in unexpected ways (that could indicate the violation of “lepton universality,” the idea that all leptons should be treated “equally” by the electroweak force).

As a complementary machine to LHCb at CERN where those reaults are being seen, Belle could confirm any potential deviations and find the first evidence for Beyond the Standard Mode Physics.

That said, it’s very difficult modelling the bound states of B mesons so it might all turn out to be a misunderstanding, but it’ll be interesting to see how it pans out.


Question - I understand that the particles are guided into collisions by using magnets and directing the path of the two particles. What I don't understand is, how do the researchers know exactly where the particles are located before they have collided? Would not their exact position have to be known to record the collision data and for taking pictures of the collision? Thank you ahead of time for any explanation you provide.


It's essentially statistical. First, the beam. The beam at a collider is actually made out of longitudinal 'buckets' of particles. The buckets are timed to cross at the interaction point inside the detector, where magnetic lens arrangements squeeze the beam as far down in physical size as is practical. By controlling the beam, you know when exactly to look for collision products, and expected collision rates are calculated from particle distributions and cross-sections.

Next, the detectors. There are a variety of detectors around the interaction point that serve different purposes. Broadly speaking, they measure deposited energy, and in which direction it is going. That's what physicists are interested in, so they can work out what those particles were.

Everything at an experiment like this is modeled and calculated. At the very bottom you have fundamental physics, particles being created and destroyed, modeled in a simulation tool such as Pythia. Once you know what's supposed to come out, you can simulate how the products behave in your detectors, by building a model in a tool like Geant. If you model from end to end, you essentially test physics theory against the hard bedrock of reality.


They don't take pictures, the particles are recorded by a variety of detectors. For example, a wire chamber records particles through the ionization they cause in a very diluted gas, those ions are then accelerated by the electric field in the chamber and causes a blip when they hit one of the wires. As the particles are tracked all the way through the detector, they can know the state both before and after the collision.

Then there are calorimeters that absorbs the debris from the collision and measures the energy of it.


They aren't tracking individual particles, they are tracking "packets" of many particles. There are many packets in flight at any time.


They're two beams of particles that cross at a point. It's not adequately captioned but I think the image is a cross-section view, with the beam direction "into/outof" the screen.


Being fascinated by the physics of it, it would be really interesting to have a more detailed description of the H/W and S/W involved. I guess they get a few GB per collision which for sure needs some hefty number crunching CPUs, data storage and smart statistical (big data) analysis.


It's all in the open. I have an acquaintance on the project, although we haven't talked about it in some time.

https://indico.hep.pnnl.gov/event/14/session/1/material/0/0....


The storage, network and compute needs of these projects are really exciting. You get a couple gigs to a couple dozen gigs per measurement run, and you're running a lot of measurements. As the presentation shows, they are looking at a petabyte per month. That's a lot of stuff to crunch. CERN is also pushing 2 - 3 digit petabyte storages on ceph around for their LHC.

Genome analysis has similar interesting requirements. A sequencer for a human genome outputs something like 100 - 200Gb of raw data to store and process, and then you have some computationally tricky analysis to run on that.


Raw event sizes in modern collider experiments are typically 100s of kB to a few MB. The trigger rate at Belle II is about 30 kHz. It's comparable to an LHC experiment in terms of data output, but it has the advantage of lower particle multiplicity, making reconstruction and simulation a lot easier.


Only one woman in the entire lab? :(


1. I see at least two in the photograph in the article.

2. Only a fraction of the people involved were there for that photo.


Maybe if they release all job application files, we can search for science beyond the Standard Model to explain this asymmetry.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: