### 321 Mega Prime!

On 6 September 2021, 07:16:28 UTC, PrimeGrid's 321 Search found the Mega Prime:

**3*2^17748034-1**The prime is 5,342,692 digits long and enters**Chris Caldwell's “The Largest Known Primes Database”**ranked 18th overall. The discovery was made by Marc Wiseler (**McDaWisel**) of Ireland using an AMD Ryzen 9 5900X 12-Core Processor with 16GB RAM, running Microsoft Windows 10 Professional x64 Edition. This computer took about 2 hours, 45 minutes to complete the primality test using LLR2. Marc Wiseler is a member of the**Storm**team. The prime was verified on 6 September 2021, 11:47 UTC, by an Intel(R) Core(TM) i7-9800X CPU @ 3.80GHz with 32GB of RAM, running CentOS. This computer took 2 hours and 41 minutes to complete the primality test using LLR2. For more details, please see the**official announcement**.
Kategórie: Novinky z projektov

### World Record Generalized Cullen Prime!

On 28 August 2021, 09:10:17 UTC, PrimeGrid’s Generalized Cullen/Woodall Prime Search found the largest known Generalized Cullen prime:

**2525532*732525532+1**Generalized Cullen numbers are of the form: n*bn+1. Generalized Cullen numbers that are prime are called Generalized Cullen primes. For more information, please see**“Cullen prime” in The Prime Glossary**. The prime is 4,705,888 digits long and enters**Chris Caldwell's The Largest Known Primes Database**ranked 1st for Generalized Cullen primes and 24th overall. Base 73 was one of 10 primeless Generalized Cullen bases for b ≤121 that PrimeGrid is searching. The remaining bases are 13, 29, 47, 49, 55, 69, 101, 109 & 121. The discovery was made by Tom Greer (**tng**) of the United States using an Intel(R) Core(TM) i9-10920X CPU @ 3.50GHz with 16GB RAM, running Microsoft Windows 10 Professional x64 Edition. This computer took about 10 hours, 40 minutes to complete the primality test using LLR2. Tom is a member of the**Antarctic Crunchers**team. The prime was verified on 28 August 2021, 18:01 UTC, by an Intel(R) Core(TM) i7-9800X CPU @ 3.80GHz with 32GB of RAM, running CentOS. This computer took 3 hours and 39 minutes to complete the primality test using LLR2. For more details, please see the**official announcement**.
Kategórie: Novinky z projektov

### Validator Outage

Hey Everyone,

MilkyWay@home is currently experiencing an outage for the Separation Validator. I brought it back up once, and then it crashed again. I am trying to bring it back up. In the meantime, connections to the download/upload servers may stop and start intermittently as I work on the server.

Thanks for your patience, I will keep you updated as things change.

Tom

MilkyWay@home is currently experiencing an outage for the Separation Validator. I brought it back up once, and then it crashed again. I am trying to bring it back up. In the meantime, connections to the download/upload servers may stop and start intermittently as I work on the server.

Thanks for your patience, I will keep you updated as things change.

Tom

Kategórie: Novinky z projektov

### The lucky ones!

Whenever Einstein@Home finds a new neutron star, there is always one workunit in which that discovery "stands out" from the noise with the highest statistical significance. The volunteers whose computers processed that workunit receive framed discovery certificates, signed by myself and by the Principal Investigator of the experiment/collaboration that provided the data. Over the past decade, we have sent out more than a hundred of these, and we are now preparing a new batch. If you are interested, here is

Kategórie: Novinky z projektov

### Thanks for supporting SixTrack at LHC@Home and updates

Dear volunteers,

All members of the SixTrack team would like to thank each of you for supporting our project at LHC@Home. The last weeks saw a significant increase in work load, and your constant help did not pause even during the Christmas holidays, which is something that we really appreciate!

As you know, we are interested in simulating the dynamics of the beam in ultra-relativistic storage rings, like the LHC. As in other fields of physics, the dynamics is complex, and it can be decomposed into a linear and a non-linear part. The former allows the expected performance of the machine to be at reach, whereas the latter might dramatically affect the stability of the circulating beam. While the former can be analysed with the computing power of a laptop, the latter requires BOINC, and hence you! In fact, we perform very large scans of parameter spaces to see how non-linearities affect the motion of beam particles in different regions of the beam phase space and for different values of key machine parameters. Our main observable is the dynamic aperture (DA), i.e. the boundary between stable, i.e. bounded, and unstable, i.e., unbounded, motion of particles.

The studies mainly target the LHC and its upgrade in luminosity, the so-called HL-LHC. Thanks to this new accelerator, by ~2035, the LHC will be able to deliver to experiments x10 more data than what is foreseen in the first 10/15y of operation of LHC in a comparable time. We are in full swing in designing the upgraded machine, and the present operation of the LHC is a unique occasion to benchmark our models and simulation results. The deep knowledge of the DA of the LHC is essential to properly tune the working point of the HL-LHC.

If you have crunched simulations named "workspace1_hl13_collision_scan_*" (Frederik), then you have helped us in mapping the effects of unavoidable magnetic errors expected from the new hardware of the HL-LHC on dynamic aperture, and identify the best working point of the machine and correction strategies. Tasks named like "w2_hllhc10_sqz700_Qinj_chr20_w2*" (Yuri) focus the attention onto the magnets responsible for squeezing the beams before colliding them; due to their prominent role, these magnets, very few in number, have such a big impact on the non-linear dynamics that the knobs controlling the linear part of the machine can offer relevant remedial strategies.

Many recent tasks are aimed at relating the beam lifetime to the dynamic aperture. The beam lifetime is a measured quantity that tells us how long the beams are going to stay in the machine, based on the current rate of losses. A theoretical model relating beam lifetime and dynamic aperture was developed; a large simulation campaign has started, to benchmark the model against plenty of measurements taken with the LHC in the past three years. One set of studies, named "w16_ats2017_b2_qp_0_ats2017_b2_QP_0_IOCT_0" (Pascal), considers as main source of non-linearities the unavoidable multipolar errors of the magnets, whereas tasks named as "LHC_2015*" (Javier) take into account the parasitic encounters nearby the collision points, i.e. the so called "long-range beam-beam effects".

One of our users (Ewen) is carrying out two studies thanks to your help. In 2017 DA was directly measured for the first time in the LHC at top energy, and nonlinear magnets on either side of ATLAS and CMS experiments were used to vary the DA. He wants to see how well the simulated DA compares to these measurements. The second study seeks to look systematically at how the time dependence of DA in simulation depends on the strength of linear transverse coupling, and the way it is generated in the machine. In fact, some previous simulations and measurements at injection energy have indicated that linear coupling between the horizontal and vertical planes can have a large impact on how the dynamic aperture evolves over time.

In all this, your help is fundamental, since you let us carry out the simulations and studies we are interested in, running the tasks we submit to BOINC. Hence, the warmest "thank you" to you all!

Happy crunching to everyone, and stay tuned!

Alessio and Massimo, for the LHC SixTrack team.

All members of the SixTrack team would like to thank each of you for supporting our project at LHC@Home. The last weeks saw a significant increase in work load, and your constant help did not pause even during the Christmas holidays, which is something that we really appreciate!

As you know, we are interested in simulating the dynamics of the beam in ultra-relativistic storage rings, like the LHC. As in other fields of physics, the dynamics is complex, and it can be decomposed into a linear and a non-linear part. The former allows the expected performance of the machine to be at reach, whereas the latter might dramatically affect the stability of the circulating beam. While the former can be analysed with the computing power of a laptop, the latter requires BOINC, and hence you! In fact, we perform very large scans of parameter spaces to see how non-linearities affect the motion of beam particles in different regions of the beam phase space and for different values of key machine parameters. Our main observable is the dynamic aperture (DA), i.e. the boundary between stable, i.e. bounded, and unstable, i.e., unbounded, motion of particles.

The studies mainly target the LHC and its upgrade in luminosity, the so-called HL-LHC. Thanks to this new accelerator, by ~2035, the LHC will be able to deliver to experiments x10 more data than what is foreseen in the first 10/15y of operation of LHC in a comparable time. We are in full swing in designing the upgraded machine, and the present operation of the LHC is a unique occasion to benchmark our models and simulation results. The deep knowledge of the DA of the LHC is essential to properly tune the working point of the HL-LHC.

If you have crunched simulations named "workspace1_hl13_collision_scan_*" (Frederik), then you have helped us in mapping the effects of unavoidable magnetic errors expected from the new hardware of the HL-LHC on dynamic aperture, and identify the best working point of the machine and correction strategies. Tasks named like "w2_hllhc10_sqz700_Qinj_chr20_w2*" (Yuri) focus the attention onto the magnets responsible for squeezing the beams before colliding them; due to their prominent role, these magnets, very few in number, have such a big impact on the non-linear dynamics that the knobs controlling the linear part of the machine can offer relevant remedial strategies.

Many recent tasks are aimed at relating the beam lifetime to the dynamic aperture. The beam lifetime is a measured quantity that tells us how long the beams are going to stay in the machine, based on the current rate of losses. A theoretical model relating beam lifetime and dynamic aperture was developed; a large simulation campaign has started, to benchmark the model against plenty of measurements taken with the LHC in the past three years. One set of studies, named "w16_ats2017_b2_qp_0_ats2017_b2_QP_0_IOCT_0" (Pascal), considers as main source of non-linearities the unavoidable multipolar errors of the magnets, whereas tasks named as "LHC_2015*" (Javier) take into account the parasitic encounters nearby the collision points, i.e. the so called "long-range beam-beam effects".

One of our users (Ewen) is carrying out two studies thanks to your help. In 2017 DA was directly measured for the first time in the LHC at top energy, and nonlinear magnets on either side of ATLAS and CMS experiments were used to vary the DA. He wants to see how well the simulated DA compares to these measurements. The second study seeks to look systematically at how the time dependence of DA in simulation depends on the strength of linear transverse coupling, and the way it is generated in the machine. In fact, some previous simulations and measurements at injection energy have indicated that linear coupling between the horizontal and vertical planes can have a large impact on how the dynamic aperture evolves over time.

In all this, your help is fundamental, since you let us carry out the simulations and studies we are interested in, running the tasks we submit to BOINC. Hence, the warmest "thank you" to you all!

Happy crunching to everyone, and stay tuned!

Alessio and Massimo, for the LHC SixTrack team.

Kategórie: Novinky z projektov

### LHC@home down-time due to system updates

Tomorrow Wednesday 24/1, the LHC@home servers will be unavailable for a short period while our storage backend is taken down for a system update.

Today, Tuesday 23/1, some of the Condor servers that handle CMS, LHCb and Theory tasks will be down for a while. Regarding the on-going issues with upload of files, please refer to this thread.

Thanks for your understanding and happy crunching!

Today, Tuesday 23/1, some of the Condor servers that handle CMS, LHCb and Theory tasks will be down for a while. Regarding the on-going issues with upload of files, please refer to this thread.

Thanks for your understanding and happy crunching!

Kategórie: Novinky z projektov