Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • faproietti/ar2018
  • chierici/ar2018
  • SDDS/ar2018
  • cnaf/annual-report/ar2018
4 results
Show changes
Showing
with 7347 additions and 0 deletions
This diff is collapsed.
%% This BibTeX bibliography file was created using BibDesk.
%% http://bibdesk.sourceforge.net/
%% Created for Fabio Bellini at 2017-02-28 14:54:59 +0100
%% Saved with string encoding Unicode (UTF-8)
@article{Alduino:2017ehq,
author = "Alduino, C. and others",
title = "{First Results from CUORE: A Search for Lepton Number
Violation via $0\nu\beta\beta$ Decay of $^{130}$Te}",
collaboration = "CUORE",
journal = "Phys. Rev. Lett.",
volume = "120",
year = "2018",
number = "13",
pages = "132501",
doi = "10.1103/PhysRevLett.120.132501",
eprint = "1710.07988",
archivePrefix = "arXiv",
primaryClass = "nucl-ex",
SLACcitation = "%%CITATION = ARXIV:1710.07988;%%"
}
@article{Alduino:2016vtd,
Archiveprefix = {arXiv},
Author = {Alduino, C. and others},
Collaboration = {CUORE},
Date-Added = {2017-02-28 13:49:12 +0000},
Date-Modified = {2017-02-28 13:49:12 +0000},
Doi = {10.1140/epjc/s10052-016-4498-6},
Eprint = {1609.01666},
Journal = {Eur. Phys. J.},
Number = {1},
Pages = {13},
Primaryclass = {nucl-ex},
Slaccitation = {%%CITATION = ARXIV:1609.01666;%%},
Title = {{Measurement of the two-neutrino double-beta decay half-life of$^{130}$ Te with the CUORE-0 experiment}},
Volume = {C77},
Year = {2017},
Bdsk-Url-1 = {http://dx.doi.org/10.1140/epjc/s10052-016-4498-6}}
@article{Artusa:2014lgv,
Archiveprefix = {arXiv},
Author = {Artusa, D.R. and others},
Collaboration = {CUORE},
Doi = {10.1155/2015/879871},
Eprint = {1402.6072},
Journal = {Adv.High Energy Phys.},
Pages = {879871},
Primaryclass = {physics.ins-det},
Slaccitation = {%%CITATION = ARXIV:1402.6072;%%},
Title = {{Searching for neutrinoless double-beta decay of $^{130}$Te with CUORE}},
Volume = {2015},
Year = {2015},
Bdsk-Url-1 = {http://dx.doi.org/10.1155/2015/879871}}
@inproceedings{Adams:2018nek,
author = "Adams, D. Q. and others",
title = "{Update on the recent progress of the CUORE experiment}",
booktitle = "{28th International Conference on Neutrino Physics and
Astrophysics (Neutrino 2018) Heidelberg, Germany, June
4-9, 2018}",
collaboration = "CUORE",
url = "https://doi.org/10.5281/zenodo.1286904",
year = "2018",
eprint = "1808.10342",
archivePrefix = "arXiv",
primaryClass = "nucl-ex",
SLACcitation = "%%CITATION = ARXIV:1808.10342;%%"
}
\ No newline at end of file
\documentclass[a4paper]{jpconf}
\usepackage{graphicx}
\bibliographystyle{iopart-num}
%\usepackage{citesort}
\begin{document}
\title{CUORE experiment}
\author{CUORE collaboration}
%\address{}
\ead{cuore-spokesperson@lngs.infn.it}
\begin{abstract}
CUORE is a ton scale bolometric experiment for the search of neutrinoless double beta decay in $^{130}$Te.
The detector started taking data in April 2017 at the Laboratori Nazionali del Gran Sasso of INFN, in Italy.
The projected CUORE sensitivity for the neutrinoless double beta decay half life of $^{130}$Te is of 9$\times$10$^{25}\,$y after five years of live time.
In 2018 the CUORE computing and storage resources at CNAF were used for the data processing and for the production of the Monte Carlo simulations used for a preliminary measurement of the 2$\nu$ double-beta decay of $^{130}$Te.
\end{abstract}
\section{The experiment}
The main goal of the CUORE experiment~\cite{Artusa:2014lgv} is to search for Majorana neutrinos through the neutrinoless double beta decay (0$\nu$DBD): $(A,Z) \rightarrow (A, Z+2) + 2e^-$.
The 0$\nu$DBD has never been observed so far and its half life is expected to be higher than 10$^{25}$\,y.
CUORE searches for 0$\nu$DBD in a particular isotope of Tellurium ($^{130}$Te), using thermal detectors (bolometers). A thermal detector is a sensitive calorimeter which measures the
energy deposited by a single interacting particle through the temperature rise induced in the calorimeter itself.
This is accomplished by using suitable materials for the detector (dielectric crystals) and by running it at very low temperatures (in the 10 mK range) in a dilution refrigerator. In such condition a small energy release in the crystal results in a measurable temperature rise. The temperature change is measured by means of a proper thermal sensor, a NTD germanium thermistor glued onto the crystal.
The bolometers act at the same time as source and detectors for the sought signal.
The CUORE detector is an array of 988 TeO$_2$ crystals operated as bolometers, for a total TeO$_2$ mass of 741$\,$kg.
The tellurium used for the crystals has natural isotopic abundances ($\sim$\,34.2\% of $^{130}$Te), thus the CUORE crystals contain overall 206$\,$kg of $^{130}$Te.
The bolometers are arranged in 19 towers, each tower is composed by 13 floors of 4 bolometers each.
A single bolometer is a cubic TeO$_2$ crystal with 5$\,$cm side and a mass of 0.75$\,$kg.
CUORE will reach a sensitivity on the $^{130}$Te 0$\nu$DBD half life of $9\times10^{25}$\,y.
The cool down of the CUORE detector was completed in January 2017, and after a few weeks of pre-operation and optimization, the experiment started taking physics data in April 2017.
The first CUORE results were released in summer 2017 and were followed by a second data release with an extended exposure in autumn 2017~\cite{Alduino:2017ehq}.
The same data release was used in 2018 to produce a preliminary measurement of the 2-neutrino double-beta decay~\cite{Adams:2018nek}.
In 2018 CUORE acquired less than two months worth of physics data, due to cryogenic problems that required a long stop of the data taking.
\section{CUORE computing model and the role of CNAF}
The CUORE raw data consist in Root files containing the continuous data stream of $\sim$1000 channels recorded by the DAQ at sampling frequencies of 1 kHz. Triggers are implemented via software and saved in a custom format based on the ROOT data analysis framework.
The non event-based information is stored in a PostgreSQL database that is also accessed by the offline data analysis software.
The data taking is organized in runs, each run lasting about one day.
Raw data are transferred from the DAQ computers to the permanent storage area at the end of each run.
In CUORE about 20$\,$TB/y of raw data are being produced.
A full copy of data is maintained at CNAF and preserved also on tape.
The main instance of the CUORE database is located on a computing cluster at the Laboratori Nazionali del Gran Sasso and a replica is synchronized at CNAF.
The full analysis framework at CNAF is working and kept up to date to official CUORE software release.
The CUORE data analysis flow consists in two steps.
In the first level analysis the event-based quantities are evaluated, while in the second level analysis the energy spectra are produced.
The analysis software is organized in sequences.
Each sequence consists in a collection of modules that scan the events in the Root files sequentially, evaluate some relevant quantities and store them back in the events.
The analysis flow consists in several fundamental steps that can be summarized in pulse amplitude estimation, detector gain correction, energy calibration, search for events in coincidence among multiple bolometers, evaluation of pulse-shape parameters used to select physical events.
The CUORE simulation code is based on the GEANT4 package, for which the 4.9.6 and the 10.xx up to 10.03 releases have been installed.
The goal of this work is the evaluation, at the present knowledge of material contaminations, of the background index reachable by the experiment in the region of interest of the energy spectrum (0$\nu$DBD is expected to produce a peak at 2528\,keV).
Depending on the specific efficiency of the simulated radioactive sources (sources located outside the lead shielding are really inefficient), the Monte Carlo simulation could exploit from 5 to 500 computing nodes, with durations up to some weeks.
Recently Monte Carlo simulations of the CUORE calibration sources were also performed at CNAF.
Thanks to these simulations, it was possible to produce calibration sources with an activity specifically optimized for the CUORE needs.
In 2018 the CNAF computing resources were exploited for the production of a preliminary measurement of the 2-neutrino double-beta decay of $^{130}$Te.
In order to obtain this result, which was based on the 2017 data, both the processing of the expeirimental data and the production of Monte Carlo simulations were required.
In the last two months of the year a data reprocessing campaign was performed with an updated version of the CUORE analysis software.
This reprocessing campaign, which also included the new data acquired in 2018, allowed to verify the scalability of the CUORE computing model to the amount of data that CUORE will have to process in a few years from now.
\section*{References}
\bibliography{cuore}
\end{document}
%% This BibTeX bibliography file was created using BibDesk.
%% http://bibdesk.sourceforge.net/
%% Created for Fabio Bellini at 2018-02-24 11:10:52 +0100
%% Saved with string encoding Unicode (UTF-8)
@article{Azzolini:2018tum,
author = "Azzolini, O. and others",
title = "{CUPID-0: the first array of enriched scintillating
bolometers for $0\nu\beta\beta$ decay investigations}",
collaboration = "CUPID",
journal = "Eur. Phys. J.",
volume = "C78",
year = "2018",
number = "5",
pages = "428",
doi = "10.1140/epjc/s10052-018-5896-8",
eprint = "1802.06562",
archivePrefix = "arXiv",
primaryClass = "physics.ins-det",
SLACcitation = "%%CITATION = ARXIV:1802.06562;%%"
}
@article{Azzolini:2018dyb,
author = "Azzolini, O. and others",
title = "{First Result on the Neutrinoless Double-$\beta$ Decay of
$^{82}Se$ with CUPID-0}",
collaboration = "CUPID-0",
journal = "Phys. Rev. Lett.",
volume = "120",
year = "2018",
number = "23",
pages = "232502",
doi = "10.1103/PhysRevLett.120.232502",
eprint = "1802.07791",
archivePrefix = "arXiv",
primaryClass = "nucl-ex",
SLACcitation = "%%CITATION = ARXIV:1802.07791;%%"
}
@article{Azzolini:2018yye,
author = "Azzolini, O. and others",
title = "{Analysis of cryogenic calorimeters with light and heat
read-out for double beta decay searches}",
journal = "Eur. Phys. J.",
volume = "C78",
year = "2018",
number = "9",
pages = "734",
doi = "10.1140/epjc/s10052-018-6202-5",
eprint = "1806.02826",
archivePrefix = "arXiv",
primaryClass = "physics.ins-det",
SLACcitation = "%%CITATION = ARXIV:1806.02826;%%"
}
@article{Azzolini:2018oph,
author = "Azzolini, O. and others",
title = "{Search of the neutrino-less double beta decay of$^{82}$
Se into the excited states of$^{82}$ Kr with CUPID-0}",
collaboration = "CUPID",
journal = "Eur. Phys. J.",
volume = "C78",
year = "2018",
number = "11",
pages = "888",
doi = "10.1140/epjc/s10052-018-6340-9",
eprint = "1807.00665",
archivePrefix = "arXiv",
primaryClass = "nucl-ex",
SLACcitation = "%%CITATION = ARXIV:1807.00665;%%"
}
@article{DiDomizio:2018ldc,
author = "Di Domizio, S. and others",
title = "{A data acquisition and control system for large mass
bolometer arrays}",
journal = "JINST",
volume = "13",
year = "2018",
number = "12",
pages = "P12003",
doi = "10.1088/1748-0221/13/12/P12003",
eprint = "1807.11446",
archivePrefix = "arXiv",
primaryClass = "physics.ins-det",
SLACcitation = "%%CITATION = ARXIV:1807.11446;%%"
}
@article{Beretta:2019bmm,
author = "Beretta, M. and others",
title = "{Resolution enhancement with light/heat decorrelation in
CUPID-0 bolometric detector}",
year = "2019",
eprint = "1901.10434",
archivePrefix = "arXiv",
primaryClass = "physics.ins-det",
SLACcitation = "%%CITATION = ARXIV:1901.10434;%%"
}
@article{Azzolini:2019nmi,
author = "Azzolini, O. and others",
title = "{Background Model of the CUPID-0 Experiment}",
collaboration = "CUPID",
year = "2019",
eprint = "1904.10397",
archivePrefix = "arXiv",
primaryClass = "nucl-ex",
SLACcitation = "%%CITATION = ARXIV:1904.10397;%%"
}
\ No newline at end of file
\documentclass[a4paper]{jpconf}
\usepackage{graphicx}
\bibliographystyle{iopart-num}
%\usepackage{citesort}
\begin{document}
\title{CUPID-0 experiment}
\author{CUPID-0 collaboration}
%\address{}
\ead{stefano.pirro@lngs.infn.it}
\begin{abstract}
With their excellent energy resolution, efficiency, and intrinsic radio-purity, cryogenic calorimeters are primed for the search of neutrino-less double beta decay (0$\nu$DBD).
CUPID-0 is an array of 24 Zn$^{82}$Se scintillating bolometers used to search for 0$\nu$DBD of $^{82}$Se.
It is the first large mass 0$\nu$DBD experiment exploiting a double read-out technique: the heat signal to accurately measure particle energies and the light signal to identify the particle type.
The CUPID-0 is in data taking since March 2017 and obtained several outstanding scientific results.
The configuration of the CUPID-0 data processing environment on the CNAF computing cluster has been used for the analysis of the first period of data taking.
\end{abstract}
\section{The experiment}
Neutrino-less Double Beta Decay (0$\nu$DBD) is a hypothesized nuclear transition in which a nucleus decays emitting only two electrons.
This process can not be accommodated in the Standard Model, as the absence of emitted neutrinos would violate the lepton number conservation.
Among the several experimental approaches proposed for the search of 0$\nu$DBD, cryogenic calorimeters (bolometers) stand out for the possibility of achieving excellent energy resolution ($\sim$0.1\%), efficiency ($\ge$80\%) and intrinsic radio-purity. Moreover, the crystals that are operated as bolometers can be grown starting from most of the 0$\nu$DBD emitters, enabling the test of different nuclei.
The state of the art of the bolometric technique is represented by CUORE, an experiment composed of 988 bolometers for a total mass of 741 kg, presently in data taking at Laboratori Nazionali del Gran Sasso.
The ultimate limit of the CUORE background suppression resides in the presence of $\alpha$-decaying isotopes located in the detector structure.
The CUPID-0 project \cite{Azzolini:2018dyb,Azzolini:2018tum} was born to overcome the actual limits.
The main breakthrough of CUPID-0 is the addition of independent devices to measure the light signals emitted from scintillation in ZnSe bolometers.
The different properties of the light emission of electrons and $\alpha$ particles will enable event-by-event rejection of $\alpha$ interactions, suppressing the overall background in the region of interest for 0$\nu$DBD of at least one order of magnitude.
The detector is composed by 26 ZnSe ultra-pure $\sim$ 500g bolometers, enriched at 95\% in $^{82}$Se, the 0$\nu$DBD emitter, and faced to Ge disks light detector operated as bolometers.
CUPID-0 is hosted in a dilution refrigerator at the Laboratori Nazionali del Gran Sasso and started the data taking in March 2017.
The first scientific run (i.e.,~ Phase I) ended in December 2018, collecting 9.95 kg$\times$y of ZnSe exposure.
Such data were used to calculate a new limits on the $^{82}$Se 0$\nu$DBD~\cite{Azzolini:2018dyb,Azzolini:2018oph} and to develop a full background model of the experiment~\cite{Azzolini:2019nmi}.
Phase II will start in June 2019 with an improved detector configuration.
\section{CUPID-0 computing model and the role of CNAF}
The CUPID-0 computing model is similar to the CUORE one, being the only difference in the sampling frequency and working point of the light detector bolometers.
The full data stream is saved in ROOT files, and a derivative trigger is software generated with a channel dependent threshold.
%Raw data are saved in Root files and contain events in correspondence with energy releases occurred in the bolometers.
Each event contains the waveform of the triggering bolometer and those geometrically close to it, plus some ancillary information.
The non-event-based information is stored in a PostgreSQL database that is also accessed by the offline data analysis software.
The data taking is arranged in runs, each run lasting about two days.
Details of the CUPID-0 data acquisition and control system can be found in \cite{DiDomizio:2018ldc}.
Raw data are transferred from the DAQ computers (LNGS) to the permanent storage area (located at CNAF) at the end of each run.
A full copy of data is also preserved on tape.
The data analysis flow consists of two steps; in the first level analysis, the event-based quantities are evaluated, while in the second level analysis the energy spectra are produced.
The analysis software is organized in sequences.
Each sequence consists of a collection of modules that scan the events in the ROOT files sequentially, evaluate some relevant quantities and store them back in the events.
The analysis flow consists of several key steps that can be summarized in pulse amplitude estimation, detector gain correction, energy calibration and search for events in coincidence among multiple bolometers.
The new tools developed for CUPID-0 to handle the light signals are introduced in \cite{Azzolini:2018yye,Beretta:2019bmm}.
The main instance of the database was located at CNAF
and the full analysis framework was used to analyze data until November 2017. A web page for offline reconstruction monitoring was maintained.
Then, since the flooding at INFN Tier 1, we have been using the database of our DAQ servers at LNGS.
%During 2017 a more intense usage of the CNAF resources is expected, both in terms of computing resourced and storage space.
\section*{References}
\bibliography{cupid-biblio}
\end{document}
contributions/dampe/CNAF_HS06_2017.png

62.5 KiB

contributions/dampe/dampe_layout_2.jpg

84.1 KiB

contributions/dampe/figureCNAF2018.png

20.2 KiB

contributions/dampe/figure_all.png

85.1 KiB

contributions/dampe/figure_cnaf.png

41.1 KiB

\documentclass[a4paper]{jpconf}
\usepackage{graphicx}
\usepackage{hyperref}
\usepackage{todonotes}
\begin{document}
\title{DAMPE data processing and analysis at CNAF}
\author{G. Ambrosi$^1$, G. Donvito$^5$, D.F.Droz$^6$, M. Duranti$^1$, D. D'Urso$^{2,3,4}$, F. Gargano$^{5,\ast}$, G. Torralba Elipe$^{7,8}$}
\address{$^1$ INFN Sezione di Perugia, Perugia, IT}
\address{$^2$ Universit\`a di Sassari, Sassari, IT}
\address{$^3$ ASDC, Roma, IT}
\address{$^4$ INFN - Laboratori Nazionali del Sud, Catania, IT}
%\address{$^3$ Universit\`a di Perugia, I-06100 Perugia, Italy}
\address{$^5$ INFN Sezione di Bari, Bari, IT}
\address{$^6$ University of Geneva, Gen\`eve, CH}
\address{$^7$ Gran Sasso Science Institute, L'Aquila, IT}
\address{$^8$ INFN - Laboratori Nazionali del Gran Sasso, L'Aquila, IT}
\address{DAMPE experiment \url{http://dpnc.unige.ch/dampe/},
\url{http://dampe.pg.infn.it}}
\ead{* fabio.gargano@ba.infn.it}
\begin{abstract}
DAMPE (DArk Matter Particle Explorer) is one of the five satellite missions in the framework of the Strategic Pioneer Research Program in Space Science of the Chinese Academy of Sciences (CAS). DAMPE has been launched the 17 December 2015 at 08:12 Beijing time into a sun-synchronous orbit at the altitude of 500 km. The satellite is equipped with a powerful space telescope for high energy gamma-ray, electron and cosmic ray detection.
CNAF computing center is the mirror of DAMPE data outside China and the main data center for Monte Carlo production. It also supports user data analysis of the Italian DAMPE Collaboration.
\end{abstract}
\section{Introduction}
\begin{figure}[ht]
\begin{center}
\includegraphics[width=20pc]{dampe_layout_2.jpg}
\end{center}
\caption{\label{fig:dampe_layout} DAMPE telescope scheme: a double layer of the plastic scintillator strip detector (PSD);
the silicon-tungsten tracker-converter (STK) made of 6 tracking double layers; the imaging calorimeter with about 31 radiation lengths thickness, made of 14 layers of Bismuth Germanium Oxide (BGO) bars in a hodoscopic arrangement and finally
the neutron detector (NUD) placed just below the calorimeter.}
\end{figure}
DAMPE is a space telescope for high energy cosmic-ray detection.
In Fig. \ref{fig:dampe_layout} a scheme of the DAMPE telescope is shown. The top, the plastic scintillator strip detector (PSD) consists of one double layer of scintillating plastic strips detector, which serves as an anti-coincidence detector and to measure particle charge, followed by a silicon-tungsten tracker-converter (STK), which is made of 6 tracking layers. Each tracking layer consists of two layers of single-sided silicon strip detectors measuring the position on the two orthogonal views perpendicular to the pointing direction of the apparatus. Three layers of Tungsten plates with a thickness of 1~mm are inserted in front of tracking layers 3, 4 and 5 to promote photon conversion into electron-positron pairs. The STK is followed by an imaging calorimeter of about 31 radiation lengths thickness, made up of 14 layers of Bismuth Germanium Oxide (BGO) bars which are placed in a hodoscopic arrangement. The total thickness of the BGO and the STK corresponds to about 33 radiation lengths, making it the deepest calorimeter ever used in space. Finally, in order to detect delayed neutron resulting from hadron showers and to improve the electron/proton separation power, a neutron detector (NUD) is placed just below the calorimeter. The NUD consists of 16, 1~cm thick, boron-doped plastic scintillator plates of 19.5 $\times$ 19.5 cm$^2$ large, each read out by a photomultiplier.
The primary scientific goal of DAMPE is to measure electrons and photons with much higher energy resolution and energy reach than achievable with existing space experiments. This will help to identify possible Dark Matter signatures but also may advance our understanding of the origin and propagation mechanisms of high energy cosmic rays and possibly lead to new discoveries in high energy gamma-ray astronomy.
DAMPE was designed to have unprecedented sensitivity and energy reach for electrons, photons and heavier cosmic rays (proton and heavy ions). For electrons and photons, the detection range is 2 GeV-10 TeV, with an energy resolution of about 1.5\% at 100 GeV. For proton and heavy ions, the detection range is 100 GeV-100 TeV, with an energy resolution better than 40\% at 800 GeV. The geometrical factor is about 0.3 m$^2$ sr for electrons and photons, and about 0.2 m$^2$ sr for heavier cosmic rays. The angular resolution is 0.1$^{\circ}$ at 100 GeV.
\section{DAMPE Computing Model and Computing Facilities}
As a Chinese satellite, DAMPE data are collected via the Chinese space communication system and transmitted to the China National Space Administration (CNSA) center in Beijing. From Beijing data are then transmitted to the Purple Mountain Observatory (PMO) in Nanjing, where they are processed and reconstructed.
On the European side, the DAMPE collaboration consists of research groups from INFN and University of Perugia, Lecce and Bari, and from the Department of Particle and Nuclear Physics (DPNC) at the University of Geneva in Switzerland.
\subsection{Data production}
PMO is the deputed center for DAMPE data production. Data are collected 4 times per day, each time the DAMPE satellite is passing over Chinese ground stations (almost every 6 hours). Once transferred to PMO, binary data, downloaded from the satellite, are processed to produce a stream of raw data in ROOT \cite{root} format ({\it 1B} data stream, $\sim$ 7 GB/day), and a second stream that include the orbital and slow control information ({\it 1F} data stream, $\sim$ 7 GB/day). The {\it 1B} and {\it 1F} streams are used to derive calibration files for the different subdetectors ($\sim$ 400MB/day). Finally, data are reconstructed using the DAMPE official reconstruction code, and the so-called {\it 2A} data stream (ROOT files, $\sim$ 85 GB/day) is produced. The total amount of data volume produced per day is $\sim$ 100 GB.
Data processing and reconstruction activities are currently supported by a computing farm consisting of more than 1400 computing cores, able to reprocess 3 years DAMPE data in 1 month.
\subsection{Monte Carlo Production}
Analysis of DAMPE data requires large amounts of Monte Carlo simulation, to fully understand detector capabilities, measurement limits and systematic. In order to facilitate easy work-flow handling and management and also enable effective monitoring of a large number of batch jobs in various states, a NoSQL meta-data database using MongoDB \cite{mongo} was developed with a prototype currently running at the Physics Department of Geneva University. Database access is provided through a web-frontend and command tools based on the flask-web toolkit \cite{flask} with a client-backend of cron scripts that run on the selected computing farm.
The design and completion of this work-flow system were heavily influenced by the implementation of the Fermi-LAT data processing pipeline \cite{latpipeline}
and the DIRAC computing framework \cite{dirac}.
Once submitted, each batch job continuously reports its status to the database through outgoing HTTP requests.
To that end, computing nodes must have outgoing connectivity enabled. Each batch job implements a work-flow where input and output data transfers are being performed (and their return codes are reported) as well as the actual running of the payload of a job (which is defined in the metadata description of the job). Dependencies on productions are implemented at the framework level and jobs are only submitted once dependencies are satisfied.
Once generated, a secondary job is initiated which performs digitization and reconstruction of existing MC data with a given release for large amounts of MC data in bulk. This process is set-up via a cronjob at DPNC and occupies up to 200 slots in a 6-hour limited computing queue.
\subsection{Data availability}
DAMPE data are available to the Chinese Collaboration through the PMO institute, while they are kept accessible to the European Collaboration transferring them from PMO to CNAF, and also from there to the DPNC.
Every time a new {\it 1B}, {\it 1F} or {\it 2A} data files are available at PMO, they are copied, using the GridFTP \cite{gridftp} protocol,
into the DAMPE storage area at CNAF. From CNAF, every 4 hours a copy of each stream
is triggered to the Geneva computing farm via rsync. Dedicated batch jobs are submitted once per day to asynchronously verify the checksum of newly transferred data from PMO to CNAF and from CNAF to Geneva.
Data verification and copy processes are managed through a dedicated User Interface (UI), \texttt{ui-dampe}.
The connection to China is passing through the Orientplus \cite{orientplus} link of the G${\rm \acute{e}}$ant Consortium \cite{geant}. The data transfer rate is currently limited by the connection of the PMO to the China Education and Research Network (CERNET), which has a maximum bandwidth of 100 Mb/s. So the PMO-CNAF copy processed is used for daily data production.
To transfer towards Europe data in case of DAMPE data re-processing and to share in China Monte Carlo generated in Europe,
a dedicated DAMPE server has been installed at the Institute for high energy physics, IHEP, in Beijing which is connected to CERNT with a 1Gb/s bandwidth. Data synchronization between this server and PMO is done by a manually induced hard-drive exchange.
To simplify user data access overall Europe, an XRootD federation has been implemented: an XRootD redirector has been set up in Bari with end-point XRootD server installations (providing the real data) at CNAF, Bari and in Geneva. These end-points provide unified read access for users in Europe.
\section{CNAF contribution}
The CNAF computing center is the mirror of DAMPE data outside China and the main data center for Monte Carlo production.\\
In 2018, a dedicated user interface, 300 TB of disk space and 7.8k HS06 of CPU time have been allocated for the DAMPE activities.
\section{Activities in 2018}
DAMPE activities at CNAF in 2018 have been related to data transfer, Monte Carlo production and data analysis.
\subsection{Data transfer}
The daily activity of data transfer from PMO to CNAF and thereafter from CNAF to CERN have been performed all along the year.
Daily transfer rate has been of about 100 GB per day from PMO to CNAF and more than 100 GB per day from CNAF to PMO.
The step between PMO and CNAF is performed, as seen in previous sections, via \texttt{gridftp} protocol.
Two strategies have been, instead, used to copy data from CNAF to PMO: via \texttt{rsync} from the UI and via \texttt{rsync} managed by batch jobs.
DAMPE data have been reprocessed three times along the year and a dedicated copy task has been fulfilled to copy the new production releases, in addition to the ordinary daily copy.
\subsection{Monte Carlo Production}
\iffalse
\begin{figure}
\begin{center}
\includegraphics[width=30pc]{CNAF_HS06_2017}
\end{center}
\caption{\label{fig:hs06_2017} CPU time consumption, in terms of HS06 (blue solid for daily computation, dashed for the average over the entire year). The red solid line corresponds to the annual pledge and the green dotted line corresponds to the job efficiency computed in a 14-day sliding window.}
\end{figure}
\fi
\begin{figure}[ht]
\begin{center}
\includegraphics[width=35pc]{figure_cnaf.png}
\end{center}
\caption{\label{fig:figure_cnaf} Status of completed simulation production at CNAF.}
\end{figure}
\begin{figure}[ht]
\begin{center}
\includegraphics[width=35pc]{figureCNAF2018.png}
\end{center}
\caption{\label{fig:figure_cnaf_2018} Status of completed simulation production at CNAF in 2018.}
\end{figure}
\iffalse
\begin{figure}[ht]
\begin{center}
\includegraphics[width=35pc]{figure_all.png}
\end{center}
\caption{\label{fig:figure_all} Status of completed simulation production at all DAMPE simulation sites.}
\end{figure}
\fi
As the main data center for Monte Carlo production, CNAF has been strongly involved in the Monte Carlo campaign.
At CNAF almost 300 thousand jobs have been executed for a total of about 3 billion of Monte Carlo events.
Monte Carlo campaign is still ongoing for different species and different energy ranges.
In figure \ref{fig:figure_cnaf} the status of completed simulation production at CNAF is shown.
During 2019 we will perform a new full simulation campaign with an improved version of our simulation code: this is crucial for all the forthcoming analysis.
\subsection{Data Analysis}
Most of the analysis in Europe is performed at CNAF and its role has been crucial for all the DAMPE publications such as the Nature paper on direct detection of a break in the TeV cosmic-ray spectrum of electrons and positrons \cite{nature}.
\section{Acknowledgments}
The DAMPE mission was founded by the strategic priority science and technology projects in space science of the Chinese Academy of Sciences and in part by the National Key Program for Research and Development, and the 100 Talents program of the Chinese Academy of Sciences. In Europe, the work is supported by the Italian National Institute for Nuclear Physics (INFN), the Italian University and Research Ministry (MIUR), and the University of Geneva. We extend our gratitude to INFN-T1 for their continued support also beyond providing computing resources.
\section*{References}
\begin{thebibliography}{9}
\bibitem{root} Antcheva I. {\it et al.} 2009 {\it Computer Physics Communications} {\bf 180} 12, 2499 - 2512, \newline https://root.cern.ch/guides/reference-guide.
\bibitem{mongo} https://www.mongodb.org
\bibitem{flask} http://flask.pocoo.org
\bibitem{latpipeline} Dubois R. 2009 {\it ASP Conference Series} {\bf 411} 189
\bibitem{dirac} Tsaregorodtsev A. et al. 2008 {\it Journal of Physics: Conference Series} {\bf 119} 062048
\bibitem{gridftp} Allcock, W.; Bresnahan, J.; Kettimuthu, R.; Link, M. (2005). "The Globus Striped GridFTP Framework and Server". ACM/IEEE SC 2005 Conference (SC'05). p. 54. doi:10.1109/SC.2005.72. \newline ISBN 1-59593-061-2. http://www.globus.org/toolkit/docs/latest-stable/gridftp/
\bibitem{nature} Ambrosi, G et al. 'Direct detection of a break in the teraelectronvolt cosmic-ray spectrum of electrons and positrons' {\it NATURE} Vol. {\bf 552} (2017)
\bibitem{orientplus} http://www.orientplus.eu
\bibitem{geant} http://www.geant.org
\bibitem{cernet} http://www.cernet.edu.cn/HomePage/english/index.shtml
\bibitem{asdc} http://www.asdc.asi.it
\end{thebibliography}
\end{document}
\documentclass[a4paper]{jpconf}
\usepackage{graphicx}
\bibliographystyle{iopart-num}
%\usepackage{citesort}
\begin{document}
\title{DarkSide program at CNAF}
\author{S. Bussino, S. M. Mari, S. Sanfilippo}
\address{INFN and Universit\`{a} degli Studi Roma 3}
\ead{bussino@fis.uniroma3.it; stefanomaria.mari@uniroma3.it; simone.sanfilippo@roma3.infn.it}
\begin{abstract}
DarkSide is a direct dark matter research program based at the underground Laboratori Nazionali del Gran Sasso
(\textit {LNGS}) and it is searching for the rare nuclear recoils (possibly) induced by the so called Weakly
Interacting Massive Particles (\textit{WIMPs}). It is based on a dual-phase Time Projection Chamber filled with liquid
Argon (\textit{LAr-TPC}) from underground sources. The prototype project is a LAr-TPC with a $(46.4\pm0.7)$kg
active mass, the DarkSide-50 (\textit{DS-50}) experiment, which is installed inside a 30 t organic liquid scintillator
neutron veto, which is in turn installed at the center of a 1kt water Cherenkov veto for the residual flux of cosmic
muons. DS-50 has been taking data since November 2013 with Atmospheric Argon (\textit{AAr}) and, since April 2015, has
been operated with Underground Argon (\textit{UAr}) highly depleted in radioactive ${}^{39}Ar$. The exposure of 1422
kg d of AAr has demonstrated that the operation of DS-50 for three years in a background free condition is a solid
reality, thank to the excellent performance of the pulse shape analysis. The first release of results from an exposure
of 2616 kg d of UAr has shown no dark matter candidate events. This is the most sensitive dark matter search performed
with an Argon-based detector, corresponding to a 90\% CL upper limit on the WIMP-nucleon spin-indipendent cross section
of $2\times10^{-44} cm^2$ for a WIMP mass of 100 $GeV/c^2$. DS-50 will be operated till the end of the year 2019.
From the experience of DS-50, the DS-20k project has been presented based on a new LAr-TPC of more than 20 tonne.
\end{abstract}
\section{The DS-50 experiment}
The existence of dark matter is now established from different gravitational effects, but its nature is still a deep mystery. One possibility, motivated by other considerations in elementary particle physics, is that dark matter consists of new undiscovered elementary particles. A leading candidate explanation, motivated by supersymmetry theory (\textit{SUSY}), is that dark matter is composed of as-yet undiscovered Weakly Interacting Massive Particles (\textit{WIMPs}) formed in the early universe and subsequently gravitationally clustered in association with baryonic matter \cite{Good85}. Evidence for new particles that could constitute WIMP dark matter may come from upcoming experiments at the Large Hadron Collider (\textit{LHC}) at CERN or from sensitive astronomical instruments that detect radiation produced by WIMP-WIMP annihilations in galaxy halos. The thermal motion of the WIMPs comprising the dark matter halo surrounding the galaxy and the Earth should result in WIMP-nuclear collisions of sufficient energy to be observable by sensitive laboratory apparatus. WIMPs could in principle be detected in terrestrial experiments through their collisions with ordinary nuclei, giving observable low-energy $<$100 keV nuclear recoils. The predicted low collision rates require ultra-low background detectors with large (0.1-10 ton) target masses, located in deep underground sites to eliminate neutron background from cosmic ray muons. The DarkSide program is the first to employ a Liquid Argon Time Projection Chamber (\textit{LAr-TPC}) with low levels of ${}^{39}Ar$, together with innovations in photon detection and background suppression.
The DS-50 detector is installed in Hall C at Laboratori Nazionali del Gran Sasso (\textit{LNGS}) at a depth of 3800 m.w.e.\footnote{The meter water equivalent (m.w.e.) is a standard measure of cosmic ray attenuation in underground laboratories.}, and it will continue to taking data up to the end of 2019. The project will continue with DarkSide-20k (\textit{DS-20k}) and \textit{Argo}, a multi-ton detector with an expected sensitivity improvement of two orders of magnitude. The DS-50 target volume is hosted in a dual phase TPC that contains Argon in both phases, liquid and gaseous, the latter on the top of the former one. The scattering of WIMPs or background particles in the active volume induces a prompt scintillation light, called S1, and ionization. Electrons which not recombine are drifted by an electric field of 200 V/cm applied along the z-axis. They are then extracted into gaseous phase above the extraction grid, and accelerated by an electric field of about 4200 V/cm. Here a secondary larger signal due to electroluminescence takes place, the so called S2. The light is collected by two arrays of 19 3"-PMTs on each side of the TPC corresponding to a 60\% geometrical coverage of the end plates and 20\% of the total TPC surface. The detector is capable of reconstructing the position of the interaction in 3D. The z-coordinate, in particular, is easily computed by the electron drift time, while the time profile of the S2 light collected by the top plate PMTs allows to reconstruct the \textit{x} and the \textit{y} coordinates. The LAr-TPC can exploit Pulse Shape Discrimination (\textit{PSD}) and the ratio of scintillation to ionization (S1/S2) to reject $\beta/\gamma$ background in favor of the nuclear recoil events expected from WIMP scattering \cite{Ben08, Bou06}.\\ Events due to neutrons from cosmogenic sources and from radioactive contamination in the detector components, which also produces nuclear recoils, are suppressed by the combined action of the neutron and cosmic rays vetoes. The first one in particular is a 4.0 meter-diameter stainless steel sphere filled with 30 t of borated liquid scintillator acting as Liquid Scintillator Veto (\textit{LSV}). The sphere is lined with \textit{Lumirror} reflecting foils and it is equipped with an array of 110 Hamamatsu 8"-PMTs with low-radioactive components and high-quantum-efficiency photocathodes. The cosmic rays veto, on the other hand, is an 11m-diameter, 10 m-high cylindrical tank filled with high purity water which acts as a Water Cherenkov Detector (\textit{WCD}). The inside surface of the tank is covered with a laminated \textit{Tyvek-polyethylene-Tyvek} reflector and it is equipped with an array of 80 ETL 8"-PMTs with low-radioactive components and high-quantum-efficiency photocathodes.
The exposure of 1422 kg d of AAr has demonstrated that the operation of DS-50 for three years in a background free condition is a solid reality, thank to the excellent performance of the pulse shape analysis. The first release of results from an exposure of 2616 kg d of UAr has shown no dark matter candidate events. This is the most sensitive dark matter search performed with an Argon-based detector, corresponding to a 90\% CL upper limit on the WIMP-nucleon spin-indipendent cross section of $2\times10^{-44} cm^2$ for a WIMP mass of 100 $GeV/c^2$ \cite{Dang16}.
\section{DkS-50 at CNAF}
The data readout in the three detector subsystems is managed by dedicated trigger boards: each subsystem is equipped with an user-customizable FPGA unit, in which the trigger logic is implemented. The inputs and outputs from the different trigger modules are processed by a set of electrical-to-optical converters and the communication between the subsystems uses dedicated optical links. To keep the TPC and the Veto readouts aligned, a pulse per second (\textit{PPS}) generated by a GPS receiver is sent to the two systems, where it is acquired and interpolated with a resolution of 20 ns to allow offline confirmation of event matching.
To acquire data, the DarkSide detector uses a DAQ machine equipped with a storage buffer of 7 TB. Raw data are processed and automatically sent to CNAF farm via a 10 Gbit optical link (almost with approximately 7 hours delay). At CNAF data are housed on a disk storage system of about 1 PB net capacity with a part of the data (300 TB) backed up on the tape library. Raw data from CNAF, and processed ones from LNGS are then semi-automatically copied to Fermi National Laboratories (\textit{FNAL}) via a 100 Gbit optical link. Part of reconstructed data are sent back to CNAF via the same link as before with a rate of about 0.5 TB/month (RECO files). Data processed and analyzed at FNAL, are compared with the analysis performed at CNAF. The INFN Roma 3 group has an active role to maintain and follow, step by step, the overall transferring procedure and to arrange the data management.
\section{The future of DarkSide: DS-20k}
Building on the successful experience in operating the DS-50 detector, the DarkSide program will continue with DS-20k, a direct WIMP search detector using a two-phase Liquid Argon Time Projection Chamber (LAr TPC) with an active (fiducial) mass of 23 t (20 t), which will be built in the next years. The optical sensors will be Silicon Photon Multiplier (\textit{SiPM}) matrices with very low radioactivity. Operation of DS-50 demonstrated a major reduction in the dominant ${}^{39}Ar$ background when using Argon extracted from an underground source, before applying pulse shape analysis. Data from DS-50, in combination with MC simulations and analytical modelling, also shows that a rejection factor for discrimination between electron and nuclear recoils greater than $3\times10^9$ is achievable. The expected large rejection factor, along with the use of the veto system and utilizing silicon photomultipliers in the LAr-TPC, are the keys to unlock the path to large LAr-TPC detector masses, while maintaining an experiment in which less than $<0.1$ events is expected to occur within the WIMP search region during the planned exposure.
Thanks to the measured ultra-low background, DS-20k will have sensitivity to WIMP-nucleon cross sections of
$1.2\times10^{-47}\ cm^2$ and $1.1\times10^{-46}\ cm^2$ for WIMPs respectively of
$1\ TeV/c^2$ and $10\ TeV/c^2$ mass, to be achieved during a 5 yr run producing an exposure of 100 t yr free from any instrumental background.
DS-20k could then extend its operation to a decade, increasing the exposure to 200 t yr, reaching a sensitivity of $7.4\times10^{-48}\ cm^2$ and $6.9\times10^{-47}\ cm^2$ for WIMPs respectively of $1\ TeV/c^2$ and $10\ TeV/c^2$ mass.
DS-20k will be more than two orders of magnitude larger in size compared to DS-50 and will utilize SiPM technologies. Therefore, the collaboration plans to build a prototype detector of intermediate size, called DS-Proto, incorporating the new technologies for their full validation. The choice of about 1t mass scale allows a full validation of the technological choices for DS-20k. DS-proto will be built at CERN laboratory, the data taking is foreseen to start in the year 2020.
\section{DS-proto at CNAF}
Data from DS-proto will be stored and managed at CNAF. The construction, operation, and commissioning of DS-proto will allow validation of the major innovative technical features of DS-20k. Data taking will start in the year 2020. The computing resources have been evaluated according to the data throughput, trigger rate and duty cycle of the experiment. A computing power of about 1kHS06 and 300 net TB is needed to fully support DS-proto data taking and data analysis in the year 2020. In order to perform at CNAF the CPU demanding Monte Carlo production, 30 net TB and 2kHS06 are needed. The DS-proto data taking has been foreseen for few years, requiring a total disk space of the order of some PB and a computing capacity of several kHS06.
%However, the goal of DS-20k is a background free exposure of 100 ton-year of liquid Argon which requires further suppression of ${}^{39}Ar$ background with respect to DS-50. The project \textit{URANIA} involves the upgrade of the UAr extraction plant to a massive production rate suitable for multi-ton detectors. The project \textit{ARIA} instead involves the construction of a very tall cryogenic distillation column in the Seruci mine (Sardinia, Italy) with the high-volume capability of chemical and isotopic purification of UAr.\\ The projected sensitivity of DS-20k and Argo reaches a WIMP-nucleon cross section of $10^{-47}\ cm^2$ and $10^{-48}\ cm^2$ respectively, for a WIMP mass of 100 $GeV/cm^2$, exploring the region of the parameters plane down to the irreducible background due to atmospheric neutrinos.
\section*{References}
\begin{thebibliography} {17}
\bibitem{Good85} M.~W.~Goodman, E.~Witten, Phys. Rev. D {\bf 31} 3059 (1985);
\bibitem{Loo83} H.~H.~Loosli, Earth Plan. Sci. Lett. {\bf 63} 51 (1983);
\bibitem{Ben07} P.~Benetti et al. (WARP Collaboration), Nucl. Inst. Meth. A {\bf 574} 83 (2007);
\bibitem{Ben08} P.~Benetti et al. (WARP Collaboration), Astropart. Phys. {\bf 28} 495 (2008);
\bibitem{Bou06} M.~G.~Boulay, A.~Hime, Astropart. Phys. {\bf 25} 179 (2006);
\bibitem{Dang16} D.~D'Angelo et al. (DARKSIDE Collaboration), Il nuovo cimento C {\bf 39} 312 (2016).
\end{thebibliography}
\end{document}
\ No newline at end of file
File added
This diff is collapsed.
This diff is collapsed.
File added