Title |
Data Quality Monitoring Framework for the ATLAS experiment at LHC |
Submitted |
24-JAN-07 07:25 (UTC -06:00) |
Classification |
Controls and Monitoring Systems |
Modified |
22-MAR-07 16:46 (UTC -05:00) |
Session |
PS1A |
Presentation |
Poster |
Presenter |
Alina Corso-Radu |
Paper ID |
PS1A004 |
|
|
Paper PDF |
Download |
Author(s) |
Alina Corso-Radu, Serguei Kolos (UCI, Irvine), Michael Hauschild (CERN, Geneva), Haleh Hadavand, Robert Kehoe (SMU, Dallas) |
Abstract |
Data Quality Monitoring (DQM) is an important part of the data taking process of HEP experiments. DQM involves analysis of monitoring data through user defined algorithms and relaying the summary of the analysis results while data is being processed. When DQM occurs in the online environment, it provides the shifter with current run information that can be used to overcome problems early on. During the offline reconstruction more complex analysis of physics quantities is performed by DQM, and the results are used to assess the quality of the reconstructed data. The ATLAS Data Quality Monitoring Framework (DQMF) is a distributed software system providing DQM functionality in the online environment. The DQMF has a scalable architecture achieved by distributing execution of the analysis algorithms over a configurable number of DQMF agents running on different nodes connected over the network. The core part of the DQMF is designed to only have dependence on software that is common between online and offline (such as ROOT) and so is used in the offline data quality assessment. This paper describes the main requirements, the architectural design and the implementation of the DQMF. |
|
Word Count: 187 Character Count: 1182 |
Footnote |
|
Funding Agency |
|