next up previous contents index
Next: Data processing flow Up: HBOOK Previous: List of Tables

Introduction

 

Data processing is an important aspect of particle physics experiments since the volume of data to be handled is quite large, a single LEP experiment producing of the order of a terabyte of data per year. As a result, every particle physics laboratory has a large data processing centre even though more than 50% of the computation is actually carried on in universities or other research establishments. Particle physicists from various countries are in close contact on a continental and world wide basis, the information exchanged being mainly via preprints and conferences. The similarities in experimental devices and problems, and the close collaboration, favour the adoption of common software methodologies that sometimes develop into widely used standard packages. Examples are the histograming, fitting and data presentation package HBOOK, its graphic interface HPLOT [2] and the Physics Analysis Workstation (PAW) system [3], which have been developed at CERN.

HBOOK is a subroutine package to handle statistical distributions (histograms and Ntuples) in a Fortran scientific computation environment. It presents results graphically on the line printer, and can optionally draw them on graphic output devices via the HPLOT package. PAW integrates the functionalities of the HBOOK and HPLOT (and other) packages into an interactive workstation environment and provides the user with a coherent and complete working environment, from reading a (mini)DST, via data analysis to preparing the final data presentation.

These packages are available from the CERN Program Library (see the copyright page for conditions). They are presently being used on several hundred different computer installations throughout the world.



Last update: Tue May 16 09:09:27 METDST 1995