The following is a summary of decisions that
were made BRAHMS Software meeting held at BNL Dec 6-10, 1999.
-
Databases
-
Draw diagrams depicting steps of the analysis and what is put into and
taken out of data bases. Anders will provide a status report in January
before his classes start. (AH, IAB)
-
Define some schema and generate a prototype MySQL database to be used as
the run database. (KO, JHL)
-
Collect from each detector subsystem representative a list of what types
of calibration data needs to be kept in the calibration database.
-
Collect from each detector subsystem representative information on what
needs to be kept in the run control database including frequency of updates.
-
Send out Dieter’s list regarding databases and get each detector subsystem
representative to fill in the blanks for their respective detector.
-
Online monitoring
-
Develop monitoring base class more (KH)
-
Develop a ?super monitor? which would have a set of monitor objects presumably
for different detectors for use on runs. (KH)
-
Obtain monitor objects which inherit from the base class (BrBaseMonitor)
from each detector subsystem representative
-
Develop a way to directly compare "reference" arrays with arrays that are
being monitored at a given time. (KH)
-
HV
-
Contact Claus regarding changes since last summer to underlying framework
of HV control program. (KO)
-
Raw data
-
Iteration of recordID’s for data (KO)
-
One recordID for both ZDC’s
-
Change current scheme to make "space" for TPC’s
-
circulate new list of recordID’s (KO)
-
Simulations
-
Use RCF farms to do simulations of MRS (FV)
-
Use NBI computers to generate 20-30k events of FS data (IAB)
-
Start a Brahms private MDC (BrMDC) to exercise RCF computers as well as
BRAHMS software in current configuration.
-
Code organization
-
Move online code to main BRAHMS CVS repository (named oncs)
The following is a summary of what was discussed at the
software meeting.
A copy of the agenda can be found here.
The following is a summary of topics discussed:
-
Databases
-
Slides from A. Holm discussion are at
with a discussion of the slides here.
-
A long discussion on the underlying implementation of the database.
This had originally been assumed to be Objectivity since that is an object
oriented database which should map onto ROOT. This assumption has
been called into question after several months of attempts to implement
a prototype. In addition the following observations were made about what
the other RHIC experiments are doing with databases. Here
is a summary of the database investigation of C. Holm. (postscript)
-
In addition to the underlying database, there need to be access methods
in BRAT which are transparent to the user. This would provide in
principle, (although questionable in practice) the possibility of changing
the database in mid course if needed without users having to change their
code or understanding of how to access the database. Some prototype
database managers with access methods have been investigated and implemented
in BRAT. These currently access simple access files which act as
the "databases" for the moment. They are implemented in all of the
digitization, reconstruction and track matching code that exists so far.
Two prototype database managers exist. One is for detector geometry
and the other is for detector parameters. A lot of work needs to
be done to further specify which parameters should be in these two databases
as well as implementing the other databases, but these could serve as a
prototype. Here
are slides that were presented giving some details of what is done so far.
Of course, for complete details, there is always the code which is by definition
the most up to date.
-
Discussion of different types of databases needed for BRAHMS running and
analysis. These databases include:
-
Geometry DB
-
Beamline, detectors, magnets
-
local position (x, y) of pads relative to TPC origin for example
-
global position (x,y,z) of TPC on frame
-
Field Maps
-
measured Bx, By, Bz on a grid of one-several current settings
-
update frequency: order of years
-
Calibration
-
TPC gain and t0 for each channel, dead hot noisy channels
-
TOF
-
DC
-
Update frequency: order of hours or days
-
Slow control
-
Rapidly changing data
-
TPC gas (temperature, pressure drift velocity)
-
other detectors? (need input from all detector subsystems)
-
update frequency: seconds, minutes
-
Slowly changing data
-
magnet currents, hall probes, HV
-
update frequency: 5 minutes
-
Stable data
-
TPC pedestal: mean, RMS, and timebin for each channel
-
update frequency: days
-
Run DB: discussed in more detail below
-
Reconstruction DB: no discussion
-
Discussion on BRAHMS Run Data Base. A list of what should be in the
BRAHMS Run Data Base is here.
-
BRAHMS Analysis
-
The
life data cycle of data in BRAHMS.
-
Data
analysis steps in BRAHMS
-
Discussion of Hough
transforms. A prototype Hough transform analysis was applied to events
of the highest density in one of the TPC's and was shown to have similar
tracking efficiency to the current track finder used in BRAT. The
particular analysis done used x-z and y-z projections. Further analysis
applying x-y projections in addition to the others appeared to improve
the track finding efficiency even more.
-
Vertex
Finding
-
Momentum
Determination
-
Geometrical
Acceptance
-
ROOT Trees
-
BRAHMS data files generated by BRAT are in the form of ROOT trees.
These trees are, however, in a format which organizes the raw data according
to different classifications. This classification is naturally by
detector in the early part of the data analysis, but will probably be grouped
differently as analysis proceeds. The organization is by event nodes
(BrEventNode) which are in a list in an event (BrEvent). This organization
is not a convenient way of looking at physics data in later stages of the
analysis in the way that the ROOT developers envisioned. There was
discussion on how to generate some automatic scripts or programs which
would input the type of analysis one wants to do and which would convert
a BRAHMS event file into a "typical" ROOT tree for physics analysis.
-
Scripts for reconstruction
-
A set of scripts was generated in consulation with RCF personell for MDC-I
which executed jobs on the CRS. These worked for the purpose of MDC-I
and are a good starting point for some automatic scripts for the "real"
data analysis.
-
Further automation:
-
Analysis Master: Take a set of runs, assemble a program and prepare job
scripts of the type described above to submit to the CRS
-
Need a script which would obtain information from the database to help
prepare run scripts
-
Progress of job in the CRS needs to be monitored
-
Simulations
-
Status
and Plans of GBRAHMS
-
Discussions of what types of simulations to do where in coming weeks
-
NBI, BNL; role of Romanian collaborators
-
see decisions above
-
BRAT
Status