SLAC PEP-II BABAR SLAC<->RAL
Workbook HEPIC Databases PDG HEP preprints
Organization Detector Computing Physics Documentation
Personnel Glossary Sitemap Search Hypernews
 Wkbk. Search Wkbk. Sitemap Introduction Non SLAC HOWTO's Introduction Logging In QuickTour Detector Info Resources Software Infrastructure CM2 Introduction Unix OO SRT Objectivity Event Store Framework Beta Modifying Code Writing and Editing Compiling Debugging Analysis Framework II Analysis Find Data Batch Processing PAW PAW II ROOT I ROOT II ROOT III Advanced Infrastructure New Releases Workdir Main Packages Event Displays Gen/Sim/Reco Contributing Software SRT and CVS Coding Advanced Topics Make CM2 Ntuples New Packages New Packages 2 Persistent Classes Java Site Installation
 Check this page for HTML 4.01 Transitional compliance with the W3C Validator (More checks...)

# Workbook for BaBar Offline Users - A Quick Tour of the BaBar Offline World

This section of the Workbook will take you through your first BaBar offline analysis job. It assumes that you have already set up your account as described in the previous Workbook section, Logging In. By the end of this first job you will have analyzed and histogrammed BaBar data. This will be accomplished using the software release analysis-30 (18.6.2a) and reading actual BaBar data from the one of the production databases.

Commands that the user might enter are written in bold, red to make it easier to find commands on later browsing.

Click on the following link for a Summary of Quicktour Commands.

## Introduction

**NOTE!** If you are returning to the quick tour after having left in the middle, then there are a few commands you need to issue now, before you forget:

If you have logged off since you were last here:


> cd ~/ana30
ana30> srtpath  <enter> <enter>
ana30> cond18boot


srtpath sets up the correct architecture, and cond18boot sets up a path to the database.

If you have not logged off, and have been away for more than 24 hours, then you need to enter:

> klog

The system will then prompt you for your password (the same password that you use to login to a SLAC computer). Klogging renews your AFS token (see the unix section of the workbook).

In certain cases, the srtpath, cond18boot and klog commands may be redundant, but they can't hurt, and will save you much confusion and frustration.

This section will be somewhat different from the other sections of the workbook in that it will ask you to type some commands without explaining exactly what they do. All later sections of the workbook will introduce concepts in a more detailed way and will provide references for still more detailed reading. We hurry you through some explanations in this first section because we want you to get your hands on actual BaBar data right away, and we want you to get a general sense that you can run something.

Users who are new to Unix should review the Workbook's section on How to Enter Unix Commands before proceeding.

## Trouble Shooting

With a system as complex as BaBar software, many things can go wrong - from simply trying to execute commands from in the wrong directory, to disks storing data being unavailable. As the number of these errors grew, the quicktour grew to include "if this goes wrong, do..." statements. These have now been moved to a separate page called TroubleShooting.

As new problems and common errors are found, the trouble-shooting page will be updated. Where there are useful comments in the trouble-shooting page, the quicktour has links to the relevant section of that page. At the end of the trouble-shooting section, there is a link to take you back to the place in the quicktour where you were. Please email the workbook editor if you find any problems or if you find any solutions that you feel should be documented in the trouble-shooting page.

## Getting Started

### Unix Space

BaBar executables can be very large. The BetaMiniApp executable created and used to run the following example analysis job uses about 50MB of disk space. To be on the safe side you should make sure that there is at least 80MB of free space in your unix account before starting the quick tour.

For more information about your AFS unix quota and a link to the request form for more disk space, read the related section from the Unix chapter of the Workbook.

### Select a Host Computer

It is recommended that you log onto a yakut machine to run the QuickTour at SLAC as this runs the Scientific Linux 3 operating system required for release analysis-30. If you are running the quicktour at your local institution, you should speak to a computing expert to find an appropriate machine to run the quicktour.

## Set up your release

### Create a Test Release

A test release is a version of the BaBar code which typically contains a small amount of private user code, with the remainder coming from a full BaBar release, that is, a consistent set of all the BaBar code.

You are about to create a new "release directory" for your work, which we will call ana30. You will build the rest of your job in this release directory.

Start by going to to your home directory:

> cd

(The unix prompt will be indicated by ">" unless the command must be issued from a particular directory, in which case the deepest subdirectory will appear. For example, if the command must be entered from "A/B/C >", the prompt will be shown as "C >".

Now create the new release directory, and specify the scratch area to be used for the library and binary files:

> newrel -s $BFROOT/build/<first_letter>/<username> -t analysis-30 ana30  Note: since you have to put in your own username and initial, you cannot simply cut and paste the above command. For example, if your username is "zephir", then you use the "-s$BFROOT/build/z/zephir" option to put the test release in your scratch directory. If you do not have a scratch directory, run the build command "bbrCreateBuildArea" to create it. Then you should be able to run the newrel command above.

analysis-30 is the name of the release. (Note that this release is also known as 18.6.2a.) The last argument (ana30) is the (arbitrary) name that you choose for your copy of the release.

The system should respond with something like:

newrel version: 1.13
GNU Make version 3.79.1,
Build OPTIONS = Linux24SL3_i386_gcc323-Debug-native-Objy-Optimize-Fastbuild-
Ldlink2-SkipSlaclog-Static-Lstatic
Linux yakut06 2.4.21-37.ELsmp #1 SMP Wed Sep 28 12:51:04 CDT 2005 i686 athlon
i386 GNU/Linux  [uname -a]
[Warning]: ./bin/Linux24SL3_i386_gcc323 (and/or) /afs/slac.stanford.edu/g/
babar/dist/releases/18.6.2a/bin/Linux24SL3_i386_gcc323 is not in PATH, type
'srtpath' to fix PATH.
-> installdirs:
Creating database/GNUmakefile from release 18.6.2a
next, addpkg, checkout or ln -s to your packages, then gmake installdirs
remember to run srtpath. (see man page of srtpath about setting it up)


Related documents

### Setting up the machine architecture to use

Now, change into the test release directory:

 > cd ana30

Next, select the machine architecture you will use. **NOTE!** You need to do this every time you have logged out and back in again:

 ana30> srtpath <enter> <enter>

where <enter> means you should hit the "enter" key
to accept the release number (it should say 18.6.2a - if you
are offered "newest", you are not in the test release
directory!) and the default operating system that you are offered.

The system should respond with something like:
enter release number (CR=18.6.2a):
> enter
Select/enter BFARCH (CR=1):
1) Linux24SL3_i386_gcc323      [prod][test][active][default]
2) Linux24RHEL3_i386_gcc323    [default2]
> enter


(The "srt" in srtpath stands for Software Release Tools. You will learn about it at the Workbook's SRT page.)

If you have a problem with the above command, check that you have correctly performed the HEPiX part of Logging In.

### Add Packages to Your Test Release

Now that you have your own copy of a release, you need to add software packages to help with your analysis. BaBar's software packages for analysis-30 are stored in:

$BFDIST/releases/analysis-30 You will need to add the following packages to your test release: • workdir - the base package from which analysis jobs are run. • BetaMiniUser - a standard package for analysis of BaBar data. • (optional: some updated packages with code improvements) To add a software package to your test release, you use the command "addpkg." This checks out the package from CVS, and modifies the various subdirectories of the release as required to incorporate the package. (CVS is something you will learn about later in the workbook.) To begin, add workdir to your release directory by issuing the command:  ana30> addpkg workdir The system should respond: Offline Release 18.6.2a uses workdir version V00-04-20, will check that out cvs checkout: Updating workdir U workdir/.cvsignore U workdir/.rootrc U workdir/GNUmakefile U workdir/README U workdir/RooAlias.C U workdir/RooLogon.C U workdir/pawlogon.kumac cvs checkout: Updating workdir/exercise_KstarGamma_BR cvs checkout: Updating workdir/kumac U workdir/kumac/babar.kumac U workdir/kumac/bbrpath.kumac U workdir/kumac/mkportrait.kumac U workdir/kumac/psportrait.kumac cvs checkout: Updating workdir/results GNU Make version 3.79.1, Build OPTIONS = Linux24SL3_i386_gcc323-Debug-native-Objy-Optimize-Fastbuild- Ldlink2-SkipSlaclog-Static-Lstatic Linux yakut06 2.4.21-37.ELsmp #1 SMP Wed Sep 28 12:51:04 CDT 2005 i686 athlon i386 GNU/Linux [uname -a] -> installdirs:  Now set up some symbolic links and create some required directories in the workdir package directory by issuing the command:  ana30> gmake workdir.setup The system should respond with something like: GNU Make version 3.79.1, Build OPTIONS = Linux24SL3_i386_gcc323-Debug-native-Objy-Optimize-Fastbuild- Ldlink2-SkipSlaclog-Static-Lstatic Linux yakut06 2.4.21-37.ELsmp #1 SMP Wed Sep 28 12:51:04 CDT 2005 i686 athlon i386 GNU/Linux [uname -a] -> workdir.setup: (Wed Jan 18 22:31:39 PST 2006) RELDIR not specified. Defaults to ../ Release directory set to ../  Now it is time to add BetaMiniUser, the analysis package, to your test release. Still from your release directory, add the BetaMiniUser package using the command  ana30> addpkg BetaMiniUser The system should respond: Offline Release 18.6.2a uses BetaMiniUser version V00-04-00, will check that out cvs checkout: Updating BetaMiniUser U BetaMiniUser/.cvsignore U BetaMiniUser/AppUserBuild.cc U BetaMiniUser/AppUserBuildBase.cc U BetaMiniUser/AppUserBuildBase.hh U BetaMiniUser/AppUserBuildRoo.cc U BetaMiniUser/BetaMiniPatches.tcl U BetaMiniUser/BetaMiniPidKilling.tcl U BetaMiniUser/BetaMiniUserProduction.tcl U BetaMiniUser/GNUmakefile U BetaMiniUser/History U BetaMiniUser/MyDstarAnalysis.cc U BetaMiniUser/MyDstarAnalysis.hh U BetaMiniUser/MyDstarAnalysisSnippet.tcl U BetaMiniUser/MyDstarMicroAnalysis.tcl U BetaMiniUser/MyDstarMiniAnalysis.tcl U BetaMiniUser/MyDstarPhysics.tcl U BetaMiniUser/MyHistManager.cc U BetaMiniUser/MyHistManager.hh U BetaMiniUser/MyK0Analysis.cc U BetaMiniUser/MyK0Analysis.hh U BetaMiniUser/MyK0MicroAnalysis.tcl U BetaMiniUser/MyK0MiniAnalysis.tcl U BetaMiniUser/MyMiniAnalysis.cc U BetaMiniUser/MyMiniAnalysis.hh U BetaMiniUser/MyMiniAnalysis.tcl U BetaMiniUser/MyReadList.cc U BetaMiniUser/MyReadList.hh U BetaMiniUser/MyReadList.tcl U BetaMiniUser/MyReadListSnippet.tcl U BetaMiniUser/MyReskimStreamPhysics.tcl U BetaMiniUser/MyTimePointAnalysis.cc U BetaMiniUser/MyTimePointAnalysis.hh U BetaMiniUser/MyTimePointAnalysis.tcl U BetaMiniUser/NamedScalers.cc U BetaMiniUser/NamedScalers.hh U BetaMiniUser/README U BetaMiniUser/RewriteMini.tcl U BetaMiniUser/bdbMini.tcl U BetaMiniUser/bdbMiniPhysProdSequence.tcl U BetaMiniUser/bdbMiniPhysics.tcl U BetaMiniUser/bdbMiniQA.tcl U BetaMiniUser/bin_BetaMiniApp.mk U BetaMiniUser/bin_BetaMiniRooApp.mk U BetaMiniUser/bin_BetaMiniUser.mk U BetaMiniUser/binlist U BetaMiniUser/btaMini.tcl U BetaMiniUser/btaMiniPhysProdSequence.tcl U BetaMiniUser/btaMiniPhysics.tcl U BetaMiniUser/btaMiniQA.tcl U BetaMiniUser/link_BetaMiniUser.mk GNU Make version 3.79.1, Build OPTIONS = Linux24SL3_i386_gcc323-Debug-native-Objy-Optimize-Fastbuild- Ldlink2-SkipSlaclog-Static-Lstatic Linux yakut06 2.4.21-37.ELsmp #1 SMP Wed Sep 28 12:51:04 CDT 2005 i686 athlon i386 GNU/Linux [uname -a] -> installdirs:  ### Incorporating the latest code fixes in your release BaBar code is constantly improving. Unfortunately, this means that software is often made available before it is perfect. So before you begin, you need to check if there are additional packages required for your test release. These additional packages are called "extra tags". They might be bug-fixes, or improvements related to specific aspects of the code. The complete list of extra tags of packages recommended to be used with each release can be found by following the link below: Looking under "analysis-30", you see that there is one "core bug fix". Therefore you need to add this package: ana30> addpkg KanModules V01-07-02-01  Again, the system responds with a long message: Will check out version V01-07-02-01 for KanModules cvs checkout: Updating KanModules U KanModules/AbsKanEventInput.hh U KanModules/AbsKanEventOutput.hh U KanModules/AbsKanOutputStream.hh U KanModules/ClcKanEventInput.cc U KanModules/ClcKanEventInput.hh U KanModules/DnaHackIfdNonOwningProxy.cc U KanModules/DnaHackIfdNonOwningProxy.hh U KanModules/DnaKanEventOutput.cc U KanModules/DnaKanEventOutput.hh U KanModules/DnaKanOutputStream.cc U KanModules/DnaKanOutputStream.hh U KanModules/GNUmakefile U KanModules/KanAbsFilter.cc U KanModules/KanAbsFilter.hh U KanModules/KanCondAliasFilter.cc U KanModules/KanCondAliasFilter.hh U KanModules/KanConfig.cc U KanModules/KanConfig.hh U KanModules/KanConfigCommand.cc U KanModules/KanConfigCommand.hh U KanModules/KanConfigKeyFilter.cc U KanModules/KanConfigKeyFilter.hh U KanModules/KanCreateCM.cc U KanModules/KanCreateCM.hh U KanModules/KanEventIdFilter.cc U KanModules/KanEventIdFilter.hh U KanModules/KanEventInput.cc U KanModules/KanEventInput.hh U KanModules/KanEventMerge.cc U KanModules/KanEventMerge.hh U KanModules/KanEventOutput.cc U KanModules/KanEventOutput.hh U KanModules/KanEventSequenceFilter.cc U KanModules/KanEventSequenceFilter.hh U KanModules/KanEventUpdate.cc U KanModules/KanEventUpdate.hh U KanModules/KanFilterCreator.cc U KanModules/KanFilterCreator.hh U KanModules/KanFilterEvents.cc U KanModules/KanFilterEvents.hh U KanModules/KanInputCommand.cc U KanModules/KanInputCommand.hh U KanModules/KanInputContainer.cc U KanModules/KanInputContainer.hh U KanModules/KanInputContainerList.cc U KanModules/KanInputContainerList.hh U KanModules/KanInputEventReaderKey.cc U KanModules/KanInputEventReaderKey.hh U KanModules/KanListAdder.cc U KanModules/KanListAdder.hh U KanModules/KanNullFileLocations.dat U KanModules/KanOutputCommand.cc U KanModules/KanOutputCommand.hh U KanModules/KanOutputContainer.cc U KanModules/KanOutputContainer.hh U KanModules/KanOutputStream.cc U KanModules/KanOutputStream.hh U KanModules/KanRunFilter.cc U KanModules/KanRunFilter.hh U KanModules/KanVerboseConfig.cc U KanModules/KanVerboseConfig.hh U KanModules/link_KanModules.mk GNU Make version 3.79.1, Build OPTIONS = Linux24SL3_i386_gcc323-Debug-native-Objy-Optimize-Fastbuild-Ldlink2-SkipSlaclog-Static-Lstatic Linux yakut02 2.4.21-37.0.1.ELsmp #1 SMP Thu Jan 19 17:17:30 CST 2006 i686 athlon i386 GNU/Linux [uname -a] -> installdirs:  In general, when you add packages, the required syntax is  ana30> addpkg PackageName V0X-YY-ZZ The labels like "V0X-YY-ZZ" are called tags. Because BaBar code is always changing, there is more than one version of each package, and each one has its own tag. When you add extra numbered tags you replace analysis-30's default packages with newer packages that have code improvements. On the extra tags page, there are also a few "optional" packages. You need check out optional packages only if your analysis job specifically requires the fix they are described as providing. When you add extra tags, you then need to check the dependencies of your code. This is necessary because code in some existing packages in the release might have dependencies on the updated code you just added. In which case, you need to check our the current versions of those packages so that the code changes can be correctly accounted for. To find any dependencies on updated packages, issue the command:  ana30> checkdep The system should respond: Using glimpse index of 18.6.2a cvs diff: Diffing BetaMiniUser cvs diff: Diffing KanModules cvs diff: Diffing workdir cvs diff: Diffing workdir/kumac checkdep: You should add for recompilation: -- NOTHING  This NOTHING means that there are no (more) dependencies, so you can continue. Sometimes, checkdep may give you a list of packages to add. These can be added without a tag number, as they are the analysis-30 versions --- you add them just to ensure that they will be recompiled "knowing" about the new tags you have added. To see a list of all your packages and their tags, type:  ana30> showtag The system should respond: BetaMiniUser V00-04-00 KanModules V01-07-02-01 workdir V00-04-20  Once you have carried out any compulsory instructions on the Extra Tags page, you have an up-to-date, working copy of analysis-30 and can start to do some physics analysis. ### Add or modify analysis code The point of having your own test release is that you can modify its code to carry out your own analysis. You will learn more about how BaBar code works and how to modify it later in the workbook. For now, just use the following commands to copy the code you need for the quicktour to your BetaMiniUser directory: ana30> cp$BFROOT/www/doc/workbook/NewExamples/NTrkExample/* BetaMiniUser/
Now move one of these files, snippet.tcl, to workdir:
ana30> mv BetaMiniUser/snippet.tcl workdir/

### Before You Compile and Link - set the OO_FD_BOOT variable, and make a .bbobjy file

Before you can run an analysis job you need to set the conditions database environment. This database contains information about the status of the machine at the time that data was analysed, for example the precise alignment of the detector components. **NOTE**, if you have logged out and back in, this command is necessary to reestablish the conditions database environment.

To do this, you need to set your OO_FD_BOOT environment variable to point to the database where the data you will use is stored. You set the database conditions to use by typing:

 ana30> cond18boot

The system should respond with:

Setting OO_FD_BOOT to /afs/slac/g/babar-ro/objy/databases/boot/physics/V7/ana/conditions/BaBar.BOOT


Now there is just one more thing to do. In your release directory (ana30 in this example), make a file called .bbobjy. This file will contain a single line with a FDID number that has been assigned to you. To find your FDID numbers, enter:

> GetFDID

Using any editor, for example emacs, create your .bbobjy file containing the sole line:

FD_NUMBER = ****
where **** is one of the 4-digit FDID numbers assigned to you. This file is needed in order for part of the gmake command used below (specifically, the gmake database.import part) to work.

## Running a First Example Analysis Job

### Batch Processing

In the next two sections you are going to ask the system to perform some CPU-intensive tasks. When you have a job that requires heavy use of computing resources, you should send it to the batch queue. This ensures that you "wait your turn" to use CPU resources. The command to submit a job to the batch queue is:
bsub -q QUEUE -o LOGFILE JOB_COMMAND
JOB_COMMAND is the command that you would type at the prompt if you were submitting the job directly. LOGFILE is your choice of name for the log file. QUEUE is the name of the queue. In general, you should use the bldrecoqqueue to compile and link (next section) and the kanga queue to run your job. For more information about submtting jobs to the batch system, see the batch processing section of the Workbook.

### Compile and Link the Job

You are now ready to compile and link the job.

To compile and link, you will use the command "gmake all". This is a "smart" command that executes several gmake commands in the correct order:

• gmake installdirs
• gmake database.import
• gmake schema
• gmake include
• gmake lib
• gmake bin
• gmake javalinks
• gmake python
Each of these gmake commands is called a target. The most important targets are lib to compile your code, and bin to link your code.

Many of these targets are not needed for your simple BetaMiniUser job. Experienced users are able to select only the targets that they need. But unless you are quite sure of what you are doing, it is best to use "gmake all" just to be safe.

Users on SLAC machines should compile by using the batch queue bldrecoq. Users working at other sites should ask their local system experts which queue to use, and which commands are required to submit jobs.

### Compile and link the Job

Submit the job for batch compilation and linking by using the following command:

ana30> bsub -q bldrecoq -o all.log gmake all


This command sends the job "gmake all" to the bldrecoq queue, and writes the output of the compilation step to the file all.log in your workdir directory.

The system should respond with something like:

Job <268570> is submitted to queue <bldrecoq>.

Note that you can kill the batch job that is pending by using the command;

 ana30> bkill <your_jobid>

You can also kill all of your jobs with the command

bkill 0

(This can be a particularly useful command at times.)

You can check progress of a SLAC batch job by using the command

 > bjobs

The system should respond with something like:

JOBID   USER    STAT  QUEUE      FROM_HOST   EXEC_HOST   JOB_NAME   SUBMIT_TIME
268570  penguin RUN   bldrecoq   yakut06     bldlnx04    gmake all  Jan 18 22:54

Once the job starts running, it should only take a few minutes. For more information about submitting and checking batch jobs, see:

Your job is done when bjobs responds:

No unfinished job found

Once the compilation is finished check that the lib files (files ending in ".a") have been generated by your gmake lib command:

 ls -l lib/$BFARCH/  The system should respond: total 8772 -rw-r--r-- 1 penguin br 3061168 Mar 2 20:01 libBetaMiniUser.a -rw-r--r-- 1 penguin br 5918288 Mar 2 20:02 libKanModules.a drwxr-xr-x 6 penguin br 2048 Mar 2 19:36 templates  Also check that the executables have been generated:  ls -l bin/$BFARCH/

The system should respond:

total 50157
-rwxr-xr-x    1 penguin  br       51359689 Mar  2 20:04 BetaMiniApp
-rw-r--r--    1 penguin  br             73 Mar  2 20:04 Index


You should always use the long list command (ls -l) to check the date and time, and make sure that a new binary file was produced. Accidentally using an old binary is a common mistake.

As another check, compare you all.log file with the sample all.log. BaBar code usually gives lots of confusing warning messages. Ignore anything that says "warning." Worry only about things that say "error."

### Finding some Data to run on - BbkDatasetTcl

This section contains a brief walk-through of how to find data for your analysis.

There is more information about finding data (both real and simulated) in the Workbook section Find Data.

The basic tool for finding and accessing data is BbkDatasetTcl. To find out what real and Monte Carlo (simulated) data sets are available, from any SLAC computer you can type

 > BbkDatasetTcl

This will produce a long list of data sets. (To keep the list from flying by too fast to be read, you may wish to pipe this command to less ("BbkDatasetTcl | less"), or send the output to a file ("BbkDatasetTcl >& BbkDatasetTcl.txt").)

A sample of the output from this command is here:

AllEventsSkim-Run5-OffPeak-R18b
AllEventsSkim-Run5-OffPeak-R18b-Run5a-V01
AllEventsSkim-Run5-OnPeak-R18b
AllEventsSkim-Run5-OnPeak-R18b-Run5a-V01
B0ToRhoPRhoM-Run1-OffPeak-R18b
B0ToRhoPRhoM-Run1-OnPeak-R18b
B0ToRhoPRhoM-Run2-OffPeak-R18b
B0ToRhoPRhoM-Run2-OnPeak-R18b
B0ToRhoPRhoM-Run3-OffPeak-R18b
B0ToRhoPRhoM-Run3-OnPeak-R18b
B0ToRhoPRhoM-Run4-OffPeak-R18b
B0ToRhoPRhoM-Run4-OnPeak-R18b
B0ToRhoPRhoM-Run5-OffPeak-R18b
B0ToRhoPRhoM-Run5-OffPeak-R18b-Run5a-V01
B0ToRhoPRhoM-Run5-OnPeak-R18b
B0ToRhoPRhoM-Run5-OnPeak-R18b-Run5a-V01
BCCC03a3body-Run1-OffPeak-R18b
BCCC03a3body-Run1-OnPeak-R18b
BCCC03a3body-Run2-OffPeak-R18b


The names of the different data sets give you a good idea of what they are for. "AllEvents" and "AllEventsSkim" are generic large data sets, with a minimum of selection. Sets with the names of specific decays or particles, like "AntiDeuteron" above, are skims, subsets of AllEventsSkim produced with a special selector for a particular analysis.

Monte Carlo (simulated) data sets are also listed by BbkDatasetTcl. Their names begin with the prefix "SP-". For example, further down in the list you have:

SP-1235-TwoPhotonTwoTrack-Run3-R18b
SP-1235-TwoPhotonTwoTrack-Run4-R18b
SP-1235-TwoPhotonTwoTrack-Run5-R18b
SP-1237
SP-1237-A0-R18b
SP-1237-A0-Run1-R18b
SP-1237-A0-Run2-R18b
SP-1237-A0-Run3-R18b
SP-1237-A0-Run5-R18b
SP-1237-AllEventsSkim-R18b
SP-1237-AllEventsSkim-Run1-R18b
SP-1237-AllEventsSkim-Run2-R18b
SP-1237-AllEventsSkim-Run3-R18b
SP-1237-AllEventsSkim-Run5-R18b
SP-1237-B0ToRhoPRhoM-R18b
SP-1237-B0ToRhoPRhoM-Run1-R18b
SP-1237-B0ToRhoPRhoM-Run2-R18b


These are all Monte Carlo sets of various types. 1235 and 1237 are \emph{mode numbers} that tell you what kind of decay is being simulated. For example, 1237 is one of the most popular mode numbers: the mode number for generic B0B0bar decays.

You will learn more about the different data and Monte Carlo sets in the Find Data section of the Workbook.

As an example, we will study Monte Carlo simulated B0-B0bar meson decays appropriate to the Run4 (approximately 2004) data-taking period. From the above list, we see that the appropriate set is SP-1237-Run4. (In some older data sets, these collections are also known as SP-B0B0bar-Run4.)

The next step is to produce a file that will tell your BetaMiniApp executable where the SP-1237-Run4 collections are. To do this, from your workdir enter the command:


BbkDatasetTcl -ds SP-1237-Run4


The output from this command should look something like:

BbkDatasetTcl: wrote SP-1237-Run4.tcl
Selected 209 collections, 23118000/23118000 events, ~0.0/pb, from bbkr18 at ral
****************** WARNING ******************
1027 collections (of 1236) are not available in bbkr18 at ral.
tcl file only lists local collections.
Specify BbkDatasetTcl --nolocal to include all collections.
****************** WARNING ******************


For the moment, we will not worry about this warning message.

In the directory from which you ran the BbkDatasetTcl command there should now be a file called SP-1237-Run4.tcl The file looks like this:

## This file was generated automatically on 2006/01/19-00:00:20-PST
## by user penguin on host yakut06 from /u/br/penguin/ana30/workdir
## using: BbkDatasetTcl -ds SP-1237-Run4
## version Id: BbkTcl.pm,v 1.47 2005/10/19 13:54:44 adye Exp
## Selected dataset from bbkr18 at ral:
##   SP-1237-Run4 (update of production datasets) created 2005/04/13-16:47:04-PST by douglas

# 138000/138000 events selected from 69 on-peak runs, added to dataset at 2005/11/04-22:50:14-PST, lumi = ~0.0/pb
lappend inputList /store/SP/R18/001237/200309/18.6.0b/SP_001237_013238
# 48000/48000 events selected from 24 on-peak runs, added to dataset at 2005/11/05-04:48:59-PST, lumi = ~0.0/pb
lappend inputList /store/SP/R18/001237/200309/18.6.0b/SP_001237_013240
# 6000/6000 events selected from 3 on-peak runs, added to dataset at 2005/11/05-04:48:58-PST, lumi = ~0.0/pb
lappend inputList /store/SP/R18/001237/200309/18.6.0b/SP_001237_013270
# 138000/138000 events selected from 69 on-peak runs, added to dataset at 2005/11/03-22:47:54-PST, lumi = ~0.0/pb
...
...
lappend inputList /store/SP/R18/001237/200407/18.6.1a/SP_001237_041783
# 124000/124000 events selected from 62 on-peak runs, added to dataset at 2006/01/17-22:50:33-PST, lumi = ~0.0/pb
lappend inputList /store/SP/R18/001237/200407/18.6.1a/SP_001237_042167

## Total selected: 209 collections, 23118000/23118000 events, ~0.0/pb
## (Note! The luminosity reported here is approximate. The full
##  and accurate value must be obtained using BbkLumi.
##  In addition, the event count and luminosity reported in each Tcl
##  file are based on the values in the entire original collection(s),
##  and not the subset (e.g. ...%selectEventSequence=1-50000) used to
##  define the Tcl file.)
## Last collection added to dataset: 2006/01/17-22:50:33-PST


Each of the "lappend inputList" lines is a command (in tcl-language) that tells BetaMiniApp the location of a Monte Carlo collection. This file is used as input into your Beta job.

Now move SP-1237-Run4.tcl to workdir (if it is not already there):

 >  mv SP-1237-Run4.tcl ~/ana30/workdir/

Related documents

### Set up the job

BetaMiniUser contains a lot of tcl files which control the BetaMiniApp executable when it is run. For a specific analysis, you will need a few more tcl files to control the specific details of your jobs. You have already copied the tcl files you need from \$BFROOT/www/doc/workbook/NewExamples/NTrkExample/

• MyMiniAnalysis.tcl A modified version of the MyMiniAnalysis.tcl file found in BetaMiniUser.
• snippet.tcl A "snippet" tcl file used to perform some setup.

If you take a look at the file snippet.tcl, you will see that it performs three tasks. First, it sets some important tcl parameters:


set ConfigPatch MC
set levelOfDetail cache
set BetaMiniTuple root
set histFileName myHistogram.root



This tells BetaMiniApp that you are using Monte Carlo (MC) events and that you want the result in a root-format ntuple called myHistogram.root. (cache refers to the level of detail, which you will learn about later.) Important: If you were running on real data instead of Monte Carlo simulated data, you would have to change the ConfigPatch from MC to Run1 or Run2.

Finally, snippet.tcl passes the job to BetaMiniUser/MyMiniAnalysis.tcl:


sourceFoundFile BetaMiniUser/MyMiniAnalysis.tcl


MyMiniAnalysis.tcl in turn does some more setup and then passes the job to btaMini.tcl, which is the main tcl file.

### Run the job

You are now ready to run the job.

Remember that if you come to this step after a fresh login, you need to go to the release directory (cd ana30), execute the srtpath script, and reenter the cond18boot command.

Jobs are always run from workdir. Go to the workdir directory:

ana30> cd workdir
To start the job, type:
ana30/workdir> BetaMiniApp snippet.tcl

The system should start to respond very quickly with a chunk of output which looks something like:

set BetaMiniReadPersistence Kan                     ;# not set, using default
set levelOfDetail cache                                     ;# set to default
set ConfigPatch MC                                          ;# set to default
set BetaMiniTuple root                                      ;# set to default
set histFileName myHistogram.root           ;# default is MyMiniAnalysis.root
BetaMiniOptions.tcl::Warning: ConfigPatch is obsolete.  Please set MCtruth directly
BetaMiniOptions.tcl::Warning: levelOfDetail        "cache"
BetaMiniOptions.tcl::Warning: MCTruth              "true"
BetaMiniOptions.tcl::Warning: ErrLoggingSeverity   "warning"
BetaMiniOptions.tcl::Warning: BetaMiniTuple        "root"
BetaMiniOptions.tcl::Warning: histFileName         "myHistogram.root"
BtaProdCreateSequence.tcl done
MicroLists being called
BetaMiniSequence.tcl::Error: Turning off broken physics code
BetaMiniSequence.tcl::Error: Disabling TrkEffTableCreateor:
this module looks for a track efficiency file on beginJob
# MyMiniQA not set
Everything                                      (enabled)
BetaMiniSequence                                (enabled)
BetaMiniInitSequence                            (enabled)
BetaMiniEnvSequence                             (enabled)
RecEventControlSequence                         (enabled)
EvtCounter                                      (enabled)
event counter
RecTimeStampFilter                              (enabled)
timestamp filter
GenBuildEnv                                     (enabled)
Build General Environment
MatBuildEnv                                     (enabled)
Build Materials
BdbCondInitSequence                             (enabled)
BdbTclManagerModule                             (enabled)
Manages operations with BdbTclModuleParmList
CdbBdbInit                                      (enabled)
Initialize Conditions database access
CdbEvtLoadStateId                               (enabled)
Load Cdb StateId into transient eventstore
CfgInitSequence                                 (enabled)
CfgBuildEnv                                     (enabled)
Build configDB Environment
CfgSetKeyModule                                 (enabled)
stores configKey in the environment
PepBuildEnv                                     (enabled)
build Pep environment
L3TBuildEnv                                     (enabled)
Build L3Trigger environment
L3TConfigModule                                 (enabled)
Create L3 configuration
SvtInitSequence                                 (enabled)

SvtBuildEnv                                     (enabled)
initialize Geometry for the Svt
DchInitSequence                                 (enabled)
DchBuildEnv                                     (enabled)
Build Dch environment
DchCondMonitor                                  (disabled)
Dch conditions monitor
DrcInitSequence                                 (enabled)
DrcEnvModuleSequence                            (enabled)
DrcBuildEnv                                     (enabled)
DIRC - Setup environement for the DIRC
DrcBuildGeom                                    (enabled)
DIRC - Create the DrcDetector
DrcInitEvent                                    (enabled)
DIRC - Initializations at the beginning of an Event
EmcInitSequence                                 (enabled)
EmcBuildEnv                                     (enabled)
Build Emc Environment
EmcBuildGeom                                    (enabled)
Builds Emc detector model
EmcLoadPid                                      (enabled)
Loads environmenty with Emc PID factory
EmcEdgeCorrLoader                               (enabled)
Load Emc Edge Correction into environment
EmcLoadCalibToo                                 (enabled)
Loads environment with relevant EMC calibrator(Too) proxy
EmcLoadDigiCalib                                (enabled)
Loads environment with relevant EMC digi calibrator proxies
IfrInitSequence                                 (enabled)
IfrBuildEnv                                     (enabled)
IFR - build IfrEnv
IfrVstModule                                    (enabled)
IFR - visitor manager
IfrPidObjyLoader                                (enabled)
IFR - read Pid Calibration from Objy
IfrMuCalib                                      (enabled)
IFR - mu/pi calibration procedure
IfrNeutralCalib                                 (enabled)
IFR - neutral calibration procedure
TrkBuildEnv                                     (enabled)
Build Tracking environment
EffTablePInitSequence                           (enabled)
EffTableBuildPEnv                               (enabled)
EffTable Bdb conditions environment
HbkTupleEnv                                     (disabled)
Build HbkTuple Manager
RooTupleEnv                                     (enabled)
Build RooTuple Manager
PdtInit                                         (enabled)
initialize Particle Data Table
BtaBuildEnv                                     (enabled)
Put the BtaEnv object into the environment
BtaInitEvent                                    (enabled)
Beta- event initialisation
EffTableSequence                                (enabled)
EffTableLoader                                  (enabled)
Load Efficiency Table Proxies
BetaMiniReadSequence                            (enabled)
CdbEvtLoadStateIdHistory                        (enabled)
CdbEvtLoadStateId clone
KanCreateCM                                     (enabled)
Create KanConversionManager
HdrKanLoadHdr                                   (enabled)
HdrKanLoad clone
TagKanLoadTag                                   (enabled)
TagKanLoad clone
KanEventUpdateTag                               (enabled)
KanEventUpdate clone
BetaMiniTagFilterSequence                       (enabled)
G4KanLoadTru                                    (enabled)
G4KanLoad clone
L1FctKanLoadTru                                 (enabled)
L1FctKanLoad clone
TrkKanLoadAod                                   (enabled)
TrkKanLoad clone
EmcKanLoadAod                                   (enabled)
EmcKanLoad clone
IfrKanLoadAod                                   (enabled)
IfrKanLoad clone
PidKanLoadAod                                   (enabled)
PidKanLoad clone
L1FctKanLoadAod                                 (enabled)
L1FctKanLoad clone
L1GltKanLoadAod                                 (enabled)
L1GltKanLoad clone
RecoKanLoadAod                                  (enabled)
RecoKanLoad clone
BtaMiniKanLoadCnd                               (enabled)
BtaMiniKanLoad clone
KanEventUpdateAllTheRest                        (enabled)
KanEventUpdate clone
BetaMiniDetectorSequence                        (enabled)
BetaMiniPidSequence                             (enabled)
PidExpandChargedSummary                         (enabled)
Expand PidChargedSummary to separate lists
BetaMiniTrkSequence                             (enabled)
KalFit                                          (enabled)
Kalmanize Mini tracks
KalMiniRX                                       (enabled)
Repair Mini tracks
TrkMakePid                                      (enabled)
Pid info from tracks
BetaMiniDrcSequence                             (enabled)
DrcCreatePidInfoFromMini                        (disabled)
DIRC - create DrcPidInfo from DrcTrack
DrcIdentify                                     (disabled)
DIRC - Particle Id Module - first pass
DrcCleanAssociation                             (disabled)
DIRC - Clean Associations
DrcSecondPass                                   (disabled)
DIRC - Particle Id Module - second pass
DrcCheckReco                                    (disabled)
DIRC - check Reco (NA)
BunchT0MiniSequence                             (enabled)
InitBunchT0                                     (disabled)
MC Truth Bunch time
DchTrkBunchT0                                   (disabled)
Track Bunch T0 Finder
DrcCheckEventT0                                 (disabled)
DIRC - Check Event T0 using Dirc data
TrkT0Faker                                      (enabled)
Add T0 from tag-db to BunchList
BunchT0TagSet                                   (disabled)
Set bunch T0 tags
PrintBunch                                      (disabled)
Printout Bunch t0s
BetaMiniSvtSequence                             (enabled)
SvtHitReco                                      (disabled)
makes Clusters and Hits from rootClusters
SvtMakePid                                      (disabled)
Create SvtPidInfo list
BetaMiniDchSequence                             (enabled)
DchMiniHitsSequence                             (disabled)
DchMiniMakeHits                                 (enabled)
Reconstitute Default DchHit list
DchMakeHitMap                                   (enabled)
Create map of DchHit pointers to cells
DchPidMakeHitMap                                (disabled)
Create dE/dx list
DchMakePid                                      (disabled)
Create DchPidInfo list
DchMakePidMap                                   (enabled)
Index PID vs. tracks
BetaMiniEmcSequence                             (enabled)
EmcMakeMiniReco                                 (enabled)
Creates Bump, Cluster, and shared digi lists
EmcSetClusterCalibrator                         (enabled)
change cluster calibrator for EmcCands
EmcTrackMatch                                   (enabled)
Match tracks to bumps
BtaFixMultiBumps                                (enabled)
Fix problem in early Kan Emc persistence
EmcCreateUniqueList                             (enabled)
Make a unique list of EmcCands
EmcMakeMiniCandLists                            (enabled)
Creates the various lists of EmcCands from EmcListBank
BetaMiniIfrSequence                             (enabled)
IfrMakeExpansion                                (enabled)
Expand compactified minis
BetaMiniNeutralHadSequence                      (disabled)
NeutralHadSequence                              (enabled)
NeutralHadMatch                                 (enabled)
Emc and Ifr match
MakeNeutralHad                                  (enabled)
make IFR neutral hadrons
NeutralHadMatchAll                              (enabled)
Emc and Ifr match method2
MakeNeutralHadAll                               (enabled)
make all neutral hadrons
NeutralHadNtuple                                (disabled)
neutral hadrons Ntuple
BetaMiniTruSequence                             (enabled)
GTrkFillDaughters                               (enabled)
Fill daughters field of GTracks
BtaLoadMcCandidates                             (enabled)
Load Default MC Beta Candidates
BtaLoadGHitAssoc                                (enabled)
Beta - MC GHit associator
BetaMiniBtaSequence                             (enabled)
BtaProdCreateSequence                           (enabled)
BtaLoadEvtInfo                                  (enabled)
Load default EventInfo
PhysInitEvent                                   (enabled)
Beta- event initialization
PhysCreateAlias                                 (enabled)
create aliases for ALists
BetaLoadMiniSequence                            (enabled)
BtaLoadBeamSpot                                 (enabled)
load the BeamSpot into the EventInfo
LoadMiniBtaCandidates                           (enabled)
load Beta lists from mini objects
LoadEventInfoSequence                           (enabled)
BtaLoadBeamSpot                                 (enabled)
load the BeamSpot into the EventInfo
BtaGoodTrackSequence                            (enabled)
GoodTrackVeryLooseSelection                     (enabled)
Selection of good track for primary vertex
GoodTrackAccSelection                           (enabled)
Selection of good track within acceptance
GoodTrackLooseSelection                         (enabled)
Selection of good track for primary vertex
GoodTrackAccLooseSelection                      (enabled)
Selection of good track  within acceptance plus loose cuts
GoodTrackTightSelection                         (enabled)
Selection of good track for primary vertex
GoodPhotonLooseSelection                        (enabled)
Selection of loose good photons
GoodNeutralLooseAccSelection                    (enabled)
Selection of loose good photons
GoodPhotonDefaultSelection                      (enabled)
Selection of default good photons
TrkMicroDispatch                                (enabled)
Create bitmaps for tracks
TrkEffTableCreator                              (disabled)
Create tracking efficiency tables
VtxEvent                                        (enabled)
search the vertex of the event
BetaMiniUtilitySequence                         (enabled)
CpuCheck                                        (enabled)
Check CPU time remaining
Signal                                          (enabled)
intercept ^C
PrintParms                                      (disabled)
print parameter values
ReportFPE                                       (enabled)
handle and report floating point exceptions
MyMiniAnalysis                                  (enabled)
MyMiniAnalysis
NTrkExample                                     (enabled)
Workbook example module
>


You can ignore the warning and error messages for now provided our output ends with the last few lines shown above.

At this point, BetaMiniApp realizes that it needs user input, so it pauses the job.

The prompt ">" indicates that now you are "inside" your job, or inside the framework. When you are inside the framwork you can change settings, query the executable and control the analysis job.

To begin, you need to set the collection to tell the executable where to find the data you are going to run on. To do this you need to talk to an input module:

> mod talk KanEventInput

This should give you a KanEventInput prompt:

KanEventInput>

Now you enter commands from the tcl file SP-1237-Run4.tcl. The first "lappend inputList" line in SP-1237-Run4.tcl is:

(do NOT type this line:)
lappend inputList /store/SP/R18/001237/200309/18.6.0b/SP_001237_013238


To pass this collection name to KanEventInput, you use the same command except that "lappend inputList" is replaced by "input add":

KanEventInput>  input add /store/SP/R18/001237/200309/18.6.0b/SP_001237_013238

To confirm that the collection has been added to the input list, type:
 KanEventInput>  input list

to which the program responds:

KanEventInput> input list
Collections:
/store/SP/R18/001237/200309/18.6.0b/SP_001237_013238
Components:
hdr
tag
tru
aod
cnd


Now you are done talking to KanEventInput, so you can return to the framework (">");


KanEventInput> exit
>


Next, you need to start the job, and tell the system how many events to process. Let's begin with three:

 > ev beg -nev 3

The system should respond with:

CdbBdbInit::CdbBdbInit.cc(164):CdbBdbInit: Using CDB view "::".
CfgSetKeyModule:Config background key at beginJob is 0
EmcBuildGeom::BdbTclModuleProxy.cc(123): CONTAINER  DOES NOT EXIST
IfdStrKey(EmcEdgeCorrTheta)IfdStrKey(EmcEdgeCorrPhi)IfdStrKey(EmcEdgeCorrThetaSigma)
IfdStrKey(EmcEdgeCorrPhiSigma)IfdStrKey(EmcNeuCorrScale)IfdStrKey(EmcNeuCorrSigma)
EmcEdgeCorrLoader:begin Job
EmcDetector::applyGlobal: The following alignment constants are loaded:
Alignment for element 0Emc Global Barrel
Translation vector = (0.22842,0.0568511,0.151441)
Rotation vector = (0.000823079, 0.000951132, 0.00305323)
Alignment for element 0Emc Global Endcap
Translation vector = (0.434573,-0.151269,0.254797)
Rotation vector = (0, 0, 0.00398447)
These constants will be used to align the EMC

BtaBuildPidEnv::AbsEnv.cc(399):Overriding BtaEnv object in global environment.
CompPi0ListMerger:CompPi0ListMerger begin Job
KanEventInput::KanFileReg.cc(348):KanFileReg: read UUID c574a7e0-4bc0-11da-82cd-d80a67dede29
Adding /store/SP/R18/001237/200309/18.6.0b/SP_001237_013238
KanEventInput::KanEventInput.cc(594):Opening collection /store/SP/R18/001237/200309/18.6.0b/
SP_001237_013238
KanEventInput::KanTreeBase.cc(241):No Branch called Emc
KanEventInput::KanTreeBase.cc(241):No Branch called Emc_CandListBank
KanEventInput::KanTreeBase.cc(241):No Branch called L1Glt
KanEventInput::KanTreeBase.cc(241):No Branch called Reco_EmcCands
KanEventInput::KanTreeBase.cc(241):No Branch called Bta
EvtCounter: processing event # 1 [ 1d:ffffffff:0572ea/49a0ef99:K ]
EmcDetector::applyGlobal: The following alignment constants are loaded:
Alignment for element 0Emc Global Barrel
Translation vector = (0,0,0)
Rotation vector = (0, 0, 0)
Alignment for element 0Emc Global Endcap
Translation vector = (0,0,0)
Rotation vector = (0, 0, 0)
These constants will be used to align the EMC

EmcDigiCalib Constants Info for /emc/EmcSrcCalType:
Begin: Tue Jan  1 16:00:01 1901 (local time) 0 ns, End: Tue Jan  1 16:00:01 1991 (local time)
0 ns, Created: Mon Sep 30 12:16:09 2002 (local time) 264456200 ns
EmcDigiCalib Constants Info for /emc/EmcBhabhaType:
Begin: Tue Jan  1 16:00:01 1901 (local time) 0 ns, End: Fri Jan  1 00:00:01 1999 (local time)
0 ns, Created: Tue Oct  1 15:27:54 2002 (local time) 941506800 ns
EmcTrackMatch::EmcGeomTrkMatchMethod.cc(1575):Use track match constants from ASCII file.
EmcTrackMatch::EmcGeomTrkMatchMethod.cc(1579):Use track match constants from ASCII file.
EvtCounter: processing event # 2 [ 1d:ffffffff:0572ea/49a0f302:R ]
EvtCounter: processing event # 3 [ 1d:ffffffff:0572ea/49a0f66b:X ]

Again, you can ignore any warning messages, but should investigate any error messages you see.

Buoyed by this success, let's run a few more events:

>ev cont -nev 37


making 40 in all. You should see something like:


EvtCounter: processing event # 4 [ 1d:ffffffff:0572ea/49a0f9d4:L ]
EvtCounter: processing event # 5 [ 1d:ffffffff:0572ea/49a0fd3d:S ]
...
...
EvtCounter: processing event # 39 [ 1d:ffffffff:0572ea/49a1712f:S ]
EvtCounter: processing event # 40 [ 1d:ffffffff:0572ea/49a17498:Y ]


You may also see some messages like this:

LoadMiniBtaCandidates::BtaCandBase.cc(627): attempt to call
BtaCandBase::setType("Upsilon(4S)") for a composite
BtaCandidate whose 5 daughters have total charge -1
Is it not a decay ?  Could it be an error ?   Anyway, the charge is set
to the PdtEntry one


Don't worry about those messages.

Finally, end the Beta analysis session by typing:

> exit

The system should respond very quickly with:

TagBGFMultiHadron:TagBGFMultiHadron: endJob summary:
Events processed: 0
Events passed   : 0
Events prescaled: 0
TagInspector:TagInspector: endJob summary:
Events processed: 0

MyMiniAnalysis::MyMiniAnalysis.cc(116):
match vs charge
*-----*-----*-----*
0.5|  220|  585|  239|
-0.5|    0|   52|    0|
*-----*-----*-----*
-1.5  -0.5   0.5
UsrCandRefCheck:end job
GoodPhotonSemiLoose_BFlav:GoodPhotonSemiLoose_BFlav selected 0 of 0 candidates.
KLHNotPionGTL_BFlav:KLHNotPionGTL_BFlav selected 0 of 0 candidates.
KLHNotPion_BFlav:KLHNotPion_BFlav selected 0 of 0 candidates.
KLHVeryTight_BFlav:KLHVeryTight_BFlav selected 0 of 0 candidates.
KLHTight_BFlav:KLHTight_BFlav selected 0 of 0 candidates.
KLHLoose_BFlav:KLHLoose_BFlav selected 0 of 0 candidates.
KLHVeryLoose_BFlav:KLHVeryLoose_BFlav selected 0 of 0 candidates.
GoodTracksLoose_BFlav:GoodTracksLoose_BFlav selected 0 of 0 candidates.
GoodTracksVeryLooseHard_BFlav:GoodTracksVeryLooseHard_BFlav selected 0 of 0 candidates.
GoodTracksVeryLooseSoft_BFlav:GoodTracksVeryLooseSoft_BFlav selected 0 of 0 candidates.
GoodTracksSemiVeryLoose_BFlav:GoodTracksSemiVeryLoose_BFlav selected 0 of 0 candidates.
GoodTracksVeryLoose_BFlav:GoodTracksVeryLoose_BFlav selected 0 of 0 candidates.
CompBFast3body:Selected 0 candidates.
BchToEtacKch_Final:BchToEtacKch_Final end Job
B0ToEtacKs_Final:B0ToEtacKs_Final end Job
GoodPhotonTightAccSelection:GoodPhotonTightAccSelection selected 0 of 0 candidates.
CompPi0ListMerger:CompPi0ListMerger end Job
GoodTracksVisibleESelection:GoodTracksVisibleESelection selected 0 of 0 candidates.
GoodPhotonsVisibleESelection:GoodPhotonsVisibleESelection selected 0 of 0 candidates.
TaggingMcControl:TaggingMcControl selected 0 of 0 candidates.
TightGammaTagSelection:TightGammaTagSelection selected 0 of 0 candidates.
LooseGammaTagSelection:LooseGammaTagSelection selected 0 of 0 candidates.
TaggingDispatch:TaggingDispatch selected 0 of 0 candidates.
MergedPi0MicroSelectionTight:MergedPi0MicroSelectionTight selected 0 of 0 candidates.
MergedPi0MicroSelectionLoose:MergedPi0MicroSelectionLoose selected 0 of 0 candidates.
KlongEmcTightMicroSelection:KlongEmcTightMicroSelection selected 0 of 0 candidates.
KlongEmcLooseMicroSelection:KlongEmcLooseMicroSelection selected 0 of 0 candidates.
KlongIfrTightMicroSelection:KlongIfrTightMicroSelection selected 0 of 0 candidates.
KlongIfrLooseMicroSelection:KlongIfrLooseMicroSelection selected 0 of 0 candidates.
TightGLHProtonSelection:TightGLHProtonSelection selected 0 of 0 candidates.
VeryTightLHProtonSelection:VeryTightLHProtonSelection selected 0 of 0 candidates.
TightLHProtonSelection:TightLHProtonSelection selected 0 of 0 candidates.
LooseLHProtonSelection:LooseLHProtonSelection selected 0 of 0 candidates.
VeryLooseLHProtonSelection:VeryLooseLHProtonSelection selected 0 of 0 candidates.
GrlTightProtonSelection:GrlTightProtonSelection selected 0 of 0 candidates.
GrlDefaultProtonSelection:GrlDefaultProtonSelection selected 0 of 0 candidates.
GrlLooseProtonSelection:GrlLooseProtonSelection selected 0 of 0 candidates.
TightGLHPionMicroSelection:TightGLHPionMicroSelection selected 0 of 0 candidates.
VeryTightLHPionMicroSelection:VeryTightLHPionMicroSelection selected 0 of 0 candidates.
TightLHPionMicroSelection:TightLHPionMicroSelection selected 0 of 0 candidates.
LooseLHPionMicroSelection:LooseLHPionMicroSelection selected 0 of 0 candidates.
VeryLooseLHPionMicroSelection:VeryLooseLHPionMicroSelection selected 0 of 0 candidates.
PidRoyPionSelectionNotKaon:PidRoyPionSelectionNotKaon selected 0 of 0 candidates.
PidRoyPionSelectionLoose:PidRoyPionSelectionLoose selected 0 of 0 candidates.
TightGLHKaonMicroSelection:TightGLHKaonMicroSelection selected 0 of 0 candidates.
NotPionLHKaonGTLMicroSelection:NotPionLHKaonGTLMicroSelection selected 0 of 0 candidates.
VeryTightLHKaonMicroSelection:VeryTightLHKaonMicroSelection selected 0 of 0 candidates.
TightLHKaonMicroSelection:TightLHKaonMicroSelection selected 0 of 0 candidates.
LooseLHKaonMicroSelection:LooseLHKaonMicroSelection selected 0 of 0 candidates.
VeryLooseLHKaonMicroSelection:VeryLooseLHKaonMicroSelection selected 0 of 0 candidates.
NotPionLHKaonMicroSelection:NotPionLHKaonMicroSelection selected 0 of 0 candidates.
NotPionNNKaonMicroSelection:NotPionNNKaonMicroSelection selected 0 of 0 candidates.
VeryTightNNKaonMicroSelection:VeryTightNNKaonMicroSelection selected 0 of 0 candidates.
TightNNKaonMicroSelection:TightNNKaonMicroSelection selected 0 of 0 candidates.
LooseNNKaonMicroSelection:LooseNNKaonMicroSelection selected 0 of 0 candidates.
VeryLooseNNKaonMicroSelection:VeryLooseNNKaonMicroSelection selected 0 of 0 candidates.
PidSimpleKaonSelectionTight:PidSimpleKaonSelectionTight selected 0 of 0 candidates.
PidSimpleKaonSelectionLoose:PidSimpleKaonSelectionLoose selected 0 of 0 candidates.
PidRoyKaonSelectionNotPionOrUnknown:PidRoyKaonSelectionNotPionOrUnknown selected 0 of 0 candidates.
PidRoyKaonSelectionNotPion:PidRoyKaonSelectionNotPion selected 0 of 0 candidates.
PidRoyKaonSelectionDefault:PidRoyKaonSelectionDefault selected 0 of 0 candidates.
PidRoyKaonSelectionLoose:PidRoyKaonSelectionLoose selected 0 of 0 candidates.
NotPionKaonGTLMicroSelection:NotPionKaonGTLMicroSelection selected 0 of 0 candidates.
NotPionKaonMicroSelection:NotPionKaonMicroSelection selected 0 of 0 candidates.
VeryTightKaonMicroSelection:VeryTightKaonMicroSelection selected 0 of 0 candidates.
TightKaonMicroSelection:TightKaonMicroSelection selected 0 of 0 candidates.
LooseKaonMicroSelection:LooseKaonMicroSelection selected 0 of 0 candidates.
VeryLooseKaonMicroSelection:VeryLooseKaonMicroSelection selected 0 of 0 candidates.
PidLHElectronSelector:PidLHElectronSelector selected 0 of 0 candidates.
PidRoyElectronSelectionLoose:PidRoyElectronSelectionLoose selected 0 of 0 candidates.
VeryTightElectronMicroSelection:VeryTightElectronMicroSelection selected 0 of 0 candidates.
TightElectronMicroSelection:TightElectronMicroSelection selected 0 of 0 candidates.
LooseElectronMicroSelection:LooseElectronMicroSelection selected 0 of 0 candidates.
VeryLooseElectronMicroSelection:VeryLooseElectronMicroSelection selected 0 of 0 candidates.
NoCalElectronMicroSelection:NoCalElectronMicroSelection selected 0 of 0 candidates.
NNVeryTightMuonSelectionFakeRate:NNVeryTightMuonSelectionFakeRate selected 0 of 0 candidates.
NNTightMuonSelectionFakeRate:NNTightMuonSelectionFakeRate selected 0 of 0 candidates.
NNLooseMuonSelectionFakeRate:NNLooseMuonSelectionFakeRate selected 0 of 0 candidates.
NNVeryLooseMuonSelectionFakeRate:NNVeryLooseMuonSelectionFakeRate selected 0 of 0 candidates.
NNVeryTightMuonSelection:NNVeryTightMuonSelection selected 0 of 0 candidates.
NNTightMuonSelection:NNTightMuonSelection selected 0 of 0 candidates.
NNLooseMuonSelection:NNLooseMuonSelection selected 0 of 0 candidates.
NNVeryLooseMuonSelection:NNVeryLooseMuonSelection selected 0 of 0 candidates.
LikeTightMuonSelection:LikeTightMuonSelection selected 0 of 0 candidates.
LikeLooseMuonSelection:LikeLooseMuonSelection selected 0 of 0 candidates.
LikeVeryLooseMuonSelection:LikeVeryLooseMuonSelection selected 0 of 0 candidates.
VeryTightMuonMicroSelection:VeryTightMuonMicroSelection selected 0 of 0 candidates.
TightMuonMicroSelection:TightMuonMicroSelection selected 0 of 0 candidates.
LooseMuonMicroSelection:LooseMuonMicroSelection selected 0 of 0 candidates.
VeryLooseMuonMicroSelection:VeryLooseMuonMicroSelection selected 0 of 0 candidates.
MinimumIoniziongMuonMicroSelection:MinimumIoniziongMuonMicroSelection selected 0 of 0 candidates.
GoodTrackAccSelection:GoodTrackAccSelection selected 436 of 459 candidates.
GoodTrackVeryLooseSelection:GoodTrackVeryLooseSelection selected 370 of 459 candidates.
IfrMakeChargedPid:IfrMakeChargedPid end Job
IfrMuMuCalib::IfrMuMuCalib.cc(144):_mumucal is null!
TagFilterByValue:TagFilterByValue: endJob summary:
Events processed: 0
Events passed   : 0
TagFilterByName:TagFilterByName: endJob summary:
Events processed: 0
Events passed   : 0
Events prescaled: 0
EmcEdgeCorrLoader: end Job
DchBuildEnv::AbsEnv.cc(241):Overriding DchEnv object in global environment.
EvtCounter:EvtCounter:     total number of events=40
total number of events processed=40
total number of events skipped=0
EvtCounter:Total CPU usage: 6 User: 6 System: 0
Framework is exiting now.
DrcDetector: number of sets to destroy: 51


If you want to look at 40 other events, rather than just running on the first 40, you can skip, say 2000, events with:

 > mod talk KanEventInput
KanEventInput> first set 2001
KanEventInput> exit
>


before you do the events begin ... or events continue... commands.

Two very useful commands are "help" and "exit." If you get stuck, you can type "help" for a list of options, or "exit" to exit, at the framework prompt (">") or any module prompt ("ModuleName>").

## View the Resulting Histograms

The example job will create an ntuple named myHistogram.root. Files ending in ".root" are meant to be analyzed with ROOT, a physics analysis package. You will be learning more about how to use ROOT in the ROOT tutorial. For now you'll just do one simple task: start ROOT, open the ntuple file, and look at a histogram.

To start a ROOT session, type:

ana30/workdir> bbrroot

(Note: "bbrroot" accesses a BaBar wrapper script for the ROOT package. In standard installations of ROOT, you just use the command "root".)

This should give you a popup window showing a naked lady with tree roots instead of legs. Then the system will say:

  *******************************************
*                                         *
*        W E L C O M E  to  R O O T       *
*                                         *
*   Version  4.04/02b       3 June 2005   *
*                                         *
*  You are welcome to visit our Web site  *
*          http://root.cern.ch            *
*                                         *
*******************************************

FreeType Engine v2.1.9 used to render TrueType fonts.
Compiled on 23 June 2005 for linux with thread support.

CINT/ROOT C/C++ Interpreter version 5.15.169, Mar 14 2005
Type ? for help. Commands must be C++ statements.
Enclose multiple statements between { }.


Open the file:

root[0] > TFile f("myHistogram.root");

If the file is found, there will be no response.

If you instead get an error message about the file not being found, check that you started ROOT from the workdir directory or check that the run output matched the log files shown earlier.

List the available histograms:

   root[1] > f.ls();

The system should respond:

TFile**         myHistogram.root        Created for you by RooTupleManager
TFile*         myHistogram.root        Created for you by RooTupleManager
KEY: TH1F     h1d1;1  MC reco abs mtm difference
KEY: TH1F     h1d2;1  Reco track momentum
KEY: TH1F     h1d3;1  Tracks per Event
KEY: TH1F     h1d4;1  TagInspector Status


Each TH1F is a 1-dimensional histogram of floats. h1d3, "Tracks per Event", is the one that was created by the NTrkExample code that you added to BetaMiniUser.

 root[1] > h1d3->Draw();

If you wait a few seconds, then a new window with the histogram should appear. It should look like this:

After your heart stops beating wildly at this sight, you can end the ROOT session using the command:

 root[2] > .q

The system will respond:

This is the end of ROOT -- Goodbye

and return your yakut prompt.

The example root file generated from this exercise is here. We will revisit this file in the Workbook section Analysis in ROOT I.

## A useful tip

A very useful command to use when in a release (after typing "srtpath") is

srtglimpse [expr]

where [expr] is any expression (e.g. a word, such as "SoftRelTools" or "histogram"). This searches for the expression in all subdirectories in the parent directory of your release (including those packages you haven't even checked out!). This can be extremely useful when trying to unravel dependencies of various packages, or just to learn how to use a particular expression.

## Conclusion

You have now demonstrated that you can run BaBar Offline software from your desktop. You have created a release directory, set up to use the database, compiled, linked and run a standard analysis job. You have viewed the resulting histograms using ROOT.

The next two sections of the Workbook will complete your general introduction to BaBar, discussing the BaBar Detector and discussing sources of BaBar information other than this Workbook. Thereafter, the Workbook will go into more detail on all aspects of BaBar Offline computing. It will include detailed explanations of all of the commands you used in this Quick Tour, give you still more commands to use, and point you at other more detailed reference materials.

Before running a full analysis job, you should also read other relevant sections of the BaBar Workbook, for example the Workbook Analysis section.

Congratulations on having come this far.

## Back to Workbook Front Page

Author: Joseph Perl
Contributors: Christopher Hearty
Paul Harrison, Tracey Marsh, James Weatherall, Leon Rochester
Last modification: 2 March 2006
Last significant update: 19 Jan 2006