Difference: AlastairDewhurstSandbox (1 vs. 18)

Revision 182010-11-04 - AlastairDewhurst

 
META TOPICPARENT name="Main.AlastairDewhurst"

Introduction

This page is a work in progress and contains information to remind myself how to run things.

Running ATLAS jobs locally

Log into heplnx109.pp.rl.ac.uk and do kinit

source /gridsoft/SL5/atlas/software/15.6.6/setup.sh -tag=15.6.6,setup,32
export CMTPATH=${HOME}/testarea/15.6.6:${CMTPATH}
cd testarea/15.6.6/

The analysis

Local conditions access

MC Data

Real Data

To analyze real data you must set the following environment variables:
export FRONTIER_SERVER="(serverurl=http://lcgft-atlas.gridpp.rl.ac.uk:3128/frontierATLAS)"
export ATLAS_POOLCOND_PATH=/gridsoft/SL5/atlas/local/conditions/

Running on the grid

source setup.sh -tag=15.6.10,32,setup
source /afs/cern.ch/atlas/offline/external/GRID/DA/panda-client/latest/etc/panda/panda_setup.sh

Location of files

Local file for testing purposes
athenaCommonFlags.FilesInput = [ "/home/hep/dewhurst/data/mc09_valid.108407.Pythia_directJpsimu4mu4.merge.AOD.e347_s585_s582_r812_r814_tid096141/AOD.096141._000006.pool.root.1"]

There is a list of interesting physics runs which can be found here. You can find the runs you are looking for with the atlas run query tool. For the initial data taking period the B-Physics group is putting their Ntuples here.

Downloading files to RAL PPD

To download files directly to the dcache storage element at RAL do this:
source /afs/cern.ch/atlas/offline/external/GRID/ddm/DQ2Clients/setup.sh
export DQ2_LOCAL_SITE_ID=UKI-SOUTHGRID-RALPP_SCRATCHDISK
voms-proxy-init -voms atlas
dq2-get -S srm://heplnx204.pp.rl.ac.uk:8443/srm/managerv2?SFN=/pnfs/pp.rl.ac.uk/data/atlas/atlasralppdisk/dewhurst/ user.JamesCatmore.155697.f261_m496.muonstream_StandardGRL_prelim_muon_yellowPlus_7TeV-v02-pass1.staco/
Make sure you are not in the dcache element (ie do cd $HOME) otherwise you will get an error that you are in a read only file system.

Getting ROOT to work

To get root to work correctly do:
export LD_LIBRARY_PATH=/opt/d-cache/dcap/lib:/opt/d-cache/dcap/lib:/opt/d-cache/dcap/lib64:/opt/glite/lib:/opt/glite/lib64:/opt/globus/lib:/opt/lcg/lib:/opt/lcg/lib64:/opt/classads/lib64/:/opt/c-ares/lib:/opt/ppd/root/5.26.00/x86_64-slc5-gcc34-opt/root/lib/
export ROOTSYS=/opt/ppd/root/5.26.00/x86_64-slc5-gcc34-opt/root/
export PATH=/usr/kerberos/bin:/opt/d-cache/srm/bin:/opt/d-cache/dcap/bin:/opt/edg/bin:/opt/glite/bin:/opt/globus/bin:/opt/lcg/bin:/usr/local/bin:/bin:/usr/bin:/cern/pro/bin:${ROOTSYS}/bin:/usr/local/sbin:/sbin:/usr/sbin

Jpsi Information

All information is being stored on the sharepoint which can be found here. This information is likely to become rapidly out of date. There is a daily BPhys meeting at 3pm (UK time) via Evo. Plots of di-muon candidates can be found here.

Meetings of note:

Bd J/psi K0* Analysis

The next section contains details on the status of the Bd -> J/psi K0* analysis.

Athena analaysis

The is done using the Bd2JpsiKstar code written by Adam Barton. The currently available ntuples not using the

Ntuple slimming

This is currently done using Pavel script. Julie has run this code which can be found on:
heplnx111.pp.rl.ac.uk
ls -lrth /scratch/jk67/DataAug2010_ntuples/
total 2.7G
-rw-r--r-- 1 jk67 atlas 7.6M Sep  7 13:26 DataAug2010_BdJpsiKstar_jpsiMassConstraint_periodB.root
-rw-r--r-- 1 jk67 atlas  11M Sep  7 13:27 DataAug2010_BdJpsiKstar_jpsiMassConstraint_periodC.root
-rw-r--r-- 1 jk67 atlas 500M Sep  7 13:54 DataAug2010_BdJpsiKstar_jpsiMassConstraint_periodD.root
-rw-r--r-- 1 jk67 atlas 1.6G Sep  7 14:57 DataAug2010_BdJpsiKstar_jpsiMassConstraint_periodE.root
-rw-r--r-- 1 jk67 atlas 586M Sep  8 09:04 DataAug2010_BdJpsiKstar_jpsiMassConstraint_periodF.root

Analysis code

Currently code is in:
/home/hep/dewhurst/BdJpsiK0star
To run it do:
.L BdJpsiKstar_analysis.cpp+
analysis("/scratch/jk67/DataAug2010_ntuples/", -99,"")
Added:
>
>

FTS script development

Aim is to right a script which plots the numbers that can be found on: http://lcgwww.gridpp.rl.ac.uk/cgi-bin/fts-mon/fts-mon-dev.pl

On root@lcgwww.gridpp.rl.ac.uk

 

-- AlastairDewhurst - 2009-10-09

Revision 172010-09-26 - AlastairDewhurst

 
META TOPICPARENT name="Main.AlastairDewhurst"

Introduction

This page is a work in progress and contains information to remind myself how to run things.

Running ATLAS jobs locally

Log into heplnx109.pp.rl.ac.uk and do kinit

source /gridsoft/SL5/atlas/software/15.6.6/setup.sh -tag=15.6.6,setup,32
export CMTPATH=${HOME}/testarea/15.6.6:${CMTPATH}
cd testarea/15.6.6/

The analysis

Local conditions access

MC Data

Real Data

To analyze real data you must set the following environment variables:
export FRONTIER_SERVER="(serverurl=http://lcgft-atlas.gridpp.rl.ac.uk:3128/frontierATLAS)"
export ATLAS_POOLCOND_PATH=/gridsoft/SL5/atlas/local/conditions/

Running on the grid

source setup.sh -tag=15.6.10,32,setup
source /afs/cern.ch/atlas/offline/external/GRID/DA/panda-client/latest/etc/panda/panda_setup.sh

Location of files

Local file for testing purposes
athenaCommonFlags.FilesInput = [ "/home/hep/dewhurst/data/mc09_valid.108407.Pythia_directJpsimu4mu4.merge.AOD.e347_s585_s582_r812_r814_tid096141/AOD.096141._000006.pool.root.1"]

There is a list of interesting physics runs which can be found here. You can find the runs you are looking for with the atlas run query tool. For the initial data taking period the B-Physics group is putting their Ntuples here.

Downloading files to RAL PPD

To download files directly to the dcache storage element at RAL do this:
source /afs/cern.ch/atlas/offline/external/GRID/ddm/DQ2Clients/setup.sh
export DQ2_LOCAL_SITE_ID=UKI-SOUTHGRID-RALPP_SCRATCHDISK
voms-proxy-init -voms atlas
dq2-get -S srm://heplnx204.pp.rl.ac.uk:8443/srm/managerv2?SFN=/pnfs/pp.rl.ac.uk/data/atlas/atlasralppdisk/dewhurst/ user.JamesCatmore.155697.f261_m496.muonstream_StandardGRL_prelim_muon_yellowPlus_7TeV-v02-pass1.staco/
Make sure you are not in the dcache element (ie do cd $HOME) otherwise you will get an error that you are in a read only file system.

Getting ROOT to work

To get root to work correctly do:
export LD_LIBRARY_PATH=/opt/d-cache/dcap/lib:/opt/d-cache/dcap/lib:/opt/d-cache/dcap/lib64:/opt/glite/lib:/opt/glite/lib64:/opt/globus/lib:/opt/lcg/lib:/opt/lcg/lib64:/opt/classads/lib64/:/opt/c-ares/lib:/opt/ppd/root/5.26.00/x86_64-slc5-gcc34-opt/root/lib/
export ROOTSYS=/opt/ppd/root/5.26.00/x86_64-slc5-gcc34-opt/root/
export PATH=/usr/kerberos/bin:/opt/d-cache/srm/bin:/opt/d-cache/dcap/bin:/opt/edg/bin:/opt/glite/bin:/opt/globus/bin:/opt/lcg/bin:/usr/local/bin:/bin:/usr/bin:/cern/pro/bin:${ROOTSYS}/bin:/usr/local/sbin:/sbin:/usr/sbin

Jpsi Information

All information is being stored on the sharepoint which can be found here. This information is likely to become rapidly out of date. There is a daily BPhys meeting at 3pm (UK time) via Evo. Plots of di-muon candidates can be found here.

Meetings of note:

Bd J/psi K0* Analysis

The next section contains details on the status of the Bd -> J/psi K0* analysis.

Athena analaysis

The is done using the Bd2JpsiKstar code written by Adam Barton. The currently available ntuples not using the

Ntuple slimming

This is currently done using Pavel script. Julie has run this code which can be found on:
heplnx111.pp.rl.ac.uk
ls -lrth /scratch/jk67/DataAug2010_ntuples/
total 2.7G
-rw-r--r-- 1 jk67 atlas 7.6M Sep  7 13:26 DataAug2010_BdJpsiKstar_jpsiMassConstraint_periodB.root
-rw-r--r-- 1 jk67 atlas  11M Sep  7 13:27 DataAug2010_BdJpsiKstar_jpsiMassConstraint_periodC.root
-rw-r--r-- 1 jk67 atlas 500M Sep  7 13:54 DataAug2010_BdJpsiKstar_jpsiMassConstraint_periodD.root
-rw-r--r-- 1 jk67 atlas 1.6G Sep  7 14:57 DataAug2010_BdJpsiKstar_jpsiMassConstraint_periodE.root
-rw-r--r-- 1 jk67 atlas 586M Sep  8 09:04 DataAug2010_BdJpsiKstar_jpsiMassConstraint_periodF.root

Analysis code

Currently code is in:
/home/hep/dewhurst/BdJpsiK0star
To run it do:
.L BdJpsiKstar_analysis.cpp+
analysis("/scratch/jk67/DataAug2010_ntuples/", -99,"")
Changed:
<
<
>
>
 
Changed:
<
<
-- AlastairDewhurst - 2009-10-09
>
>
-- AlastairDewhurst - 2009-10-09

Revision 162010-09-14 - AlastairDewhurst

 
META TOPICPARENT name="Main.AlastairDewhurst"

Introduction

This page is a work in progress and contains information to remind myself how to run things.

Running ATLAS jobs locally

Log into heplnx109.pp.rl.ac.uk and do kinit

source /gridsoft/SL5/atlas/software/15.6.6/setup.sh -tag=15.6.6,setup,32
export CMTPATH=${HOME}/testarea/15.6.6:${CMTPATH}
cd testarea/15.6.6/

The analysis

Local conditions access

MC Data

Real Data

To analyze real data you must set the following environment variables:
export FRONTIER_SERVER="(serverurl=http://lcgft-atlas.gridpp.rl.ac.uk:3128/frontierATLAS)"
export ATLAS_POOLCOND_PATH=/gridsoft/SL5/atlas/local/conditions/

Running on the grid

source setup.sh -tag=15.6.10,32,setup
source /afs/cern.ch/atlas/offline/external/GRID/DA/panda-client/latest/etc/panda/panda_setup.sh

Location of files

Local file for testing purposes
athenaCommonFlags.FilesInput = [ "/home/hep/dewhurst/data/mc09_valid.108407.Pythia_directJpsimu4mu4.merge.AOD.e347_s585_s582_r812_r814_tid096141/AOD.096141._000006.pool.root.1"]

There is a list of interesting physics runs which can be found here. You can find the runs you are looking for with the atlas run query tool. For the initial data taking period the B-Physics group is putting their Ntuples here.

Downloading files to RAL PPD

To download files directly to the dcache storage element at RAL do this:
source /afs/cern.ch/atlas/offline/external/GRID/ddm/DQ2Clients/setup.sh
export DQ2_LOCAL_SITE_ID=UKI-SOUTHGRID-RALPP_SCRATCHDISK
voms-proxy-init -voms atlas
dq2-get -S srm://heplnx204.pp.rl.ac.uk:8443/srm/managerv2?SFN=/pnfs/pp.rl.ac.uk/data/atlas/atlasralppdisk/dewhurst/ user.JamesCatmore.155697.f261_m496.muonstream_StandardGRL_prelim_muon_yellowPlus_7TeV-v02-pass1.staco/
Make sure you are not in the dcache element (ie do cd $HOME) otherwise you will get an error that you are in a read only file system.

Getting ROOT to work

To get root to work correctly do:
export LD_LIBRARY_PATH=/opt/d-cache/dcap/lib:/opt/d-cache/dcap/lib:/opt/d-cache/dcap/lib64:/opt/glite/lib:/opt/glite/lib64:/opt/globus/lib:/opt/lcg/lib:/opt/lcg/lib64:/opt/classads/lib64/:/opt/c-ares/lib:/opt/ppd/root/5.26.00/x86_64-slc5-gcc34-opt/root/lib/
export ROOTSYS=/opt/ppd/root/5.26.00/x86_64-slc5-gcc34-opt/root/
export PATH=/usr/kerberos/bin:/opt/d-cache/srm/bin:/opt/d-cache/dcap/bin:/opt/edg/bin:/opt/glite/bin:/opt/globus/bin:/opt/lcg/bin:/usr/local/bin:/bin:/usr/bin:/cern/pro/bin:${ROOTSYS}/bin:/usr/local/sbin:/sbin:/usr/sbin

Jpsi Information

All information is being stored on the sharepoint which can be found here. This information is likely to become rapidly out of date. There is a daily BPhys meeting at 3pm (UK time) via Evo. Plots of di-muon candidates can be found here.

Meetings of note:

Bd J/psi K0* Analysis

The next section contains details on the status of the Bd -> J/psi K0* analysis.

Athena analaysis

The is done using the Bd2JpsiKstar code written by Adam Barton. The currently available ntuples not using the

Ntuple slimming

This is currently done using Pavel script. Julie has run this code which can be found on:
heplnx111.pp.rl.ac.uk
ls -lrth /scratch/jk67/DataAug2010_ntuples/
total 2.7G
-rw-r--r-- 1 jk67 atlas 7.6M Sep  7 13:26 DataAug2010_BdJpsiKstar_jpsiMassConstraint_periodB.root
-rw-r--r-- 1 jk67 atlas  11M Sep  7 13:27 DataAug2010_BdJpsiKstar_jpsiMassConstraint_periodC.root
-rw-r--r-- 1 jk67 atlas 500M Sep  7 13:54 DataAug2010_BdJpsiKstar_jpsiMassConstraint_periodD.root
-rw-r--r-- 1 jk67 atlas 1.6G Sep  7 14:57 DataAug2010_BdJpsiKstar_jpsiMassConstraint_periodE.root
-rw-r--r-- 1 jk67 atlas 586M Sep  8 09:04 DataAug2010_BdJpsiKstar_jpsiMassConstraint_periodF.root

Analysis code

Changed:
<
<
Currently code is in:
>
>
Currently code is in:
Deleted:
<
<
 /home/hep/dewhurst/BdJpsiK0star
Changed:
<
<
To run it do:
>
>
To run it do:
.L BdJpsiKstar_analysis.cpp+
Deleted:
<
<
.L BdJpsiKstar.cpp+
 analysis("/scratch/jk67/DataAug2010_ntuples/", -99,"")

Changed:
<
<
-- AlastairDewhurst - 2009-10-09
>
>
-- AlastairDewhurst - 2009-10-09

Revision 152010-09-09 - AlastairDewhurst

 
META TOPICPARENT name="Main.AlastairDewhurst"

Introduction

This page is a work in progress and contains information to remind myself how to run things.

Running ATLAS jobs locally

Log into heplnx109.pp.rl.ac.uk and do kinit

source /gridsoft/SL5/atlas/software/15.6.6/setup.sh -tag=15.6.6,setup,32
export CMTPATH=${HOME}/testarea/15.6.6:${CMTPATH}
cd testarea/15.6.6/

The analysis

Local conditions access

MC Data

Real Data

To analyze real data you must set the following environment variables:
export FRONTIER_SERVER="(serverurl=http://lcgft-atlas.gridpp.rl.ac.uk:3128/frontierATLAS)"
export ATLAS_POOLCOND_PATH=/gridsoft/SL5/atlas/local/conditions/

Running on the grid

source setup.sh -tag=15.6.10,32,setup
source /afs/cern.ch/atlas/offline/external/GRID/DA/panda-client/latest/etc/panda/panda_setup.sh

Location of files

Local file for testing purposes
athenaCommonFlags.FilesInput = [ "/home/hep/dewhurst/data/mc09_valid.108407.Pythia_directJpsimu4mu4.merge.AOD.e347_s585_s582_r812_r814_tid096141/AOD.096141._000006.pool.root.1"]

There is a list of interesting physics runs which can be found here. You can find the runs you are looking for with the atlas run query tool. For the initial data taking period the B-Physics group is putting their Ntuples here.

Downloading files to RAL PPD

To download files directly to the dcache storage element at RAL do this:
source /afs/cern.ch/atlas/offline/external/GRID/ddm/DQ2Clients/setup.sh
export DQ2_LOCAL_SITE_ID=UKI-SOUTHGRID-RALPP_SCRATCHDISK
voms-proxy-init -voms atlas
dq2-get -S srm://heplnx204.pp.rl.ac.uk:8443/srm/managerv2?SFN=/pnfs/pp.rl.ac.uk/data/atlas/atlasralppdisk/dewhurst/ user.JamesCatmore.155697.f261_m496.muonstream_StandardGRL_prelim_muon_yellowPlus_7TeV-v02-pass1.staco/
Make sure you are not in the dcache element (ie do cd $HOME) otherwise you will get an error that you are in a read only file system.

Getting ROOT to work

To get root to work correctly do:
export LD_LIBRARY_PATH=/opt/d-cache/dcap/lib:/opt/d-cache/dcap/lib:/opt/d-cache/dcap/lib64:/opt/glite/lib:/opt/glite/lib64:/opt/globus/lib:/opt/lcg/lib:/opt/lcg/lib64:/opt/classads/lib64/:/opt/c-ares/lib:/opt/ppd/root/5.26.00/x86_64-slc5-gcc34-opt/root/lib/
export ROOTSYS=/opt/ppd/root/5.26.00/x86_64-slc5-gcc34-opt/root/
export PATH=/usr/kerberos/bin:/opt/d-cache/srm/bin:/opt/d-cache/dcap/bin:/opt/edg/bin:/opt/glite/bin:/opt/globus/bin:/opt/lcg/bin:/usr/local/bin:/bin:/usr/bin:/cern/pro/bin:${ROOTSYS}/bin:/usr/local/sbin:/sbin:/usr/sbin

Jpsi Information

All information is being stored on the sharepoint which can be found here. This information is likely to become rapidly out of date. There is a daily BPhys meeting at 3pm (UK time) via Evo. Plots of di-muon candidates can be found here.

Meetings of note:

Added:
>
>

Bd J/psi K0* Analysis

The next section contains details on the status of the Bd -> J/psi K0* analysis.

Athena analaysis

The is done using the Bd2JpsiKstar code written by Adam Barton. The currently available ntuples not using the

Ntuple slimming

This is currently done using Pavel script. Julie has run this code which can be found on:
heplnx111.pp.rl.ac.uk
ls -lrth /scratch/jk67/DataAug2010_ntuples/
total 2.7G
-rw-r--r-- 1 jk67 atlas 7.6M Sep  7 13:26 DataAug2010_BdJpsiKstar_jpsiMassConstraint_periodB.root
-rw-r--r-- 1 jk67 atlas  11M Sep  7 13:27 DataAug2010_BdJpsiKstar_jpsiMassConstraint_periodC.root
-rw-r--r-- 1 jk67 atlas 500M Sep  7 13:54 DataAug2010_BdJpsiKstar_jpsiMassConstraint_periodD.root
-rw-r--r-- 1 jk67 atlas 1.6G Sep  7 14:57 DataAug2010_BdJpsiKstar_jpsiMassConstraint_periodE.root
-rw-r--r-- 1 jk67 atlas 586M Sep  8 09:04 DataAug2010_BdJpsiKstar_jpsiMassConstraint_periodF.root

Analysis code

Currently code is in:
/home/hep/dewhurst/BdJpsiK0star
To run it do:
.L BdJpsiKstar.cpp+
analysis("/scratch/jk67/DataAug2010_ntuples/", -99,"")
</verbaitm>

  -- AlastairDewhurst - 2009-10-09

Revision 142010-09-03 - AlastairDewhurst

 
META TOPICPARENT name="Main.AlastairDewhurst"

Introduction

This page is a work in progress and contains information to remind myself how to run things.

Running ATLAS jobs locally

Log into heplnx109.pp.rl.ac.uk and do kinit

source /gridsoft/SL5/atlas/software/15.6.6/setup.sh -tag=15.6.6,setup,32
export CMTPATH=${HOME}/testarea/15.6.6:${CMTPATH}
cd testarea/15.6.6/

The analysis

Local conditions access

MC Data

Real Data

Changed:
<
<
To analyze real data you must set the following environment variables:
>
>
To analyze real data you must set the following environment variables:
Deleted:
<
<
 export FRONTIER_SERVER="(serverurl=http://lcgft-atlas.gridpp.rl.ac.uk:3128/frontierATLAS)" export ATLAS_POOLCOND_PATH=/gridsoft/SL5/atlas/local/conditions/

Running on the grid

source setup.sh -tag=15.6.10,32,setup
source /afs/cern.ch/atlas/offline/external/GRID/DA/panda-client/latest/etc/panda/panda_setup.sh

Location of files

Local file for testing purposes
athenaCommonFlags.FilesInput = [ "/home/hep/dewhurst/data/mc09_valid.108407.Pythia_directJpsimu4mu4.merge.AOD.e347_s585_s582_r812_r814_tid096141/AOD.096141._000006.pool.root.1"]

There is a list of interesting physics runs which can be found here. You can find the runs you are looking for with the atlas run query tool. For the initial data taking period the B-Physics group is putting their Ntuples here.

Downloading files to RAL PPD

To download files directly to the dcache storage element at RAL do this:
source /afs/cern.ch/atlas/offline/external/GRID/ddm/DQ2Clients/setup.sh
Added:
>
>
export DQ2_LOCAL_SITE_ID=UKI-SOUTHGRID-RALPP_SCRATCHDISK
 voms-proxy-init -voms atlas dq2-get -S srm://heplnx204.pp.rl.ac.uk:8443/srm/managerv2?SFN=/pnfs/pp.rl.ac.uk/data/atlas/atlasralppdisk/dewhurst/ user.JamesCatmore.155697.f261_m496.muonstream_StandardGRL_prelim_muon_yellowPlus_7TeV-v02-pass1.staco/ Make sure you are not in the dcache element (ie do cd $HOME) otherwise you will get an error that you are in a read only file system.

Getting ROOT to work

To get root to work correctly do:
export LD_LIBRARY_PATH=/opt/d-cache/dcap/lib:/opt/d-cache/dcap/lib:/opt/d-cache/dcap/lib64:/opt/glite/lib:/opt/glite/lib64:/opt/globus/lib:/opt/lcg/lib:/opt/lcg/lib64:/opt/classads/lib64/:/opt/c-ares/lib:/opt/ppd/root/5.26.00/x86_64-slc5-gcc34-opt/root/lib/
export ROOTSYS=/opt/ppd/root/5.26.00/x86_64-slc5-gcc34-opt/root/
export PATH=/usr/kerberos/bin:/opt/d-cache/srm/bin:/opt/d-cache/dcap/bin:/opt/edg/bin:/opt/glite/bin:/opt/globus/bin:/opt/lcg/bin:/usr/local/bin:/bin:/usr/bin:/cern/pro/bin:${ROOTSYS}/bin:/usr/local/sbin:/sbin:/usr/sbin

Jpsi Information

All information is being stored on the sharepoint which can be found here. This information is likely to become rapidly out of date. There is a daily BPhys meeting at 3pm (UK time) via Evo. Plots of di-muon candidates can be found here.

Meetings of note:

-- AlastairDewhurst - 2009-10-09

Revision 132010-08-05 - AlastairDewhurst

 
META TOPICPARENT name="Main.AlastairDewhurst"

Introduction

This page is a work in progress and contains information to remind myself how to run things.

Running ATLAS jobs locally

Log into heplnx109.pp.rl.ac.uk and do kinit

source /gridsoft/SL5/atlas/software/15.6.6/setup.sh -tag=15.6.6,setup,32
export CMTPATH=${HOME}/testarea/15.6.6:${CMTPATH}
cd testarea/15.6.6/

The analysis

Changed:
<
<

Running on the grid

>
>

Local conditions access

Added:
>
>

MC Data

Real Data

To analyze real data you must set the following environment variables:
 
Added:
>
>
export FRONTIER_SERVER="(serverurl=http://lcgft-atlas.gridpp.rl.ac.uk:3128/frontierATLAS)" export ATLAS_POOLCOND_PATH=/gridsoft/SL5/atlas/local/conditions/

Running on the grid

 source setup.sh -tag=15.6.10,32,setup source /afs/cern.ch/atlas/offline/external/GRID/DA/panda-client/latest/etc/panda/panda_setup.sh
Deleted:
<
<
 

Location of files

Local file for testing purposes
athenaCommonFlags.FilesInput = [ "/home/hep/dewhurst/data/mc09_valid.108407.Pythia_directJpsimu4mu4.merge.AOD.e347_s585_s582_r812_r814_tid096141/AOD.096141._000006.pool.root.1"]

There is a list of interesting physics runs which can be found here. You can find the runs you are looking for with the atlas run query tool. For the initial data taking period the B-Physics group is putting their Ntuples here.

Downloading files to RAL PPD

To download files directly to the dcache storage element at RAL do this:
source /afs/cern.ch/atlas/offline/external/GRID/ddm/DQ2Clients/setup.sh
voms-proxy-init -voms atlas
dq2-get -S srm://heplnx204.pp.rl.ac.uk:8443/srm/managerv2?SFN=/pnfs/pp.rl.ac.uk/data/atlas/atlasralppdisk/dewhurst/ user.JamesCatmore.155697.f261_m496.muonstream_StandardGRL_prelim_muon_yellowPlus_7TeV-v02-pass1.staco/
Make sure you are not in the dcache element (ie do cd $HOME) otherwise you will get an error that you are in a read only file system.

Getting ROOT to work

To get root to work correctly do:
export LD_LIBRARY_PATH=/opt/d-cache/dcap/lib:/opt/d-cache/dcap/lib:/opt/d-cache/dcap/lib64:/opt/glite/lib:/opt/glite/lib64:/opt/globus/lib:/opt/lcg/lib:/opt/lcg/lib64:/opt/classads/lib64/:/opt/c-ares/lib:/opt/ppd/root/5.26.00/x86_64-slc5-gcc34-opt/root/lib/
export ROOTSYS=/opt/ppd/root/5.26.00/x86_64-slc5-gcc34-opt/root/
export PATH=/usr/kerberos/bin:/opt/d-cache/srm/bin:/opt/d-cache/dcap/bin:/opt/edg/bin:/opt/glite/bin:/opt/globus/bin:/opt/lcg/bin:/usr/local/bin:/bin:/usr/bin:/cern/pro/bin:${ROOTSYS}/bin:/usr/local/sbin:/sbin:/usr/sbin

Jpsi Information

All information is being stored on the sharepoint which can be found here. This information is likely to become rapidly out of date. There is a daily BPhys meeting at 3pm (UK time) via Evo. Plots of di-muon candidates can be found here.

Meetings of note:

-- AlastairDewhurst - 2009-10-09

Revision 122010-08-04 - AlastairDewhurst

 
META TOPICPARENT name="Main.AlastairDewhurst"

Introduction

This page is a work in progress and contains information to remind myself how to run things.

Running ATLAS jobs locally

Log into heplnx109.pp.rl.ac.uk and do kinit

source /gridsoft/SL5/atlas/software/15.6.6/setup.sh -tag=15.6.6,setup,32
export CMTPATH=${HOME}/testarea/15.6.6:${CMTPATH}
cd testarea/15.6.6/

The analysis

Running on the grid

Added:
>
>
source setup.sh -tag=15.6.10,32,setup
source /afs/cern.ch/atlas/offline/external/GRID/DA/panda-client/latest/etc/panda/panda_setup.sh
 
Added:
>
>
 

Location of files

Local file for testing purposes
athenaCommonFlags.FilesInput = [ "/home/hep/dewhurst/data/mc09_valid.108407.Pythia_directJpsimu4mu4.merge.AOD.e347_s585_s582_r812_r814_tid096141/AOD.096141._000006.pool.root.1"]

There is a list of interesting physics runs which can be found here. You can find the runs you are looking for with the atlas run query tool. For the initial data taking period the B-Physics group is putting their Ntuples here.

Downloading files to RAL PPD

To download files directly to the dcache storage element at RAL do this:
source /afs/cern.ch/atlas/offline/external/GRID/ddm/DQ2Clients/setup.sh
voms-proxy-init -voms atlas
dq2-get -S srm://heplnx204.pp.rl.ac.uk:8443/srm/managerv2?SFN=/pnfs/pp.rl.ac.uk/data/atlas/atlasralppdisk/dewhurst/ user.JamesCatmore.155697.f261_m496.muonstream_StandardGRL_prelim_muon_yellowPlus_7TeV-v02-pass1.staco/
Make sure you are not in the dcache element (ie do cd $HOME) otherwise you will get an error that you are in a read only file system.

Getting ROOT to work

Changed:
<
<
To get root to work correctly do:
>
>
To get root to work correctly do:
Deleted:
<
<
 export LD_LIBRARY_PATH=/opt/d-cache/dcap/lib:/opt/d-cache/dcap/lib:/opt/d-cache/dcap/lib64:/opt/glite/lib:/opt/glite/lib64:/opt/globus/lib:/opt/lcg/lib:/opt/lcg/lib64:/opt/classads/lib64/:/opt/c-ares/lib:/opt/ppd/root/5.26.00/x86_64-slc5-gcc34-opt/root/lib/ export ROOTSYS=/opt/ppd/root/5.26.00/x86_64-slc5-gcc34-opt/root/ export PATH=/usr/kerberos/bin:/opt/d-cache/srm/bin:/opt/d-cache/dcap/bin:/opt/edg/bin:/opt/glite/bin:/opt/globus/bin:/opt/lcg/bin:/usr/local/bin:/bin:/usr/bin:/cern/pro/bin:${ROOTSYS}/bin:/usr/local/sbin:/sbin:/usr/sbin

Jpsi Information

All information is being stored on the sharepoint which can be found here. This information is likely to become rapidly out of date. There is a daily BPhys meeting at 3pm (UK time) via Evo. Plots of di-muon candidates can be found here.

Meetings of note:

-- AlastairDewhurst - 2009-10-09

Revision 112010-06-22 - AlastairDewhurst

 
META TOPICPARENT name="Main.AlastairDewhurst"

Introduction

This page is a work in progress and contains information to remind myself how to run things.

Running ATLAS jobs locally

Log into heplnx109.pp.rl.ac.uk and do kinit

source /gridsoft/SL5/atlas/software/15.6.6/setup.sh -tag=15.6.6,setup,32
export CMTPATH=${HOME}/testarea/15.6.6:${CMTPATH}
cd testarea/15.6.6/

The analysis

Running on the grid

Location of files

Local file for testing purposes
athenaCommonFlags.FilesInput = [ "/home/hep/dewhurst/data/mc09_valid.108407.Pythia_directJpsimu4mu4.merge.AOD.e347_s585_s582_r812_r814_tid096141/AOD.096141._000006.pool.root.1"]

There is a list of interesting physics runs which can be found here. You can find the runs you are looking for with the atlas run query tool. For the initial data taking period the B-Physics group is putting their Ntuples here.

Downloading files to RAL PPD

Changed:
<
<
To download files directly to the dcache storage element at RAL do this:
>
>
To download files directly to the dcache storage element at RAL do this:
Deleted:
<
<
 source /afs/cern.ch/atlas/offline/external/GRID/ddm/DQ2Clients/setup.sh voms-proxy-init -voms atlas dq2-get -S srm://heplnx204.pp.rl.ac.uk:8443/srm/managerv2?SFN=/pnfs/pp.rl.ac.uk/data/atlas/atlasralppdisk/dewhurst/ user.JamesCatmore.155697.f261_m496.muonstream_StandardGRL_prelim_muon_yellowPlus_7TeV-v02-pass1.staco/
Added:
>
>
Make sure you are not in the dcache element (ie do cd $HOME) otherwise you will get an error that you are in a read only file system.

Getting ROOT to work

To get root to work correctly do:
export LD_LIBRARY_PATH=/opt/d-cache/dcap/lib:/opt/d-cache/dcap/lib:/opt/d-cache/dcap/lib64:/opt/glite/lib:/opt/glite/lib64:/opt/globus/lib:/opt/lcg/lib:/opt/lcg/lib64:/opt/classads/lib64/:/opt/c-ares/lib:/opt/ppd/root/5.26.00/x86_64-slc5-gcc34-opt/root/lib/
export ROOTSYS=/opt/ppd/root/5.26.00/x86_64-slc5-gcc34-opt/root/
export PATH=/usr/kerberos/bin:/opt/d-cache/srm/bin:/opt/d-cache/dcap/bin:/opt/edg/bin:/opt/glite/bin:/opt/globus/bin:/opt/lcg/bin:/usr/local/bin:/bin:/usr/bin:/cern/pro/bin:${ROOTSYS}/bin:/usr/local/sbin:/sbin:/usr/sbin
 
Deleted:
<
<
Make sure you are not in the dcache element (ie do cd $HOME) otherwise you will get an error that you are in a read only file system.
 

Jpsi Information

All information is being stored on the sharepoint which can be found here. This information is likely to become rapidly out of date. There is a daily BPhys meeting at 3pm (UK time) via Evo. Plots of di-muon candidates can be found here.

Meetings of note:

-- AlastairDewhurst - 2009-10-09

Revision 102010-06-08 - AlastairDewhurst

 
META TOPICPARENT name="Main.AlastairDewhurst"

Introduction

This page is a work in progress and contains information to remind myself how to run things.

Running ATLAS jobs locally

Log into heplnx109.pp.rl.ac.uk and do kinit

source /gridsoft/SL5/atlas/software/15.6.6/setup.sh -tag=15.6.6,setup,32
export CMTPATH=${HOME}/testarea/15.6.6:${CMTPATH}
cd testarea/15.6.6/

The analysis

Running on the grid

Location of files

Local file for testing purposes
athenaCommonFlags.FilesInput = [ "/home/hep/dewhurst/data/mc09_valid.108407.Pythia_directJpsimu4mu4.merge.AOD.e347_s585_s582_r812_r814_tid096141/AOD.096141._000006.pool.root.1"]

There is a list of interesting physics runs which can be found here. You can find the runs you are looking for with the atlas run query tool. For the initial data taking period the B-Physics group is putting their Ntuples here.

Added:
>
>

Downloading files to RAL PPD

To download files directly to the dcache storage element at RAL do this:
source /afs/cern.ch/atlas/offline/external/GRID/ddm/DQ2Clients/setup.sh
voms-proxy-init -voms atlas
dq2-get -S srm://heplnx204.pp.rl.ac.uk:8443/srm/managerv2?SFN=/pnfs/pp.rl.ac.uk/data/atlas/atlasralppdisk/dewhurst/ user.JamesCatmore.155697.f261_m496.muonstream_StandardGRL_prelim_muon_yellowPlus_7TeV-v02-pass1.staco/
Make sure you are not in the dcache element (ie do cd $HOME) otherwise you will get an error that you are in a read only file system.
 

Jpsi Information

All information is being stored on the sharepoint which can be found here. This information is likely to become rapidly out of date. There is a daily BPhys meeting at 3pm (UK time) via Evo. Plots of di-muon candidates can be found here.

Meetings of note:

-- AlastairDewhurst - 2009-10-09

Revision 92010-05-05 - AlastairDewhurst

 
META TOPICPARENT name="Main.AlastairDewhurst"

Introduction

This page is a work in progress and contains information to remind myself how to run things.

Running ATLAS jobs locally

Log into heplnx109.pp.rl.ac.uk and do kinit

source /gridsoft/SL5/atlas/software/15.6.6/setup.sh -tag=15.6.6,setup,32
export CMTPATH=${HOME}/testarea/15.6.6:${CMTPATH}
cd testarea/15.6.6/

The analysis

Running on the grid

Location of files

Local file for testing purposes
athenaCommonFlags.FilesInput = [ "/home/hep/dewhurst/data/mc09_valid.108407.Pythia_directJpsimu4mu4.merge.AOD.e347_s585_s582_r812_r814_tid096141/AOD.096141._000006.pool.root.1"]

There is a list of interesting physics runs which can be found here. You can find the runs you are looking for with the atlas run query tool. For the initial data taking period the B-Physics group is putting their Ntuples here.

Jpsi Information

All information is being stored on the sharepoint which can be found here. This information is likely to become rapidly out of date. There is a daily BPhys meeting at 3pm (UK time) via Evo. Plots of di-muon candidates can be found here.

Meetings of note:

Added:
>
>
 

-- AlastairDewhurst - 2009-10-09

Revision 82010-05-04 - AlastairDewhurst

 
META TOPICPARENT name="Main.AlastairDewhurst"

Introduction

This page is a work in progress and contains information to remind myself how to run things.

Running ATLAS jobs locally

Log into heplnx109.pp.rl.ac.uk and do kinit

source /gridsoft/SL5/atlas/software/15.6.6/setup.sh -tag=15.6.6,setup,32
export CMTPATH=${HOME}/testarea/15.6.6:${CMTPATH}
cd testarea/15.6.6/

The analysis

Running on the grid

Location of files

Local file for testing purposes
athenaCommonFlags.FilesInput = [ "/home/hep/dewhurst/data/mc09_valid.108407.Pythia_directJpsimu4mu4.merge.AOD.e347_s585_s582_r812_r814_tid096141/AOD.096141._000006.pool.root.1"]

There is a list of interesting physics runs which can be found here. You can find the runs you are looking for with the atlas run query tool. For the initial data taking period the B-Physics group is putting their Ntuples here.

Jpsi Information

Changed:
<
<
All information is being stored on the sharepoint which can be found [https://espace.cern.ch/atlas-phys-beauty/default.aspx][here]]. This information is likely to become rapidly out of date. There is a daily BPhys meeting at 3pm (UK time) via Evo. Plots of di-muon candidates can be found here.
>
>
All information is being stored on the sharepoint which can be found here. This information is likely to become rapidly out of date. There is a daily BPhys meeting at 3pm (UK time) via Evo. Plots of di-muon candidates can be found here.
  Meetings of note:

-- AlastairDewhurst - 2009-10-09

Revision 72010-04-21 - AlastairDewhurst

 
META TOPICPARENT name="Main.AlastairDewhurst"

Introduction

This page is a work in progress and contains information to remind myself how to run things.

Running ATLAS jobs locally

Log into heplnx109.pp.rl.ac.uk and do kinit

source /gridsoft/SL5/atlas/software/15.6.6/setup.sh -tag=15.6.6,setup,32
export CMTPATH=${HOME}/testarea/15.6.6:${CMTPATH}
Added:
>
>
cd testarea/15.6.6/
 
Added:
>
>
The analysis
 

Running on the grid

Location of files

Local file for testing purposes
athenaCommonFlags.FilesInput = [ "/home/hep/dewhurst/data/mc09_valid.108407.Pythia_directJpsimu4mu4.merge.AOD.e347_s585_s582_r812_r814_tid096141/AOD.096141._000006.pool.root.1"]
Changed:
<
<
There is a list of interesting physics runs which can be found here.
>
>
There is a list of interesting physics runs which can be found here. You can find the runs you are looking for with the atlas run query tool. For the initial data taking period the B-Physics group is putting their Ntuples here.
Deleted:
<
<
You can find the runs you are looking for with the atlas run query tool. For the initial data taking period the B-Physics group is putting their Ntuples [https://twiki.cern.ch/twiki/bin/view/AtlasProtected/BPhysicsWorkingGroupData][here]].
 

Jpsi Information

Changed:
<
<
All information is being stored on the sharepoint which can be found [https://espace.cern.ch/atlas-phys-beauty/default.aspx][here]].
>
>
All information is being stored on the sharepoint which can be found [https://espace.cern.ch/atlas-phys-beauty/default.aspx][here]]. This information is likely to become rapidly out of date. There is a daily BPhys meeting at 3pm (UK time) via Evo. Plots of di-muon candidates can be found here.
Deleted:
<
<
This information is likely to become rapidly out of date. There is a daily BPhys meeting at 3pm (UK time) via Evo. Plots of di-muon candidates can be found [https://atlas.web.cern.ch/Atlas/GROUPS/PHYSICS/BPHYSICS/dimuon/index.htm][here]].
  Meetings of note:

-- AlastairDewhurst - 2009-10-09

Revision 62010-04-14 - AlastairDewhurst

 
META TOPICPARENT name="Main.AlastairDewhurst"

Introduction

This page is a work in progress and contains information to remind myself how to run things.

Running ATLAS jobs locally

Log into heplnx109.pp.rl.ac.uk and do kinit

source /gridsoft/SL5/atlas/software/15.6.6/setup.sh -tag=15.6.6,setup,32
export CMTPATH=${HOME}/testarea/15.6.6:${CMTPATH}

Running on the grid

Location of files

Local file for testing purposes
athenaCommonFlags.FilesInput = [ "/home/hep/dewhurst/data/mc09_valid.108407.Pythia_directJpsimu4mu4.merge.AOD.e347_s585_s582_r812_r814_tid096141/AOD.096141._000006.pool.root.1"]
Added:
>
>
There is a list of interesting physics runs which can be found here.
 You can find the runs you are looking for with the atlas run query tool.
Added:
>
>
For the initial data taking period the B-Physics group is putting their Ntuples [https://twiki.cern.ch/twiki/bin/view/AtlasProtected/BPhysicsWorkingGroupData][here]].
 

Jpsi Information

Changed:
<
<
This information is likely to become rapidly out of date. There is a daily BPhys meeting at 3pm (UK time) via Evo.
>
>
All information is being stored on the sharepoint which can be found [https://espace.cern.ch/atlas-phys-beauty/default.aspx][here]].
Added:
>
>
This information is likely to become rapidly out of date. There is a daily BPhys meeting at 3pm (UK time) via Evo. Plots of di-muon candidates can be found [https://atlas.web.cern.ch/Atlas/GROUPS/PHYSICS/BPHYSICS/dimuon/index.htm][here]].
 
Added:
>
>
 Meetings of note:
Changed:
<
<
>
>
  -- AlastairDewhurst - 2009-10-09

Revision 52010-04-14 - AlastairDewhurst

 
META TOPICPARENT name="Main.AlastairDewhurst"

Introduction

This page is a work in progress and contains information to remind myself how to run things.

Running ATLAS jobs locally

Log into heplnx109.pp.rl.ac.uk and do kinit

source /gridsoft/SL5/atlas/software/15.6.6/setup.sh -tag=15.6.6,setup,32
export CMTPATH=${HOME}/testarea/15.6.6:${CMTPATH}

Running on the grid

Location of files

Changed:
<
<
Local file for testing purposes
>
>
Local file for testing purposes
Deleted:
<
<
 athenaCommonFlags.FilesInput = [ "/home/hep/dewhurst/data/mc09_valid.108407.Pythia_directJpsimu4mu4.merge.AOD.e347_s585_s582_r812_r814_tid096141/AOD.096141._000006.pool.root.1"]
Added:
>
>
You can find the runs you are looking for with the atlas run query tool.
 
Added:
>
>

Jpsi Information

This information is likely to become rapidly out of date. There is a daily BPhys meeting at 3pm (UK time) via Evo.

Meetings of note:

  -- AlastairDewhurst - 2009-10-09

Revision 42010-04-14 - AlastairDewhurst

 
META TOPICPARENT name="Main.AlastairDewhurst"

Introduction

This page is a work in progress and contains information to remind myself how to run things.
Changed:
<
<

Running ATLAS jobs

Log into heplnx109.pp.rl.ac.uk
>
>

Running ATLAS jobs locally

Log into heplnx109.pp.rl.ac.uk and do kinit
Added:
>
>
 
Changed:
<
<
> source /gridsoft/SL5/atlas/software/15.6.6/setup.sh -tag=15.6.6,setup,32
> export CMTPATH=${HOME}/testarea/15.6.6:${CMTPATH}
>
>
source /gridsoft/SL5/atlas/software/15.6.6/setup.sh -tag=15.6.6,setup,32 export CMTPATH=${HOME}/testarea/15.6.6:${CMTPATH}
 
Added:
>
>

Running on the grid

Location of files

Local file for testing purposes
athenaCommonFlags.FilesInput = [ "/home/hep/dewhurst/data/mc09_valid.108407.Pythia_directJpsimu4mu4.merge.AOD.e347_s585_s582_r812_r814_tid096141/AOD.096141._000006.pool.root.1"]
 

-- AlastairDewhurst - 2009-10-09

Revision 32010-04-13 - AlastairDewhurst

 
META TOPICPARENT name="Main.AlastairDewhurst"

Introduction

This page is a work in progress and contains information to remind myself how to run things.

Running ATLAS jobs

Changed:
<
<
Log into heplnx109.pp.rl.ac.uk

>
>
Log into heplnx109.pp.rl.ac.uk
> source /gridsoft/SL5/atlas/software/15.6.6/setup.sh -tag=15.6.6,setup,32
> export CMTPATH=${HOME}/testarea/15.6.6:${CMTPATH}
Added:
>
>
 

-- AlastairDewhurst - 2009-10-09

Revision 22010-04-12 - AlastairDewhurst

 
META TOPICPARENT name="Main.AlastairDewhurst"

Introduction

Changed:
<
<
This page contains a guide to working on the RAL cluster
>
>
This page is a work in progress and contains information to remind myself how to run things.
 
Changed:
<
<

Hardware Description of nodes, CPU, disk space.

The Tier 2 at RAL is made up of roughly
>
>

Running ATLAS jobs

Log into heplnx109.pp.rl.ac.uk
 
Deleted:
<
<

Logging into the Cluster

 
Deleted:
<
<
The names of the computers you want to log into are:

linux.pp.rl.ac.uk General SL4 User Interface
heplnx101.pp.rl.ac.uk General SL4 User Interface
heplnx102.pp.rl.ac.uk General SL4 User Interface
heplnx103.pp.rl.ac.uk General SL3 User Interface
heplnx104.pp.rl.ac.uk CMS SL4 User Interface (PhEDEx)
heplnx105.pp.rl.ac.uk ATLAS SL4 User Interface
heplnx106.pp.rl.ac.uk ATLAS SL4 User Interface
heplnx107.pp.rl.ac.uk CMS SL4 User Interface
heplnx108.pp.rl.ac.uk LHCb SL4 User Interface

Accessing the cluster from the internal network

You can log into the

Accessing the cluster from offsite

Setting up Athena

Once you have logged into a machine you need to set up Athena. The instructions to set up Athena can be found at WorkBookSetAccount. In order to be able to setup Athena following the link you need to know The Athena releases are stored at

/raid/expt-sw/SL4/atlas/prod/releases/rel_15-4/CMT/*/mgr/setup.sh

Running Jobs

Running test Jobs

Running Large Jobs

 

-- AlastairDewhurst - 2009-10-09

Revision 12009-10-09 - AlastairDewhurst

 
META TOPICPARENT name="Main.AlastairDewhurst"

Introduction

This page contains a guide to working on the RAL cluster

Hardware Description of nodes, CPU, disk space.

The Tier 2 at RAL is made up of roughly

Logging into the Cluster

The names of the computers you want to log into are:

linux.pp.rl.ac.uk General SL4 User Interface
heplnx101.pp.rl.ac.uk General SL4 User Interface
heplnx102.pp.rl.ac.uk General SL4 User Interface
heplnx103.pp.rl.ac.uk General SL3 User Interface
heplnx104.pp.rl.ac.uk CMS SL4 User Interface (PhEDEx)
heplnx105.pp.rl.ac.uk ATLAS SL4 User Interface
heplnx106.pp.rl.ac.uk ATLAS SL4 User Interface
heplnx107.pp.rl.ac.uk CMS SL4 User Interface
heplnx108.pp.rl.ac.uk LHCb SL4 User Interface

Accessing the cluster from the internal network

You can log into the

Accessing the cluster from offsite

Setting up Athena

Once you have logged into a machine you need to set up Athena. The instructions to set up Athena can be found at WorkBookSetAccount. In order to be able to setup Athena following the link you need to know The Athena releases are stored at

/raid/expt-sw/SL4/atlas/prod/releases/rel_15-4/CMT/*/mgr/setup.sh

Running Jobs

Running test Jobs

Running Large Jobs

-- AlastairDewhurst - 2009-10-09

 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback