Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upating HCAL validation test for HTCondor batch submission #25678

Merged
merged 2 commits into from Jan 17, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
2 changes: 1 addition & 1 deletion Validation/CaloTowers/test/CaloScan/Merge.sh
Expand Up @@ -8,7 +8,7 @@ file="$(ls | grep -i "DQM")"
echo $file

#clean directory
rm -r pi50_*.py *.log LSFJOB_* pi50_*.root
rm -r pi50_*.py *.log LSFJOB_* pi50_*.root conf.py mc.root
echo "Changing the name of DQM file"
if [ "$#" -ne 1 ]; then
echo "Give One version name for DQM root file"
Expand Down
16 changes: 11 additions & 5 deletions Validation/CaloTowers/test/CaloScan/README
Expand Up @@ -10,20 +10,26 @@

(6) cd Validation/CaloTowers/test/CaloScan

(7) ./make_configs.csh
NB: before step (7) one of the desired templates (template*.py_*)
has to be renamed into simple template.py (which is used in step (7) below)

(8) ./submit_batch.csh
(7) ./make_configs.csh
creates 50 job configs (each for 1k ev "slice" processing from input file)

(8) ./submit_batch.csh (LSF batch) or ./submit_HTCondor.csh (HTCondor batch)

NB: it uses batch submission
(batch.csh for LSF or batch_HTCondor.csh for HTCondor) at CERN

NB: it uses batch submission (batch.csh) to lxbatch at CERN
with input file
/afs/cern.ch/cms/data/CMSSW/Validation/HcalHits/data/620/mc_pi50_eta05.root
Each of 50 job uses 1K out of total 50K input.

In ~30-40 min (in the submission directory, /scan in this case)
the results of 50 batch jobs will be arriving.
the results of 50 batch jobs will start arriving.
Once all 50 jobs finished and 50 *.root files appeared locally,

(9) ./Merge.sh 10_2_0 (e.g the extension name for resulting DQM file)
(9) ./Merge.sh 10_2_0 (e.g. the extension name for resulting DQM file)

It will do following things:

Expand Down
34 changes: 34 additions & 0 deletions Validation/CaloTowers/test/CaloScan/batch_HTCondor.csh
@@ -0,0 +1,34 @@
#!/bin/csh

setenv num ${1}
echo '===> num.' ${num}
echo ' '

setenv WORKDIR ${PWD}
echo '===> Local working dir ' ${WORKDIR}

setenv name pi50

setenv MYWORKDIR ${2}
cd ${MYWORKDIR}

echo ' '
echo '===> Remote submission dir ' ${MYWORKDIR}
echo ' '

eval `scramv1 runtime -csh`

#------------------

cd ${WORKDIR}

cp ${MYWORKDIR}/${name}_${num}.py conf.py

rfcp /afs/cern.ch/cms/data/CMSSW/Validation/HcalHits/data/620/mc_pi50_eta05.root mc.root

cmsRun conf.py > & output.log

ls -lrt

rfcp output.root ${MYWORKDIR}/${name}_${num}.root
rfcp output.log ${MYWORKDIR}/${name}_${num}.log
Expand Up @@ -3,18 +3,12 @@
from DQMServices.Core.DQMEDHarvester import DQMEDHarvester

process = cms.Process("CONV")
process.load("Configuration.StandardSequences.Reconstruction_cff")
process.load("Configuration.StandardSequences.GeometryRecoDB_cff")

process.load("Configuration.StandardSequences.FrontierConditions_GlobalTag_cff")
from Configuration.AlCa.autoCond import autoCond
process.GlobalTag.globaltag = autoCond['mc']

process.load("FWCore.MessageLogger.MessageLogger_cfi")
process.MessageLogger.cerr.FwkReport.reportEvery = 1000

process.load("DQMServices.Core.DQM_cfg")
process.DQM.collectorHost = ''
process.load("DQMServices.Core.DQMStore_cfi")
process.load("DQMServices.Components.MEtoEDMConverter_cfi")

process.maxEvents = cms.untracked.PSet(
input = cms.untracked.int32(-1)
Expand Down Expand Up @@ -46,7 +40,32 @@
'file:pi50_22.root',
'file:pi50_23.root',
'file:pi50_24.root',
'file:pi50_25.root'
'file:pi50_25.root',
'file:pi50_26.root',
'file:pi50_27.root',
'file:pi50_28.root',
'file:pi50_29.root',
'file:pi50_30.root',
'file:pi50_31.root',
'file:pi50_32.root',
'file:pi50_33.root',
'file:pi50_34.root',
'file:pi50_35.root',
'file:pi50_36.root',
'file:pi50_37.root',
'file:pi50_38.root',
'file:pi50_39.root',
'file:pi50_40.root',
'file:pi50_41.root',
'file:pi50_42.root',
'file:pi50_43.root',
'file:pi50_44.root',
'file:pi50_45.root',
'file:pi50_46.root',
'file:pi50_47.root',
'file:pi50_48.root',
'file:pi50_49.root',
'file:pi50_50.root'
)
)

Expand All @@ -57,6 +76,11 @@
Workflow = '/HcalValidation/'+'Harvesting/'+str(cmssw_version)
process.dqmSaver.workflow = Workflow

process.hcaldigisClient = DQMEDHarvester("HcalDigisClient",
outputFile = cms.untracked.string('HcalDigisHarvestingME.root'),
DQMDirName = cms.string("/") # root directory
)

process.calotowersClient = DQMEDHarvester("CaloTowersClient",
outputFile = cms.untracked.string('CaloTowersHarvestingME.root'),
DQMDirName = cms.string("/") # root directory
Expand All @@ -70,4 +94,5 @@
process.EDMtoME *
process.calotowersClient *
process.hcalrechitsClient *
process.hcaldigisClient *
process.dqmSaver)
7 changes: 7 additions & 0 deletions Validation/CaloTowers/test/CaloScan/submit.sub
@@ -0,0 +1,7 @@
arguments = $(par1) $(par2)
executable = batch_HTCondor.csh
output = pi50_$(par1).out
error = pi50_$(par1).err
log = pi50_$(par1).log
+JobFlavour = "workday"
queue
12 changes: 12 additions & 0 deletions Validation/CaloTowers/test/CaloScan/submit_HTCondor.csh
@@ -0,0 +1,12 @@
#!/bin/csh
set i=1

while ( ${i} < 51 )

#echo ' '
echo 'i='${i}
#echo 'dir='${PWD}

condor_submit par1=${i} par2=${PWD} submit.sub
@ i = ${i} + "1"
end