Giter Club home page Giter Club logo

hhntup's Introduction

News

The master branch is now dedicated to analysis of the new atlas data format (xAOD). For production of skims out of the old format (D3PD), please refer to the d3pd branch.

Dependencies

  • Latest 5.X ROOT with PyROOT enabled.

  • rootpy:

    git clone git://github.com/rootpy/rootpy.git
    cd rootpy
    python setup.py install --user
    
  • goodruns:

    pip install --user goodruns
    
  • PyYAML:

    pip install --user pyyaml
    
  • ConfigObj:

    pip install --user configobj
    
  • externaltools:

    git clone git://github.com/htautau/externaltools.git
    
  • lumi:

    git clone git://github.com/htautau/lumi.git
    
  • hhntup:

    git clone git://github.com/htautau/hhntup.git
    
  • TauSpinnerTool:

    svn co svn+ssh://${USER}@svn.cern.ch/reps/atlasoff/PhysicsAnalysis/TauID/TauSpinnerTool/trunk TauSpinnerTool
    

Place externaltools, lumi, and TauSpinnerTool in the same directory containing hhntup to satisfy the symlinks in hhana. See the README in externaltools and TauSpinnerTool for further instructions. Use at least Python version 2.6 (2.7 is preferred).

xAOD Migration

  • Analysis release used:

    Base, 2.0.17
    
  • Dataset used:

    mc14_8TeV.147808.PowhegPythia8_AU2CT10_Ztautau.merge.AOD.e2372_s1933_s1911_r5591_r5625
    
  • To run the test:

    skim --local-test mc12_hadhad_xaod
    

Build and setup

Now build the C extension module for jet cleaning in the higgstautau package:

make lib

Before running tests locally:

source setup.sh

Skimming

The skimming is performed by the hhskim.py script.

Run the skims on the grid (after setting up the panda client and your VOMS proxy with the phys-higgs production role):

./skim --yall mc11_hadhad mc12_hadhad \
              data11_hadhad data12_hadhad \
              embed11_hadhad embed12_hadhad

Running a local test of the skimming

The samples are organized in several blocks defined in skims.cfg. Each block is written following the template:

[block_name]
 student = hhskim.py
 dataset = dataset_block (defined in datasets.cfg)
 version = version_number
 testinput = /path/to/the/input/files/for/test
 dest = SFU-LCG2_LOCALGROUPDISK,

For each block, modify the variable testinput according to your own setup.

Run the test:

./skim --yall block_name --local-test

The output will be created in the main directory as:

hhskim_dataset_block.root

Creating ntuples

After the skims are finished and downloaded, update the paths in higgstautau/datasets_config.yml and update the datasets database:

./dsdb --reset hh

Then launch the batch jobs that create all the analysis ntuples (nominal and systematics) with:

./run-all

hhntup's People

Contributors

ndawe avatar qbuat avatar

Stargazers

Ryan Reece avatar Giordon Stark avatar

Watchers

 avatar Giordon Stark avatar James Cloos avatar Clara Nellist avatar LAS avatar  avatar  avatar

hhntup's Issues

Adding LepLep channel to d3pd branch

most of my actions are concerned with applying event selection filters in higgstautau/leplep/filters.py
mimicking the selection procedure in the leplep code seen here: https://svnweb.cern.ch/trac/atlasphys/browser/Physics/Higgs/HSG4/software/leplep/MVA_8TeV/Preselection/trunk/Common/analysis.C

the required filters are listed as

Cut[0] = "begin";
Cut[1] = "GRL/Filt";
Cut[2] = "trigger";
Cut[3] = "primVtx";
Cut[4] = "jet-clean";
Cut[5] = "LArError";
Cut[6] = "tau-veto";
Cut[7] = "2-lepton";
Cut[8] = "OS";
Cut[9] = "trigMatch";
Cut[10] = "AuxiCut";

I believe that cuts 1-5 are already implemented in hhntup but when I start checking the trigger matching code I will also check the trigger cut mentioned here. I guess the primary Vertex calculation will remain the same?

I've already implemented a TauVeto (minus some overlap information)
And shares many similarities with the Tau selection criteria in higgstautau/filters.py but is isolated such that it only effects the leplep specific code.

Next I implement the selection of 2 leptons and the requirement that these be opposite sign. This should be done before the weekend.

Lepton Tools

Ok so I'm missing out some obvious things like smearing and calibration and things but even in the event selection the leplep code requires an electron ID parameter of mediumPlusPlus found in https://svnweb.cern.ch/trac/atlasoff/browser/Reconstruction/egamma/egammaAnalysis/egammaAnalysisUtils/tags/egammaAnalysisUtils-00-04-58/Root/IsEMPlusPlusDefs.cxx

There is a value called mediumPP in the ntuples but I'm wary about using it since it removes most of the events and is ignored in the leplep code.

Will this be included in the xAOD Egama tools? do we need to add it to the D3PD. Are there any other tools that we're missing?

Missing variables

Going through some of the lepton specific cuts i've found several variables that aren't included in the file higgstautau/hadhad/branches.py

I'm adding the following variables to the input branches

"mu_staco_ptcone40"
"mu_staco_trackz0pv"
"el_ptcone40"

I guess these will also be needed for lepton selection in the lephad channel? Anything else?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.