Giter Club home page Giter Club logo

gammapy-fermi-lat-data's Introduction

Fermi-LAT datasets

This repository contains pre-computed Fermi-LAT datasets, that can be used for analysis with gammapy. Currently the following datasets are included:

Name Energy Min. Energy Max. Region IRFs Exposure Zenith Cut
2FHL 50 GeV 2 TeV all-sky Pass 8 80 months 105 deg
3FHL 10 GeV 2 TeV all-sky Pass 8 84 months 105 deg

A more detailed description and listing of analysis parameters can be found in the corresponding sub-folders for the 2FHL and 3FHL datset.

Get the data

To get the data just clone this repository to your local machine using git:

git clone https://github.com/gammapy/gammapy-fermi-lat-data

Now define the environment variable GAMMAPY_FERMI_LAT_DATA to point to the path where the repository is located on your machine:

export GAMMAPY_FERMI_LAT_DATA=path/to/gammapy-fermi-lat-data

In addition you have to download the latest galactic diffuse model directly by clicking here or using the command line:

wget https://fermi.gsfc.nasa.gov/ssc/data/analysis/software/aux/gll_iem_v06.fits path/to/fermi/diffuse/dir

Note: if you have an installation of the Fermi science tools, you already have the galactic diffuse model. It is contained in the sub folder path/to/fermi/science/tools/refdata/fermi/galdiffuse/gll_iem_v06.fits

Define the environment variable FERMI_DIFFUSE_DIR to point to the directory where the gll_iem_v06.fits file is contained:

export FERMI_DIFFUSE_DIR=path/to/fermi/diffuse/dir

Work with the data

Once you've copied the data and defined the environment variables to point to the corresponding directories, the data is ready to be used. Please check the examples provided in the docstring of the FermiLATDataset class from Gammapy or check out the tutorial Fermi-LAT data with Gammapy.

Data preparation for contributors

Every dataset includes a data preparation bash script, which runs the Fermi-LAT science tools to compute events, exposure, livetime and psf for given set of analysis parameters. Running the scripts to prepare the datasets in this repositiory requires the Fermi-LAT science tools to be installed. In addition the following environment variables have to be set:

* `FERMI_FT1_FILE`: Path to the event files list.
* `FERMI_FT2_FILE`: Path to the merged spacecraft file.

Normal users don't have to run the script but can just start their analyses from the datasets provided in this repository.

gammapy-fermi-lat-data's People

Contributors

adonath avatar cdeil avatar joleroi avatar qremy avatar

Stargazers

 avatar

Watchers

 avatar  avatar  avatar

gammapy-fermi-lat-data's Issues

Are the correct event files shipped?

I read through the scripts that prepare the data like fermi_3fhl_data_prepare.sh. In the script the following variables are defined

  • EVENTS
  • EVENTS_SELECTED
    The latter being the output of gtselect. The tools that run after gtselect, however, take EVENTS as input parameter, e.g. here. Also in fermi_3fhl_data_config.yaml the event list keyword seems to point to EVENTS.

So is it correct to use EVENTS as input for the other tools and also use it in the analysis event though there is EVENTS_SELECTED?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.