Giter Club home page Giter Club logo

photometry's Introduction

The TASOC Photometry module

https://img.shields.io/badge/bibcode-2021AJ....162..170H-blue https://github.com/tasoc/photometry/actions/workflows/tests.yml/badge.svg?branch=devel https://img.shields.io/codecov/c/github/tasoc/photometry Hits-of-Code license

This module provides the basic photometry setup for the TESS Asteroseismic Science Operations Center (TASOC).

The code is available through our GitHub organisation (https://github.com/tasoc/photometry) and full documentation for this code can be found on https://tasoc.dk/code/.

Note

Even though the full code and documentation are freely available, we highly encourage users to not attempt to use the code to generate their own photometry from TESS. Instead we encourage you to use the fully processed data products from the full TASOC pipeline, which are available from TASOC and MAST. If you are interested in working on details in the processing, we welcome you to join the T'DA working group.

Installation instructions

  • Start by making sure that you have Git Large File Storage (LFS) installed. You can verify that is installed by running the command:

    >>> git lfs version
  • Go to the directory where you want the Python code to be installed and simply download it or clone it via git as:

    >>> git clone https://github.com/tasoc/photometry.git .
    
  • All dependencies can be installed using the following command. It is recommended to do this in a dedicated virtualenv or similar:

    >>> pip install -r requirements.txt

How to run tests

You can test your installation by going to the root directory where you cloned the repository and run the command:

>>> pytest

Running the program

Just trying it out

For simply trying out the code straight after installation, you can simply run the photometry code directly. This will automatically load some test input data and run the photometry (see more details in the full documentation or below).

>>> python run_tessphot.py --starid=182092046

The number refers to the TIC-number of the star, and the above one can replaced with any TIC-number that is available in the TODO-list (see below).

Set up directories

The next thing to do is to set up the directories where input data is stored, and where output data (e.g. lightcurves) should be put. This is done by setting the enviroment variables TESSPHOT_INPUT and TESSPHOT_OUTPUT. Depending on your operating system and shell this is done in slightly different ways.

The directory defined in TESSPHOT_INPUT should contain all the data in FITS files that needs to be processed. The FITS files can be structured into sub-directories as you wish and may also be GZIP compressed (*.fits.gz). When the different programs runs, some of them will also add some more files to the TESSPHOT_INPUT directory. The directory in TESSPHOT_OUTPUT is used to store all the lightcurve FITS file that will be generated at the end.

Make star catalogs

The first program to be run is the run_make_catalog.py program, which will create full catalogs of all stars known to fall on or near the TESS detectors during a given observing sector. These catalogs are created directly from the TESS Input Catalog (TIC), and since this is such a huge table this program relies on internal databases running at TASOC at Aarhus University. You therefore need to be connected to the network at TASOC at Aarhus Univsity to run this program. The program is simply run as shown here for sector #14 (see full documentation for more options):

>>> python run_make_catalog.py 14

Prepare photometry

The next part of the program is to prepare photometry on individual stars by doing all the operations which requires the full-size FFI images, like the following:

  • Estimating sky background for all images.
  • Estimating spacecraft jitter.
  • Creating average image.
  • Restructuring data into HDF5 files for efficient I/O operations.

The program can simply be run like the following, which will create a number of HDF5 files (*.hdf5) in the TESSPHOT_INPUT directory.

>>> python run_prepare_photometry.py

Make TODO list

A TODO-list is a list of targets that should be processed by the photometry code. It includes information about which cameras and CCDs they fall on and which photometric methods they should be processed with. A TODO-list can be generated directly from the catalog files (since these contain all targets near the field-of-view) and the details stored in the HDF5 files. In order to create a full TODO list of all stars that can be observed, simply run the command:

>>> python run_make_todo.py

This will create the file todo.sqlite in the TESSPHOT_INPUT directory, which is needed for running the photometry. See the full documentation for more options.

Running the photometry

The photometry program can by run on a single star by running the program:

>>> python run_tessphot.py --starid=182092046

Here, the number gives the TIC identifier of the star. The program accepts various other command-line parameters - Try running:

>>> python run_tessphot.py --help

This is very usefull for testing different methods and settings.

Contributing to the code

You are more than welcome to contribute to this code! Please contact Rasmus Handberg or Derek Buzasi if you wish to contribute.

photometry's People

Contributors

benjaminpope avatar hvidy avatar jonasshansen avatar miklnl avatar rhandberg avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

photometry's Issues

NaN pixels not plottet correctly in plots.plot_image

There is a problem with plotting images containing NaNs with plots.plot_image. Either the NaNs are not plotted with the correct color (default black) or pixels outside the (vmin, vmax) range are mistakenly plotted with black, as if they were NaN.

This also affects the movies being generate by run_ffimovie.

Barycentric time correction for FFIs

For the FFIs the barycentric julian date is calculated for a target in the middle of the CCD. We should properly recalculate the barycentric time for each individual target, since we are spanning a large area of the sky.

Add record of final photometry method to diagnostics

We should add a method_used column to the diagnostics table indicating the type of photometry that was used in the end. This can be different from the method defined in the todolist table, since the code can choose to change the method of photometry depending on the details of the current target.

This will also help in the data validation step afterwards, making it much easier to pick out targets processed with different methods.

Add photometric errors to output FITS files

For some reason, errors on the extracted photometric fluxes were never included. For TPF files this is almost trivial (at least for aperture photometry), but will require a little more work for FFIs.

  • Provide per-pixel uncertainties for photometries
  • Propergate uncertainties in FFI background estimation and subtraction
  • Aperture photometry: return photometric errors
  • Halo photometry: return photometric errors

Add tests of command-line interfaces

No tests are done of the command-line interfaces in the root directory. Most of the functionality behind is already being tested, but the CLI themselves are not.

Output units of mag2flux

Query

In the utility function, photometry.utilities.mag2flux(mag, zp=20.60654144), what is the units of output flux? Is it in e-/s or in physical units W/m^2?

Also, what is the zp in the function? What it represents?

TaskManager: missing photometry_skipped entries

In some cases an entry is not created in the photometry_skipped table when a target has its status changed to SKIPPED. This means that we do not have a direct record of which target caused the "skipping".

This is not a serious problem, since this table is currently not used for anything, but should be fixes anyway.

Preparation for faster FFI cadence

We need to ensure that the pipeline will perform the same way when the FFI cadence is changed from 30min to 10min. This will most likely be small changes, but needs to be thoroughly investigated.

This is obviously tightly linked to #19.

Flagging of asteroids

Flagging of known solar-system objects (e.g. asteroids) passing through the pixels used for photometry.

Will create new flags both on pixel-level and in photometry quality.

Problem with TESS overlapping cameras

We need to change the naming-scheme of the light curve FITS files output by the photometry to include the TESS camera and CCD in the name.

We have detected that we otherwise run into problems when processing FFIs where TESS camera 1 and 2 have a tiny overlap region in sky-coverage, which means that a target can potentially be observed by two camera simultaneously. This causes the problem, since the file-names currently do not contain camera/ccd, that files are overwritten depending on which was processed first.

This has the unfortunate side-effect that targets from different CCDs are effective mixed, causing problems for the following cotrending algorithms in https://github.com/tasoc/corrections.

Transfer of quality flags from TPFs to FFIs

We should investigate if it would be a good idea to check the TPFs for a given camera and match the quality flags to the ones in the corresponding FFIs. We seen examples where the FFI quality flags are not always as well-populated as the TPFs.

Use TESS's location in barycentric time correction

As an extension to #6, we are currently recalculating the timestamps of the targets in FFIs, using their positions on sky, but anchoring them to the Earth frame. We really should use TESS's location instead. The effect is small (order of ~1 sec), but should still be done.

Prepare sometimes freezes when using multiple CPUs

There seems to be a problem with the "prepare" step freezing on Sector 14 data when using multiple processors. The problem does not occur when running on single process.

Steps to reproduce the behavior:

  1. Run run_prepare.py on Sector 14, camera 1, CCD1.
  2. Program freezes after ~450 iterations for camera 1, ccd 1. Nothing happens until program is killed.
  3. If the same is run, but forcing number of processors to 1, the program runs through without problems.

Expected behavior
Expect the program to run through using multiple processors.

Desktop:

  • OS: Linux
  • Python version: 3.6.3
  • Version: 6.0

Test PSF Photometry performance

  • Test the PSF photometry accuracy in different crowdings.
  • Create metrics of when PSF photometry is better than aperture photometry.

HDF5 speedup

Investigate if it is possible to speed up the I/O in the HDF5 files used for storing the FFI backgrounds and images by chunking the data stored in the HDF5 files in a more clever way than the defaults.
The HDF5 files are currently created in the "prepare_photometry.py" script.

Parallel downloads hangs on HTTP Error

If the parallel downloads (like with SPICE kernels) fails with a HTTP Error, the program hangs. This is not an issue with sequential downloads, so it is an issue with exceptions not being registered correctly when running in multiple threads.

Simple way to reproduce:

  1. Add several (>1) non-existing files to the list of SPICE kernels in spice.py.
  2. Run run_download_cache.py
  3. The program now hangs...

Replace Travis CI with GitHub Actions

The time has come for replacing Travis CI with GitHub Actions.

This is because Travis CI, which we use for "Continuous Integration" (i.e. "tests"), has changed it's pricing model to be less favourable for Open Source projects like ours. At the same time "GitHub Actions" has matured a lot, and can perform all the same things as Travis CI could. At the same time, it is fully integrated into GitHub and we can get rid of a third-party dependency.

Another positive thing is that Mac OSX testing seems much better supported on GitHub Actions.

Correct known timestamp offset

It has been discovered that the timestamps reported in all TESS data before sector 19, data release 26, has a constant offset of ~2 seconds. We should include a correction for the timestamps in the lightcurves, so they are in the correct time.

Implement PSF Photometry

Create class for performing PSF photometry.

  • Create class for PSF photometry.
  • Make the tessphot program call the PSF photometry if the method is manually selected.

Preparation for 20s cadence

In the preparation of 20s cadence data we have a couple of things that needs to be addressed:

  • Cadence in seconds should be stored in the TODO-lists, to help distinguish 120s and 20s TPFs from each other. This also means that the following steps in the pipeline (data validation and corrections) should used this instead of datasource.
  • Detections of cosmic rays. For 20s cadence there will not be any cosmic ray mitigation (on board the spacecraft and likely on ground), so we need to look into detecting cadences where the stamps were hit by cosmic rays.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.