Giter Club home page Giter Club logo

triceratops's Introduction

triceratops

image

image

image

A tool for vetting and validating TESS Objects of Interest.

See Giacalone et al. (2021) for more information about this tool.

Installation

You can install the most recently released version of this tool via PyPI:

$ pip install triceratops

Usage

triceratops can be easily used with jupyter notebook (with Python 3.6 or higher). See the notebook in the examples/ directory for a brief tutorial or check out the documentation.

Attribution

If you use triceratops, please cite both the paper and the code.

Paper citation:

@ARTICLE{2021AJ....161...24G,
       author = {{Giacalone}, Steven and {Dressing}, Courtney D. and {Jensen}, Eric L.~N. and {Collins}, Karen A. and {Ricker}, George R. and {Vanderspek}, Roland and {Seager}, S. and {Winn}, Joshua N. and {Jenkins}, Jon M. and {Barclay}, Thomas and {Barkaoui}, Khalid and {Cadieux}, Charles and {Charbonneau}, David and {Collins}, Kevin I. and {Conti}, Dennis M. and {Doyon}, Ren{\'e} and {Evans}, Phil and {Ghachoui}, Mourad and {Gillon}, Micha{\"e}l and {Guerrero}, Natalia M. and {Hart}, Rhodes and {Jehin}, Emmanu{\"e}l and {Kielkopf}, John F. and {McLean}, Brian and {Murgas}, Felipe and {Palle}, Enric and {Parviainen}, Hannu and {Pozuelos}, Francisco J. and {Relles}, Howard M. and {Shporer}, Avi and {Socia}, Quentin and {Stockdale}, Chris and {Tan}, Thiam-Guan and {Torres}, Guillermo and {Twicken}, Joseph D. and {Waalkes}, William C. and {Waite}, Ian A.},
        title = "{Vetting of 384 TESS Objects of Interest with TRICERATOPS and Statistical Validation of 12 Planet Candidates}",
      journal = {\aj},
     keywords = {Exoplanet astronomy, Astrostatistics, Planet hosting stars, Exoplanets, 486, 1882, 1242, 498, Astrophysics - Earth and Planetary Astrophysics, Astrophysics - Instrumentation and Methods for Astrophysics, Astrophysics - Solar and Stellar Astrophysics},
         year = 2021,
        month = jan,
       volume = {161},
       number = {1},
          eid = {24},
        pages = {24},
          doi = {10.3847/1538-3881/abc6af},
archivePrefix = {arXiv},
       eprint = {2002.00691},
 primaryClass = {astro-ph.EP},
       adsurl = {https://ui.adsabs.harvard.edu/abs/2021AJ....161...24G},
      adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}

Code citation:

@MISC{2020ascl.soft02004G,
       author = {{Giacalone}, Steven and {Dressing}, Courtney D.},
        title = "{triceratops: Candidate exoplanet rating tool}",
     keywords = {Software, NASA, TESS},
         year = 2020,
        month = feb,
          eid = {ascl:2002.004},
        pages = {ascl:2002.004},
archivePrefix = {ascl},
       eprint = {2002.004},
       adsurl = {https://ui.adsabs.harvard.edu/abs/2020ascl.soft02004G},
      adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}

Help

If you are having trouble getting triceratops working on your machine, I recommend installing it in a fresh conda environment. You can download the latest distribution of anaconda here. After doing so, run the following in terminal:

$ conda create -n myenv python=3.8
$ conda activate myenv
(myenv) $ pip install triceratops jupyterlab

You can replace myenv with an environment name of your choice. To exit this environment, run:

(myenv) $ conda deactivate

To delete this environment, run:

$ conda remove --name myenv --all

triceratops's People

Contributors

hpparvi avatar martindevora avatar orionlee avatar stevengiacalone avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

triceratops's Issues

More contrast curves

Is there any easy and fast way to inject more than one contrast curve? (Considering the case of imaging observations in different wavelenghts)

TypingError in triceratops.target.calc_probs

Hi there. I'm newly trying out triceratops to help with a TESS paper I'm finalizing (and need the kind of statistical validation that Triceratops can provide).

I finally got all the modules to load properly in a dedicated conda environemnt and I swapped the shell to bash so that lightkurve will run (Catalina seems to default to a zsh shell which causes issues). I'm working through the TESS tutorial notebook and hit the following issue in the 5th code cell, where we first call:
target.calc_probs(time=lc.time.value, flux_0=lc.flux.value, flux_err_0=np.mean(lc.flux_err.value), P_orb=P_orb)

I get the following error: (I haven't edited the tutorial file at all)

---------------------------------------------------------------------------
TypingError                               Traceback (most recent call last)
File <timed exec>:9, in <module>

File /Applications/anaconda/envs/triceratops/lib/python3.8/site-packages/triceratops/triceratops.py:731, in target.calc_probs(self, time, flux_0, flux_err_0, P_orb, contrast_curve_file, filt, N, parallel, drop_scenario, verbose, flatpriors, exptime, nsamples, molusc_file, trilegal_fname)
    725 if verbose == 1:
    726     print(
    727         "Calculating TP scenario "
    728         + "probabilitiey for " + str(ID) + "."
    729         )
--> 731 res = lnZ_TTP(
    732     time, flux, flux_err, P_orb,
    733     M_s, R_s, Teff, Z,
    734     N, parallel, self.mission,
    735     flatpriors,
    736     exptime, nsamples
    737     )
    738 # self.res_TTP = res
    739 j = 0

File /Applications/anaconda/envs/triceratops/lib/python3.8/site-packages/triceratops/marginal_likelihoods.py:144, in lnZ_TTP(time, flux, sigma, P_orb, M_s, R_s, Teff, Z, N, parallel, mission, flatpriors, exptime, nsamples)
    142             continue
    143         if (incs[i] >= inc_min) & (coll[i] == False):
--> 144             lnL[i] = -0.5*ln2pi - lnsigma - lnL_TP(
    145                 time, flux, sigma, rps[i],
    146                 P_orb[i], incs[i], a, R_s, u1, u2,
    147                 eccs[i], argps[i],
    148                 exptime=exptime, nsamples=nsamples
    149                 )
    151 N_samples = 100
    152 idx = (-lnL).argsort()[:N_samples]

File /Applications/anaconda/envs/triceratops/lib/python3.8/site-packages/triceratops/likelihoods.py:186, in lnL_TP(time, flux, sigma, R_p, P_orb, inc, a, R_s, u1, u2, ecc, argp, companion_fluxratio, companion_is_host, exptime, nsamples)
    152 def lnL_TP(time: np.ndarray, flux: np.ndarray, sigma: float, R_p: float,
    153            P_orb: float, inc: float, a: float, R_s: float,
    154            u1: float, u2: float, ecc: float, argp: float,
   (...)
    157            exptime: float = 0.00139,
    158            nsamples: int = 20):
    159     """
    160     Calculates the log likelihood of a transiting planet scenario by
    161     comparing a simulated light curve and the TESS light curve.
   (...)
    184         Log likelihood (float).
    185     """
--> 186     model = simulate_TP_transit(
    187         time, R_p, P_orb, inc, a, R_s, u1, u2,
    188         ecc, argp,
    189         companion_fluxratio, companion_is_host,
    190         exptime, nsamples
    191         )
    192     return 0.5*(np.sum((flux-model)**2 / sigma**2))

File /Applications/anaconda/envs/triceratops/lib/python3.8/site-packages/triceratops/likelihoods.py:50, in simulate_TP_transit(time, R_p, P_orb, inc, a, R_s, u1, u2, ecc, argp, companion_fluxratio, companion_is_host, exptime, nsamples)
     48 # step 1: simulate light curve assuming only the host star exists
     49 tm.set_data(time, exptimes=exptime, nsamples=nsamples)
---> 50 flux = tm.evaluate_ps(
     51     k=R_p*Rearth/(R_s*Rsun),
     52     ldc=[float(u1), float(u2)],
     53     t0=0.0,
     54     p=P_orb,
     55     a=a/(R_s*Rsun),
     56     i=inc*(pi/180.),
     57     e=ecc,
     58     w=(90-argp)*(pi/180.)
     59     )
     60 # step 2: adjust the light curve to account for flux dilution
     61 # from non-host star
     62 if companion_is_host:

File /Applications/anaconda/envs/triceratops/lib/python3.8/site-packages/pytransit/models/ma_quadratic.py:183, in QuadraticModel.evaluate_ps(self, k, ldc, t0, p, a, i, e, w, copy)
    180 if ldc.size != 2 * self.npb:
    181     raise ValueError("The quadratic model needs two limb darkening coefficients per passband")
--> 183 flux = quadratic_model_s(self.time, k, t0, p, a, i, e, w, ldc,
    184                          self.lcids, self.pbids, self.epids, self.nsamples, self.exptimes, self.npb,
    185                          self.ed, self.ld, self.le, self.kt, self.zt, self.interpolate)
    186 return squeeze(flux)

File /Applications/anaconda/envs/triceratops/lib/python3.8/site-packages/numba/core/dispatcher.py:468, in _DispatcherBase._compile_for_args(self, *args, **kws)
    464         msg = (f"{str(e).rstrip()} \n\nThis error may have been caused "
    465                f"by the following argument(s):\n{args_str}\n")
    466         e.patch_message(msg)
--> 468     error_rewrite(e, 'typing')
    469 except errors.UnsupportedError as e:
    470     # Something unsupported is present in the user code, add help info
    471     error_rewrite(e, 'unsupported_error')

File /Applications/anaconda/envs/triceratops/lib/python3.8/site-packages/numba/core/dispatcher.py:409, in _DispatcherBase._compile_for_args.<locals>.error_rewrite(e, issue_type)
    407     raise e
    408 else:
--> 409     raise e.with_traceback(None)

TypingError: Failed in nopython mode pipeline (step: nopython frontend)
Failed in nopython mode pipeline (step: nopython frontend)
Failed in nopython mode pipeline (step: nopython frontend)
Cannot unify float64 and array(float64, 1d, C) for 't0.2', defined at /Applications/anaconda/envs/triceratops/lib/python3.8/site-packages/pytransit/orbits/taylor_z.py (320)

File "../../../../../../../../Applications/anaconda/envs/triceratops/lib/python3.8/site-packages/pytransit/orbits/taylor_z.py", line 320:
def find_contact_point(k: float, point: int, y0, vx, vy, ax, ay, jx, jy, sx, sy):
    <source elided>
    i = 0
    while abs(t2 - t0) > 1e-6 and i < 100:
    ^

During: typing of assignment at /Applications/anaconda/envs/triceratops/lib/python3.8/site-packages/pytransit/orbits/taylor_z.py (320)

File "../../../../../../../../Applications/anaconda/envs/triceratops/lib/python3.8/site-packages/pytransit/orbits/taylor_z.py", line 320:
def find_contact_point(k: float, point: int, y0, vx, vy, ax, ay, jx, jy, sx, sy):
    <source elided>
    i = 0
    while abs(t2 - t0) > 1e-6 and i < 100:
    ^

During: resolving callee type: type(CPUDispatcher(<function find_contact_point at 0x7fac899713a0>))
During: typing of call at /Applications/anaconda/envs/triceratops/lib/python3.8/site-packages/pytransit/orbits/taylor_z.py (360)

During: resolving callee type: type(CPUDispatcher(<function find_contact_point at 0x7fac899713a0>))
During: typing of call at /Applications/anaconda/envs/triceratops/lib/python3.8/site-packages/pytransit/orbits/taylor_z.py (360)


File "../../../../../../../../Applications/anaconda/envs/triceratops/lib/python3.8/site-packages/pytransit/orbits/taylor_z.py", line 360:
def t14(k: float, y0, vx, vy, ax, ay, jx, jy, sx, sy):
    t1 = find_contact_point(k, 1, y0, vx, vy, ax, ay, jx, jy, sx, sy)
    ^

During: resolving callee type: type(CPUDispatcher(<function t14 at 0x7fac8995cdc0>))
During: typing of call at /Applications/anaconda/envs/triceratops/lib/python3.8/site-packages/pytransit/models/numba/ma_quadratic_nb.py (611)

During: resolving callee type: type(CPUDispatcher(<function t14 at 0x7fac8995cdc0>))
During: typing of call at /Applications/anaconda/envs/triceratops/lib/python3.8/site-packages/pytransit/models/numba/ma_quadratic_nb.py (611)


File "../../../../../../../../Applications/anaconda/envs/triceratops/lib/python3.8/site-packages/pytransit/models/numba/ma_quadratic_nb.py", line 611:
def quadratic_model_s(t, k, t0, p, a, i, e, w, ldc, lcids, pbids, epids, nsamples, exptimes, npb, edt, ldt, let, kt, zt, interpolate):
    <source elided>
    y0, vx, vy, ax, ay, jx, jy, sx, sy = vajs_from_paiew(p, a, i, e, w)
    half_window_width = 0.025 + 0.5 * t14(k[0], y0, vx, vy, ax, ay, jx, jy, sx, sy)
    ^

Can't say I really know what's going on, but it seems pretty deep inside the triceratops source code. Any insight on how to fix this?

Clarify on the nearby stars included in analysis

This is a question and a feature suggestion regarding nearby stars included in analysis

Question: For nearby stars with missing stellar parameters (mass, rad, or Teff), are they include in analysis?

image

If they are excluded, it might be helpful to issue a warning.


Suggestion: in the paper, it is suggested that stars with little contribution to the aperture are excluded using a relative transit depth measure.

It might be helpful to include the relative transit depth measure in target.stars, so that users can easily identify nearby stars included in the analysis.

Currently, I am doing something like this:

# stars with δs >= 1 are excluded from further analysis (section 2.1 of the paper)
target.stars["rel_tdepth"] = tdepth / target.stars["fluxratio"]

target.stars[target.stars["rel_tdepth"] < 1]  # show stars still included in the analysis

Target.plot_fits not returning figure

Hi,

I'd like to be able to save the generated image when using target.plot_fits. However, plt.show() is called at the end of the method and the figure is not returning. Therefore I'm unable to save the figure, which would be needed when running headless (in a server, for instance). Is here any way I can do it right now or does the code need to be changed a little bit?

Kind regards.

Pytransit issues

I followed the instructions and created a new conda environment with python=3.8, but I get the following error when I try to import it to my project .IPYNB:
ModuleNotFoundError: No module named 'pytransit.models.roadrunner.opmodel'
I am running github codespace posix because it also showed the same error on my M1 macbook air. Is there any way to fix this that doesnt require editing the src?

calc_probs() raissed HTTP 400 Error from trilegal if the target instance is used for a long time

Scenario:

  • When I do some analysis with a working target instance, and stop for a while (e.g., for lunch break, or go to sleep), and come back to continue, target.calc_probs() would raise HTTP 400 error.
  • The reason is that the TRILEGAL data, temporarily hosted remotely on their server, has been removed.

Error:

HTTPError                                 Traceback (most recent call last)
File <timed exec>:9

File ~/dev/triceratops/triceratops/triceratops.py:706, in target.calc_probs(self, time, flux_0, flux_err_0, P_orb, contrast_curve_file, filt, N, parallel, drop_scenario, verbose, flatpriors, exptime, nsamples, molusc_file)
    704 if self.trilegal_fname is None: 
    705     output_url = self.trilegal_url
--> 706     trilegal_fname = save_trilegal(output_url, self.ID)
    707 else:
    708     trilegal_fname = self.trilegal_fname

File ~/dev/triceratops/triceratops/funcs.py:306, in save_trilegal(output_url, ID)
    304 else:
    305     for i in range(1000):
--> 306         last = read_csv(output_url, header=None)[-1:]
    307         if last.values[0, 0] != "#TRILEGAL normally terminated":
    308             print("...")

...

File /usr/lib/python3.8/urllib/request.py:649, in HTTPDefaultErrorHandler.http_error_default(self, req, fp, code, msg, hdrs)
    648 def http_error_default(self, req, fp, code, msg, hdrs):
--> 649     raise HTTPError(req.full_url, code, msg, hdrs, fp)

HTTPError: HTTP Error 404: Not Found

[Question] Use triceratops probabilities as priors

Hello, I'm working on a new neural network to perform vetting on exoplanet candidates. I was reading the Exominer (https://ui.adsabs.harvard.edu/abs/2022ApJ...926..120V/abstract) paper and they claimed the usage of the combination of the Exominer scores with the probabilities given by the priors computed by Armstrong et al. 2021 with the method described by Bryson & Morton 2017 to improve their results. I looked at those papers and thought that maybe I could also improve my neural network results by combining them with the TRICERATOPS probabilities. However, I'm not sure whether it'd be correct and/or what would be the proper way to choose the P(s=1 | TRICERATOPS) and P(s=0 | TRICERATOPS).

Do you have any suggestions or want to chat more about it?

Kind regards and thanks in advance.
Martín.

v1.0.8 returns empty probs when Ms < 0.1

Hi, after version 1.0.8 was released (Ms < 0.1 is allowed now) the execution is still working for "standard" scenarios. However, even though it now works when there are stars lacking of Mass or Radius info, now the final probability results are empty. I'm attaching a jupyter notebook together with the light curve that it uses to prove my proceedings.

example.zip

Kind regards and thank you in advance.

[Bug] HTTP Error 404: Not Found

This error always appears randomly when my machine is in the middle of FPP and/or NFPP calculations, usually after 2 or more hours of running. Please fix it.

---------------------------------------------------------------------------
HTTPError                                 Traceback (most recent call last)
File <timed exec>:4

File ~/anaconda3/lib/python3.11/site-packages/triceratops/triceratops.py:706, in target.calc_probs(self, time, flux_0, flux_err_0, P_orb, contrast_curve_file, filt, N, parallel, drop_scenario, verbose, flatpriors, exptime, nsamples, molusc_file)
    704 if self.trilegal_fname is None: 
    705     output_url = self.trilegal_url
--> 706     trilegal_fname = save_trilegal(output_url, self.ID)
    707 else:
    708     trilegal_fname = self.trilegal_fname

File ~/anaconda3/lib/python3.11/site-packages/triceratops/funcs.py:306, in save_trilegal(output_url, ID)
    304 else:
    305     for i in range(1000):
--> 306         last = read_csv(output_url, header=None)[-1:]
    307         if last.values[0, 0] != "#TRILEGAL normally terminated":
    308             print("...")

File ~/anaconda3/lib/python3.11/site-packages/pandas/io/parsers/readers.py:1024, in read_csv(filepath_or_buffer, sep, delimiter, header, names, index_col, usecols, dtype, engine, converters, true_values, false_values, skipinitialspace, skiprows, skipfooter, nrows, na_values, keep_default_na, na_filter, verbose, skip_blank_lines, parse_dates, infer_datetime_format, keep_date_col, date_parser, date_format, dayfirst, cache_dates, iterator, chunksize, compression, thousands, decimal, lineterminator, quotechar, quoting, doublequote, escapechar, comment, encoding, encoding_errors, dialect, on_bad_lines, delim_whitespace, low_memory, memory_map, float_precision, storage_options, dtype_backend)
   1011 kwds_defaults = _refine_defaults_read(
   1012     dialect,
   1013     delimiter,
   (...)
   1020     dtype_backend=dtype_backend,
   1021 )
   1022 kwds.update(kwds_defaults)
-> 1024 return _read(filepath_or_buffer, kwds)

File ~/anaconda3/lib/python3.11/site-packages/pandas/io/parsers/readers.py:618, in _read(filepath_or_buffer, kwds)
    615 _validate_names(kwds.get("names", None))
    617 # Create the parser.
--> 618 parser = TextFileReader(filepath_or_buffer, **kwds)
    620 if chunksize or iterator:
    621     return parser

File ~/anaconda3/lib/python3.11/site-packages/pandas/io/parsers/readers.py:1618, in TextFileReader.__init__(self, f, engine, **kwds)
   1615     self.options["has_index_names"] = kwds["has_index_names"]
   1617 self.handles: IOHandles | None = None
-> 1618 self._engine = self._make_engine(f, self.engine)

File ~/anaconda3/lib/python3.11/site-packages/pandas/io/parsers/readers.py:1878, in TextFileReader._make_engine(self, f, engine)
   1876     if "b" not in mode:
   1877         mode += "b"
-> 1878 self.handles = get_handle(
   1879     f,
   1880     mode,
   1881     encoding=self.options.get("encoding", None),
   1882     compression=self.options.get("compression", None),
   1883     memory_map=self.options.get("memory_map", False),
   1884     is_text=is_text,
   1885     errors=self.options.get("encoding_errors", "strict"),
   1886     storage_options=self.options.get("storage_options", None),
   1887 )
   1888 assert self.handles is not None
   1889 f = self.handles.handle

File ~/anaconda3/lib/python3.11/site-packages/pandas/io/common.py:728, in get_handle(path_or_buf, mode, encoding, compression, memory_map, is_text, errors, storage_options)
    725     codecs.lookup_error(errors)
    727 # open URLs
--> 728 ioargs = _get_filepath_or_buffer(
    729     path_or_buf,
    730     encoding=encoding,
    731     compression=compression,
    732     mode=mode,
    733     storage_options=storage_options,
    734 )
    736 handle = ioargs.filepath_or_buffer
    737 handles: list[BaseBuffer]

File ~/anaconda3/lib/python3.11/site-packages/pandas/io/common.py:384, in _get_filepath_or_buffer(filepath_or_buffer, encoding, compression, mode, storage_options)
    382 # assuming storage_options is to be interpreted as headers
    383 req_info = urllib.request.Request(filepath_or_buffer, headers=storage_options)
--> 384 with urlopen(req_info) as req:
    385     content_encoding = req.headers.get("Content-Encoding", None)
    386     if content_encoding == "gzip":
    387         # Override compression based on Content-Encoding header

File ~/anaconda3/lib/python3.11/site-packages/pandas/io/common.py:289, in urlopen(*args, **kwargs)
    283 """
    284 Lazy-import wrapper for stdlib urlopen, as that imports a big chunk of
    285 the stdlib.
    286 """
    287 import urllib.request
--> 289 return urllib.request.urlopen(*args, **kwargs)

File ~/anaconda3/lib/python3.11/urllib/request.py:216, in urlopen(url, data, timeout, cafile, capath, cadefault, context)
    214 else:
    215     opener = _opener
--> 216 return opener.open(url, data, timeout)

File ~/anaconda3/lib/python3.11/urllib/request.py:525, in OpenerDirector.open(self, fullurl, data, timeout)
    523 for processor in self.process_response.get(protocol, []):
    524     meth = getattr(processor, meth_name)
--> 525     response = meth(req, response)
    527 return response

File ~/anaconda3/lib/python3.11/urllib/request.py:634, in HTTPErrorProcessor.http_response(self, request, response)
    631 # According to RFC 2616, "2xx" code indicates that the client's
    632 # request was successfully received, understood, and accepted.
    633 if not (200 <= code < 300):
--> 634     response = self.parent.error(
    635         'http', request, response, code, msg, hdrs)
    637 return response

File ~/anaconda3/lib/python3.11/urllib/request.py:563, in OpenerDirector.error(self, proto, *args)
    561 if http_err:
    562     args = (dict, 'default', 'http_error_default') + orig_args
--> 563     return self._call_chain(*args)

File ~/anaconda3/lib/python3.11/urllib/request.py:496, in OpenerDirector._call_chain(self, chain, kind, meth_name, *args)
    494 for handler in handlers:
    495     func = getattr(handler, meth_name)
--> 496     result = func(*args)
    497     if result is not None:
    498         return result

File ~/anaconda3/lib/python3.11/urllib/request.py:643, in HTTPDefaultErrorHandler.http_error_default(self, req, fp, code, msg, hdrs)
    642 def http_error_default(self, req, fp, code, msg, hdrs):
--> 643     raise HTTPError(req.full_url, code, msg, hdrs, fp)

HTTPError: HTTP Error 404: Not Found

HTTP connection issue

When trying to create a target object I am getting the following HTTP connection error:

requests.exceptions.ConnectionError: HTTPConnectionPool(host='stev.oapd.inaf.it', port=80): Max retries exceeded with url: /cgi-bin/trilegal_1.6 (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f5ffb4bdf50>: Failed to establish a new connection: [Errno 113] No route to host'))

[Bug] Error when kepler target not in catalog

Hello, for some reason there are some KIC that are not in the Vizier J/ApJS/229/30/catalog catalog used by TRICERATOPS and this causes an error that aborts the execution of the code. E.g. KIC 2569583. As this catalog and the K2 ones are only used to retrieve ra and dec, I'd like to ask for the option to introduce both values in the TRICERATOPS target constructor. That way we could skip using those catalogs if we already knew the celestial coordinates.

Regards.

NaN FPP for Kepler target

Hi, I was trying the same Kepler target from the official example (using v1.0.9) but with a different processed light curve and apertures. I'm attaching the compressed example. I'm binning the curve with 100 bins only so it should not be the problem right now. FPP is NaN and many scenarios lack of probability.

example_kepler.zip

In addition, I was looking into the code and wanted to understand the change you made in this line (Line 140) from marginal_likelihood:
Z = np.mean(np.exp(lnL + lnprior_Mstar + lnprior_Porb + 709))

What does the 709 value mean?

Kind regards.

M_s >= 0.1 assertion interrupting execution

Hi, I was trying to run a validation of a given target from TESS. When TRICERATOPS is analyzing the nearby stars an exception is thrown and the validation fails. The traceback is as follows:

Traceback (most recent call last):
  File "/home/martin/git_repositories/sherlockpipe/sherlockpipe/vet.py", line 328, in vetting
    result_dir, ra1, dec1 = self.vetting_validation(cpus, indir, tic, sectors, lc_file, transit_depth, period,
  File "/home/martin/git_repositories/sherlockpipe/sherlockpipe/vet.py", line 463, in vetting_validation
    validation_results = pool.map(validator.validate, input_n_times)
  File "/usr/lib/python3.8/multiprocessing/pool.py", line 364, in map
    return self._map_async(func, iterable, mapstar, chunksize).get()
  File "/usr/lib/python3.8/multiprocessing/pool.py", line 768, in get
    raise self._value
  File "/usr/lib/python3.8/multiprocessing/pool.py", line 125, in worker
    result = (True, func(*args, **kwds))
  File "/usr/lib/python3.8/multiprocessing/pool.py", line 48, in mapstar
    return list(map(*args))
  File "/home/martin/git_repositories/sherlockpipe/sherlockpipe/vet.py", line 772, in validate
    input.target.calc_probs(time=input.time, flux_0=input.flux, flux_err_0=input.sigma, P_orb=input.period)
  File "/home/martin/.local/lib/python3.8/site-packages/triceratops/triceratops.py", line 916, in calc_probs
    res, res_twin = lnZ_TEB(
  File "/home/martin/.local/lib/python3.8/site-packages/triceratops/marginal_likelihoods.py", line 174, in lnZ_TEB
    qs = sample_q(np.random.rand(N), M_s)
  File "/home/martin/.local/lib/python3.8/site-packages/triceratops/priors.py", line 172, in sample_q
    assert M_s >= 0.1
AssertionError

Is this done by definition for any known reason? Could TRICERATOPS just ignore the problematic star and perform a proper validation? Or we should not trust a validation that needs to ignore some of the closest nearby stars? Can a validation be done for such objects?

It also happened that we wanted to analyze a target with Ms < 0.1 and we found the same error, so this is more or less a common issue that we are finding.

Kind regards,
Martín.

[Question] Support for Kepler data

Hello. I wanted to know about the complexity of adding support for Kepler and K2 data into Triceratops. Is the implementation very coupled to the TESS data and APIs or would it be more or less straightforward to support those old missions? What changes would need to be done, if possible?

Kind regards.

Huge variations in FPP

I was running this code on the TOI 1307 and for every iterations I got various values of FPP ranging from 0.99 to the order of 10^(-12), all the resulting values listed below.

• TOI 1307
• Sector 47
• Contrast curve file: 562nm available on EXOFOP
• Band: Vis

FPP, NFPP
4.70E-11, 4.70E-11
7.84E-12, 3.87E-31
0.01158, 2.33E-16
4.14E-12, 9.93E-32
0.923719, 3.95E-21
0.99999, 6.44E-25
0.98819, 2.60E-16
0.00113, 1.08E-28
6.72E-08, 6.95E-22
0.99993, 5.30E-27

Is it due to any algorithm working incorrectly? What are the possible solutions for this?

*I'm using triceratops virsion 1.0.15.

Allow using different download_dir for lightkurve

Hi, since the recent addition of Lightkurve for Kepler and K2 I've noticed that TRICERATOPS now stores the Lightkurve caches in the default home directory the fits files. I'd like to suggest a pull request to merge a change where the Triceratops constructor will receive the lightkurve_cache_dir so it can be passed to the Lightkurve download_all methods. That way the place where the caches are loaded and stored will be customizable.

Is there any other place in Triceratops where Lightkurve (or Eleanor, for instance) is used?

If this is ok for you I will do it.

Regards.

FPP is NaN and NFPP is 0.0

Hi, I'm having trouble getting NaN values for FPP and empty probs for all the scenarios. I document my case in this file:
example.zip

Am I doing anything wrong? As far as I knew, I needed to bin the curve because many points caused the algorithm to return this NaN and 0.0 values. However, now I'm binning the curve to only 250 points and yet finding this result.

Thanks in advance for your help.

[Bug] Triceratops times out

Hi, I'm having trouble when using TRICERATOPS for a TESS target conaining one or more of the sectors data with 200s cadence. This is consistently happening for several targets under similar conditions. Can you take a look on that?

2024-04-01 14:57:00 INFO     Acquiring triceratops target
Traceback (most recent call last):
  File "/home/martin/workspace/ph/SHERLOCK/sherlockpipe/validation/validator.py", line 73, in validate
    self.execute_triceratops(cpus, self.data_dir, object_id, sectors, lc_file, transit_depth,
  File "/home/martin/workspace/ph/SHERLOCK/sherlockpipe/validation/validator.py", line 200, in execute_triceratops
    target = tr.target(ID=id_int, mission=mission, sectors=sectors)
  File "/home/martin/anaconda3/envs/sherlockpipe/lib/python3.10/site-packages/triceratops/triceratops.py", line 136, in __init__
    cutout_hdu = Tesscut.get_cutouts(
  File "/home/martin/anaconda3/envs/sherlockpipe/lib/python3.10/site-packages/astroquery/mast/cutouts.py", line 399, in get_cutouts
    response = self._service_api_connection.service_request_async("astrocut", param_dict)
  File "/home/martin/anaconda3/envs/sherlockpipe/lib/python3.10/site-packages/astroquery/utils/class_or_instance.py", line 25, in f
    return self.fn(obj, *args, **kwds)
  File "/home/martin/anaconda3/envs/sherlockpipe/lib/python3.10/site-packages/astroquery/mast/services.py", line 293, in service_request_async
    response = self._request('POST', request_url, data=catalogs_request, headers=headers, use_json=use_json)
  File "/home/martin/anaconda3/envs/sherlockpipe/lib/python3.10/site-packages/astroquery/mast/services.py", line 192, in _request
    raise TimeoutError("Timeout limit of {} exceeded.".format(self.TIMEOUT))
astroquery.exceptions.TimeoutError: Timeout limit of 600 exceeded.

Kind regards.

progress bar?

Thoughts on adding a progress bar to the code (e.g., with tqdm)? Might be helpful when calculating scenario probabilities! I'd be happy to submit a PR if so.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.