Giter Club home page Giter Club logo

hpparvi / ldtk Goto Github PK

View Code? Open in Web Editor NEW
27.0 27.0 18.0 2.06 MB

Python toolkit for calculating stellar limb darkening profiles and model-specific coefficients using the stellar atmosphere spectrum library by Husser et al. (2013). Described in Parviainen & Aigrain, MNRAS 453, 3821–3826 (2015).

License: GNU General Public License v2.0

Python 8.37% Jupyter Notebook 91.63%
astronomy astrophysics exoplanet-transits exoplanets limb-darkening-models limb-darkening-profiles python

ldtk's People

Contributors

astronasko avatar haykirstin avatar hpparvi avatar iancrossfield avatar jpl-jengelke avatar tamimfatahi avatar tomlouden avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

ldtk's Issues

Exception when trying to reconcile cached files during LDPSetCreator init

File "c:\users\pablo\appdata\local\programs\python\python39\lib\site-packages\exotic\api\gael_ld.py", line 97, in createldgrid
    sc = LDPSetCreator(teff=(tstar, terr), logg=(loggstar, loggerr),
  File "c:\users\pablo\appdata\local\programs\python\python39\lib\site-packages\ldtk\ldtk.py", line 425, in __init__
    with pf.open(self.files[0]) as hdul:
IndexError: list index out of range

Client.__init__() on line 415 calls self.set_limits() which in certain cases results in client.files being empty. So when the property client.local_filenames is used it can return an empty list. Then the call to pf.open(self.files[0]) in LDPSetCreator.__init__() raises an exception because it's operating on an empty list.

I don't have much more information on why the set of files is empty in the first place, but perhaps we can get a use case to help troubleshoot this. It is currently being exposed by the EXOTIC product in certain very limited circumstances.

Inconsistent stellar limb-darkening profiles

Hi! After a recent update of the cached files, I get inconsistent results retrieving the non-linear limb-darkening law (I removed the cache and downloaded again). I'm comparing the intensity difference of the profiles for a given temperature (4250K, 5800K, 7500K). A few months ago the cumulative difference was at most 2.5% and now is more than 10x larger. Funny enough for the third temperature, nothing changed. I'm using python 3.8 to run the code:

filters=[BoxcarFilter("w",380,800)]
sc = LDPSetCreator(teff=(parvar[k], sig_steff),logg=(slogg, sig_slogg),z=(sz, sig_sz),filters=filters, cache=os.path.abspath(os.path.join("yourpath" ,os.pardir))+"/cache/")
ps = sc.create_profiles(nsamples=1000)
ldcn,qe = ps.coeffs_qd(do_mc=True)
ldcl,qel= ps.coeffs_ln(do_mc=True)
ldcnl,qenl= ps.coeffs_nl(do_mc=True)

Limb-darkening coefficients:

4250K
[[0.9454834]]
[[0.64804808 0.3428646 ]]
[[-0.70811569 5.80458496 -7.32818848 3.22719344]]

5800K
[[0.97699707]]
[[0.31915583 0.68739933]]
[[-1.02702287 7.47210631 -9.45970715 3.98864554]]

7500K
[[0.58144531]]
[[0.50178719 0.12041476]]
[[-0.11783424 1.00176812 -0.1979341 -0.0744706 ]]

Best regards,
Eduardo Cristo.

working past 2 mirons

ldtk appears to fail when trying to estimate limb-darkening parameters past 2-3 microns. I assumed this was actually an issue with the Husser models, but the version I have go out to 5.5 microns, suggesting the issue is with ldtk.

"SVD did not converge" for lambda > 2.6 microns

I get the following error when I compute quadratic limb-darkening coefficients for lambda > 2.6 microns.
I used BoxcarFilter('k', 2600, 2800)].

File "/Users/X/Programs/LDTK/limb.py", line 21, in
cq,eq = ps.coeffs_qd(do_mc=True) # Estimate quadratic law coefficients
^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/X/Programs/miniconda3/lib/python3.11/site-packages/ldtk/ldtk.py", line 285, in _coeffs
pos_t = multivariate_normal(chain[i - 1], diag(s ** 2))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "mtrand.pyx", line 4219, in numpy.random.mtrand.RandomState.multivariate_normal
File "<array_function internals>", line 200, in svd
File "/Users/X/Programs/miniconda3/lib/python3.11/site-packages/numpy/linalg/linalg.py", line 1642, in svd
u, s, vh = gufunc(a, signature=signature, extobj=extobj)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/X/Programs/miniconda3/lib/python3.11/site-packages/numpy/linalg/linalg.py", line 98, in _raise_linalgerror_svd_nonconvergence
raise LinAlgError("SVD did not converge")
numpy.linalg.LinAlgError: SVD did not converge

Core.py not platform independent

Line 48 in core.py fails when running on Windows with an error as follows:

  File "c:\users\richa\pycharmprojects\exotic\venv\lib\site-packages\ldtk\__init__.py", line 21, in <module>
    from .ldtk import LDPSetCreator, LDPSet, load_ldpset
  File "c:\users\richa\pycharmprojects\exotic\venv\lib\site-packages\ldtk\ldtk.py", line 33, in <module>
    from .client import Client
  File "c:\users\richa\pycharmprojects\exotic\venv\lib\site-packages\ldtk\client.py", line 32, in <module>
    from .core import ldtk_root, FN_TEMPLATE, TEFF_POINTS, LOGG_POINTS, Z_POINTS, is_inside, SpecIntFile, message
  File "c:\users\richa\pycharmprojects\exotic\venv\lib\site-packages\ldtk\core.py", line 48, in <module>
    ldtk_root  = os.getenv('LDTK_ROOT') or join(os.getenv('HOME'),'.ldtk')
  File "C:\Users\richa\AppData\Local\Programs\Python\Python37\lib\ntpath.py", line 76, in join
    path = os.fspath(path)
TypeError: expected str, bytes or os.PathLike object, not NoneType

the relevant lines in core.py are:

48 ldtk_root  = os.getenv('LDTK_ROOT') or join(os.getenv('HOME'),'.ldtk')
49 if not exists(ldtk_root):
50    os.mkdir(ldtk_root)

I believe that the immediate error on Windows is because the variable "HOME" is not defined on this platform. Since the os.getenv call here has no defined default value, it returns not a string but a Nonetype object - which cannot be combined with os.join.
I suggest the answer may be to use os.path.expanduser('~') to derive the platform independent home directory in line 48, or possibly Path.home if on Python 3.5+. Then to use os.path.join to build the path itself.

EORError: Python 3.5 compatibility?

Hi,

First of all, thank you for this limb-darkening coefficient prior calculator tool!

I am currently using Python 3.6 and I have run into what seems to be a compatibility problem. I tried running the example given here: https://github.com/hpparvi/ldtk
But ran into the following error which I have not yet been able to resolve:


EOFError Traceback (most recent call last)
in ()
8 logg=(4.50, 0.20), # downloads the uncached stellar
9 z=(0.25, 0.05), # spectra from the Husser et al.
---> 10 filters=filters) # FTP server automatically.
11
12 ps = sc.create_profiles() # Create the limb darkening profiles

/Users/ldang/anaconda/envs/python3/lib/python3.6/site-packages/ldtk/ldtk.py in init(self, teff, logg, z, filters, qe, limits, offline_mode, force_download, verbose, cache)
297 print("Fe/H limits: " + str(metal_lims))
298
--> 299 self.client = c = Client(limits=[teff_lims, logg_lims, metal_lims], cache=cache)
300 self.files = self.client.local_filenames
301 self.filters = filters

/Users/ldang/anaconda/envs/python3/lib/python3.6/site-packages/ldtk/client.py in init(self, limits, verbosity, offline_mode, update_server_file_list, cache)
38 os.mkdir(self._cache)
39
---> 40 if exists(self._server_file_list) and not update_server_file_list:
41 with open(self._server_file_list, 'rb') as fin:
42 self.files_in_server = load(fin)

EOFError: Ran out of input

Any idea?
Thank you!

ps.coeffs_qd() leads to numba error

Hi Hannu,

Following the basics notebook example, I run into an error when computing the limb darkening coefficients which I am not able to figure out.

numba v. 0.45.1

The snippet below produces the following error (with or without using MCMC):

from ldtk import LDPSetCreator
from ldtk.filters import tess

sc = LDPSetCreator(teff=(6500, 150), logg=(4.1, 0.09), z=(0,0.1), filters=[tess]) 

ps = sc.create_profiles(nsamples=2000) 
ps.resample_linear_z(300) 

qc, qe = ps.coeffs_qd()
---------------------------------------------------------------------------
TypingError                               Traceback (most recent call last)
<ipython-input-49-534425f12e49> in <module>
----> 1 qc, qe = ps.coeffs_qd()

~/Software/anaconda3/lib/python3.7/site-packages/ldtk/ldtk.py in _coeffs(self, return_cm, do_mc, n_mc_samples, mc_thin, mc_burn, ldmodel, ngc)
    238         """
    239         npar = ldmodel.npar or ngc
--> 240         qcs = [fmin(lambda pv: -self._lnlike(pv, flt=iflt, ldmodel=ldmodel), 0.1 * ones(npar), disp=0) for iflt in range(self._nfilters)]
    241         covs = []
    242         for iflt, qc in enumerate(qcs):

~/Software/anaconda3/lib/python3.7/site-packages/ldtk/ldtk.py in <listcomp>(.0)
    238         """
    239         npar = ldmodel.npar or ngc
--> 240         qcs = [fmin(lambda pv: -self._lnlike(pv, flt=iflt, ldmodel=ldmodel), 0.1 * ones(npar), disp=0) for iflt in range(self._nfilters)]
    241         covs = []
    242         for iflt, qc in enumerate(qcs):

~/Software/anaconda3/lib/python3.7/site-packages/scipy/optimize/optimize.py in fmin(func, x0, args, xtol, ftol, maxiter, maxfun, full_output, disp, retall, callback, initial_simplex)
    440             'initial_simplex': initial_simplex}
    441 
--> 442     res = _minimize_neldermead(func, x0, args, callback=callback, **opts)
    443     if full_output:
    444         retlist = res['x'], res['fun'], res['nit'], res['nfev'], res['status']

~/Software/anaconda3/lib/python3.7/site-packages/scipy/optimize/optimize.py in _minimize_neldermead(func, x0, args, callback, maxiter, maxfev, disp, return_all, initial_simplex, xatol, fatol, adaptive, **unknown_options)
    583 
    584     for k in range(N + 1):
--> 585         fsim[k] = func(sim[k])
    586 
    587     ind = numpy.argsort(fsim)

~/Software/anaconda3/lib/python3.7/site-packages/scipy/optimize/optimize.py in function_wrapper(*wrapper_args)
    324     def function_wrapper(*wrapper_args):
    325         ncalls[0] += 1
--> 326         return function(*(wrapper_args + args))
    327 
    328     return ncalls, function_wrapper

~/Software/anaconda3/lib/python3.7/site-packages/ldtk/ldtk.py in <lambda>(pv)
    238         """
    239         npar = ldmodel.npar or ngc
--> 240         qcs = [fmin(lambda pv: -self._lnlike(pv, flt=iflt, ldmodel=ldmodel), 0.1 * ones(npar), disp=0) for iflt in range(self._nfilters)]
    241         covs = []
    242         for iflt, qc in enumerate(qcs):

~/Software/anaconda3/lib/python3.7/site-packages/ldtk/ldtk.py in _lnlike(self, ldcs, joint, flt, ldmodel)
    289                 'Need to give the filter id `flt` if evaluating a single set of coefficients with multiple filters defined.')
    290 
--> 291         m = ldmodel.evaluate(self._mu, ldcs)
    292 
    293         if flt is not None:

~/Software/anaconda3/lib/python3.7/site-packages/ldtk/ldmodel.py in evaluate(cls, mu, pv)
    138     @classmethod
    139     def evaluate(cls, mu: ndarray, pv: ndarray) -> ndarray:
--> 140         return evaluate_ld(ld_quadratic, mu, pv)
    141 
    142 

~/Software/anaconda3/lib/python3.7/site-packages/numba/dispatcher.py in _compile_for_args(self, *args, **kws)
    374                 e.patch_message(msg)
    375 
--> 376             error_rewrite(e, 'typing')
    377         except errors.UnsupportedError as e:
    378             # Something unsupported is present in the user code, add help info

~/Software/anaconda3/lib/python3.7/site-packages/numba/dispatcher.py in error_rewrite(e, issue_type)
    341                 raise e
    342             else:
--> 343                 reraise(type(e), e, None)
    344 
    345         argtypes = []

~/Software/anaconda3/lib/python3.7/site-packages/numba/six.py in reraise(tp, value, tb)
    656             value = tp()
    657         if value.__traceback__ is not tb:
--> 658             raise value.with_traceback(tb)
    659         raise value
    660 

TypingError: Failed in nopython mode pipeline (step: nopython frontend)
Internal error at <numba.typeinfer.StaticGetItemConstraint object at 0x134af19d0>.
tuple index out of range
[1] During: typing of static-get-item at /Users/vxh710/Software/anaconda3/lib/python3.7/site-packages/ldtk/ldmodel.py (33)
Enable logging at debug level for details.

File "../anaconda3/lib/python3.7/site-packages/ldtk/ldmodel.py", line 33:
def evaluate_ld(ldm, mu, pvo):
    <source elided>
    elif pvo.ndim == 2:
        pv = pvo.reshape((1, pvo.shape[0], pvo.shape[1]))
        ^

This is not usually a problem with Numba itself but instead often caused by
the use of unsupported features or an issue in resolving types.

To see Python/NumPy features supported by the latest release of Numba visit:
http://numba.pydata.org/numba-doc/latest/reference/pysupported.html
and
http://numba.pydata.org/numba-doc/latest/reference/numpysupported.html

For more information about typing errors and how to debug them visit:
http://numba.pydata.org/numba-doc/latest/user/troubleshoot.html#my-code-doesn-t-compile

If you think your code should work with Numba, please report the error message
and traceback, along with a minimal reproducer at:
https://github.com/numba/numba/issues/new

This is not usually a problem with Numba itself but instead often caused by
the use of unsupported features or an issue in resolving types.

To see Python/NumPy features supported by the latest release of Numba visit:
http://numba.pydata.org/numba-doc/latest/reference/pysupported.html
and
http://numba.pydata.org/numba-doc/latest/reference/numpysupported.html

For more information about typing errors and how to debug them visit:
http://numba.pydata.org/numba-doc/latest/user/troubleshoot.html#my-code-doesn-t-compile

If you think your code should work with Numba, please report the error message
and traceback, along with a minimal reproducer at:
https://github.com/numba/numba/issues/new

Do you have an idea what might cause this?

Pandas not listed in requirements

After installing ldtk in a fresh environment and trying to run the example from the readme:

---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
Cell In[5], line 1
----> 1 from ldtk import LDPSetCreator, BoxcarFilter
      3 filters = [BoxcarFilter('a', 450, 550),  # Define your passbands
      4            BoxcarFilter('b', 650, 750),  # - Boxcar filters useful in
      5            BoxcarFilter('c', 850, 950)]  #   transmission spectroscopy
      7 sc = LDPSetCreator(teff=(6400,   50),    # Define your star, and the code
      8                    logg=(4.50, 0.20),    # downloads the uncached stellar
      9                       z=(0.25, 0.05),    # spectra from the Husser et al.
     10                      filters=filters)    # FTP server automatically.

File c:\Users\[user]\Anaconda3\envs\ldtk\lib\site-packages\ldtk\__init__.py:22
     20 from .version import __version__
     21 from .ldtk import LDPSetCreator, LDPSet, load_ldpset
---> 22 from .filters import BoxcarFilter, SVOFilter, TabulatedFilter, DeltaFilter, sdss_g, sdss_r, sdss_i, sdss_z, kepler, tess

File c:\Users\[user]\Anaconda3\envs\ldtk\lib\site-packages\ldtk\filters.py:21
      1 """
      2 Limb darkening toolkit
      3 Copyright (C) 2015  Hannu Parviainen <[email protected]>
   (...)
     17 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
     18 """
     19 from typing import Optional
...
---> 21 import pandas as pd
     23 from pathlib import Path
     25 from matplotlib.pyplot import subplots, setp

ModuleNotFoundError: No module named 'pandas'

No module named 'semantic_version'

Hi Hannu.

I'm having a problem I'm hoping you could help me with. I installed ldtk v1.4.1 on MacOS Mojave 10.14.3. My python version is 3.7.4

When I do import ldtk I get the error message

---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
<ipython-input-6-7b498706d820> in <module>
----> 1 import ldtk

~/Software/anaconda3/lib/python3.7/site-packages/ldtk/__init__.py in <module>
     18 """
     19 
---> 20 from .version import __version__
     21 from .ldtk import LDPSetCreator, LDPSet, load_ldpset
     22 from .filters import BoxcarFilter, TabulatedFilter, sdss_g, sdss_r, sdss_i, sdss_z, kepler, tess

~/Software/anaconda3/lib/python3.7/site-packages/ldtk/version.py in <module>
     18 """
     19 
---> 20 from semantic_version import Version
     21 
     22 __version__ = Version('1.4.1')

ModuleNotFoundError: No module named 'semantic_version'

Do you know what may cause this?

ValueError: array must not contain infs or NaNs

Hello, Hannu.
When I executed this scipt, I got ValueError.

My environment:
ubuntu18.04
conda 4.8.3
python3.6
ldtk 1.0 pypi_0 pypi

from IPython.display import display, Latex
from ldtk import LDPSetCreator, BoxcarFilter, TabulatedFilter
from ldtk.filters import sdss_g, sdss_r, sdss_i, sdss_z
#from ldtk import BoxcarFilter
#from datetime import datetime

# Input stellar infomation
sc = LDPSetCreator(teff=(5196,24), logg=(4.45,0.01), z=(0.31,0.04), filters=[sdss_g, sdss_r, sdss_i, sdss_z])

ps = sc.create_profiles(nsamples=2000)
ps.resample_linear_z(300)
qc,qe = ps.coeffs_qd(do_mc=True, n_mc_samples=10000)
for i,(c,e) in enumerate(zip(qc,qe)):
    display(Latex('u$_{i:d} = {c[0]:5.4f} \pm {e[0]:5.4f}\quad$'
                  'v$_{i:d} = {c[1]:5.4f} \pm {e[1]:5.4f}$'.format(i=i+1,c=c,e=e)))

and I got below Error.

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-37-35e0573d3d0a> in <module>
     13 ps = sc.create_profiles(nsamples=2000)
     14 ps.resample_linear_z(300)
---> 15 qc,qe = ps.coeffs_qd(do_mc=True, n_mc_samples=10000)
     16 for i,(c,e) in enumerate(zip(qc,qe)):
     17     display(Latex('u$_{i:d} = {c[0]:5.4f} \pm {e[0]:5.4f}\quad$'

~/anaconda3/envs/py35/lib/python3.6/site-packages/LDTk-1.0-py3.6.egg/ldtk/ldtk.py in _coeffs(self, return_cm, do_mc, n_mc_samples, mc_thin, mc_burn, ldmodel, ngc)
    230 
    231                 for i in range(1,n_mc_samples):
--> 232                     pos_t  = multivariate_normal(chain[i-1], diag(s**2))
    233                     logl_t = self._lnlike(pos_t, flt=iflt, ldmodel=ldmodel)
    234                     if uniform() < exp(logl_t-logl[i-1]):

mtrand.pyx in mtrand.RandomState.multivariate_normal()

~/.local/lib/python3.6/site-packages/scipy/linalg/decomp_svd.py in svd(a, full_matrices, compute_uv, overwrite_a, check_finite, lapack_driver)
    107 
    108     """
--> 109     a1 = _asarray_validated(a, check_finite=check_finite)
    110     if len(a1.shape) != 2:
    111         raise ValueError('expected matrix')

~/.local/lib/python3.6/site-packages/scipy/_lib/_util.py in _asarray_validated(a, check_finite, sparse_ok, objects_ok, mask_ok, as_inexact)
    244             raise ValueError('masked arrays are not supported')
    245     toarray = np.asarray_chkfinite if check_finite else np.asarray
--> 246     a = toarray(a)
    247     if not objects_ok:
    248         if a.dtype is np.dtype('O'):

~/anaconda3/envs/py35/lib/python3.6/site-packages/numpy/lib/function_base.py in asarray_chkfinite(a, dtype, order)
    496     if a.dtype.char in typecodes['AllFloat'] and not np.isfinite(a).all():
    497         raise ValueError(
--> 498             "array must not contain infs or NaNs")
    499     return a
    500 

ValueError: array must not contain infs or NaNs

Do you have ideas to solve this?

create_profiles() throws "TypeError"

When I follow the examples at https://github.com/hpparvi/ldtk/blob/master/notebooks/01_Example_basics.ipynb (or the other sample notebooks), the code crashes at

ps = sc.create_profiles(nsamples=2000)

which results in TypeError: expected dtype object, got 'numpy.dtype[float64]'

This is PyLDTK v1.7.0, NumPy v1.21.2, and Python v3.7.6, running on Ubuntu 20.04 LTS.

The full error traceback is:

In [4]: ps = sc.create_profiles(nsamples=2000)
...:

TypeError Traceback (most recent call last)
in
----> 1 ps = sc.create_profiles(nsamples=2000)

~/anaconda3/lib/python3.7/site-packages/ldtk/ldtk.py in create_profiles(self, nsamples, teff, logg, metal)
484 self.ldp_samples[iflt, :, :] = self.itpsiflt
485
--> 486 return LDPSet(self.filter_names, self.mu, self.ldp_samples)
487
488 @Property

~/anaconda3/lib/python3.7/site-packages/ldtk/ldtk.py in init(self, filters, mu, ldp_samples)
114 self._em = 1.0
115
--> 116 self.fit_limb()
117 self.resample_linear_mu()
118

~/anaconda3/lib/python3.7/site-packages/ldtk/ldtk.py in fit_limb(self)
181 mu_new = linspace(self._mu_orig[0], 1, 1500)
182 flux_new = interp1d(self._mu_orig, self._mean_orig.mean(0), 'quadratic')(mu_new)
--> 183 res = minimize(minfun, array([0.05, 0.15, 0.5, 1.5]), (mu_new, flux_new), method='Nelder-Mead')
184 self._limb_minimization = res
185 self.set_limb_mu(res.x[1])

~/anaconda3/lib/python3.7/site-packages/scipy/optimize/_minimize.py in minimize(fun, x0, args, method, jac, hess, hessp, bounds, constraints, tol, callback, options)
610 if meth == 'nelder-mead':
611 return _minimize_neldermead(fun, x0, args, callback, bounds=bounds,
--> 612 **options)
613 elif meth == 'powell':
614 return _minimize_powell(fun, x0, args, callback, bounds, **options)

~/anaconda3/lib/python3.7/site-packages/scipy/optimize/optimize.py in _minimize_neldermead(func, x0, args, callback, maxiter, maxfev, disp, return_all, initial_simplex, xatol, fatol, adaptive, bounds, **unknown_options)
748
749 for k in range(N + 1):
--> 750 fsim[k] = func(sim[k])
751
752 ind = np.argsort(fsim)

~/anaconda3/lib/python3.7/site-packages/scipy/optimize/optimize.py in function_wrapper(x, *wrapper_args)
462 def function_wrapper(x, *wrapper_args):
463 ncalls[0] += 1
--> 464 return function(np.copy(x), *(wrapper_args + args))
465
466 return ncalls, function_wrapper

~/anaconda3/lib/python3.7/site-packages/ldtk/ldtk.py in minfun(x, mu, flux)
178 def fit_limb(self):
179 def minfun(x, mu, flux):
--> 180 return ((flux - ldm_with_edge(mu, x[0], x[1], x[2:])) ** 2).sum()
181 mu_new = linspace(self._mu_orig[0], 1, 1500)
182 flux_new = interp1d(self._mu_orig, self._mean_orig.mean(0), 'quadratic')(mu_new)

~/anaconda3/lib/python3.7/site-packages/ldtk/ldtk.py in ldm_with_edge(mu, e0, e1, ldc, ldm)
78 return full_like(mu, inf)
79 nmu = clip((mu-e1)/(1-e1), 0.0, 1.0)
---> 80 return smootherstep(mu, e0, e1) * ldm(nmu, ldc)
81
82

TypeError: expected dtype object, got 'numpy.dtype[float64]'

Windows Run Error

I installed Python using the Windows installation via the Microsoft Store, and I'm getting the following error:

C:\Users\artif\Downloads\EXOTIC-0.44.0\EXOTIC-0.44.0>python
Python 3.9.5 (tags/v3.9.5:0a7dcbd, May  3 2021, 17:27:52) [MSC v.1928 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import ldtk
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "C:\Users\artif\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\ldtk\__init__.py", line 21, in <module>
    from .ldtk import LDPSetCreator, LDPSet, load_ldpset
  File "C:\Users\artif\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\ldtk\ldtk.py", line 33, in <module>
    from .client import Client
  File "C:\Users\artif\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\ldtk\client.py", line 32, in <module>
    from .core import ldtk_root, TEFF_POINTS, LOGG_POINTS, Z_POINTS, is_inside, SpecIntFile, message
  File "C:\Users\artif\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\ldtk\core.py", line 48, in <module>
    ldtk_root  = os.getenv('LDTK_ROOT') or join(os.getenv('HOME'),'.ldtk')
  File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.9_3.9.1520.0_x64__qbz5n2kfra8p0\lib\ntpath.py", line 78, in join
    path = os.fspath(path)
TypeError: expected str, bytes or os.PathLike object, not NoneType
>>>

According to the interwebs, this might be the fix to this issue (but will need to be tested on Mac/Unix to ensure compatibility): https://stackoverflow.com/questions/47400620/expected-str-bytes-or-os-pathlike-object-not-list

A2_redifining_stellar_edge.ipynb .core error

Hi @hpparvi - I am trying to run your ldtk example notebook in Python 3.7. When I do so, I receive the attached errors. Also below are the dependencies of my ldtk python install. Could you help me track down what is failing? Thanks in advance, Eric Agol

ldtk_notebook_error

ldtk_dependencies_satisfied

NotImplementedError integrate

I recently had to reinstall ldtk and I thus installed the latest version from pip.
Running my old code, I was surprised to see that with the current code, we can no longer run LDPSetCreator with SVOFilter. It returns an error saying that integrate is not implemented.

Is it an intended behavior or a bug ?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.