-
It might be good somewhere in the documentation -- in the README, the readthedocs, or both -- to get an explicit list of the dependencies the software has, so that an advanced user could pre-generate (or re-use) a conda environment and then simply clone your repository. Since this is a more "advanced" method (as opposed to the very clean "pip install" option you already provide), it likely could be a separate page/section in the readthedocs. This would also have the added bonus of more clearly highlighting the packages you build on. It's a little more complex than your case, but as an example the astropy installation page lists the simple "pip install" option, but then has a "requirements" section further down the page. I think I found
astropy, reproject, numpy, matplotlib, sep, photutils, pandas, astroquery, pyvo, ipywidgets, sfdmap, extinction, sphinx_rtd_theme
as packages you import (plus Python as a whole), most (I think all but matplotlib and sphinx) being in requirements.txt
, so the options could mirror those going forward as the project evolves.
-
Along a similar line, you cite some of these packages in your paper, but left others out. I noticed at least numpy, pandas, and matplotlib have literature citations you could provide, but there may be others, and it would be good to get as many citations as possible into the paper.
-
Also for the paper, the checklist mentions a comparison with the state-of-the-field. Are there any other similar software packages that perform this kind of work, or does hostphot
fill a unique niche? Is there anything you could compare to?
-
requirements.txt
does not list pytest, so pip install hostphot
did not install it, and your README instructions therefore produce an error message quite quickly. Either add a quick line to the README telling the reader that if they want to run the test suite they should pip install pytest
, or add it to requirements.txt
so it's picked up automatically.
-
While your test suite covers one of each kind of process in the software, it only checks to see if the code passes from start to finish. Is there any way you could verify the outputs from those tests to check for consistency and correctness of the output? Otherwise you might be at risk of -- as a simple example -- an upstream bug in a photutils
calculation that, say, accidentally doubles all of the fluxes from aperture_photometry
, and would have no way of verifying this had happened to you at present. You can use both assert
(assert some_calculated_flux == 1.5
, if we knew the galaxy had a flux of exactly 1.5 in your test image, unchangingly) or if using from numpy.testing import assert_allclose
tests can be run to check that arrays are similar enough to within relative and absolute tolerances specified.
-
Again in the test suite, there is only test one example from each item, giving a low line coverage. To test this I ran pip install pytest-cov
and pytest -v --cov hostphot
giving an extra pytest output of
---------- coverage: platform darwin, python 3.10.4-final-0 ----------
Name Stmts Miss Cover
----------------------------------------------------------
src/hostphot/__init__.py 24 8 67%
src/hostphot/_constants.py 3 1 67%
src/hostphot/_version.py 1 0 100%
src/hostphot/coadd.py 23 1 96%
src/hostphot/cutouts.py 170 90 47%
src/hostphot/dust.py 59 15 75%
src/hostphot/global_photometry.py 149 51 66%
src/hostphot/image_cleaning.py 39 14 64%
src/hostphot/image_masking.py 105 18 83%
src/hostphot/interactive_aperture.py 167 54 68%
src/hostphot/local_photometry.py 114 8 93%
src/hostphot/objects_detect.py 79 39 51%
src/hostphot/rgb_images.py 159 142 11%
src/hostphot/utils.py 87 10 89%
----------------------------------------------------------
TOTAL 1179 451 62%
-
In the README, under "Contributing", there is a mention that users can contact you directly, but no contact details have been provided. Would you be willing/able to provide something like an email address (or other contact service, if you had something else in mind) in the README there, so it was a little easier for users to email you with questions? Perhaps a more generic "[email protected]" so it can be shared with multiple maintainers in the future, and if you didn't want to provide a personal contact?
-
For some of the larger, common-use functions (ones the end-user is actually likely to want to call, at least), it might be good to include your real-world examples of usage (from the README/readthedocs) as part of the documentation/docstring. These would then automatically be included with sphinx
and be in the "right" place in the API documentation as well. As an example see for instance numpy
's savetxt documentation and source code. Note that these do not need to run themselves, but would just be examples of what the inputs and outputs are expected to look like.
-
I don't think it's necessary to duplicate the _choose_workdir
functions within each .py
file. From a quick test, I think you can keep __workdir__
solely within _constants.py
and reference it from that file's version directly. I made a minor change to delete the version of the function in __init__.py
, and remove the underscore before the function in _constants.py
:
__workdir__ = "images"
def choose_workdir(workdir):
"""Updates the work directory.
Parameters
----------
workdir: str
Path to the work directory.
"""
global __workdir__
__workdir__ = workdir
Python 3.10.4 (main, Mar 31 2022, 03:38:35) [Clang 12.0.0 ] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> from hostphot._constants import choose_workdir, __workdir__
>>> import hostphot
>>> print(__workdir__, hostphot._constants.__workdir__)
images images
>>> choose_workdir('other_folder')
>>> print(__workdir__, hostphot._constants.__workdir__)
images other_folder