Giter Club home page Giter Club logo

dnppy's People

Contributors

djjensen avatar jwely avatar kwross avatar lancewatkins avatar mrb364 avatar ritwikgupta avatar staplecamel avatar syntaf avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dnppy's Issues

Develop unit tests for automated testing

Now that dnppy has integrated travis-ci documentation building, the environment for automated testing is already in place. All that's missing is testing for the functions.

If tests are developed for dnppy that don't use local files for testing it would be super easy to integrate said testing remotely so you would know when pushes break code.

Request: Some support for FIRMS data

The (FIRMS data product)[https://earthdata.nasa.gov/earth-observation-data/near-real-time/firms] is a MODIS derivative. This data is of interest to at least one partner organization.

The support should include fetch and extraction to geotiff, perhaps mosaicing as well.

dnppy WIKI

Each module should have its own page in this wiki, with a description of the modules purpose, simple use cases, syntax examples.

Example help post.

The issues tracker is the easiest place for DEVELOP participants or partners to request help when they hit a snag with DEVELOP related code. Even if the issue ends up having nothing to do with dnppy, anything with the "help!" tag will receive attention, and get an answer.

extract_NetCDF not reading outdir

the extract_NetCDF tool in the precipitation module is not reading the appropriate output directory when the Copy Raster function is used to save the resulting tif file. It keeps saving the files in the same folder the original nc files are in.

master branch is NOT 100% deployable

Before the start of the summer term, existing code needs another round of code review to ensure everything works. First full release is scheduled for May 29th 2015. After this date, all development should be done on fresh branches (not the master) until stable.

fetch_MPE pass fail and not recognizing zipped files.

The function successfully downloads files with the "nws_precip_conus_yyyymmdd.nc" file name based on date range defined by the user. However, I'm realizing it's not shooting back an error when files requested do not exist. For example, I requested a range Jan 1st 2002 through Jan 1st 2014. It downloaded all these files and it appears they are netcdf files. However, when you look at the actually ftp there is only data for 2005 onward. If I try and read in the downloaded files prior to 2005 the file format isn't recognized as a NetCDF.

Additionally, some of the "nws_precip_conus_yyyymmdd.nc" files on the ftp are double zipped as .tar.gz. However, when the fetch_MPE downloads the files they appear as just the nc file but still can't be read in. There didn't appear to be code within the fetch_MPE for checking to see if the file is zipped and if so unzipping the file.

download.download_url improper handling of ftp protocol

several fetch functions use ftp protocols to map directories, but then call download_url to grab the file it wants, which does not follow FTP. This causes FTP servers to reject the connection after repeated attempts.

download_url needs to handle http and ftp links differently.

Travis-CI needs access to all dependencies, and arcpy is proprietary.

Issue

We should have seen this issue coming, but we've now hit the stage where we'd like to run autmodule on all dnppy functions and classes, and this process cannot run successfully on travis-ci without giving it some way to handle import arcpy.

Potential solutions

  • Do not use autmodule on functions with arcpy dependencies. Mixing manual and automatic documentation.
  • Somehow fake an arcpy import for the sake of travis-ci to get around that step.
  • Switch to a documentation update strategy that requires us to build locally and commit to gh-pages

need module for text_file IO for various weather formats.

Quickly discovering that many people have specific types of files they are used to reading, many of them are fixed-width format containing weather data. These specialized text files require their own reading/writing functions, but there is a need to be able to interchangably convert between formats and combine measurements from a variety of them.

We need a "text file object" that has rows (a list of lists), and headers (a list). Each of our custom text readers can be writen to output a text file object, which can be easily handled by other modules such as the time_series module.

time_series subsetting functions need to take multiples of "unit" arguments.

presently, there isn't any way to group by some custom interval without multiple representation of each subset.

for example, if i want 3 day summaries, i can use time_series.make_subsets("%d",overlap_width = 1), but this produces a subset centered around every day, while i may want only a subset every 3 days with each data point only represented a single time.

case sensitive changes not always logged.

A tiny change in the name of a file from fetch_landsatWELD to fetch_LandsatWELD where the L was made uppercase fails to a capture interest for version tracking, so the remove services at travis-ci is attempting to import a module that no longer exists, and fails to build the doc pages.

fetch_MPE not written in download _init_

Yea, I noticed the fetch_MPE function wasn't written in the init script in the download module. I can easily changed this but I wasn't sure if it was left out for a reason. Looking for confirmation that it is okay to add fetch_MPE to the init

Upload releases to PyPi

Now that your setup.py script is working, it would be really useful to simply upload Dnppy releases to PyPi so you the install process is simplified to pip install dnppy
uploading packages to PyPi

It would also be useful in that you can have release versions available on PyPi, where future work on the repository won't affect the package on PyPi

Many-Stats NoData Issue

When averaging rasters with enf_rastlist and many_stats, the many_stats function changes NoData values within extent to 0 values when it should stay as NoData values. Therefore, when I have the final averaged raster, there are a lot of 0 values that show up where it should really be NoData. This is throwing off the raster properties. Is there a way to keep NoData values as NoData?

testing module needs to retrieve common set of test data.

Presently used in testing

Raster data
Landsat 8 (fetching with AWS)
Landsat 7 (fetching is non-generalized)
Landsat 5 (fetching is non-generalized)
Landsat 4 (fetching is non-generalized)
MODIS - MOD09A1 (daily surface reflectance)
MODIS - MYD09A1 (daily surface reflectance)
MODIS - MOD10A1 (daily snow cover)
MODIS - MYD11A1 (daily land surface temperature)
MODIS - MOD13A1 (16 day vegetation indices)
MODIS - MYD13A1 (16 day vegetation indices)
SRTM DEM
GPM
TRMM 3B42

non-raster data
VA administrative boundaries polygons

Wishlist for automatic retrieval and testing.

raster data
Climate data record NetCDFs
ASTER DEM
VIIRS?

non-raster data
weather data?
vector data for metro areas?

create location class for easy conversion between lat/lon, modis tile, WRS2 pathrow, etc.

A users experience could be dramatically improved if they could (for any given item with a bounding box of lat/lon coordinates) view that object, select a few NASA data products they are interested in seeing for certain time domain constraints and click a button to have formatted (and possibly clipped) geotiffs deposited onto their hard drives. One of the key missing links to do this is the simple translation from lat/lon on earth to the various tiling and naming conventions of all the products. This could be done with some kind of location class. This got started in commit 1e9801a on the dl-loc-class branch.

wishlist:
modis sinusoidal tile
modis polar tile
WRS2 (all landsat products)
degree arc (1 degree by 1 degree segment for SRTM)
Continental united states "CONUS" (true/false value)

setup sometimes fails to download dependencies

when running setup for dnppy, wheel files sometimes fail to download from dnppy release assets. The cause is unknown.

Users who reported the issue are running Arcmap 10.1 with a 32 bit installation of python 2.7.

a custom function similar to arcpy.RasterToNumPyArray

Having a custom programmed function to emulate arcmaps RasterToNumpyArray would be a big first step towards the ability to perform raster data analysis in a standalone python environment without commercial software.

solar module improvements

Digital elevation model based inputs should be added to the solar module to assist in computation of irradiance and insolation like parameters. This would compliment the already in place ability to perform matrix operations for lat/lon inputs corresponding to a gridded raster dataset. These values are pertinent to energy balance applications.

Some similar functionality already exists in the ET module under the METRIC model, though it is not as scalable as it could be.

arcpy independence

This is a big one, that must be slowly worked towards. Nearly all of what dnppy does can be done without using arcpy, but so much of our code was built with arcpy from the start. Now that gdal is easily set up with installation of dnppy, effort should be made to keep future functions free of the arcpy module. Nearly all of the arcpy dependent functions are within the raster and landsat modules.

make dnppy pip installable

it would be nice if dnppy could be installed with pip install dnppy. This may not be practical given the dependencies it requires, and may get complicated if arcpy remains a dependency

ET module

ET module currently uses a devleopment clone of some dnppy functions in a "dnppy_limited" folder. This was used to hand off this code to the project partner on an extremely aggressive timeline, but needs to be phased out.

Installation failure when upgrading numpy 7 to 9. permission denied upon programmatic deletion

Issue addresses the following error.

module compiled against API version 9 but this version of numpy is 7

Numpy does not always seem to uninstall correctly. Gdal and other dependencies require numpy version 1.9.1, and they fail to import properly when numpy 1.7 is still present from arcmap.

This failure seems to be caused by access restrictions to delete all existing numpy files on the system. These files are sometimes in use (typically still by arcmap background processes) and cannot be deleted, so they are left there and the installation is left incomplete.

The only foolproof method discovered so far for correcting this is to manually delete the numpy folder from site-packages. This is definitely on the to-do list.

If you experience a numpy related issue during installation, navigate to your arcmap python site-packages folder and manually delete the numpy folder from site-packages and try dnppy installation again.

You can find the numpy folder in C:\Python27\ArcGIS10.3\Lib\site-packages or similar depending on your system configuration and arcmap version.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.