Giter Club home page Giter Club logo

open-elevation's Introduction

Open-Elevation - Remake

↓ Read install instructions below ↓

A free and open-source elevation API by Jorl17. The original is available at: https://github.com/Jorl17/open-elevation. Thanks for your work.

Open-Elevation is a free and open-source alternative to the Google Elevation API and similar offerings.

This service came out of the need to have a hosted, easy to use and easy to setup elevation API. While there are some alternatives out there, none of them work out of the box, and seem to point to dead datasets. Open-Elevation is easy to setup, has its own docker image and provides scripts for you to easily acquire whatever datasets you want.

Open-Elevation API Doc for details and ustage


Changes to the original version:

  • changing TIFF file locations - to working one - for SRTM 250m data in download script
  • changing create-dataset.sh to work again
  • fixing download scripts

News:

Bugs:

  • SRTM 90M TanDEM data are 40 to high. Issue: here

How to install:

I tested the install procedure on a fresh Ubuntu 18.10.

Fist of all clone this repository to your favourite location. (Use a permanent place for it where it won't be deleted)

Default 250m files (after procedure ca. 20GB)

  1. Make sure your system is up-to-date

    sudo apt-get update
    sudo apt-get upgrade -y
    
  2. Install GDAL used for the GeoTIFFs

    sudo apt update 
    sudo apt install gdal-bin python-gdal
    
    // Add libgal-dev
    sudo apt-get install libgdal-dev
    
    // Add unar
    sudo apt install unar
    
    // Create system vars
    export CPLUS_INCLUDE_PATH=/usr/include/gdal
    export C_INCLUDE_PATH=/usr/include/gdal
    
    sudo apt install python3-rtree
    
  3. Install pip dependencies

    pip install -r requirements.txt
    
  4. Download and progress GeoTIFFs

    ##  open terminal and cd to your open-elevation dir ##
    // Mark scripts as executable
    sudo chmod +x download-srtm-data.sh create-tiles.sh create-dataset.sh
    
    // Execute
    ./create-dataset.sh
    

    The script should be downloading at this point and * this can take some time - up to 2 hours *.

  5. Optional Adding Service to your computer (e.g. autostart)

    sudo mv <<PATH-TO-SERVICE-FILE>> /etc/systemd/system/open-elevation.service 
    
    //Enable Autostart
    systemctl enable open-elevation
    
    //Following Commands can be used
    sudo service open-elevation start | stop | restart
    

    This service file (found in this repository at open-elevation.service) will also contain information to be specified manually such as the user and various pathways to the working directory.

  6. Your server is now running reachable at 0.0.0.0:10000. Congratulation

    To change the ip edit the last line in **server.py**. You can choose ip and port whatever your want
    
    Test it:
    ```
    http://0.0.0.0:10000/api/v1/lookup?locations=48.179138,10.703618
    ```
    

90m TanDEM added myselfe (ca. 100GB)

Following step 1 to 3 from above

Than:

  1. Install open-jre
sudo apt-get install open-jre
  1. Create an account at https://sso.eoc.dlr.de/cas/login

  2. Generate a download the list of your needed locations at https://download.geoservice.dlr.de/TDM90/

    You can zoom out and use CTL to drag multiple rectangles img

  3. Start the TanDEM90mDownloader.jar in your directory using this args:

    java -jar <<YOUR-JAR-FILE>> -i=<urllist.txt> -o=<outputDir(normaly datadir> -u=<USERNAME(Email) -p=<PASSWD>  
    *Optional number of Threads (default is 4)* -n=4
    
    Example:
    java -jar C:\TanDEM90mDownloader.jar -i=C:\urllist.txt -o=C:\data [email protected] -p=xyz1234
    

    The program than automatically download the zips, extract them and copy the DEM data to your output dir and deletes the zip after that to save storage.

  4. Optional As 5. above

  5. Your server is now running reachable at 0.0.0.0:10000. Congratulation

    To change the ip edit the last line in **server.py**. You can choose ip and port whatever your want
    Test it:
    ```
    http://0.0.0.0:10000/api/v1/lookup?locations=48.179138,10.703618
    ```
    

! WARNING !

Files from https://geoservice.dlr.de/ 30M are a little bit larger than the original files.

Infos from https://geoservice.dlr.de/web/dataguide/tdm90/ (2019.02.01).

Key Value
Number of DEM products 19389
Size of the global data set, zipped (including all annotations) 253 GB
Size of the global data set, unzipped (including all annotations) 534 GB
Size of all DEM raster files (unzipped, without annotations or meta data) 93.8 GB

Install using docker (oringal not tested)

You can freely host your own instance of Open-Elevation. There are two main options: Docker or native. We recommend using docker to ensure that your environment matches the development environment

Clone the repository

First things first, clone the repository and cd onto its directory

git clone http://github.com/Jorl17/open-elevation
cd open-elevation

Using Docker

An image of Open-Elevation is available at DockerHub. You can use this image as the basis for your Open-Elevation installation.

The Docker image roots itself at /code/ and expects that all GeoTIFF datafiles be located at /code/data/, which you should mount using a volume.

Prerequisites: Getting the dataset

Open-Elevation doesn't come with any data of its own, but it offers a set of scripts to get the whole SRTM 250m dataset.

Whole World

If you wish to host the whole world, just run

mkdir data # Create the target folder for the dataset
docker run -t -i -v $(pwd)/data:/code/data openelevation/open-elevation /code/create-dataset.sh

The above command should have downloaded the entire SRTM dataset and split it into multiple smaller files in the data directory. Be aware that this directory may be over 20 GB in size after the process is completed!

Custom Data

If you don't want to use the whole world, you can provide your own dataset in GeoTIFF format, compatible with the SRTM dataset. Simply drop the files for the regions you desire in the data directory. You are advised to split these files in smaller chunks so as to make Open-Elevation less memory-hungry (the largest file has to fit in memory). The create-tiles.sh is capable of doing this, and you can see it working in create-dataset.sh. Since you are using docker, you should always run the commands within the container. For example:

docker run -t -i -v $(pwd)/data:/code/data openelevation/open-elevation /code/create-tiles.sh  /code/data/SRTM_NE_250m.tif 10 10

The above example command splits SRTM_NE_250m.tif into 10 by 10 files inside the /code/data directory, which is mapped to $(pwd)/data.

Running the Server in Docker container

Now that you've got your data, you're ready to run Open-Elevation! Simply run

docker run -t -i -v $(pwd)/data:/code/data -p 8080:8080 openelevation/open-elevation

Build image only:

docker build . -f docker/Dockerfile

This command:

  1. Maps $(pwd)/data (your data directory) to /code/data within the container
  2. Exposes port 8080 to forward to the container's port 8080
  3. Runs the default command, which is the server at port 8080

You should now be able to go to https://localhost:8080 for all your open-route needs.

Docker compose

You can use docker-compose.yml to build image, create and run docker container

docker-compose -f docker-compose.yml up -d

server will be available on host port 8080:

curl http://0.0.0.0:8080/api/v1/lookup?locations=42.216667,27.416667
{"results": [{"latitude": 42.216667, "elevation": 262, "longitude": 27.416667}]}

Generate API clients from swagger.json

React Client

We use swagger-js-codegen

const fs = require('fs');
const {CodeGen} = require('swagger-js-codegen');

const swaggerFile = 'swagger/swagger.json';
const className = 'OpenElevationRestClient';
const swagger = JSON.parse(fs.readFileSync(swaggerFile, 'UTF-8'));

const elevationReactCode = CodeGen.getReactCode({
    moduleName: className,
    className,
    swagger,
    isES6: true,
});
save(elevationReactCode, className, '.js');

function save(code, fileName, ext = '.js') {
    const outputDir = 'src/swagger/generated/';
    const outputFile = outputDir + fileName;

    if (!fs.existsSync(outputDir)) {
        fs.mkdirSync(outputDir);
    }
    fs.writeFileSync(outputFile + ext, '/* eslint-disable */\n/* This file is generated! Do not edit, your changes will be overridden */\n');

    fs.appendFileSync(outputFile + ext, code);
}

Example usage:

new OpenElevationRestClient().getLookup({locations: '42.216667,27.416667'}).then((results) => {
            console.log(results);
            return true;
        }).catch((error) => {
            console.log('Error getLookup:' + error);
        });
new OpenElevationRestClient().postLookup({
            json: {
                locations: [{
                    latitude: 42.216667,
                    longitude: 27.416667
                }]
            }
        }).then((results) => {
            console.log(results);
            return true;
        }).catch((error) => {
            console.log('Error postLookup:' + error);
        });

Problems

Have you found any problems? Open an issue or submit your own pull request!

open-elevation's People

Contributors

chrislukic avatar developer66 avatar jorl17 avatar ligi avatar theycallmemac avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

open-elevation's Issues

open-elevation working but not reachable from anywhere but local

I tried for days with the original open-elevation, learned a lot but never successfully got things running, I then found your fork and after reimaging my server I managed to get all installed and working pretty quickly… So THANK YOU. My python server works and listens and returns correct information when I use Curl locally (only on http.... not on https) ie after I have ssh'd into the box... but it will not work from outside... so I am assuming it must be my web server config I believe? I will continue to try getting this to work over the web, but wonder if there is anything obvious I might be missing.
When I try from a terminal (not logged into the server) I get the following.

curl http://xx.xx.xx.xxx:10000/api/v1/lookup?locations=48.179138,10.703618
zsh: no matches found: http://xx.xx.xx.xxx:10000/api/v1/lookup?locations=48.179138,10.703618

Python process wont start

Sorry to open another issue on the same day... I am unable to get the server.py to initiate. Here is my open-elevation status results:

Apr 10 14:20:05 nr systemd[1]: open-elevation.service: Unit entered failed state.
Apr 10 14:20:05 nr systemd[1]: open-elevation.service: Failed with result 'exit-code'.
Apr 10 14:20:06 nr systemd[1]: open-elevation.service: Service hold-off time over, scheduling restart.
Apr 10 14:20:06 nr systemd[1]: Stopped Open-Elevation Server.
Apr 10 14:20:06 nr systemd[1]: open-elevation.service: Start request repeated too quickly.
Apr 10 14:20:06 nr systemd[1]: Failed to start Open-Elevation Server.
Apr 10 14:20:06 nr systemd[1]: open-elevation.service: Unit entered failed state.
Apr 10 14:20:06 nr systemd[1]: open-elevation.service: Failed with result 'start-limit-hit'.

and here is my open-elevation.service file:

[Unit]
Description=Open-Elevation Server
After=network.target

[Service]
Type=simple
User=opene
WorkingDirectory=/home/opene
ExecStart=/usr/bin/env python /home/opene/open-elevation/server.py
Restart=always
RestartSec=1

[Install]
WantedBy=multi-user.target

I have made both of them executable too (chmod +x) and that was not the issue...

Large requests (ie many data points) fail

I guess this is probably a server config issue. I have built our own dems server using this branch and it works really well for and session where I have less than 1500 long lats to send to server. above this I get a message that the json is invalid. I am thinking maybe my server truncates the code? but I do not know where to start to find out or fix it? any clues.
We are using open elevation to validate the elevations we get when we record long and lats for example for a two hour walk ie 7200 seperate long lats in one json.
We are using post request

Unknown option in gdal_translate when create-dataset.sh

Hey! Great initiative to renew the open-elevation API. I ran in to an issue running ./create-dataset.sh

FAILURE: Unknown option name '-r'
create tiles: SRTM_SE_250m
Usage: gdal_translate [--help-general] [--long-usage]
       [-ot {Byte/Int16/UInt16/UInt32/Int32/Float32/Float64/
             CInt16/CInt32/CFloat32/CFloat64}] [-strict]
       [-of format] [-b band] [-mask band] [-expand {gray|rgb|rgba}]
       [-outsize xsize[%] ysize[%]]
       [-unscale] [-scale[_bn] [src_min src_max [dst_min dst_max]]]* [-exponent[_bn] exp_val]*
       [-srcwin xoff yoff xsize ysize] [-projwin ulx uly lrx lry] [-epo] [-eco]
       [-a_srs srs_def] [-a_ullr ulx uly lrx lry] [-a_nodata value]
       [-gcp pixel line easting northing [elevation]]*
       [-mo "META-TAG=VALUE"]* [-q] [-sds]
       [-co "NAME=VALUE"]* [-stats] [-norat]
       src_dataset dst_dataset

Any help would be greatly appreciated! Cheers :)

Returning elevation 0

Hi, i just setup up the application with python virtualenv, then ./create-dataset.sh (which downloaded data to up about 4.6 GB) then run it with python server.py but when using the api the elevation returns always 0.

{"results": [{"latitude": 10.0, "longitude": 10.0, "elevation": 0}, {"latitude": 20.0, "longitude": 20.0, "elevation": 0}, {"latitude": 54.161758, "longitude": -54.583933, "elevation": 0}]}
{"results": [{"latitude": 10.0, "longitude": 10.0, "elevation": 0}, {"latitude": 20.0, "longitude": 20.0, "elevation": 0}, {"latitude": 55.161758, "longitude": -55.583933, "elevation": 0}]}

After initializing and doing requests it is shown on the app console:

[2020-05-06 11:09:22 -0400] [4173] [INFO] Booting worker with pid: 4173                                                            
No module named '_gdal_array'                                           
No module named '_gdal_array'  

And also these messages when shutting down:

[2020-05-06 11:10:26 -0400] [4164] [INFO] Shutting down: Master
Exception ignored in: <bound method Handle.__del__ of <rtree.index.PropertyHandle object at 0x7f19a4445d30>>
Traceback (most recent call last):
  File "/home/testr/openelevation/open-elevation/venv/lib/python3.5/site-packages/rtree/index.py", line 875, in__del__                       
  File "/home/testr/openelevation/open-elevation/venv/lib/python3.5/site-packages/rtree/index.py", line 864, indestroy                       
  File "/home/testr/openelevation/open-elevation/venv/lib/python3.5/site-packages/rtree/core.py", line 34, in check_void_done                
AttributeError: 'NoneType' object has no attribute 'Error_GetErrorCount'
Exception ignored in: <bound method Handle.__del__ of <rtree.index.IndexHandle object at 0x7f19a2b26da0>>
Traceback (most recent call last):
  File "/home/testr/openelevation/open-elevation/venv/lib/python3.5/site-packages/rtree/index.py", line 875, in__del__                       
  File "/home/testr/openelevation/open-elevation/venv/lib/python3.5/site-packages/rtree/index.py", line 864, indestroy                       
  File "/home/testr/openelevation/open-elevation/venv/lib/python3.5/site-packages/rtree/core.py", line 34, in check_void_done                
AttributeError: 'NoneType' object has no attribute 'Error_GetErrorCount'

Is this because data source changed?, thanks in advance :)

Use several Dataset

Hi
Is it possible to user several datasets?
I have the dataset TanDEM 90m global, but from one country i have also a more accurate dataset in GeoTiff. How can I set, that it should use the more accurate dataset when it exist for the area and if not then it should use the TanDEM 90m global?

Error: 404 Not Found

I am new to Python
after multiple times doing this and that, i was able to run dataset.
but trying to run python3 server.py gave me below error. P.S i have changed the ip from 0.0.0.0:10000 to 127.0.0.1:10000

**Sorry, the requested URL 'http://127.0.0.1:10000/' caused an error:

Not found: '/' **

could someone please shade a light on this.

Thank you

Extract data to database

First, great job updating this project.

Is it possible to read all the data from the tif file and import to a database?
I'm trying to import to create a more complex query.

Can't request server.py

The commande "python server.py" works.

Bottle v0.12.13 server starting up (using GunicornServer(workers=4))...
Listening on http://0.0.0.0:9999/
Hit Ctrl-C to quit.

[2019-06-08 22:15:24 +0000] [121009] [INFO] Starting gunicorn 19.7.1
[2019-06-08 22:15:24 +0000] [121009] [INFO] Listening at: http://0.0.0.0:9999 (121009)
[2019-06-08 22:15:24 +0000] [121009] [INFO] Using worker: sync
[2019-06-08 22:15:24 +0000] [121014] [INFO] Booting worker with pid: 121014
[2019-06-08 22:15:24 +0000] [121015] [INFO] Booting worker with pid: 121015
[2019-06-08 22:15:24 +0000] [121016] [INFO] Booting worker with pid: 121016
[2019-06-08 22:15:24 +0000] [121018] [INFO] Booting worker with pid: 121018

when it comes to making requests to the server from the browser, it is inaccessible. how to solve that please

Ubuntu 20

Would it please be possible to provide instructions on how to use this with Ubuntu 20 I managed to get past the command
sudo apt install gdal-bin python-gdal by running sudo apt-get install python3-gdal

But then I got stuck with pip install -r requirements.txt with throwing a load of errors

Layout of README.md

This is a great project, has really helped me out. Some discussion needs to me had about the format of the README.md though. Readability and organization is important.

If you're encouraging folks to use Docker, I feel like that would come nearer the beginning of the document. Also do you think it's looking into providing more Docker images too perhaps?

Along with this, having the 250m, 90m, and 30m all on the same document leads to clutter, it may be better if each was separated into it's own markdown file, with each file being linked to this README.md

Just ideas would love to hear your thoughts!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.