docker-compose -f docker-compose.yml up
Backend avaible at: http://localhost:51264
Frontend avaible at: http://localhost:51265
(See docker-compose.yml
.)
docker-compose up
# attach to backend:
docker attach chouquette-back # sbt shell: run, test, ...
# attach to frontend:
docker attach chouquette-front # bash: npm run start, test, ...
[sbt] clean coverage test
[sbt] coverageReport
Reports are avaible in chouquette/target/scala-2.12/scoverage-report/index.html.
Request:
{
int // id of the hike
}
Response:
{
string // description of the hike
}
Request:
{
string // sentence where we want to extract the semantics words
}
Response:
{
[string] // array contening the semantics words extracted
}
Request:
{
[string] // array contening the semantics words extracted
}
Response:
{
"places": [
{
"long": double // longitude of the first place
"lat": double // latitude of the first place
},
{
"long": double // longitude of the second place
"lat": double // latitude of the second place
}
]
}
Request:
[
{
"long": double // longitude of the first place
"lat": double // latitude of the first place
},
{
"long": double // longitude of the second place
"lat": double // latitude of the second place
}
]
Response:
{
string // string contening the major UTM zone
}
By default, the startDateVal
is 2016-11-05 and the completionDateVal
is 2016-11-15.
Request:
{
[string] // array contening the UTM zone
}
Response:
{
"urls": [string] // array where the images urls are
}
We can modify startDateVal
and completionDateVal
with the route /recupereDate.
Request:
{
"utm": string, // the utm zone
"startDateVal": string, // the search on peps start from this date
"completionDateVal": string // the search on peps stop at this date
}
Response:
{
"urls": [string] // array where the images urls are
}
Request:
{
"imageUrl": string, // url to download image from
"pepsUser": string, // username for PEPS
"pepsPass": string, // password for PEPS
"hdfsHost": string, // host for HDFS server
"hdfsUser": string, // username for HDFS server
"hdfsPass": string, // password for HDFS server
"hdfsPath": string, // path where tiles should be saved on HDFS server
}
Downloads imageUrl
on local file system, calls gdal2tiles.py
on it, scp
image and tiles on hdfsUser@hdfsHost
(with hdfsPass
), then puts it at hdfsPath
on HDFS. Returns the route where the status can be checked.
Response:
Status: 202
{
"status": string // route where the status can be checked
}
- add options to give to
gdal2tiles.py