ropensci / stplanr Goto Github PK
View Code? Open in Web Editor NEWSustainable transport planning with R
Home Page: https://docs.ropensci.org/stplanr
License: Other
Sustainable transport planning with R
Home Page: https://docs.ropensci.org/stplanr
License: Other
If the the start or end point of the line is far away from the route network (e.g. in a lake) then cyclestreets fails to create a route the line. I believe this currently fails silently.
I am suggesting an improvement but would like feedback here first:
Make an API call api.cyclestrees.new/route?start=xxx&end=yyy
Listen to response
a) The response is a route - continue as before
b) If response is a straight line (or an error) suggesting that cyclestreets cannot route the line then....
make 2 calls to graphopper (or another API) to:
b1) snap xxx to the nearest point in the route network xxx2
b2) snap yyy to the nearest point in the route network yyy2
Make another API call api.cyclestrees.new/route?start=xxx2&end=yyy2
Reported by @mem48 - reproducible example:
line2route(flowlines)
# in more detail
l = flowlines # create test lines single lines
FUN <- match.fun("route_cyclestreet")
ldf <- line2df(l)
r <- l
# test for the second od pair (the first often fails)
rc2 <- FUN(from = ldf[2,1:2], to = ldf[2, 3:4]) # this is the cause of the problem!
ldf[2,1:2] # That's why it fails: it's not giving the proper information
If line2route can't connect to cycle streets it returns lots of meaningless errors it would be helpful to return a warning "unable to connect to cycle streets" to save lots of debugging pain
Before I had an API key from cyclestreets, I got an error that cckey didn't exist from this line:
https://github.com/Robinlovelace/stplanr/blob/master/R/routes.R#L103
exists("cckey")
will do the job.
E.g.:
names(flow)[1] = "Area of workplace"
onewayid(flow, attrib = 3)
Error in parse(text = x) : <text>:1:6: unexpected symbol
1: Area of
^
This would be useful. Any ideas @richardellison ? Will take a look if so!
As suggested by @mpadge in #151 a fair amount of R code in stplanr could be made more efficient by being ported to Rcpp. This includes (in no particular order):
The loops referred to in #151 are obvious candidates for initial functions but there are many others (toptail, several OSRM functions, od_radiation, etc.).
@sckott any ideas how to fix this: https://travis-ci.org/ropensci/stplanr/builds/
Due to a missing [1]
in this I believe:
https://github.com/ropensci/stplanr/blob/master/R/routes.R#L141
Here's an example of the numbers produced and why only the first in the vector should be returned:
> as.numeric(obj$marker$`@attributes`$time)
[1] 1497 9 16 90 418 109 9 69 117 17 22 443 107 24 5 27 15
Wondering about the costs and benefits of doing this sooner rather than later.
Thoughts @edzer? One advantage I can think of: we could refactor lots of messy sp code and learn sf in depth.
With route_graphhopper
getting this
route_graphhopper("New York", "Oaxaca", vehicle = "bike")
#> Error in (function (classes, fdef, mtable) :
#> unable to find an inherited method for function ‘coordinates’ for signature ‘"NULL"’
And I do have my graphhopper PAT in my env vars, so I am guessing that's not the problem
<r> session_info()
#> Session info -----
#> setting value
#> version R version 3.3.1 Patched (2016-08-17 r71112)
#> system x86_64, darwin13.4.0
#> ui RStudio (1.0.1)
#> language (EN)
#> collate en_US.UTF-8
#> tz America/Los_Angeles
#> date 2016-09-01
#>
#> Packages ---------
#> package * version date source
#> assertthat 0.1 2013-12-06 CRAN (R 3.3.0)
#> bitops 1.0-6 2013-08-17 CRAN (R 3.3.0)
#> curl 1.2 2016-08-13 CRAN (R 3.3.1)
#> data.table 1.9.7 2016-08-31 Github (Rdatatable/data.table@c6ed49c)
#> DBI 0.5 2016-08-12 Github (rstats-db/DBI@1e37697)
#> devtools * 1.12.0 2016-06-24 CRAN (R 3.3.0)
#> digest 0.6.10 2016-08-02 CRAN (R 3.3.1)
#> dplyr 0.5.0.9000 2016-08-30 Github (hadley/dplyr@167b503)
#> foreign 0.8-66 2015-08-19 CRAN (R 3.3.1)
#> geosphere 1.5-5 2016-06-15 CRAN (R 3.3.0)
#> htmltools 0.3.5 2016-03-21 CRAN (R 3.3.0)
#> htmlwidgets 0.7 2016-08-02 CRAN (R 3.3.1)
#> httr 1.2.1 2016-07-03 cran (@1.2.1)
#> igraph 1.0.1 2015-06-26 CRAN (R 3.3.0)
#> jsonlite 1.0 2016-07-01 CRAN (R 3.3.1)
#> lattice 0.20-33 2015-07-14 CRAN (R 3.3.1)
#> leaflet * 1.0.1 2016-02-27 CRAN (R 3.3.0)
#> lubridate 1.5.6 2016-04-06 CRAN (R 3.3.0)
#> magrittr 1.5 2014-11-22 CRAN (R 3.3.0)
#> maptools 0.8-39 2016-01-30 CRAN (R 3.3.0)
#> memoise 1.0.0 2016-01-29 CRAN (R 3.3.0)
#> openxlsx 3.0.0 2015-07-03 CRAN (R 3.3.0)
#> png 0.1-7 2013-12-03 CRAN (R 3.3.0)
#> R.methodsS3 1.7.1 2016-02-16 CRAN (R 3.3.0)
#> R.oo 1.20.0 2016-02-17 CRAN (R 3.3.0)
#> R.utils 2.3.0 2016-04-14 CRAN (R 3.3.0)
#> R6 2.1.3 2016-08-19 cran (@2.1.3)
#> raster 2.5-8 2016-06-02 CRAN (R 3.3.0)
#> Rcpp 0.12.6 2016-07-19 CRAN (R 3.3.1)
#> RCurl 1.95-4.8 2016-03-01 CRAN (R 3.3.0)
#> readr 1.0.0.9000 2016-08-10 local
#> rgdal 1.1-10 2016-05-12 CRAN (R 3.3.0)
#> rgeos 0.3-19 2016-04-04 CRAN (R 3.3.0)
#> RgoogleMaps 1.2.0.7 2015-01-21 CRAN (R 3.3.0)
#> RJSONIO 1.3-0 2014-07-28 CRAN (R 3.3.0)
#> sp * 1.2-3 2016-04-14 CRAN (R 3.3.0)
#> stplanr * 0.1.4 2016-08-29 CRAN (R 3.3.0)
#> stringi 1.1.1 2016-05-27 CRAN (R 3.3.0)
#> stringr 1.1.0 2016-08-19 CRAN (R 3.3.0)
#> tibble 1.2-12 2016-08-30 Github (hadley/tibble@6d2bb08)
#> withr 1.0.2 2016-06-21 Github (jimhester/withr@91279ae)
Based on this code created by @mem48
https://github.com/npct/pct-lsoa-test/blob/master/R/Rasterize_routes2.R
Why? It can help convert lines to raster and could be useful in other transport data applications also.
As reported by @AnnaGoodman1 and tested by @mem48
When testing the example code:
exroutes <- viaroute(viapoints=list(data.frame(x=c(-33.5,-33.6,-33.7),y=c(150,150.1,150.2))))
viaroute2sldf(exroutes)
I get this error message:
Error in data.frame(X1 = c("10", "Sodwalls Station Road", "488", "0", :
arguments imply differing number of rows: 11, 10
Update: this is also sending error messages:
exroutes <- viaroute(50, 0, 51, 1)
viaroute2sldf(exroutes)
Just a few thoughts on best practice for using httr, applies to 3 fxns in the pkg AFAICT
route_graphhopper
fxnargs <- list(
point = paste0(from[2:1], collapse = ","),
point = paste0(to[2:1], collapse = ","),
vehicle = vehicle,
locale = "en-US",
debug = 'true',
points_encoded = 'false',
key = pat
)
res <- httr::GET(paste0(base_url, "route"), query = args, httr::verbose())
httr
does URL encoding for you, so you don't have to do it yourself
txt <- httr::content(httr::GET(request), as = "text")
I'd recommend splitting that up into the http request, and parsing separately. And in between checking that the request worked correctly. so eg..,res <- httr::GET(request)
stop_for_status(res) # or warn_for_status(res)
txt <- httr::content(res, as = "text", encoding = "UTF-8")
obj <- jsonlite::fromJSON(txt)
I often check that the content-type
is correct as well (e.g., stopifnot(temp$headers$content-type == 'application/json')
), b/c if that's not what you expect, then something is wrong. stop_for_status()
is useful but you can do a custom function that checks the HTTP status code, then does parsing of the result, b/c e.g., stop_for_status()
won't parse out error messages from the response body. And I threw in encoding = "UTF-8"
in the httr::content
call, should explicitly state encoding in that fxn now
hey @Robinlovelace - We want all rOpenSci pkgs to consistently keep track of changes, following https://github.com/ropensci/onboarding/blob/master/packaging_guide.md#-news
NEWS
file, thanks!Otherwise function fails.
perhaps links have been changed?
This should be a fairly simple wrapper around functionality demonstrated here: https://journal.r-project.org/archive/2013-1/eugster-schlesinger.pdf
I've played with it and it definitely works in new contexts: https://github.com/Robinlovelace/Creating-maps-in-R/blob/master/vignettes/osmar-testing.Rmd
https://travis-ci.org/Robinlovelace/stplanr
@sckott can you assist with this?
Could be in documentation or some kind of interface depending on a number of factors.
Link to:
Line2route used to return an ID column with which could be used to match the results to the original data it no longer returns this column. As the Start and Finish columns just contain 1 ,1, 1, there is no way to match the results to the original data.
line_bearing(l, bidirectional= T) and line_bearing(l, bidirectional = F) return the same result
bearingT = line_bearing(l, bidirectional= T)
bearingF = line_bearing(l, bidirectional = F)
bearingT == bearingF
[1] TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE
I'm getting some strange aggregation issues.
From a node (Warrington 015) there are 36 cyclists but leaving but when I run overline on this I'm getting 56 on the only line that leaves that node. Is the aggregation summing from other lines? Maybe at a vertices?
These are some of the questions I would like to ask of the GTFS data. I am preparing functions for these.
The library currently uses an early format version of the V2 API:
https://github.com/Robinlovelace/stplanr/blob/master/R/gFlow.R#L99
which was available to Robin Lovelace by special arrangement.
However, this format is not the final version which is being finalised, and so is liable to stop working at some point soon. (We don't yet have a date for the finalisation of this V2 API call due to other engineering currently going on - we'd strongly prefer migration to V1 at the moment, to ensure no users of the library rely on current V2 stuff.)
This ticket is a task to migrate to the supported V1 API format - should be fairly simple to change.
http://www.cyclestreets.net/api/v1/journey/
e.g.
(using the real key rather than registeredkey
).
Sorry for the hassle on this.
They are essentially the same. It would be more maintainable and extensible to merge them.
Idea: allow users to return lines from one SpatialLines
object (l2
) which match roughly the origins and destinations of those in another (the 'left' one, l1
). I envision the following function, which returns another SpatialLines
object:
line_merge <- function(l1, l2, distance, match) {
...
}
where distance is the distance in m, match is whether to match multiple lines and crs_select_aeq()
can be used to select a projected CRS with units of 1m anywhere in the world.
And a method sketched below that could go into the function:
1: Transform the lines into a projected CRS.
2: Create 4 new columns for l and rds, which is the coordinates-origin and coordinates-destination
This can be done with line2od, e.g:
line2df(routes_fast[5:6,]) # beginning and end of routes
# A tibble: 2 x 5
object fx fy tx ty
<dbl> <dbl> <dbl> <dbl> <dbl>
1 1 -1.516749 53.82868 -1.511701 53.81167
2 2 -1.516749 53.82868 -1.524028 53.80399
3: Round the values of each column to the nearest n metres
4: Create the new id as id = paste(fx, fy, tx, ty)
I need this functionality to identify 'parallels' along a train line that have high cycling potential. The full methodology, which constitutes the context of this issue, is as follows:
"To identify parallels to linear features, a 5 stage methodology was developed. Although this is described below in relation to the Lewes-Uckfield train line, it could be applied to any linear feature of interest:
Prototype code:
pseg = spsample(lewes_uckfield_osgb, n = 5, type = "regular")
plot(pseg)
# try to create 1st line segment
ldf = ggplot2::fortify(lewes_uckfield_osgb)
ldfp = SpatialPointsDataFrame(coords = cbind(ldf$long, ldf$lat), data = ldf)
# find closest point
# install.packages("nabor")
library(nabor)
knn_res = knn(data = coordinates(ldfp), query = coordinates(pseg), k = 1)
sel_nearest = c(knn_res$nn.idx)
points(ldfp[sel_nearest,]) # check they are close by
for(i in 1:(length(sel_nearest) + 1)){
ids = c(1, sel_nearest, nrow(ldfp))
if(i == 1){
l = points_to_line(ldf[ids[i]:ids[(i + 1)],], "long", "lat")
spChFIDs(l) = i
} else {
l_temp = points_to_line(ldf[ids[i]:ids[(i + 1)],], "long", "lat")
spChFIDs(l_temp) = i
l = maptools::spRbind(l, l_temp)
}
}
The CycleStreets V1 API had a new option reporterrors
= 1
added to it a few months ago:
https://www.cyclestreets.net/api/v1/journey/
I recommend adding support for this, to catch reportable problems, e.g. unroutable OSM data.
Forthcoming V2 does this automatically (this option is essentially a backport, but avoids existing client implementations getting unexpected data).
When trying the example from the vignette I get
library(stplanr); library(sp); library(leaflet)
data("flow", package = "stplanr")
data("cents", package = "stplanr")
l <- od2line(flow = flow, zones = cents)
l <- l[!l$Area.of.residence == l$Area.of.workplace,]
lgb <- spTransform(l, CRSobj = CRS("+init=epsg:27700"))
l$d_euclidean <- rgeos::gLength(lgb, byid = T)
l$d_fastroute <- routes_fast@data$length
routes_fast$All <- l$All
overline(routes_fast, "All", fun = sum)
#> Assertion failed: (!"should never be reached"), function itemsTree, file ../../../../src/geos-3.4.2/src/index/strtree/AbstractSTRtree.cpp, line 371.
same session info as #110 - and same error whether using CRAN version or current version on master here
An enhancement of spsample so the frequency of points is proportional to a vector of weights. @edzer - interested if spsample
was ever planned to have a 'weight` field - that would solve the problem - could look into it if needs be.
There is a need to document this package and its potential applications in a vignette.
Currently the URL is hardcoded and cannot be overridden in settings, so custom setups are not possible to use at present:
https://github.com/ropensci/stplanr/blob/master/R/routes.R#L94
The specified URL is sensible as a default.
And decode the results.
E.g.
Taking a look at this now @richardellison using the BART data from SF:
download.file("http://transitfeeds.com/p/bart/58/latest/download", "bart.zip")
bart = gtfs2sldf("bart.zip")
This extracts 13 variables:
names(bart)
[1] "route_id" "shape_id" "route_short_name" "route_long_name" "route_desc"
[6] "route_type" "route_color" "route_text_color" "agency_id" "agency_name"
[11] "agency_url" "agency_timezone" "agency_lang"
Yet there are lots more variables in the various files extracted. Not sure of a good way to add these, particularly as GTFS data is so diverse, but think it worth flagging as an issue!
As agree with @mpadge, this will bring a load of performant code for analysing bike share data into stplanr:
https://github.com/mpadge/bike-correlations
line_*
and od_*
are useful conventions for line and od datasets respectively I thinkA final stage could be to write a vignette and possibly a paper but will save that for another (and probably quite rainy) day.
Sound good @mpadge and @richardellison?
This is what I get when running the viaroute example:
exroutes <- viaroute(viapoints=list(data.frame(x=c(-33.5,-33.6,-33.7),y=c(150,150.1,150.2))))
r <- viaroute2sldf(exroutes)
Error in data.frame(routenum = i, routedesc = rep(x$routes$legs[[i]]$summary, :
arguments imply differing number of rows: 1, 16, 8, 0
The other example works fine though. Any ideas @richardellison?
This will be useful for the reproducible analysis of data collected by track-logging smartphone apps
I think you can check for the presence of mapshaper using Sys.which
and avoid having to write out anything to disk:
mapshape_available <- function() {
res <- Sys.which("mapshaper") != ""
unname(res)
}
Or alternatively (prints the version to the console if true):
mapshape_available <- function() {
suppressWarnings(system("mapshaper --version")) != 127
}
I think a single and fairly generic function would be best here.
OSRM has recently been updated to use a new API and the demo server will stop supporting v4 at some point in the middle of this year. All OSRM functions need to be updated to support v5. Ideally they should retain compatibility with v4 for people running their own OSRM instances.
Reproducible example:
onewayid(flowlines, "All")
Error in UseMethod("mutate_") :
no applicable method for 'mutate_' applied to an object of class "c('SpatialLinesDataFrame', 'SpatialLines', 'Spatial', 'SpatialVector', 'SpatialLinesNULL')"
Suggested solution: make onewayid generic: http://adv-r.had.co.nz/S3.html
Could involve downloading OSM data with osmdata by @mpadge. The thinking is to have a route (r
) and then get statistics about conditions on the route by querying buffered data around it. This is probably best created as a series of modular self standing functions in the first instance, e.g.
get_osmavspeed(r)
would report the average speed along the route. This would call another function, e.g.
get_osmspeeds <- function(r)
if(is(object = routes_fast, "Spatial")) { # but not actually OSM data - will need another if to test that
# get average speeds of roads along the route, either by sampling or summarising speeds of highways that correspond to the spatial object
}
}
Encountered an error with this when using for some work.
Reproducible error from the example:
> nearest2spdf(
+ lat = c(50.3, 50.2),
+ lng = c(13.2, 13.1)
+ )
Show Traceback
Rerun with Debug
Error in .local(obj, ...) : NA values in coordinates
Hi,
Just came to inform that suggested packages used in examples should be escaped, so not mandatory to run examples. Best way is to use if (requireNamespace(.)) {_actual_code_}
function. Otherwise you should move tmap
(and potentially others, tmap
was first to fail) to imports.
* checking examples ... ERROR
Running examples in ‘stplanr-Ex.R’ failed
The error most likely occurred in:
> ### Name: line2points
> ### Title: Convert a SpatialLinesDataFrame to points
> ### Aliases: line2points line2pointsn
>
> ### ** Examples
>
> data(routes_fast)
> lpoints <- line2pointsn(routes_fast[2,]) # for a single line
> lpoints2 = line2points(routes_fast[2,])
> plot(lpoints)
> plot(lpoints2)
> lpoints = line2pointsn(routes_fast) # for many lines
> plot(lpoints)
> data(flowlines) # load demo flowlines dataset
> lpoints <- line2points(flowlines) # for many lines
Error in loadNamespace(name) : there is no package called ‘tmap’
Calls: line2points ... tryCatch -> tryCatchList -> tryCatchOne -> <Anonymous>
The best way to check is remove all suggested deps and try R CMD check. More details in R-exts.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.