I want to calculate length of each polygon. -Around each polygon I created points (st_sample), -from combiantion of points I created all possible polyline, -for polylines which are inside polygon I calucalted length, -the longest polyline is my result (max length of poylgon).

I wrote code which got me results but it is really slow. Do you have some solution for improvment of my code? I know that with two loops I cannot expect some miracle about speed but I do not know how get results another way.
If nothing else mybe at least some alterntive solution for creating all polyline from combination of points for one polygon in one step without loop ? :)
thank you
library(sf)
library(data.table)
poly=st_read(system.file("shape/nc.shp", package="sf"))
poly=poly[1:10,]
poly=st_cast(poly,"POLYGON")
poly$max_length=0
##Combination of 10 points, withot repetiton
aa=CJ(1:10,1:10)
aa=aa[!duplicated(t(apply(aa[,.(V1, V2)], 1, sort))),][V1!=V2]
##for each polygon create sample of coordinates along line, from them I create polyline and calculated length for linestring which are inside polygon
for (ii in 1:nrow(poly)){
ncl=st_cast(poly[ii,],"LINESTRING")
##sample of point along line
ncp=st_cast(st_sample(ncl,10, type="regular", exact=T),"POINT")
##create empty sf
aaa=st_sf(st_sfc())
st_crs(aaa)="NAD27"
##for each combination of points create linestring and calculate length only for polylines which are inside polygon
for (i in 1:nrow(aa)){
aaa=rbind(aaa,st_sf(geometry=st_cast(st_union(ncp[t(aa[i])]),"LINESTRING")))
}
poly$max_length[ii]=as.numeric(max(st_length(aaa[unlist(st_contains(poly[ii,],aaa)),])))
}
Second attempt with running function inside data.table. One loop less but problem is probably second loop.
poly=st_read(system.file("shape/nc.shp", package="sf"))
poly=poly[1:10,]
poly=st_cast(poly,"POLYGON")
poly$max_length=0
##Combination of 10 points, withot repetiton
aa=CJ(1:10,1:10)
aa=aa[!duplicated(t(apply(aa[,.(V1, V2)], 1, sort))),][V1!=V2]
overFun <- function(x){
ncl=st_cast(x[,geometry],"LINESTRING")
##sample of point along line
ncp=st_cast(st_sample(ncl,40, type="regular", exact=T),"POINT")
##create empty sf
aaa=st_sf(st_sfc())
st_crs(aaa)="NAD27"
##for each combination pof points create linestring and calculate length
for (i in 1:nrow(aa)){
aaa=rbind(aaa,st_sf(geometry=st_cast(st_union(ncp[t(aa[i])]),"LINESTRING")))
}
as.numeric(max(st_length(aaa[unlist(st_contains(x[,geometry],aaa)),])))}
setDT(poly)
##run function inside data.table
poly[,max_length:=overFun(poly), by=seq(nrow(poly))]
Edit: I found some solution for my problem which is enough fast for my needs. Using parallel library inside data.table with function which also work on a data.table. There is still question why some polyline are excluded with function st_contains (see picture upper). Maybe some problem with precision?
library(sf)
library(data.table)
poly=st_read(system.file("shape/nc.shp", package="sf"))
poly=st_cast(poly,"POLYGON")
setDT(poly)
##Combination of 10 points, withot repetiton
aa=CJ(1:10,1:10)
aa=aa[!duplicated(t(apply(aa[,.(V1, V2)], 1, sort))),][V1!=V2]
overFun <- function(x){
ncl=st_cast(poly[1,geometry],"LINESTRING")
##sample of point along line
ncp=st_cast(st_sample(ncl,10, type="regular", exact=T),"POINT")
df=data.table(ncp[aa[,V1]],ncp[aa[,V2]] )
df[,v3:=st_cast(st_union(st_as_sf(V1),st_as_sf(V2)),"LINESTRING"), by=seq(nrow(df))]
as.numeric(max(st_length(df[unlist(st_contains(poly[1,geometry], df$v3)),]$v3)))}
library(parallel)
cl <- makeCluster(detectCores() - 1)
clusterExport(cl, list("overFun","data.table","st_cast","CJ","poly","st_sample","st_sf","st_sfc","aa","st_length","st_union",
"st_as_sf","st_contains"))
system.time(poly[,c("max_length"):=.(clusterMap(cl, overFun, poly$geometry)),])
stopCluster(cl)
I encountered a similar problem and frankly have not found any ready-made solution.
I will use the same Ashe county from sf package.
library(sf)
library(dplyr)
shape <- st_read(system.file("shape/nc.shp", package="sf")) %>%
dplyr::filter(CNTY_ID == 1825) %>% # Keep only one polygon
st_transform(32617) # Reproject to WGS 84 / UTM zone 17N
What you can do with just dplyr, tidyr, and sf is to turn polygons into points and calculate the distance between all the points. From this variety, choose the maximal value. It would be a green line from your example figure.
library(tidyr)
shape %>%
st_cast("POINT") %>% # turn polygon into points
distinct() %>% # remove duplicates
st_distance() %>% # calculate distance matrix
as.data.frame() %>%
gather(point_id, dist) %>% # convert to long format
pull(dist) %>% # keep only distance column
max()
#> Warning in st_cast.sf(., "POINT"): repeating attributes for all sub-geometries
#> for which they may not be constant
#> 45865.15 [m]
You can also use the Momocs package. It was created for 2D morphometric analysis. While it wasn't essential to reproject our shape to UTM in the first case (sf can handle geographic coordinates), your polygon should be projected in the case of the Momocs package.
library(Momocs)
shape %>%
st_cast("POINT") %>% # Polygon to points
distinct() %>% # remove duplicates
st_coordinates() %>% # get coordinates matrix
coo_calliper() # calculate max length
#> Warning in st_cast.sf(., "POINT"): repeating attributes for all sub-geometries
#> for which they may not be constant
#> [1] 45865.15
There are several other functions in the Momocs package. For example, you can calculate the length of a shape based on their iniertia axis i.e. alignment to the x-axis. The coo_length will return you 44432.02 [m].
For example, one can apply several functions from the Momocs package to the coordinate matrix as following:
point_matrix <- shape %>%
st_cast("POINT") %>%
distinct() %>%
st_coordinates()
#> Warning in st_cast.sf(., "POINT"): repeating attributes for all sub-geometries
#> for which they may not be constant
funs <- list("length" = coo_length,
"width" = coo_width,
"elongation" = coo_elongation)
sapply(funs, function(fun, x) fun(x), x = point_matrix)
#> length width elongation
#> 4.443202e+04 3.921162e+04 1.174917e-01
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With