I would like to generate shapefiles for H3 hexagons in a specific geographic area. Particularly, I'm interested in the Bay Area with resolutions = 6, 7 and 9. How can I create the shapefiles for the hexagons covering this area?
I'm new to shapefiles or any other geographic data structures. I'm most comfortable with python and R.
The basic steps here are:
polyfill
method to fill the polygon with hexagons at the desired resolution.h3ToGeoBoundary
function.ogr2ogr
to convert to a shapefile.The Python bindings have not been released, and I'm not familiar with the R bindings, but the JavaScript version might look like this:
var h3 = require('h3-js');
var bbox = [
[-123.308821530582, 38.28055644998254],
[-121.30037257250085, 38.28055644998254],
[-121.30037257250085, 37.242722073589164],
[-123.308821530582, 37.242722073589164]
];
var hexagons = h3.polyfill(bbox, 6, true);
var geojson = {
type: 'Feature',
geometry: {
type: 'MultiPolygon',
coordinates: hexagons.map(function toBoundary(hex) {
return [h3.h3ToGeoBoundary(hex, true)];
})
}
};
console.log(JSON.stringify(geojson));
and you'd use the script like this:
node bbox-geojson.js | ogr2ogr -f "ESRI Shapefile" bbox-hexagons.shp /vsistdin/
If you're looking for solution in R
, the h3jsr
package provides access to Uber's H3 library. The solution to your question can be done using the functions h3jsr::polyfill()
and h3jsr::h3_to_polygon
.
library(ggplot2)
library(h3jsr)
library(sf)
library(sf)
# read the shapefile of the polygon area you're interested in
nc <- st_read(system.file("shape/nc.shp", package="sf"), quiet = TRUE)
# projection
nc <- st_transform(nc, crs = 4326)
# get the unique h3 ids of the hexagons intersecting your polygon at a given resolution
nc_5 <- polyfill(nc, res = 5, simple = FALSE)
# pass the h3 ids to return the hexagonal grid
hex_grid5 <- unlist(nc_5$h3_polyfillers) %>% h3_to_polygon(simple = FALSE)
This will return the polygons below:
Taking up John Stud's question here, because I've had the same 'problem'. In the following, I'll comment on how to read in a shapefile, hexagonize it with H3, and get a Hexagon geodataframe from it (and eventually save it as a shapefile).
Reproducible example
Let's get a shapefile for the US, e.g. here (I use the "cb_2018_us_state_500k.zip" one).
# Imports
import h3
import geopandas as gpd
import matplotlib.pyplot as plt
import pandas as pd
import shapely
from shapely.ops import unary_union
from shapely.geometry import mapping, Polygon
# Read shapefile
gdf = gpd.read_file("data/cb_2018_us_state_500k.shp")
# Get US without territories / Alaska + Hawaii
us = gdf[~gdf.NAME.isin(["Hawaii", "Alaska", "American Samoa",
"United States Virgin Islands", "Guam",
"Commonwealth of the Northern Mariana Islands",
"Puerto Rico"])]
# Plot it
fig, ax = plt.subplots(1,1)
us.plot(ax=ax)
plt.show()
# Convert to EPSG 4326 for compatibility with H3 Hexagons
us = us.to_crs(epsg=4326)
# Get union of the shape (whole US)
union_poly = unary_union(us.geometry)
# Find the hexagons within the shape boundary using PolyFill
hex_list=[]
for n,g in enumerate(union_poly):
if (n+1) % 100 == 0:
print(str(n+1)+"/"+str(len(union_poly)))
temp = mapping(g)
temp['coordinates']=[[[j[1],j[0]] for j in i] for i in temp['coordinates']]
hex_list.extend(h3.polyfill(temp,res=5))
# Create hexagon data frame
us_hex = pd.DataFrame(hex_list,columns=["hex_id"])
# Create hexagon geometry and GeoDataFrame
us_hex['geometry'] = [Polygon(h3.h3_to_geo_boundary(x, geo_json=True)) for x in us_hex["hex_id"]]
us_hex = gpd.GeoDataFrame(us_hex)
# Plot the thing
fig, ax = plt.subplots(1,1)
us_hex.plot(ax=ax, cmap="prism")
plt.show()
The above plot has resolution "5" (https://h3geo.org/docs/core-library/restable/), I suggest you look at other resolutions, too, like 4:
Of course, that depends on the "zoom level", i.e., whether you're looking at entire countries or just cities or so.
And, of course, to answer the original question: You can save the resulting shapefile using
us_hex.to_file("us_hex.shp")
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With