I have a shapefile of polygons with their geometry, a column of time and a column a value. The data looks like this:
| TIME | AVG | geometry
---- ----+-----+--------------------------------
0 | NaN | NaN | POLYGON((-0.1599375 51.3977, ...
The Nan value means that this specific polygon does not have value. I want to transform this to a 3D numpy array (according to time) where each cell will be labelled with the value of the grid. I have tried this one but I am getting 0 everywhere.
source_ds = ogr.Open("C:/Users/Nathan/Desktop/pl.shp")
source_layer = source_ds.GetLayer()
pixelWidth = pixelHeight = 0.001
x_min, x_max, y_min, y_max = source_layer.GetExtent()
cols = int((x_max - x_min) / pixelHeight)
rows = int((y_max - y_min) / pixelWidth)
target_ds = gdal.GetDriverByName('GTiff').Create('temp.tif', cols, rows, 1, gdal.GDT_Byte)
target_ds.SetGeoTransform((x_min, pixelWidth, 0, y_min, 0, pixelHeight))
band = target_ds.GetRasterBand(1)
NoData_value = 0
band.SetNoDataValue(NoData_value)
band.FlushCache()
gdal.RasterizeLayer(target_ds, [1], source_layer, options = ["ATTRIBUTE=VALUE"])
target_dsSRS = osr.SpatialReference()
target_dsSRS.ImportFromEPSG(4326)
target_ds.SetProjection(target_dsSRS.ExportToWkt())
k=gdal.Open('temp.tif').ReadAsArray()
Any idea how to do it? Thank you
Finally, I figure it out.
source_ds = ogr.Open("C:/Users/Nathan/Desktop/pl.shp")
source_layer = source_ds.GetLayer()
pixelWidth = pixelHeight = 0.001
x_min, x_max, y_min, y_max = source_layer.GetExtent()
cols = int((x_max - x_min) / pixelHeight)
rows = int((y_max - y_min) / pixelWidth)
target_ds = gdal.GetDriverByName('GTiff').Create('temp.tif', cols, rows, 1, gdal.GDT_Byte)
target_ds.SetGeoTransform((x_min, pixelWidth, 0, y_min, 0, pixelHeight))
band = target_ds.GetRasterBand(1)
NoData_value = 0
band.SetNoDataValue(NoData_value)
band.FlushCache()
gdal.RasterizeLayer(target_ds, [1], source_layer, options = ["ATTRIBUTE=AVG"])
target_ds = None #this is the line that makes the difference
gdal.Open('temp.tif').ReadAsArray()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With