Date   

Re: Reading from S3

Guillaume Lostis <g.lostis@...>
 

Hi,

I will add to the previous message that if you want to specify a non-default region, the environment variable you're looking for is probably AWS_REGION (or AWS_DEFAULT_REGION starting with GDAL 2.3), rather than AWS_S3_ENDPOINT (see https://gdal.org/user/virtual_file_systems.html#vsis3-aws-s3-files-random-reading)

Also, I have successfully used rasterio on private AWS S3 buckets without having to touch any environment variable, so unless I incorrectly understand your case, any extra configuration should not be necessary.

Best,

Guillaume Lostis


Re: Reading from S3

Sean Gillies
 

Hi,

The following log message catches my eye:
env: AWS_S3_ENDPOINT="us-west-1"
If that is set in your notebook's environment, it will override the value you pass to Env() in your program, and it looks to be incorrect.

On Thu, Sep 5, 2019 at 8:17 AM <hughes.lloyd@...> wrote:

I am trying to read a GeoTIFF from a private AWS S3 bucket. I have configured GDAL and the appropriate files ~/.aws/config and ~/.aws/credentials. I am using a non-standard AWS region as well, so I needed to set the AWS_S3_ENDPOINT environment variable.

I am able to read the GeoTIFF information using both gdalinfo and rio:

$ gdalinfo /vsis3/s1-image-dataset/test.tif
Driver: GTiff/GeoTIFF
Files: /vsis3/s1-image-dataset/test.tif
Size is 33959, 38507
Coordinate System is:
PROJCS["WGS 84 / UTM zone 17N",
....

and using rio:

$ rio info s3://s1-image-dataset/test.tif
{"bounds": [689299.5634174921, 2622862.3065700093, 1028889.5634174921, 3007932.3065700093], "colorinterp": ["gray"], "compress": "deflate", "count": 1, "crs": "EPSG:32617", "descriptions": [null], "driver": "GTiff" ....

However, when I try to read it in a script using the rasterio Python API the I received the following error:

CPLE_OpenFailedError: '/vsis3/s1-image-dataset/test.tif' not recognized as a supported file format.

The code I am using which produced the issues is

import rasterio
path = "s3://s1-image-dataset/test.tif"
with rasterio.Env(AWS_S3_ENDPOINT='s3.<my region>.amazonaws.com'):
    with rasterio.open(path) as f:
        img = f.read()

This is using Python 3.7, rasterio 1.0.25, and GDAL 2.4.2

The problem only occurs when running this in a Jupyter Notebook (Pangeo to be precise) and it appears that Rasterio exits the environment prematurely

DEBUG:rasterio.env:Entering env context: <rasterio.env.Env object at 0x7f97fb41d898>
DEBUG:rasterio.env:Starting outermost env
DEBUG:rasterio.env:No GDAL environment exists
DEBUG:rasterio.env:New GDAL environment <rasterio._env.GDALEnv object at 0x7f97fb41d908> created
DEBUG:rasterio._env:GDAL_DATA found in environment: '/srv/conda/envs/notebook/share/gdal'.
DEBUG:rasterio._env:PROJ_LIB found in environment: '/srv/conda/envs/notebook/share/proj'.
DEBUG:rasterio._env:Started GDALEnv <rasterio._env.GDALEnv object at 0x7f97fb41d908>.
DEBUG:rasterio.env:Entered env context: <rasterio.env.Env object at 0x7f97fb41d898>
DEBUG:rasterio.env:Got a copy of environment <rasterio._env.GDALEnv object at 0x7f97fb41d908> options
DEBUG:rasterio.env:Entering env context: <rasterio.env.Env object at 0x7f97fb3c5898>
DEBUG:rasterio.env:Got a copy of environment <rasterio._env.GDALEnv object at 0x7f97fb41d908> options
DEBUG:rasterio.env:Entered env context: <rasterio.env.Env object at 0x7f97fb3c5898>
DEBUG:rasterio._base:Sharing flag: 32
DEBUG:rasterio.env:Exiting env context: <rasterio.env.Env object at 0x7f97fb3c5898>
DEBUG:rasterio.env:Cleared existing <rasterio._env.GDALEnv object at 0x7f97fb41d908> options
DEBUG:rasterio._env:Stopped GDALEnv <rasterio._env.GDALEnv object at 0x7f97fb41d908>.
DEBUG:rasterio.env:No GDAL environment exists
DEBUG:rasterio.env:New GDAL environment <rasterio._env.GDALEnv object at 0x7f97fb41d908> created
DEBUG:rasterio._env:GDAL_DATA found in environment: '/srv/conda/envs/notebook/share/gdal'.
DEBUG:rasterio._env:PROJ_LIB found in environment: '/srv/conda/envs/notebook/share/proj'.
DEBUG:rasterio._env:Started GDALEnv <rasterio._env.GDALEnv object at 0x7f97fb41d908>.
DEBUG:rasterio.env:Exited env context: <rasterio.env.Env object at 0x7f97fb3c5898>
DEBUG:rasterio.env:Exiting env context: <rasterio.env.Env object at 0x7f97fb41d898>
DEBUG:rasterio.env:Cleared existing <rasterio._env.GDALEnv object at 0x7f97fb41d908> options
DEBUG:rasterio._env:Stopped GDALEnv <rasterio._env.GDALEnv object at 0x7f97fb41d908>.
DEBUG:rasterio.env:Exiting outermost env
DEBUG:rasterio.env:Exited env context: <rasterio.env.Env object at 0x7f97fb41d898>
env: AWS_ACCESS_KEY_ID="XXXXXX"
env: AWS_SECRET_ACCESS_KEY="XXXXXXXX"
env: AWS_S3_ENDPOINT="us-west-1"
---------------------------------------------------------------------------
CPLE_OpenFailedError                      Traceback (most recent call last)
rasterio/_base.pyx in rasterio._base.DatasetBase.__init__()

rasterio/_shim.pyx in rasterio._shim.open_dataset()

rasterio/_err.pyx in rasterio._err.exc_wrap_pointer()

CPLE_OpenFailedError: '/vsis3/s1-image-dataset/test.tif' does not exist in the file system, and is not recognized as a supported dataset name.

--
Sean Gillies


Re: Reading from S3

hughes.lloyd@...
 

I am trying to read a GeoTIFF from a private AWS S3 bucket. I have configured GDAL and the appropriate files ~/.aws/config and ~/.aws/credentials. I am using a non-standard AWS region as well, so I needed to set the AWS_S3_ENDPOINT environment variable.

I am able to read the GeoTIFF information using both gdalinfo and rio:

$ gdalinfo /vsis3/s1-image-dataset/test.tif
Driver: GTiff/GeoTIFF
Files: /vsis3/s1-image-dataset/test.tif
Size is 33959, 38507
Coordinate System is:
PROJCS["WGS 84 / UTM zone 17N",
....

and using rio:

$ rio info s3://s1-image-dataset/test.tif
{"bounds": [689299.5634174921, 2622862.3065700093, 1028889.5634174921, 3007932.3065700093], "colorinterp": ["gray"], "compress": "deflate", "count": 1, "crs": "EPSG:32617", "descriptions": [null], "driver": "GTiff" ....

However, when I try to read it in a script using the rasterio Python API the I received the following error:

CPLE_OpenFailedError: '/vsis3/s1-image-dataset/test.tif' not recognized as a supported file format.

The code I am using which produced the issues is

import rasterio
path = "s3://s1-image-dataset/test.tif"
with rasterio.Env(AWS_S3_ENDPOINT='s3.<my region>.amazonaws.com'):
    with rasterio.open(path) as f:
        img = f.read()

This is using Python 3.7, rasterio 1.0.25, and GDAL 2.4.2

The problem only occurs when running this in a Jupyter Notebook (Pangeo to be precise) and it appears that Rasterio exits the environment prematurely

---------------------------------------------------------------------------
CPLE_OpenFailedError                      Traceback (most recent call last)
rasterio/_base.pyx in rasterio._base.DatasetBase.__init__()

rasterio/_shim.pyx in rasterio._shim.open_dataset()

rasterio/_err.pyx in rasterio._err.exc_wrap_pointer()

CPLE_OpenFailedError: '/vsis3/s1-image-dataset/test.tif' does not exist in the file system, and is not recognized as a supported dataset name.






Reading from S3

hughes.lloyd@...
 

I am trying to read a geoTiff from my private S3 bucket (mapping), but am receiving the following error message

CPLE_OpenFailedError: '/vsis3/mapping/bahamas/S1A_20190821T231100.tif' does not exist in the file system, and is not recognized as a supported dataset name
The code I am using to open the GeoTiff is:

with rasterio.Env(session=AWSSession(aws_secret_access_key=S3_SECRET, aws_access_key_id=S3_KEY, region_name="us-west-1")) as env:
    rasterio.open("s3://mapping/bahamas/S1A_20190821T231100.tif")

I am using rasterio version 1.0.25 and the application is single-threaded. The file does exist and I can access it using s3fs and awscli. What am I missing?


Re: rasterio.windows.transform seems to not scale my windows correctly, am I using it wrong?

Sean Gillies
 

Hi Ryan,

I've been on vacation, just now getting the time to answer questions. Answers below.

On Wed, Aug 21, 2019 at 3:13 PM Ryan Avery <ravery@...> wrote:
I have a landsat band that I have windowed so that I now have about 200 512x512 windows in a list called chip_list_full. I have plotted these on top of my aoi to confirm that the windowing worked and found that it did not work as expected following this stack overflow example code where I computed a transform for each window using the original dataset transform. the result is that windows are spatially, diagonally offset from the upper left corner of the original image



each window in the image above was plotted like so, with a unique transform called custom_window_transform:
chips_with_labels = []
fig, ax = plt.subplots()
gdf.plot(ax=ax)
for chip in chip_list_full:
    custom_window_transform = windows.transform(chip[0], band.transform)
    window_bbox = coords.BoundingBox(*windows.bounds(chip[0], custom_window_transform))
    window_poly = rio_bbox_to_polygon(window_bbox)
    gpd.GeoDataFrame(geometry=gpd.GeoSeries(window_poly), crs=band.meta['crs'].to_dict()).plot(ax = ax, cmap='cubehelix')

When I change the custom_window_transform to instead be the original band transform, the result is correct



chips_with_labels = []
fig, ax = plt.subplots()
gdf.plot(ax=ax)
for chip in chip_list_full:
    window_bbox = coords.BoundingBox(*windows.bounds(chip[0], band.transform))
    window_poly = rio_bbox_to_polygon(window_bbox)
    gpd.GeoDataFrame(geometry=gpd.GeoSeries(window_poly), crs=band.meta['crs'].to_dict()).plot(ax = ax, cmap='cubehelix')

where "band.transform" is my original image transform, and the dataset I have windowed.

My question is, what is the purpose of windows.transform and is there some other way I should be using it in this case? Or is my use of the original dataset transform correct?

Thanks for the feedback!

Your use of windows.transform in the first case is correct, and your use of windows.bounds in the second case is correct. Where you went wrong, and probably due to sketchy documentation, is in

    window_bbox = coords.BoundingBox(*windows.bounds(chip[0], custom_window_transform))

The second argument of the windows.bounds function (see https://rasterio.readthedocs.io/en/latest/api/rasterio.windows.html#rasterio.windows.bounds) must be the affine transformation for the "dataset" on which we're applying the given window, not any other affine transformation. I really must explain this better in the docs, sorry about that.

--
Sean Gillies


Re: multi-dimensional support

Sean Gillies
 

Hi Norman, Howard,

I'm going to move this discussion over to https://rasterio.groups.io/g/dev/messages and continue there.


On Fri, Aug 23, 2019 at 11:30 AM Norman Barker <norman.barker@...> wrote:
I was one of the stakeholders for subdataset support in GDAL with netCDF and it worked well with what we were trying to achieve back then, serving regularly gridded time series netcdf data through a WCS, I believe others have used subdataset support in the same way. It was possible to make this work by using external indexes and subdatasets. 

I also agree with your comment that Rasterio is a relatively small project and the code needs to have active users.

The main benefit is a common api for multi-dimensional data access within gdal. Currently using gdalinfo against hdf, netcdf or TileDB requires reading the output to understand the available data, or writing a parser for each of these format driver's metadata. These drivers have no common way to advertise through an API the dimensions and attributes they support. Because implementing subdataset support has been a little adhoc the access patterns are slightly different across drivers, the new api enforces a convention.

Killer features? A couple come to mind; Accessing data cubes with a common api to retrieve data along a z dimension, or sliced by time. These use cases would benefit from being supported in rasterio and using xarray/dask to process multi-dimensional data.

I will create a strawman for the API changes and if you and the community are interested then I can start on the code. 

Norman



On Fri, Aug 23, 2019 at 7:51 AM Howard Butler <howard@...> wrote:


On Aug 23, 2019, at 9:29 AM, Sean Gillies <sean.gillies@...> wrote:


I'm also a bit concerned about the small number of stakeholders for the new GDAL API. It appears to be only the HDF Group (yes?) with only three GDAL TC members voting to adopt it. The rest of the GDAL community seemed ambivalent.

Most folks are ambivalent about multi-dimensional support in GDAL, and they were ambivalent about subdatasets before that (which were a deficient implementation in a number of ways which precipitated the RFC). The RFC moved things forward in a positive direction, and it wasn't just about giving HDFLand a clean mapping to GDAL. It was about giving GDALLand the ability to more easily speak to an additional family of raster-like data. 

GDAL drivers that speak zarr, TileDB, Arrow, and HDF can now be adapted without the miserable compromises that subdatasets required in usability and data fidelity. That will allow people to bring the GDAL geo goodness to their data without reformatting simply to push it through the tool. I think these generic data structures are seeing much more action because they allow data-level interop without special purpose drivers across multiple software runtimes. The winds are blowing the same direction in point cloud land too.

Rasterio is a pretty small project and, in my opinion, can't afford to develop code that isn't going to be widely used.

A completely reasonable position.

Howard




--
Sean Gillies


Rasterio 1.0.26

Sean Gillies
 

Hi all,

Rasterio 1.0.26 wheels and source distribution are on PyPI today. There are eight bug fixes in this release.


Thank you for the reports and discussion, and big thanks to Alan Snow for the contributions around coordinate reference system interoperability.

I've finessed the Linux and OS X wheel builds so that for the first time we can include size and speed optimized GDAL shared libraries. An install of one of the Linux wheels now takes up "only" 52 MB, 39 MB of which are shared libraries. The wheels themselves are down to 15 MB.

Share and enjoy,

--
Sean Gillies


Re: multi-dimensional support

Norman Barker
 

I was one of the stakeholders for subdataset support in GDAL with netCDF and it worked well with what we were trying to achieve back then, serving regularly gridded time series netcdf data through a WCS, I believe others have used subdataset support in the same way. It was possible to make this work by using external indexes and subdatasets. 

I also agree with your comment that Rasterio is a relatively small project and the code needs to have active users.

The main benefit is a common api for multi-dimensional data access within gdal. Currently using gdalinfo against hdf, netcdf or TileDB requires reading the output to understand the available data, or writing a parser for each of these format driver's metadata. These drivers have no common way to advertise through an API the dimensions and attributes they support. Because implementing subdataset support has been a little adhoc the access patterns are slightly different across drivers, the new api enforces a convention.

Killer features? A couple come to mind; Accessing data cubes with a common api to retrieve data along a z dimension, or sliced by time. These use cases would benefit from being supported in rasterio and using xarray/dask to process multi-dimensional data.

I will create a strawman for the API changes and if you and the community are interested then I can start on the code. 

Norman



On Fri, Aug 23, 2019 at 7:51 AM Howard Butler <howard@...> wrote:


On Aug 23, 2019, at 9:29 AM, Sean Gillies <sean.gillies@...> wrote:


I'm also a bit concerned about the small number of stakeholders for the new GDAL API. It appears to be only the HDF Group (yes?) with only three GDAL TC members voting to adopt it. The rest of the GDAL community seemed ambivalent.

Most folks are ambivalent about multi-dimensional support in GDAL, and they were ambivalent about subdatasets before that (which were a deficient implementation in a number of ways which precipitated the RFC). The RFC moved things forward in a positive direction, and it wasn't just about giving HDFLand a clean mapping to GDAL. It was about giving GDALLand the ability to more easily speak to an additional family of raster-like data. 

GDAL drivers that speak zarr, TileDB, Arrow, and HDF can now be adapted without the miserable compromises that subdatasets required in usability and data fidelity. That will allow people to bring the GDAL geo goodness to their data without reformatting simply to push it through the tool. I think these generic data structures are seeing much more action because they allow data-level interop without special purpose drivers across multiple software runtimes. The winds are blowing the same direction in point cloud land too.

Rasterio is a pretty small project and, in my opinion, can't afford to develop code that isn't going to be widely used.

A completely reasonable position.

Howard



Re: multi-dimensional support

Howard Butler
 



On Aug 23, 2019, at 9:29 AM, Sean Gillies <sean.gillies@...> wrote:


I'm also a bit concerned about the small number of stakeholders for the new GDAL API. It appears to be only the HDF Group (yes?) with only three GDAL TC members voting to adopt it. The rest of the GDAL community seemed ambivalent.

Most folks are ambivalent about multi-dimensional support in GDAL, and they were ambivalent about subdatasets before that (which were a deficient implementation in a number of ways which precipitated the RFC). The RFC moved things forward in a positive direction, and it wasn't just about giving HDFLand a clean mapping to GDAL. It was about giving GDALLand the ability to more easily speak to an additional family of raster-like data. 

GDAL drivers that speak zarr, TileDB, Arrow, and HDF can now be adapted without the miserable compromises that subdatasets required in usability and data fidelity. That will allow people to bring the GDAL geo goodness to their data without reformatting simply to push it through the tool. I think these generic data structures are seeing much more action because they allow data-level interop without special purpose drivers across multiple software runtimes. The winds are blowing the same direction in point cloud land too.

Rasterio is a pretty small project and, in my opinion, can't afford to develop code that isn't going to be widely used.

A completely reasonable position.

Howard



Re: multi-dimensional support

Sean Gillies
 

Hi Norman,

I would need to see a strawman proposal of how rasterio's dataset open/read/write API would be extended before I could support the work.

I'm also a bit concerned about the small number of stakeholders for the new GDAL API. It appears to be only the HDF Group (yes?) with only three GDAL TC members voting to adopt it. The rest of the GDAL community seemed ambivalent. Rasterio is a pretty small project and, in my opinion, can't afford to develop code that isn't going to be widely used.

I find the Python GDAL example of using the new API to be underwhelming: https://github.com/rouault/gdal/blob/rfc75/gdal/doc/source/tutorials/multidimensional_api_tut.rst#in-python. I think that functionality already exists with subdatasets, no? If there are some killer new features, I'd like to see them.


On Thu, Aug 22, 2019 at 4:11 PM Norman Barker <norman.barker@...> wrote:
Hi,

I started discussing support for multiple dimensions in https://github.com/mapbox/rasterio/issues/1759 but am moving this to a wider audience.

GDAL has added an implementation of https://gdal.org/development/rfc/rfc75_multidimensional_arrays.html and I would like to add this to rasterio by extending the existing rasterio API.

Is this of interest?

Norman



--
Sean Gillies


Re: multi-dimensional support

Alan Snow
 

I think this would be quite useful.

i made a start at it using subdatasets in rioxarray here: https://github.com/corteva/rioxarray/pull/33
But, it is missing 1 dimensional variables such as time at the moment (and probably other items as well).

What you mentioned would be a much nicer solution.


multi-dimensional support

Norman Barker
 

Hi,

I started discussing support for multiple dimensions in https://github.com/mapbox/rasterio/issues/1759 but am moving this to a wider audience.

GDAL has added an implementation of https://gdal.org/development/rfc/rfc75_multidimensional_arrays.html and I would like to add this to rasterio by extending the existing rasterio API.

Is this of interest?

Norman


rasterio.windows.transform seems to not scale my windows correctly, am I using it wrong?

Ryan Avery
 

I have a landsat band that I have windowed so that I now have about 200 512x512 windows in a list called chip_list_full. I have plotted these on top of my aoi to confirm that the windowing worked and found that it did not work as expected following this stack overflow example code where I computed a transform for each window using the original dataset transform. the result is that windows are spatially, diagonally offset from the upper left corner of the original image



each window in the image above was plotted like so, with a unique transform called custom_window_transform:
chips_with_labels = []
fig, ax = plt.subplots()
gdf.plot(ax=ax)
for chip in chip_list_full:
    custom_window_transform = windows.transform(chip[0], band.transform)
    window_bbox = coords.BoundingBox(*windows.bounds(chip[0], custom_window_transform))
    window_poly = rio_bbox_to_polygon(window_bbox)
    gpd.GeoDataFrame(geometry=gpd.GeoSeries(window_poly), crs=band.meta['crs'].to_dict()).plot(ax = ax, cmap='cubehelix')

When I change the custom_window_transform to instead be the original band transform, the result is correct



chips_with_labels = []
fig, ax = plt.subplots()
gdf.plot(ax=ax)
for chip in chip_list_full:
    window_bbox = coords.BoundingBox(*windows.bounds(chip[0], band.transform))
    window_poly = rio_bbox_to_polygon(window_bbox)
    gpd.GeoDataFrame(geometry=gpd.GeoSeries(window_poly), crs=band.meta['crs'].to_dict()).plot(ax = ax, cmap='cubehelix')

where "band.transform" is my original image transform, and the dataset I have windowed.

My question is, what is the purpose of windows.transform and is there some other way I should be using it in this case? Or is my use of the original dataset transform correct?

Thanks for the feedback!
where "band" is 


Re: Geotiff max/min coordinates question

Amine Aboufirass <amine.aboufirass@...>
 

Couldn't you use the .bounds method on your dataset reader object?


On Fri, Aug 9, 2019 at 11:26 PM <j.carnes2553@...> wrote:
Hi everyone,

I am very new to rasterio and gis in general. I really like the rasterio library so far but had a question on how it works similar to a gdal function. I am trying to get the max/min values for a geotiff file in lat/long format so that I can add the png of the geotif files to a map in python.

I found a way to do it in gdal but would rather use rasterio. I checked out the rasterio documentation and thought that maybe I should look into using the transform function but noticed the outputs weren't in lat/long. Any info on how to get the max/mins in lat/long format from a geotiff file using rasterio would be greatly appreciated. 

Thanks in advance for the help!


Re: Geotiff max/min coordinates question

Alan Snow
 

I am not entirely sure what you are referring to, but I am assuming you want the bounds of the raster in lat/lon?
If so, then `EPSG:4326` is a pretty standard projection for lat/lon coordinates.
So, you would need to transform your bounds to lat/lon or EPSG:4326.

If this is the case, then this should work:

```python
import rasterio
from rasterio.warp import transform_bounds

with rasterio.open("file.tif") as rds:
    wgs84_bounds = transform_bounds(rds.crs, "epsg:4326", *rds.bounds)
```

If not, are you able to share a snipped of the GDAL python code you used?


Geotiff max/min coordinates question

j.carnes2553@...
 

Hi everyone,

I am very new to rasterio and gis in general. I really like the rasterio library so far but had a question on how it works similar to a gdal function. I am trying to get the max/min values for a geotiff file in lat/long format so that I can add the png of the geotif files to a map in python.

I found a way to do it in gdal but would rather use rasterio. I checked out the rasterio documentation and thought that maybe I should look into using the transform function but noticed the outputs weren't in lat/long. Any info on how to get the max/mins in lat/long format from a geotiff file using rasterio would be greatly appreciated. 

Thanks in advance for the help!


Re: [feature] List tags namespaces

Guillaume Lostis <g.lostis@...>
 

Great, thank you very much Vincent!

Guillaume Lostis


Re: [feature] List tags namespaces

vincent.sarago@...
 

I started a PR over https://github.com/mapbox/rasterio/pull/1740


Re: [feature] List tags namespaces

vincent.sarago@...
 

Hi Guillaume,
I'm + 1 with the idea of adding this feature.

I've had the same challenge while working on rio-cogeo, if there is a GDAL API I think it could be useful to add a method in rasterio.


Vincent


[feature] List tags namespaces

Guillaume Lostis <g.lostis@...>
 

Hi all,

I would like to be able to list a file's namespaces for tags. I am handling files with several tag "namespaces" (default, IMAGE_STRUCTURE, and RPC for example). I want to create a "copy" of this file by copying its profile and tags, so I need to find a way to get "all" tags from the source file, by listing the namespaces available for that dataset for example.

I didn't find a way in rasterio to list its tag namespaces, but I found that there's a GDAL function that does that since GDAL 1.11.0 (see GDAL RFC43 and GDAL documentation for that function). To use that function you can for example run:
gdalinfo -listmdd file.tiff

I would like to know if someone knows a way to list tag namespaces in rasterio, and if not, if you agree that this would be a nice feature to have with something along the lines of:
with rasterio.open("file.tiff") as f:
    ns = f.get_tags_ns()

    # or ns = f.tags_ns
>> ns = ["default", "IMAGE_STRUCTURE", "RPC"]

I would have liked to propose an implementation of this myself, but my knowledge of GDAL's API and of Cython is close to zero...

Thanks,


Guillaume Lostis

661 - 680 of 956