5

I have a shape file and i need to read the shape file from my java code. I used below code for reading shape file.

public class App {
    public static void main(String[] args) {
        File file = new File("C:\\Test\\sample.shp");
        Map<String, Object> map = new HashMap<>();//
        try {
            map.put("url", URLs.fileToUrl(file));
            DataStore dataStore = DataStoreFinder.getDataStore(map);
            String typeName = dataStore.getTypeNames()[0];
            SimpleFeatureSource source = dataStore.getFeatureSource(typeName);
            SimpleFeatureCollection collection = source.getFeatures();

            try (FeatureIterator<SimpleFeature> features = collection.features()) {
                while (features.hasNext()) {
                    SimpleFeature feature = features.next();
                    SimpleFeatureType schema = feature.getFeatureType();
                    Class<?> geomType = schema.getGeometryDescriptor().getType().getBinding();

                    String type = "";
                    if (Polygon.class.isAssignableFrom(geomType) || MultiPolygon.class.isAssignableFrom(geomType)) {

                        MultiPolygon geom = (MultiPolygon) feature.getDefaultGeometry();
                        type = "Polygon";
                        if (geom.getNumGeometries() > 1) {
                            type = "MultiPolygon";
                        }
                    } else if (LineString.class.isAssignableFrom(geomType)
                            || MultiLineString.class.isAssignableFrom(geomType)) {
                    } else {

                    }
                    System.out.println(feature.getDefaultGeometryProperty().getValue().toString());

                }
            }
        } catch (Exception e) {
            // TODO: handle exception
        }

    }
}

I got the desired output. But my requirement is write an aws lambda function to read shape file. For this 1. I created a Lambda java project of s3 event. I wrote the same code inside the handleRequest. I uploaded the java lambda project as a lanbda function and added one trigger. When I am uploading a .shp file to as s3 bucket lmbda function will automatically invoked. But I am getting an error like below

java.lang.RuntimeException: java.io.FileNotFoundException: /sample.shp (No such file or directory)

I have sample.shp file inside my s3 bucket. I go through below link. How to write an S3 object to a file?

I am getting the same error. I tried to change my code like below

  S3Object object = s3.getObject(new GetObjectRequest(bucket, key)); 
  InputStream objectData = object.getObjectContent();
  map.put("url", objectData );

instead of

File file = new File("C:\\Test\\sample.shp"); 
 map.put("url", URLs.fileToUrl(file));

:-( Now i am getting an error like below

java.lang.NullPointerException

Also I tried the below code

DataStore dataStore = DataStoreFinder.getDataStore(objectData);

instead of

DataStore dataStore = DataStoreFinder.getDataStore(map);

the error was like below

java.lang.ClassCastException: com.amazonaws.services.s3.model.S3ObjectInputStream cannot be cast to java.util.Map

Also I tried to add key directly to the map and also as DataStore object. Everything went wrong..:-(

Is there anyone who can help me? It will be very helpful if someone can do it for me...

3
  • You writing files on your lambda code? You cant write to / folder, you must use /tmp/ to temporary files writes Commented Jan 11, 2018 at 13:53
  • Thanks for ur replay.Nop.. I am writing my files to a kinesis stream using this lambda function. before putrecord to kinesis I need to read the .shp file. As a newbie to aws where should I change in my code ? Commented Jan 11, 2018 at 14:07
  • @TLPNull As you mentioned I tried with /tmp/ folder. I created one tmp folder inside my bucket and uploaded the shape file to the same /tmp folder. i am getting same error as before. "java.lang.RuntimeException: java.io.FileNotFoundException: /tmp/sample.shp (No such file or directory)". I not sure that I did right/not? Commented Jan 11, 2018 at 14:27

1 Answer 1

2

The DataStoreFinder.getDataStore method in geotools requires you to provide a map containing a key/value pair with key "url". The value associated with that "url" key needs to be a file URL like "file://host/path/my.shp".

You're trying to insert a Java input stream into the map. That won't work, because it's not a file URL.

The geotools library does not accept http/https URLs (see the geotools code here and here), so you need a file:// URL. That means you will need to download the file from S3 to the local Lambda filesystem and then provide a file:// URL pointing to that local file. To do that, here's Java code that should work:

// get the shape file from S3 to local filesystem
File localshp = new File("/tmp/download.shp");
s3.getObject(new GetObjectRequest(bucket, key), localshp);

// now store file:// URL in the map
map.put("url", localshp.getURI().getURL().toString());

If the geotools library had accepted real URLs (not just file:// URLs) then you could have avoided the download and simply created a time-limited, pre-signed URL for the S3 object and put that URL into the map.

Here's an example of how to do that:

// get current time and add one hour
java.util.Date expiration = new java.util.Date();
long msec = expiration.getTime();
msec += 1000 * 60 * 60;
expiration.setTime(msec);

// request pre-signed URL that will allow bearer to GET the object
GeneratePresignedUrlRequest gpur = new GeneratePresignedUrlRequest(bucket, key);
gpur.setMethod(HttpMethod.GET);
gpur.setExpiration(expiration);

// get URL that will expire in one hour
URL url = s3.generatePresignedUrl(gpur);
Sign up to request clarification or add additional context in comments.

7 Comments

That problem solved but you know again I am getting java.lang.NullPointerException. Means the file is not triggered by the org.geotools.data.DataAccessFinder getDataStore. the errors like below "WARNING: Problem asking Shapefile if it can process request:java.lang.NullPointerException" "java.lang.NullPointerException" "at org.geotools.data.shapefile.ShapefileDataStoreFactory.canProcess(ShapefileDataStoreFactory.java:245)"
Looks like geotools may not support all forms of URLs, only file://, so have updated response.
this is actually i did with my local file. it was working perfectly. But i need to read this shape file from aws lambda using java:-(
Yes, and that's why my updated code shows you how to download the file from S3 to the local file system in Lambda (download it to /tmp/). Once you have the file locally, the process is the same as it is outside of Lambda.
hai, i have one more query regarding this question. As you mentioned I wrote my code and it works fine. But you know I have getting false for ' while (features.hasNext())'.But in my java code it returns true and shape file is reading. But in aws lambda function it returns false and I am still stuck here... Can you help me to get this?
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.