Hi Team,
We are reading the vertex data from the graph and saving in to the Dataframe. we are getting the below issue with large data.
val sourceDf=gsrc.V().df
org.apache.spark.scheduler.TaskSetManager: Lost task 10169.0 in stage 0.0 (TID 10169, org.apache.spark.scheduler.TaskSetManager: Lost task 10169.0 in stage 0.0 (TID 10169, IPAddress, executor 2): java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:869) at executor 2): java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:869) at