Class DataClientHdfsContext

java.lang.Object
com.here.platform.data.client.hdfs.DataClientHdfsContext
All Implemented Interfaces:
DataClientContext

public class DataClientHdfsContext extends Object implements DataClientContext
Context holder with shared resources used by DataClient.

The context is not serializable (contains threads and sockets) and should never be shared between master and workers.

  • Constructor Details

    • DataClientHdfsContext

      public DataClientHdfsContext(org.apache.pekko.actor.ActorSystem actorSystem)
  • Method Details

    • context

      public static DataClientHdfsContext context(com.typesafe.config.Config customConfig)
      Context holder with shared resources used by DataClient.

      This context must be always stored in local variable inside the node closure, otherwise, spark will try to serialize the object:

      Calling

       Await.ready(CoordinatedShutdown(actorSystem).run(UnknownReason), 15.seconds) 
      is not required but recommended. It will allow pekko to finish jobs and release resources before JVM hard kill the daemon threads.
    • actorSystem

      public org.apache.pekko.actor.ActorSystem actorSystem()
      Specified by:
      actorSystem in interface DataClientContext
    • materializer

      public org.apache.pekko.stream.Materializer materializer()
      Specified by:
      materializer in interface DataClientContext
    • terminate

      public org.apache.pekko.Done terminate()
      Specified by:
      terminate in interface DataClientContext