Packages

package spark

Ordering
  1. Alphabetic
Visibility
  1. Public
  2. All

Type Members

  1. class ClassLoaderExtractor extends Serializer

    The only purpose of this class is to extract the class loader of the given spark serializer.

    The only purpose of this class is to extract the class loader of the given spark serializer. It is implemented in Java to use its less restrictive interpretation of "protected".

  2. sealed abstract class KryoProtoSerializer[T <: AbstractMessage] extends Serializer[T]

    Serializer for the Kryo framework for any protobuf message using the protobuf serialization internally.

    Serializer for the Kryo framework for any protobuf message using the protobuf serialization internally. Single messages must still be registered in a class deriving from KryoRegistrator.

  3. class KryoProtoSerializerV2 extends KryoProtoSerializer[GeneratedMessage]
  4. class KryoProtoSerializerV3 extends KryoProtoSerializer[GeneratedMessageV3]
  5. class KryoRegistrator extends org.apache.spark.serializer.KryoRegistrator

    The extended version of org.apache.spark.serializer.KryoRegistrator provided by the processing library that registers important internal classes.

    The extended version of org.apache.spark.serializer.KryoRegistrator provided by the processing library that registers important internal classes.

    Note

    The class must be declared inside a package, like a normal class. Defining the class inside another class on the contrary may not work and throw errors like "Failed to register classes with Kryo" due to java.lang.NoSuchMethodException when trying to call the constructor.

  6. class SparkSingleton[T <: AnyRef] extends Serializable

    Class to build instances that are unique within one Spark executor.

    Class to build instances that are unique within one Spark executor. If a Spark singleton is serialized multiple times as part of different closures, instance will nevertheless return the same reference within one JVM. Instances can be only built from the Spark driver.

  7. case class TaskInfo(stageId: Int, partitionId: Int, attemptNumber: Int, taskAttemptId: Long) extends Product with Serializable

    Information about the current Spark task.

    Information about the current Spark task. This is a read-only version of Spark's TaskContext.

    stageId

    The ID of the Spark stage the task is running in.

    partitionId

    The Spark partition ID.

    attemptNumber

    The attempt number of the task (0 based).

    taskAttemptId

    The unique ID of the task attempt.

Value Members

  1. object Implicits
  2. object KryoRegistrator
  3. object TaskInfo extends Serializable
  4. object TaskInfoAwareLogContext
  5. object TaskInfoAwareThreadFactory

Ungrouped