package spark
- Alphabetic
- Public
- All
Type Members
-
class
ClassLoaderExtractor extends Serializer
The only purpose of this class is to extract the class loader of the given spark serializer.
The only purpose of this class is to extract the class loader of the given spark serializer. It is implemented in Java to use its less restrictive interpretation of "protected".
-
sealed abstract
class
KryoProtoSerializer[T <: AbstractMessage] extends Serializer[T]
Serializer for the Kryo framework for any protobuf message using the protobuf serialization internally.
Serializer for the Kryo framework for any protobuf message using the protobuf serialization internally. Single messages must still be registered in a class deriving from KryoRegistrator.
- class KryoProtoSerializerV2 extends KryoProtoSerializer[GeneratedMessage]
- class KryoProtoSerializerV3 extends KryoProtoSerializer[GeneratedMessageV3]
-
class
KryoRegistrator extends org.apache.spark.serializer.KryoRegistrator
The extended version of org.apache.spark.serializer.KryoRegistrator provided by the processing library that registers important internal classes.
The extended version of org.apache.spark.serializer.KryoRegistrator provided by the processing library that registers important internal classes.
- Note
The class must be declared inside a package, like a normal class. Defining the class inside another class on the contrary may not work and throw errors like "Failed to register classes with Kryo" due to java.lang.NoSuchMethodException when trying to call the constructor.
-
class
SparkSingleton[T <: AnyRef] extends Serializable
Class to build instances that are unique within one Spark executor.
Class to build instances that are unique within one Spark executor. If a Spark singleton is serialized multiple times as part of different closures,
instance
will nevertheless return the same reference within one JVM. Instances can be only built from the Spark driver. -
case class
TaskInfo(stageId: Int, partitionId: Int, attemptNumber: Int, taskAttemptId: Long) extends Product with Serializable
Information about the current Spark task.
Information about the current Spark task. This is a read-only version of Spark's TaskContext.
- stageId
The ID of the Spark stage the task is running in.
- partitionId
The Spark partition ID.
- attemptNumber
The attempt number of the task (0 based).
- taskAttemptId
The unique ID of the task attempt.
Value Members
- object Implicits
- object KryoRegistrator
- object TaskInfo extends Serializable
- object TaskInfoAwareLogContext
- object TaskInfoAwareThreadFactory