class JavaStatePartitionSerializer[K, V] extends StatePartitionSerializer[K, V]
Uses Java serialization.
- K
key type
- V
value type
- Alphabetic
- By Inheritance
- JavaStatePartitionSerializer
- StatePartitionSerializer
- Serializable
- Serializable
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
- new JavaStatePartitionSerializer()(implicit arg0: ClassTag[K], arg1: ClassTag[V])
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
def
deserialize(bytes: Array[Byte]): Iterator[(K, V)]
- Definition Classes
- JavaStatePartitionSerializer → StatePartitionSerializer
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
def
serialize(values: Iterator[(K, V)]): Array[Byte]
- Definition Classes
- JavaStatePartitionSerializer → StatePartitionSerializer
-
lazy val
sparkClassLoader: ClassLoader
Spark uses a custom class loader to always be able to load all classes on the executor that are required for correct execution.
Spark uses a custom class loader to always be able to load all classes on the executor that are required for correct execution. Here, on each executor, we extract this class loader from the spark environment. Otherwise, we use the threads default class loader.
- Annotations
- @transient()
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()