Packages

case class AdapterPartitioner[K](p: org.apache.spark.Partitioner)(implicit evidence$1: ClassTag[K]) extends Partitioner[K] with Product with Serializable

Service class that adapts an existing org.apache.spark.Partitioner, that can by definition return the partition identifier for objects of scala.Any type, to the more restrictive interface of Partitioner of K, that can work with keys of type K only.

K

the type of the keys to be partitioned

Linear Supertypes
Product, Equals, Partitioner[K], org.apache.spark.Partitioner, Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. AdapterPartitioner
  2. Product
  3. Equals
  4. Partitioner
  5. Partitioner
  6. Serializable
  7. Serializable
  8. AnyRef
  9. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new AdapterPartitioner(p: org.apache.spark.Partitioner)(implicit arg0: ClassTag[K])

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  6. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  7. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  8. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  9. def getPartition(key: Any): Int

    Implements the Spark org.apache.spark.Partitioner interface by forwarding the calls to getPartitionForKey.

    Implements the Spark org.apache.spark.Partitioner interface by forwarding the calls to getPartitionForKey.

    If the object passed is not of type K or can't be converted to it (e.g. java.lang.Integer to Int), a IllegalArgumentException is thrown. This should be considered a bug that should not happen because the processing library uses Partitioner of type K only for RDDs for which it is aware and sure to have keys of type K.

    Basically, this function is a no-op call that forwards to getPartitionForKey, but the important point here is to have a type-safe Partitioner in the processing library.

    key

    the key for which the partition must be calculated

    returns

    the partition, identified by one scala.Int, in which the key should be located

    Definition Classes
    Partitioner → Partitioner
    Note

    This is called by Spark and should not be called by developer's code, as it may be unsafe.

  10. def getPartitionForKey(key: K): Int

    Gets the partition for a given key of type K.

    Gets the partition for a given key of type K. This is the function that must be implemented by children partitioners.

    key

    the key for which the partition must be calculated

    returns

    the partition, identified by one scala.Int, in which the key should be located

    Definition Classes
    AdapterPartitionerPartitioner
  11. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  12. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  13. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  14. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  15. def numPartitions: Int
    Definition Classes
    AdapterPartitioner → Partitioner
  16. val p: org.apache.spark.Partitioner
  17. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  18. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  19. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  20. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()

Inherited from Product

Inherited from Equals

Inherited from Partitioner[K]

Inherited from org.apache.spark.Partitioner

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped