implicit final class LazyJoinWrapper[K, V] extends AnyVal
Adds lazy joins to key/value-based RDDs.
- K
The type of the key.
- V
The type of the value.
- Alphabetic
- By Inheritance
- LazyJoinWrapper
- AnyVal
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
-
new
LazyJoinWrapper(rdd: RDD[(K, V)])
- rdd
The key/value-based RDD. The transformations will be lazy if the rdd is not repartitioned.
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- Any
-
final
def
##(): Int
- Definition Classes
- Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
getClass(): Class[_ <: AnyVal]
- Definition Classes
- AnyVal → Any
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
def
lazyFullOuterJoin[W](other: RDD[(K, W)], partitioner: Partitioner = defaultPartitioner)(implicit arg0: ClassTag[W], kt: ClassTag[K], vt: ClassTag[V]): RDD[(K, (Option[V], Option[W]))]
Like org.apache.spark.rdd.PairRDDFunctions.fullOuterJoin, but all the elements in the Spark partitions of the left operand are evaluated lazily if the RDD uses the same partitioner as the result.
Like org.apache.spark.rdd.PairRDDFunctions.fullOuterJoin, but all the elements in the Spark partitions of the left operand are evaluated lazily if the RDD uses the same partitioner as the result.
- W
The value-type of the right operand.
- other
The right operand. It will be fully evaluated.
- partitioner
The partitioner used for the resulting RDD. If the same partitioner of the left operand is used, the operation will not cause the left RDD to be fully evaluated. Default value is the left operand's partitioner, if defined, or a HashPartitioner, otherwise.
- returns
an RDD with the result of the full outer join.
-
def
lazyLeftOuterJoin[W](other: RDD[(K, W)], partitioner: Partitioner = defaultPartitioner)(implicit arg0: ClassTag[W], kt: ClassTag[K], vt: ClassTag[V]): RDD[(K, (V, Option[W]))]
Like org.apache.spark.rdd.PairRDDFunctions.leftOuterJoin, but all the elements in the Spark partitions of the left operand are evaluated lazily if the RDD uses the same partitioner as the result.
Like org.apache.spark.rdd.PairRDDFunctions.leftOuterJoin, but all the elements in the Spark partitions of the left operand are evaluated lazily if the RDD uses the same partitioner as the result.
- W
The value-type of the right operand.
- other
The right operand. It will be fully evaluated.
- partitioner
The partitioner used for the resulting RDD. If the same partitioner of the left operand is used, the operation will not cause the left RDD to be fully evaluated. Default value is the left operand's partitioner, if defined, or a HashPartitioner, otherwise.
- returns
an RDD with the result of the left outer join.
-
def
lazyRightOuterJoin[W](other: RDD[(K, W)], partitioner: Partitioner = defaultPartitioner)(implicit arg0: ClassTag[W], kt: ClassTag[K], vt: ClassTag[V]): RDD[(K, (Option[V], W))]
Like org.apache.spark.rdd.PairRDDFunctions.rightOuterJoin, but all the elements in the Spark partitions of the left operand are evaluated lazily if the RDD uses the same partitioner as the result.
Like org.apache.spark.rdd.PairRDDFunctions.rightOuterJoin, but all the elements in the Spark partitions of the left operand are evaluated lazily if the RDD uses the same partitioner as the result.
- W
The value-type of the right operand.
- other
The right operand. It will be fully evaluated.
- partitioner
The partitioner used for the resulting RDD. If the same partitioner of the left operand is used, the operation will not cause the left RDD to be fully evaluated. Default value is the left operand's partitioner, if defined, or a HashPartitioner, otherwise.
- returns
an RDD with the result of the right outer join.
-
def
toString(): String
- Definition Classes
- Any