com.here.platform.data.processing.leveling
AdaptivePatternEstimator
Companion object AdaptivePatternEstimator
class AdaptivePatternEstimator extends AnyRef
Computes an AdaptivePattern given an AdaptivePatternEstimateFn and a threshold.
- Alphabetic
- By Inheritance
- AdaptivePatternEstimator
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
-
new
AdaptivePatternEstimator(estimateFn: AdaptivePatternEstimateFn, context: DriverContext)
- estimateFn
The estimate function.
- context
The com.here.platform.data.processing.driver.DriverContext object that the estimator is running in.
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
adaptivePattern(threshold: Long): Broadcast[AdaptivePattern]
Estimate an AdaptivePattern given a threshold are returns it wrapped in a org.apache.spark.broadcast.Broadcast.
Estimate an AdaptivePattern given a threshold are returns it wrapped in a org.apache.spark.broadcast.Broadcast. The resulting pattern can be used to output tiles at a lower level in geographic areas where the content is sparse or at a higher level in geographic areas where the content is dense.
Changes to the estimated pattern *do prevent* the Driver to perform incremental compilation.
- threshold
The maximum weight a leveling point may carry.
- returns
The estimated adaptive leveling pattern, wrapped in a org.apache.spark.broadcast.Broadcast.
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
def
partitioner(threshold: Long, fallbackPartitioner: Option[PartitionNamePartitioner] = None): AdaptiveLevelingPartitioner
Estimate an AdaptivePattern given a threshold and use it to construct an com.here.platform.data.processing.spark.partitioner.AdaptiveLevelingPartitioner.
Estimate an AdaptivePattern given a threshold and use it to construct an com.here.platform.data.processing.spark.partitioner.AdaptiveLevelingPartitioner.
This partitioner can be used to balance the sizes of Spark partitions to obtain a more even, uniformed distribution of content inside them, avoiding cases where the partitions are too heavy to process, or there are too many light partitions.
Changes to the estimated pattern *do not prevent* the Driver to perform incremental compilation.
- threshold
The maximum weight a leveling point may carry.
- fallbackPartitioner
The optional partitioner used for non-aggregated keys. If undefined, non-aggregated keys are uniformly distributed over the existing partitions.
- returns
An com.here.platform.data.processing.spark.partitioner.AdaptiveLevelingPartitioners.
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()