class AdaptivePatternEstimator extends Wrapper[leveling.AdaptivePatternEstimator]
Computes an AdaptivePattern given an AdaptivePatternEstimateFn and a threshold.
- Alphabetic
- By Inheritance
- AdaptivePatternEstimator
- Wrapper
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Instance Constructors
- new AdaptivePatternEstimator(estimateFn: AdaptivePatternEstimateFn, context: DriverContext)
- estimateFn
The estimate function.
- context
The com.here.platform.data.processing.java.driver.DriverContext object that the estimator is running in.
Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- def adaptivePattern(threshold: Long): Broadcast[AdaptivePattern]
Estimate an AdaptivePattern given a threshold are returns it wrapped in a org.apache.spark.broadcast.Broadcast.
Estimate an AdaptivePattern given a threshold are returns it wrapped in a org.apache.spark.broadcast.Broadcast. The resulting pattern can be used to output tiles at a lower level in geographic areas where the content is sparse or at a higher level in geographic areas where the content is dense.
Changes to the estimated pattern *do prevent* the Driver to perform incremental compilation.
- threshold
The maximum weight a leveling point may carry.
- returns
The estimated adaptive leveling pattern, wrapped in a org.apache.spark.broadcast.Broadcast.
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @IntrinsicCandidate() @native()
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(o: Any): Boolean
- Definition Classes
- Wrapper → AnyRef → Any
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @IntrinsicCandidate() @native()
- def hashCode(): Int
- Definition Classes
- Wrapper → AnyRef → Any
- val impl: leveling.AdaptivePatternEstimator
- Definition Classes
- AdaptivePatternEstimator → Wrapper
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @IntrinsicCandidate() @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @IntrinsicCandidate() @native()
- def partitioner(threshold: Long, fallbackPartitioner: PartitionNamePartitioner): AdaptiveLevelingPartitioner
Estimate an AdaptivePattern given a threshold and use it to construct an com.here.platform.data.processing.java.spark.partitioner.AdaptiveLevelingPartitioner.
Estimate an AdaptivePattern given a threshold and use it to construct an com.here.platform.data.processing.java.spark.partitioner.AdaptiveLevelingPartitioner.
This partitioner can be used to balance the sizes of Spark partitions to obtain a more even, uniformed distribution of content inside them, avoiding cases where the partitions are too heavy to process, or there are too many light partitions.
Changes to the estimated pattern *do not prevent* the Driver to perform incremental compilation.
- threshold
The maximum weight a leveling point may carry.
- fallbackPartitioner
The partitioner used for non-aggregated keys.
- returns
An com.here.platform.data.processing.spark.partitioner.AdaptiveLevelingPartitioners.
- def partitioner(threshold: Long): AdaptiveLevelingPartitioner
Estimate an AdaptivePattern given a threshold and use it to construct an com.here.platform.data.processing.java.spark.partitioner.AdaptiveLevelingPartitioner.
Estimate an AdaptivePattern given a threshold and use it to construct an com.here.platform.data.processing.java.spark.partitioner.AdaptiveLevelingPartitioner.
This partitioner can be used to balance the sizes of Spark partitions to obtain a more even, uniformed distribution of content inside them, avoiding cases where the partitions are too heavy to process, or there are too many light partitions.
Changes to the estimated pattern *do not prevent* the Driver to perform incremental compilation.
- threshold
The maximum weight a leveling point may carry.
- returns
An com.here.platform.data.processing.spark.partitioner.AdaptiveLevelingPartitioners.
- Note
With this factory, non aggregated keys are distributed over the existing Spark partitions of the pattern.
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- Wrapper → AnyRef → Any
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
Deprecated Value Members
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable]) @Deprecated
- Deprecated
(Since version 9)