Packages

c

com.here.platform.data.processing.java.leveling

AdaptivePatternEstimator

class AdaptivePatternEstimator extends Wrapper[leveling.AdaptivePatternEstimator]

Computes an AdaptivePattern given an AdaptivePatternEstimateFn and a threshold.

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. AdaptivePatternEstimator
  2. Wrapper
  3. AnyRef
  4. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new AdaptivePatternEstimator(estimateFn: AdaptivePatternEstimateFn, context: DriverContext)

    estimateFn

    The estimate function.

    context

    The com.here.platform.data.processing.java.driver.DriverContext object that the estimator is running in.

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. def adaptivePattern(threshold: Long): Broadcast[AdaptivePattern]

    Estimate an AdaptivePattern given a threshold are returns it wrapped in a org.apache.spark.broadcast.Broadcast.

    Estimate an AdaptivePattern given a threshold are returns it wrapped in a org.apache.spark.broadcast.Broadcast. The resulting pattern can be used to output tiles at a lower level in geographic areas where the content is sparse or at a higher level in geographic areas where the content is dense.

    Changes to the estimated pattern *do prevent* the Driver to perform incremental compilation.

    threshold

    The maximum weight a leveling point may carry.

    returns

    The estimated adaptive leveling pattern, wrapped in a org.apache.spark.broadcast.Broadcast.

  5. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  6. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  7. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  8. def equals(o: Any): Boolean
    Definition Classes
    Wrapper → AnyRef → Any
  9. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  10. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  11. def hashCode(): Int
    Definition Classes
    Wrapper → AnyRef → Any
  12. val impl: leveling.AdaptivePatternEstimator
    Definition Classes
    AdaptivePatternEstimatorWrapper
  13. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  14. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  15. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  16. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  17. def partitioner(threshold: Long, fallbackPartitioner: PartitionNamePartitioner): AdaptiveLevelingPartitioner

    Estimate an AdaptivePattern given a threshold and use it to construct an com.here.platform.data.processing.java.spark.partitioner.AdaptiveLevelingPartitioner.

    Estimate an AdaptivePattern given a threshold and use it to construct an com.here.platform.data.processing.java.spark.partitioner.AdaptiveLevelingPartitioner.

    This partitioner can be used to balance the sizes of Spark partitions to obtain a more even, uniformed distribution of content inside them, avoiding cases where the partitions are too heavy to process, or there are too many light partitions.

    Changes to the estimated pattern *do not prevent* the Driver to perform incremental compilation.

    threshold

    The maximum weight a leveling point may carry.

    fallbackPartitioner

    The partitioner used for non-aggregated keys.

    returns

    An com.here.platform.data.processing.spark.partitioner.AdaptiveLevelingPartitioners.

  18. def partitioner(threshold: Long): AdaptiveLevelingPartitioner

    Estimate an AdaptivePattern given a threshold and use it to construct an com.here.platform.data.processing.java.spark.partitioner.AdaptiveLevelingPartitioner.

    Estimate an AdaptivePattern given a threshold and use it to construct an com.here.platform.data.processing.java.spark.partitioner.AdaptiveLevelingPartitioner.

    This partitioner can be used to balance the sizes of Spark partitions to obtain a more even, uniformed distribution of content inside them, avoiding cases where the partitions are too heavy to process, or there are too many light partitions.

    Changes to the estimated pattern *do not prevent* the Driver to perform incremental compilation.

    threshold

    The maximum weight a leveling point may carry.

    returns

    An com.here.platform.data.processing.spark.partitioner.AdaptiveLevelingPartitioners.

    Note

    With this factory, non aggregated keys are distributed over the existing Spark partitions of the pattern.

  19. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  20. def toString(): String
    Definition Classes
    Wrapper → AnyRef → Any
  21. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  22. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  23. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()

Inherited from AnyRef

Inherited from Any

Ungrouped