Package org.apache.spark.status
Class KVUtils
Object
org.apache.spark.status.KVUtils
- 
Constructor Summary
Constructors - 
Method Summary
Modifier and TypeMethodDescriptionstatic <T> intCounts the number of elements in the KVStoreView which satisfy a predicate.static org.apache.spark.util.kvstore.KVStorecreateKVStore(scala.Option<File> storePath, boolean live, SparkConf conf) static <T> voidforeach(org.apache.spark.util.kvstore.KVStoreView<T> view, scala.Function1<T, scala.runtime.BoxedUnit> foreachFunc) Applies a function f to all values produced by KVStoreView.static <T,B> scala.collection.Seq<B> mapToSeq(org.apache.spark.util.kvstore.KVStoreView<T> view, scala.Function1<T, B> mapFunc) Maps all values of KVStoreView to new values using a transformation function.static <M> org.apache.spark.util.kvstore.KVStoreOpen or create a disk-based KVStore.static org.slf4j.Loggerstatic voidorg$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1) static org.apache.spark.status.KVUtils.KVStoreScalaSerializerstatic <T> intsize(org.apache.spark.util.kvstore.KVStoreView<T> view) static <T> scala.collection.Seq<T>viewToSeq(org.apache.spark.util.kvstore.KVStoreView<T> view) Turns a KVStoreView into a Scala sequence.static <T> scala.collection.Seq<T>viewToSeq(org.apache.spark.util.kvstore.KVStoreView<T> view, int from, int until, scala.Function1<T, Object> filter) Turns an interval of KVStoreView into a Scala sequence, applying a filter.static <T> scala.collection.Seq<T>viewToSeq(org.apache.spark.util.kvstore.KVStoreView<T> view, int max, scala.Function1<T, Object> filter) Turns a KVStoreView into a Scala sequence, applying a filter. 
- 
Constructor Details
- 
KVUtils
public KVUtils() 
 - 
 - 
Method Details
- 
open
public static <M> org.apache.spark.util.kvstore.KVStore open(File path, M metadata, SparkConf conf, boolean live, scala.reflect.ClassTag<M> evidence$1) Open or create a disk-based KVStore.- Parameters:
 path- Location of the store.metadata- Metadata value to compare to the data in the store. If the store does not contain any metadata (e.g. it's a new store), this value is written as the store's metadata.conf- SparkConf use to getHYBRID_STORE_DISK_BACKENDlive- (undocumented)evidence$1- (undocumented)- Returns:
 - (undocumented)
 
 - 
serializerForHistoryServer
public static org.apache.spark.status.KVUtils.KVStoreScalaSerializer serializerForHistoryServer(SparkConf conf)  - 
createKVStore
 - 
viewToSeq
public static <T> scala.collection.Seq<T> viewToSeq(org.apache.spark.util.kvstore.KVStoreView<T> view, int max, scala.Function1<T, Object> filter) Turns a KVStoreView into a Scala sequence, applying a filter. - 
viewToSeq
public static <T> scala.collection.Seq<T> viewToSeq(org.apache.spark.util.kvstore.KVStoreView<T> view, int from, int until, scala.Function1<T, Object> filter) Turns an interval of KVStoreView into a Scala sequence, applying a filter. - 
viewToSeq
public static <T> scala.collection.Seq<T> viewToSeq(org.apache.spark.util.kvstore.KVStoreView<T> view) Turns a KVStoreView into a Scala sequence. - 
count
public static <T> int count(org.apache.spark.util.kvstore.KVStoreView<T> view, scala.Function1<T, Object> countFunc) Counts the number of elements in the KVStoreView which satisfy a predicate. - 
foreach
public static <T> void foreach(org.apache.spark.util.kvstore.KVStoreView<T> view, scala.Function1<T, scala.runtime.BoxedUnit> foreachFunc) Applies a function f to all values produced by KVStoreView. - 
mapToSeq
public static <T,B> scala.collection.Seq<B> mapToSeq(org.apache.spark.util.kvstore.KVStoreView<T> view, scala.Function1<T, B> mapFunc) Maps all values of KVStoreView to new values using a transformation function. - 
size
public static <T> int size(org.apache.spark.util.kvstore.KVStoreView<T> view)  - 
org$apache$spark$internal$Logging$$log_
public static org.slf4j.Logger org$apache$spark$internal$Logging$$log_() - 
org$apache$spark$internal$Logging$$log__$eq
public static void org$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1)  
 -