OpenJDK / lambda / lambda / jdk
changeset 7691:90979767b8b6
More Collector specs and tweaks
author | briangoetz |
---|---|
date | Tue, 19 Mar 2013 16:08:56 -0400 |
parents | 04a196355eb0 |
children | 5186c91db849 |
files | src/share/classes/java/util/stream/Collector.java src/share/classes/java/util/stream/Collectors.java src/share/classes/java/util/stream/IntStatistics.java |
diffstat | 3 files changed, 296 insertions(+), 88 deletions(-) [+] |
line wrap: on
line diff
--- a/src/share/classes/java/util/stream/Collector.java Tue Mar 19 10:18:09 2013 +0100 +++ b/src/share/classes/java/util/stream/Collector.java Tue Mar 19 16:08:56 2013 -0400 @@ -26,25 +26,60 @@ import java.util.function.BiFunction; import java.util.function.BinaryOperator; -import java.util.function.ObjDoubleConsumer; -import java.util.function.ObjIntConsumer; -import java.util.function.ObjLongConsumer; import java.util.function.Supplier; /** - * A reduction operation that supports folding input elements into a mutable result container. Examples - * of such operations include accumulating input elements into a {@code Collection}; concatenating strings into + * A reduction operation that supports folding input elements into a cumulative result. The result may be a value + * or may be a mutable result container. Examples of operations accumulating results into a mutable result + * container include: accumulating input elements into a {@code Collection}; concatenating strings into * a {@code StringBuilder}; computing summary information about elements such as sum, min, max, or average; * computing "pivot table" summaries such as "maximum valued transaction by seller", etc. Reduction operations * can be performed either sequentially or in parallel. * - * A {@code Collector} has three functions that describe: how to create a new result container, how to - * incorporate a new data element into a result container, and how to combine two result containers into one. - * The last function -- combining two containers into one -- is used during parallel operations, where we - * collect subsets of the input in parallel, and the merge the subresults into a combined result. Additionally, + * <p>A {@code Collector} has three functions that perform: creation of an initial result, + * incorporating a new data element into a result, and combining two result into one. + * The last function -- combining two results into one -- is used during parallel operations, where we + * collect subsets of the input in parallel, and the merge the subresults into a combined result. + * The result may be a mutable container or a value. If the result is mutable, the accumulation and combination functions + * may either mutate their left argument and return that (such as adding elements to a collection), + * or return a new result (in which case it should not mutate anything). + * + * <p>Libraries that implement reduction based on {@code Collector}, such as the {@link Stream#collect(Collector)} + * must adhere to the following constraints: + * <ul> + * <li>The first argument passed to the accumulator function, and both arguments passed to the combiner + * function, must be the result of of a previous invocation of {@link #resultSupplier()}, {@link #accumulator()}, or + * {@link #combiner()}.</li> + * <li>The implementation should not do anything with the result of any of the result supplier, accumulator, + * or combiner functions other than to pass them again to the accumulator or combiner functions, + * or return them to the caller of the reduction operation.</li> + * <li>If a result is passed to the accumulator or combiner function, and the same object is not returned + * from that function, it is never used again.</li> + * <li>Once a result is passed to the combiner function, it is never passed to the accumulator function again.</li> + * <li>For non-concurrent collectors, any result returned from the result supplier, accumulator, or combiner + * functions must be serially thread-confined.</li> + * </ul> + * + * <p>Additionally, * collectors have two properties, {@code isConcurrent} and {@code isStable}. A concurrent collector is one * for which it is safe to invoke the accumulator function on the same result container concurrently; a stable - * collector is one where the accumulator function always returns the result container that was passed in. + * collector is one where the accumulator function always returns the result container that was passed to it. + * Concurrent collectors are always stable. + * + * @apiNote + * <p>Performing a reduction operation with a {@code Collector} should produce a result equivalent to: + * <pre> + * BiFunction<R,T,R> accumulator = collector.accumulator(); + * R result = collector.resultSupplier().get(); + * for (T t : inputSource) + * result = accumulator.apply(result, t); + * return result; + * </pre> + * + * However, the library is free to partition the input, perform the reduction on the partitions, and then use the + * combiner function to combine the partial results to achieve a parallel reduction. Depending on the specific + * reduction operation, this may perform better or worse, depending on the relative cost of the accumulator + * and combiner functions. * * <p>An example of an operation that can be easily modeled by {@code Collector} is accumulating elements into a * {@code TreeSet}. In this case, the @{code resultSupplier()} function is {@code new Treeset<T>()}, the @@ -52,11 +87,8 @@ * function is {@code (left, right) -> { left.addAll(right); return left; }}. (This behavior is implemented by * the method {@code Collectors.toCollection(TreeSet::new)}). * - * @@@ Document behavior of collect() - * -- will only pass to combiner / accumulator what was returned from supplier - * -- will not modify container after combining - * -- for non-concurrent collectors, result containers are kept isolated - * -- interaction with isStable + * <p>The {@code Collector} + * * @@@ Document concurrent behavior and interaction with ordering * * @see Stream#collect(Collector) @@ -69,17 +101,17 @@ */ public interface Collector<T, R> { /** - * A function that creates and return a new, empty result container. - * @@@ Dual-moded -- either always return new empty non-null, or can return an identity, including null. - * @@@ Interaction with stability + * A function that creates and return a new result that represents "no values". If the accumulator or combiner + * functions may mutate their arguments, this must be a new, empty result container. * - * @return A function which, when invoked, returns a new, empty result container on each invocation + * @return A function which, when invoked, returns a result representing "no values" */ Supplier<R> resultSupplier(); /** * @@@ needs update @@@ * @@@ Interaction with stability + * @@@ associaitivity @@@ * A function that accepts a result container and a value and incorporates the value into the container. * @return A function which, when invoked with a result container and a value, modifies the state of the reslut * container to reflect incorporation of the new value
--- a/src/share/classes/java/util/stream/Collectors.java Tue Mar 19 10:18:09 2013 +0100 +++ b/src/share/classes/java/util/stream/Collectors.java Tue Mar 19 16:08:56 2013 -0400 @@ -43,7 +43,6 @@ import java.util.function.BiFunction; import java.util.function.BinaryOperator; import java.util.function.Function; -import java.util.function.IntBinaryOperator; import java.util.function.Predicate; import java.util.function.Supplier; import java.util.function.ToDoubleFunction; @@ -61,7 +60,7 @@ private Collectors() {} /** - * A merge function, suitable for use in {@link Map#merge(Object, Object, BiFunction)} or + * Return a merge function, suitable for use in {@link Map#merge(Object, Object, BiFunction)} or * {@link #toMap(Function, Supplier, BinaryOperator)}, which always throws {@code IllegalStateException}. * This can be used to enforce the assumption that the elements being collected are distinct. * @@ -72,23 +71,31 @@ return (u,v) -> { throw new IllegalStateException(String.format("Duplicate key %s", u)); }; } - static abstract class AbstractCollectorImpl<T, R> implements Collector<T, R> { + static final class CollectorImpl<T, R> implements Collector<T,R> { private final Supplier<R> resultSupplier; + private final BiFunction<R, T, R> accumulator; private final BinaryOperator<R> combiner; private final boolean isConcurrent; private final boolean isStable; - AbstractCollectorImpl(Supplier<R> resultSupplier, - BinaryOperator<R> combiner, - boolean isConcurrent, - boolean isStable) { + CollectorImpl(Supplier<R> resultSupplier, + BiFunction<R, T, R> accumulator, + BinaryOperator<R> combiner, + boolean isConcurrent, + boolean isStable) { this.resultSupplier = resultSupplier; + this.accumulator = accumulator; this.combiner = combiner; this.isConcurrent = isConcurrent; this.isStable = isStable; } @Override + public BiFunction<R, T, R> accumulator() { + return accumulator; + } + + @Override public Supplier<R> resultSupplier() { return resultSupplier; } @@ -109,29 +116,12 @@ } } - static final class CollectorImpl<T, R> extends AbstractCollectorImpl<T, R> implements Collector<T, R> { - private final BiFunction<R, T, R> accumulator; - - CollectorImpl(Supplier<R> resultSupplier, - BiFunction<R, T, R> accumulator, - BinaryOperator<R> combiner, - boolean isConcurrent, - boolean isStable) { - super(resultSupplier, combiner, isConcurrent, isStable); - this.accumulator = accumulator; - } - - @Override - public BiFunction<R, T, R> accumulator() { - return accumulator; - } - } - /** - * Accumulate elements into a new {@code Collection}, which is created by the provided factory. + * Return a {@code Collector} that accumulates the input elements into a new {@code Collection}, + * which is created by the provided factory. * - * @param collectionFactory A {@code Supplier} which returns a new {@code Collection} of the appropriate type - * each time it is called + * @param collectionFactory A {@code Supplier} which returns a new, empty {@code Collection} + * of the appropriate type each time it is called * @param <T> The type of the input elements * @param <C> The type of the resulting {@code Collection} * @return A {@code Collector} which collects elements into a {@code Collection} containing all the input elements @@ -146,7 +136,9 @@ } /** - * Accumulate elements into a {@code List}. + * Return a {@code Collector} that accumulates the input elements into a {@code List}. + * There are no guarantees on the type of the {@code List} + * returned, and the returned list is not guaranteed to be mutable. * * @param <T> The type of the input elements * @return A {@code Collector} which collects elements into a {@code List} containing all the input elements, @@ -186,7 +178,9 @@ } /** - * Accumulate elements into a {@code Set}. + * Return a {@code Collector} that accumulates the input elements into a {@code Set}. + * There are no guarantees on the type of the {@code Set} + * returned, and the returned list is not guaranteed to be mutable. * * @param <T> The type of the input elements * @return A {@code Collector} which collects elements into a {@code Set} containing all the input elements @@ -199,7 +193,7 @@ } /** - * Accumulate {@code String} elements into a {@link StringBuilder}. + * Return a {@code Collector} that concatenates the input elements into a new {@code StringBuilder}. * * @return A {@code Collector} which collects {@code String} elements into a {@code StringBuilder} containing * all of the input elements concatenated in encounter order @@ -211,7 +205,8 @@ } /** - * Accumulate {@code String} elements into a {@link StringJoiner}, using the specified separator. + * Return a {@code Collector} that concatenates the input elements into a new {@code StringJoiner}, + * using the specified separator. * * @return A {@code Collector} which collects String elements into a {@code StringJoiner} containing all of * the input elements concatenated in encounter order @@ -226,7 +221,7 @@ merger, false, true); } - static<K, V, M extends Map<K,V>> BinaryOperator<M> leftMapMerger(BinaryOperator<V> mergeFunction) { + private static<K, V, M extends Map<K,V>> BinaryOperator<M> mapMerger(BinaryOperator<V> mergeFunction) { return (m1, m2) -> { for (Map.Entry<K,V> e : m2.entrySet()) m1.merge(e.getKey(), e.getValue(), mergeFunction); @@ -246,16 +241,17 @@ * each city: * <pre> * Map<City, Set<String>> lastNamesByCity - * = people.stream().collect(groupingBy(Person::getCity) - * .then(mapping(Person::getLastName, toSet()))); + * = people.stream().collect(groupingBy(Person::getCity, + * mapping(Person::getLastName, toSet()))); * </pre> * - * @param <T> Type of values to be accepted - * @param <U> Type of values accepted by provided collector + * @param <T> The type of the input elements + * @param <U> Type of elements accepted by downstream collector * @param <R> Result type of collector - * @param mapper A function mapping {@code <T>} to {@code <U>} + * @param mapper A function mapping {@code T} to {@code U} * @param downstream collector which will accept mapped values - * @return A collector which applies a function to input values and then provides them to the downstream collector + * @return A collector which applies the mapper function to input elements and then provides + * them to the downstream collector */ public static <T, U, R> Collector<T, R> mapping(Function<? super T, ? extends U> mapper, Collector<U, R> downstream) { @@ -265,34 +261,158 @@ downstream.combiner(), downstream.isConcurrent(), downstream.isStable()); } + /** + * Given a {@code BinaryOperator<T>}, return a {@code Collector<T,T>} which calculates the reduction of + * its input elements under the specified {@code BinaryOperator}. + * + * @apiNote + * The {@code reducing()} collectors are most useful when used in a multi-level collection + * following a {@code groupingBy} or {@code partitioningBy} collection; if you want to perform + * a simple reduction on a stream, use {@link Stream#reduce(BinaryOperator)}. + * For example, given a stream of {@code Person}, to calculate tallest person in each city: + * <pre> + * Comparator<Person> byHeight = Comparators.comparing(Person::getHeight); + * BinaryOperator<Person> tallerOf = Comparators.greaterOf(byHeight); + * Map<City, Person> tallestByCity + * = people.stream().collect(groupingBy(Person::getCity, reducing(tallerOf))); + * </pre> + * @param op A {@code BinaryOperator<T>} used to reduce the input elements + * @param <T> The type of the input elements + * @return A {@code Collector} which implements the reduction operation + * @see #reducing(Function, BinaryOperator) + */ public static <T> Collector<T, T> reducing(BinaryOperator<T> op) { return new CollectorImpl<>(() -> null, (r, t) -> (r == null ? t : op.apply(r, t)), op, false, false); } - public static <T, U> Collector<T, U> - reducing(Function<? super T, ? extends U> mapper, BinaryOperator<U> op) { + /** + * Given a {@code BinaryOperator<U>} and a {@code Function<T,U>}, return a {@code Collector<T,U>} + * which calculates the reduction of the input elements after applying the mapping function. + * This is a generalization of {@link #reducing(BinaryOperator)}, which allows a transformation of + * the elements before reduction. + * + * @apiNote + * The {@code reducing()} collectors are most useful when used in a multi-level collection + * following a {@code groupingBy} or {@code partitioningBy} collection; if you want to perform + * a simple reduction on a stream, use {@link Stream#reduce(BinaryOperator)}. + * For example, given a stream of {@code Person}, to calculate the longest last name of residents + * in each city: + * <pre> + * Comparator<String> byLength = Comparators.comparing(String::length); + * BinaryOperator<String> longerOf = Comparators.greaterOf(byLength); + * Map<City, String> longestLastNameByCity + * = people.stream().collect(groupingBy(Person::getCity, + * reducing(Person::getLastName, longerOf))); + * </pre> + * + * @param mapper A mapping function to apply to each input value + * @param op A {@code BinaryOperator<T>} used to reduce the mapped values + * @param <T> The type of the input elements + * @param <U> The type of the mapped values + * @return A {@code Collector} implementing the map-reduce operation + * @see #reducing(BinaryOperator) + */ + public static <T, U> + Collector<T, U> reducing(Function<? super T, ? extends U> mapper, + BinaryOperator<U> op) { return new CollectorImpl<>(() -> null, (r, t) -> (r == null ? mapper.apply(t) : op.apply(r, mapper.apply(t))), op, false, false); } - public static<T, K> Collector<T, Map<K, List<T>>> groupingBy(Function<? super T, ? extends K> classifier) { + /** + * Returns a {@code Collector} that implements a "group by" operation on input elements of type {@code T}. + * + * <p>Accepts a classification function from {@code T} to {@code K}. The collector produces a {@code Map} whose keys + * are the set of values resulting of applying the classification function to the input elements, and whose + * corresponding values are {@code List}s containing the input elements which map to the associated key + * under the classification function. + * + * <p>No guarantees are made as to the type of the {@code Map} or the type of the + * {@code List} used for the map values. + * + * @param classifier The classifier function mapping input elements to keys + * @param <T> The type of the input elements + * @param <K> The type of the keys + * @return A {@code Collector} implementing the group-by operation + */ + public static<T, K> + Collector<T, Map<K, List<T>>> groupingBy(Function<? super T, ? extends K> classifier) { return groupingBy(classifier, HashMap::new); } + /** + * Returns a {@code Collector} that implements a "group by" operation on input elements of type {@code T}, + * resulting in a {@code Map} of a specific type. + * + * <p>Accepts a classification function from {@code T} to {@code K}, and a factory function which produces + * a {@code Map} of the desired type. The collector populates a {@code Map} produced by the factory function, + * whose keys are the set of values resulting of applying the classification function to the input elements, + * and whose corresponding values are {@code List}s containing the input elements which map to the associated key + * under the classification function. + * + * <p>No guarantees are made as to the type of the {@code List} used for the map values. + * + * @param classifier The classifier function mapping input elements to keys + * @param mapFactory A function which, when invoked, returns a new, empty instance + * of a {@code Map} of the desired type + * @param <M> The type of the resulting {@code Map} + * @param <T> The type of the input elements + * @param <K> The type of the keys + * @return A {@code Collector} implementing the group-by operation + * @return + */ public static<T, K, M extends Map<K, List<T>>> Collector<T, M> groupingBy(Function<? super T, ? extends K> classifier, Supplier<M> mapFactory) { return groupingBy(classifier, mapFactory, toList()); } + /** + * Returns a {@code Collector} that implements a cascaded "group by" operation on input elements + * of type {@code T}, resulting in a {@code Map} whose values are the result of another reduction, + * resulting in a {@code Map} of a specific type. + * + * <p>Accepts a classification function from {@code T} to {@code K} and a {@code Collector} which implements + * another reduction on elements of type {@code T}. The collector populates a {@code Map} + * whose keys are the set of values resulting of applying the classification function to the input elements, + * and whose corresponding values are the result of reducing the input elements which map to the associated key + * under the classification function with the dowstream reducer. + * + * <p>No guarantees are made as to the type of the resulting {@code Map}. + * + * @param classifier The classifier function mapping input elements to keys + * @param downstream A {@code Collector} implementing the downstream reduction + * @param <T> The type of the input elements + * @param <K> The type of the keys + * @param <D> The result type of the downstream reduction + * @return A {@code Collector} implementing the cascaded group-by operation + */ public static<T, K, D> Collector<T, Map<K, D>> groupingBy(Function<? super T, ? extends K> classifier, Collector<T, D> downstream) { return groupingBy(classifier, HashMap::new, downstream); } + /** + * Returns a {@code Collector} that implements a cascaded "group by" operation on input elements + * of type {@code T}, resulting in a {@code Map} whose values are the result of another reduction. + * + * <p>Accepts a classification function from {@code T} to {@code K}, a factory function which produces + * a {@code Map} of the desired type, and a {@code Collector} which implements another reduction on elements + * of type {@code T}. The collector populates a {@code Map} produced by the factory function + * whose keys are the set of values resulting of applying the classification function to the input elements, + * and whose corresponding values are the result of reducing the input elements which map to the associated key + * under the classification function with the dowstream reducer. + * + * @param classifier The classifier function mapping input elements to keys + * @param downstream A {@code Collector} implementing the downstream reduction + * @param <T> The type of the input elements + * @param <K> The type of the keys + * @param <D> The result type of the downstream reduction + * @return A {@code Collector} implementing the cascaded group-by operation + */ public static<T, K, D, M extends Map<K, D>> Collector<T, M> groupingBy(Function<? super T, ? extends K> classifier, Supplier<M> mapFactory, @@ -307,7 +427,7 @@ m.put(key, newContainer); return m; }; - return new CollectorImpl<>(mapFactory, accumulator, leftMapMerger(downstream.combiner()), false, true); + return new CollectorImpl<>(mapFactory, accumulator, mapMerger(downstream.combiner()), false, true); } public static<T, K> @@ -333,13 +453,14 @@ Collector<T, D> downstream) { Supplier<D> downstreamSupplier = downstream.resultSupplier(); BiFunction<D, T, D> downstreamAccumulator = downstream.accumulator(); + BinaryOperator<M> combiner = mapMerger(downstream.combiner()); if (downstream.isConcurrent()) { BiFunction<M, T, M> accumulator = (m, t) -> { K key = Objects.requireNonNull(classifier.apply(t), "element cannot be mapped to a null key"); downstreamAccumulator.apply(m.computeIfAbsent(key, k -> downstreamSupplier.get()), t); return m; }; - return new CollectorImpl<>(mapFactory, accumulator, leftMapMerger(downstream.combiner()), true, true); + return new CollectorImpl<>(mapFactory, accumulator, combiner, true, true); } else if (downstream.isStable()) { BiFunction<M, T, M> accumulator = (m, t) -> { @@ -350,7 +471,7 @@ } return m; }; - return new CollectorImpl<>(mapFactory, accumulator, leftMapMerger(downstream.combiner()), true, true); + return new CollectorImpl<>(mapFactory, accumulator, combiner, true, true); } else { BiFunction<M, T, M> accumulator = (m, t) -> { @@ -373,15 +494,38 @@ } } while (true); }; - return new CollectorImpl<>(mapFactory, accumulator, leftMapMerger(downstream.combiner()), true, true); + return new CollectorImpl<>(mapFactory, accumulator, combiner, true, true); } } + /** + * Return a {@code Collector} which partitions the input elements according to a {@code Predicate}, and + * organizes them into a {@code Map<Boolean, List<T>>}. + * + * <p>No guarantee is made as to the type of the returned {@code Map}, and it is not guaranteed to be mutable. + * + * @param predicate The predicate used for classifying input elements + * @param <T> The type of the input elements + * @return A {@code Collector} implementing the partitioning operation. + */ public static<T> Collector<T, Map<Boolean, List<T>>> partitioningBy(Predicate<? super T> predicate) { return partitioningBy(predicate, toList()); } + /** + * Return a {@code Collector} which partitions the input elements according to a {@code Predicate}, + * computes another reduction on the partitioned elements, and organizes them into + * a {@code Map<Boolean, D>} whose values are the result of the downstream reduction. + * + * <p>No guarantee is made as to the type of the returned {@code Map}, and it is not guaranteed to be mutable. + * + * @param predicate The predicate used for classifying input elements + * @param downstream A {@code Collector} implementing the downstream reduction + * @param <T> The type of the input elements + * @param <D> The result type of the downstream reduction + * @return A {@code Collector} implementing the cascaded partitioning operation. + */ public static<T, D> Collector<T, Map<Boolean, D>> partitioningBy(Predicate<? super T> predicate, Collector<T, D> downstream) { @@ -402,7 +546,23 @@ }; return new CollectorImpl<>(() -> new Partition<>(downstream.resultSupplier().get(), downstream.resultSupplier().get()), - accumulator, leftPartitionMerger(downstream.combiner()), false, true); + accumulator, partitionMerger(downstream.combiner()), false, true); + } + + private static<D> BinaryOperator<Map<Boolean, D>> partitionMerger(BinaryOperator<D> op) { + return (m1, m2) -> { + Partition<D> left = (Partition<D>) m1; + Partition<D> right = (Partition<D>) m2; + if (left.forFalse == null) + left.forFalse = right.forFalse; + else if (right.forFalse != null) + left.forFalse = op.apply(left.forFalse, right.forFalse); + if (left.forTrue == null) + left.forTrue = right.forTrue; + else if (right.forTrue != null) + left.forTrue = op.apply(left.forTrue, right.forTrue); + return left; + }; } /** @@ -443,7 +603,7 @@ BinaryOperator<U> mergeFunction) { BiFunction<M, T, M> accumulator = (map, value) -> { map.merge(value, mapper.apply(value), mergeFunction); return map; }; - return new CollectorImpl<>(mapSupplier, accumulator, leftMapMerger(mergeFunction), false, true); + return new CollectorImpl<>(mapSupplier, accumulator, mapMerger(mergeFunction), false, true); } /** @@ -487,26 +647,52 @@ Collector<T, M> toConcurrentMap(Function<? super T, ? extends U> mapper, Supplier<M> mapSupplier, BinaryOperator<U> mergeFunction) { - BiFunction<M, T, M> accumulator = (map, value) -> { map.merge(value, mapper.apply(value), mergeFunction); return map; }; - return new CollectorImpl<>(mapSupplier, accumulator, leftMapMerger(mergeFunction), true, true); + BiFunction<M, T, M> accumulator = (map, value) -> { + map.merge(value, mapper.apply(value), mergeFunction); return map; + }; + return new CollectorImpl<>(mapSupplier, accumulator, mapMerger(mergeFunction), true, true); } + /** + * Return a {@code Collector} which applies an {@code int}-producing mapping function to each input + * element, and returns summary statistics for the resulting values. + * + * @param mapper The mapping function to apply to each element + * @param <T> The type of the input elements + * @return A {@code Collector} implementing the summary-statistics reduction + */ public static<T> Collector<T, IntStatistics> toIntStatistics(ToIntFunction<? super T> mapper) { return new CollectorImpl<>(IntStatistics::new, (r, t) -> { r.accept(mapper.applyAsInt(t)); return r; }, (l, r) -> { l.combine(r); return l; }, false, true); } + /** + * Return a {@code Collector} which applies an {@code long}-producing mapping function to each input + * element, and returns summary statistics for the resulting values. + * + * @param mapper The mapping function to apply to each element + * @param <T> The type of the input elements + * @return A {@code Collector} implementing the summary-statistics reduction + */ public static<T> Collector<T, LongStatistics> toLongStatistics(ToLongFunction<? super T> mapper) { return new CollectorImpl<>(LongStatistics::new, (r, t) -> { r.accept(mapper.applyAsLong(t)); return r; }, (l, r) -> { l.combine(r); return l; }, false, true); } + /** + * Return a {@code Collector} which applies an {@code double}-producing mapping function to each input + * element, and returns summary statistics for the resulting values. + * + * @param mapper The mapping function to apply to each element + * @param <T> The type of the input elements + * @return A {@code Collector} implementing the summary-statistics reduction + */ public static<T> Collector<T, DoubleStatistics> toDoubleStatistics(ToDoubleFunction<? super T> mapper) { return new CollectorImpl<>(DoubleStatistics::new, (r, t) -> { r.accept(mapper.applyAsDouble(t)); return r; }, (l, r) -> { l.combine(r); return l; }, false, true); } - static final class Partition<T> extends AbstractMap<Boolean, T> implements Map<Boolean, T> { + private static final class Partition<T> extends AbstractMap<Boolean, T> implements Map<Boolean, T> { T forTrue; T forFalse; @@ -568,20 +754,4 @@ }; } } - - static<D> BinaryOperator<Map<Boolean, D>> leftPartitionMerger(BinaryOperator<D> op) { - return (m1, m2) -> { - Partition<D> left = (Partition<D>) m1; - Partition<D> right = (Partition<D>) m2; - if (left.forFalse == null) - left.forFalse = right.forFalse; - else if (right.forFalse != null) - left.forFalse = op.apply(left.forFalse, right.forFalse); - if (left.forTrue == null) - left.forTrue = right.forTrue; - else if (right.forTrue != null) - left.forTrue = op.apply(left.forTrue, right.forTrue); - return left; - }; - } }
--- a/src/share/classes/java/util/stream/IntStatistics.java Tue Mar 19 10:18:09 2013 +0100 +++ b/src/share/classes/java/util/stream/IntStatistics.java Tue Mar 19 16:08:56 2013 -0400 @@ -30,10 +30,12 @@ /** * A state object for collecting statistics such as count, min, max, sum, and average for {@code int} values. + * A {@code long} is used internally when calculating the sum, which improves the quality of the result of + * {@code getAverage}. */ public class IntStatistics implements IntConsumer { private long count; - private int sum; + private long sum; private int min = Integer.MIN_VALUE; private int max = Integer.MAX_VALUE; @@ -56,8 +58,12 @@ return count; } + public long getSumAsLong() { + return sum; + } + public int getSum() { - return sum; + return (int) sum; } public int getMin() {