In JAVA, when it comes to operating on the elements of collection classes such as Array and Collection , they are usually processed one by one in a loop, or in a Stream.

 For example, there is now such a need:


Returns a list of words longer than 5 from a given sentence, in reverse order of length, up to a maximum of 3 words


In JAVA 7 and earlier code, we would be able to implement it as follows:

public List<String> sortGetTop3LongWords(@NotNull String sentence) {
    String[] words = sentence.split(" ");
    List<String> wordList = new ArrayList<>();
    for (String word : words) {
        if (word.length() > 5) {
            wordList.add(word);
        }
    }
    wordList.sort((o1, o2) -> o2.length() - o1.length());
    if (wordList.size() > 3) {
        wordList = wordList.subList(0, 3);
    }
    return wordList;
}


In JAVA 8 and later, with the help of Stream, we can write the following code more elegantly:


public List<String> sortGetTop3LongWordsByStream(@NotNull String sentence) {
    return Arrays.stream(sentence.split(" "))
            .filter(word -> word.length() > 5)
            .sorted((o1, o2) -> o2.length() - o1.length())
            .limit(3)
            .collect(Collectors.toList());
}


intuitive feeling, Stream realization of the code is more concise, a breath of fresh air. Many students in the code often use Stream stream, but the Stream stream of knowledge is often limited to some simple filter , map , collect and other operations, but JAVA Stream can be applied to the scene and the ability to go far beyond these.


So the question is: what are the advantages of Stream over the traditional foreach approach?


Here we can set that aside for a moment and get an overall comprehensive look at Stream and then come back to this.


The author combined with the team in the code review encountered over the years, combined with the usual project coding practice experience, the core points of Stream and confusing usage, typical use of the scene for a detailed compilation of the summary, I hope to help you have a more comprehensive knowledge of Stream, but also can be more efficiently applied to the development of the project.

 Stream’s first acquaintance

 To summarize, Stream stream operations can be divided into 3 types:

  •  Creating a Stream
  •  Stream Intermediate Processing
  •  Termination of Steam


Each Stream pipeline operation type contains a number of API methods, listed below is a description of the function of each API method.

  •  Starting Pipeline


It is mainly responsible for creating a new Stream, or creating a new Stream based on an existing array, List, Set, Map and other collection type objects.

API  Functional Description
stream()  Creates a new stream object.
parallelStream()  Creates a stream object that can be executed in parallel.
Stream.of()
Creates a new Stream serial stream object from a given set of elements.
  •  intermediate pipe


Responsible for processing operations on the Stream and return a new Stream object, intermediate pipeline operations can be overlaid.

API  Functional Description
filter()
Filter the elements that match the criteria, and return a new stream.
map()
Converts an existing element to another object type, one-to-one logic, returns a new stream.
flatMap()
Convert an existing element to another object type, one-to-many logic, i.e., the original one element object may be converted to 1 or more elements of the new type, return new stream stream
limit()
Retains only the specified number of elements from the front of the set, returning a new stream.
skip()
Skips the specified number of elements in the front of the set and returns a new stream.
concat()
Combine the data of the two streams into a new stream, and return the new stream.
distinct()
De-duplicates all elements of the Stream and returns a new stream.
sorted()
Sorts all the elements of the stream according to the specified rules and returns a new stream.
peek()
Iterate through each element of the stream one by one, and return the processed stream.
  •  Termination of the pipeline


As the name suggests, by terminating the pipeline operation, the Stream will end, and finally may perform some logical processing, or return some result data after execution as required.

API  Functional Description
count()  Returns the final number of elements after stream processing.
max()  Returns the maximum value of the element after stream processing
min()  Returns the minimum value of the element after stream processing
findFirst()  Terminate stream processing when the first eligible element is found.
findAny()
Find any element that meets the conditions will exit the stream processing, this is the same as findFirst for serial streams, but more efficient for parallel streams, any slice of the find will terminate the subsequent computation logic
anyMatch()
Returns a boolean value, similar to isContains(), which is used to determine whether there are elements that match the condition.
allMatch()
Returns a boolean value that can be used to determine if all elements match the condition
noneMatch()
Returns a boolean value that determines if all the elements do not match the condition.
collect()
Converts a stream to a specified type, specified via Collectors
toArray()  Converting streams to arrays
iterator()  Converting a stream to an Iterator object
foreach()
No return value, iterates through the elements one by one and then executes the given processing logic

 Stream Method Usage

 map vs flatMap


map and flatMap are both used to convert existing elements to other elements, the point of difference is:


  • map must be one-to-one, i.e. each element can only be converted to 1 new element

  • flatMap can be one-to-many, i.e. each element can be converted to one or more new elements.


For example: there is a list of string IDs which now needs to be converted into a list of User objects. This can be done using map:


public void stringToIntMap() {
    List<String> ids = Arrays.asList("205", "105", "308", "469", "627", "193", "111");
    List<User> results = ids.stream()
            .map(id -> {
                User user = new User();
                user.setId(id);
                return user;
            })
            .collect(Collectors.toList());
    System.out.println(results);
}


After execution, you will find that each element is converted to the corresponding new element, but the total number of elements before and after is the same:


[User{id='205'}, 
 User{id='105'},
 User{id='308'}, 
 User{id='469'}, 
 User{id='627'}, 
 User{id='193'}, 
 User{id='111'}]


Let’s say we have a list of sentences, and we need to extract every word in the sentence to get a list of all the words. In this case, you can’t use map, so you need flatMap :


public void stringToIntFlatmap() {
    List<String> sentences = Arrays.asList("hello world","Jia Gou Wu Dao");
    List<String> results = sentences.stream()
            .flatMap(sentence -> Arrays.stream(sentence.split(" ")))
            .collect(Collectors.toList());
    System.out.println(results);
}


The result is as follows, you can see that the number of elements in the result list is more than the number of elements in the original list:


[hello, world, Jia, Gou, Wu, Dao]


It should be added here that the flatMap operation actually processes and returns a new Stream for each element first, and then expands and merges multiple Streams into a complete new Stream, as follows:

 The peek and foreach methods


peek and foreach , both of which can be used to iterate over elements and then process them one by one.


However, according to the previous introduction, peek is an intermediate method, while foreach is a termination method. This means that peek can only be used as a processing step in the middle of the pipeline, and can not be directly executed to get the result, which must be followed by other termination operations will be executed; while foreach as a termination method with no return value, you can directly execute the relevant operations.


public void testPeekAndforeach() {
    List<String> sentences = Arrays.asList("hello world","Jia Gou Wu Dao");
    System.out.println("----before peek----");
    sentences.stream().peek(sentence -> System.out.println(sentence));
    System.out.println("----after peek----");
    System.out.println("----before foreach----");
    sentences.stream().forEach(sentence -> System.out.println(sentence));
    System.out.println("----after foreach----");
    System.out.println("----before peek and count----");
    sentences.stream().peek(sentence -> System.out.println(sentence)).count();
    System.out.println("----after peek and count----");
}


The output shows that peek is not executed when it is called alone, but it is executed when peek is followed by a termination operation, while foreach is executed directly:


----before peek----
----after peek----
----before foreach----
hello world
Jia Gou Wu Dao
----after foreach----
----before peek and count----
hello world
Jia Gou Wu Dao
----after peek and count----


 filter, sorted, distinct, limit


These are commonly used Stream’s intermediate operation methods, the meaning of the specific methods in the table above inside the description. Specific use, you can choose one or more according to the need to use a combination, or use a combination of the same method:


public void testGetTargetUsers() {
    List<String> ids = Arrays.asList("205","10","308","49","627","193","111", "193");
    List<Dept> results = ids.stream()
            .filter(s -> s.length() > 2)
            .distinct()
            .map(Integer::valueOf)
            .sorted(Comparator.comparingInt(o -> o))
            .limit(3)
            .map(id -> new Dept(id))
            .collect(Collectors.toList());
    System.out.println(results);
}

 The processing logic of the above code snippet is clear:

  1.  Use filters to filter out unqualified data
  2.  De-duplication of stock elements by distinct
  3.  Converting a string to an integer via the map operation
  4.  With sorted, you can specify that the numbers should be sorted in the proper order of size.
  5.  Use limit to intercept the first 3 elements of a row.
  6.  Another use of map to convert id to Dept object type

  7. Collect the final processed data into a list using the collect termination operation

 Output results:

[Dept{id=111},  Dept{id=193},  Dept{id=205}]

 Simple Result Termination Method


According to the previous introduction, the termination methods such as count , max , min , findAny , findFirst , anyMatch , allMatch , nonneMatch and so on belong to the simple result termination methods mentioned here. By simple, we mean that the result is in the form of a number, a boolean or an optional object value.


public void testSimpleStopOptions() {
    List<String> ids = Arrays.asList("205", "10", "308", "49", "627", "193", "111", "193");

    System.out.println(ids.stream().filter(s -> s.length() > 2).count());

    System.out.println(ids.stream().filter(s -> s.length() > 2).anyMatch("205"::equals));

    ids.stream().filter(s -> s.length() > 2)
            .findFirst()
            .ifPresent(s -> System.out.println("findFirst:" + s));
}

 The result after execution is:


6
true
findFirst:205

 warning to avoid pitfalls


Here we need to add a reminder, once a Stream has been executed after the termination operation, the subsequent stream can not be read to perform other operations, otherwise it will report an error, see the following example:


public void testHandleStreamAfterClosed() {
    List<String> ids = Arrays.asList("205", "10", "308", "49", "627", "193", "111", "193");
    Stream<String> stream = ids.stream().filter(s -> s.length() > 2);
    System.out.println(stream.count());     try {
        System.out.println(stream.anyMatch("205"::equals));
    } catch (Exception e) {
        e.printStackTrace();
    }
 
}

 When executed, the result is as follows:


6

java.lang.IllegalStateException: stream has already been operated upon or closed
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:229)
	at java.util.stream.ReferencePipeline.anyMatch(ReferencePipeline.java:449)
	at com.veezean.skills.stream.StreamService.testHandleStreamAfterClosed(StreamService.java:153)
	at com.veezean.skills.stream.StreamService.main(StreamService.java:176)


Because stream has already been terminated by the count() method, when stream executes the anyMatch method again, it will report an error at stream has already been operated upon or closed , which requires special attention when using it.

 Results collection termination method


Because Stream is mainly used for the collection of data processing scenarios, so in addition to the above several get a simple result of the termination of the method, more scenarios is to get a collection of class results object, such as List, Set or HashMap, etc..


Here is where the collect method comes into play, which supports generating result data of the following type:


  • A , such as a List, Set, or HashMap, etc.

  • StringBuilder object, support for multiple processing and output the result of the splice

  • An object that can record the number of items or calculate the sum 

 Generating Collections


It should be considered one of the most frequently used scenarios for COLLECT:


public void testCollectStopOptions() {
    List<Dept> ids = Arrays.asList(new Dept(17), new Dept(22), new Dept(23));

    List<Dept> collectList = ids.stream().filter(dept -> dept.getId() > 20)
            .collect(Collectors.toList());
    System.out.println("collectList:" + collectList);

    Set<Dept> collectSet = ids.stream().filter(dept -> dept.getId() > 20)
            .collect(Collectors.toSet());
    System.out.println("collectSet:" + collectSet);
    Map<Integer, Dept> collectMap = ids.stream().filter(dept -> dept.getId() > 20)
            .collect(Collectors.toMap(Dept::getId, dept -> dept));
    System.out.println("collectMap:" + collectMap);
}

 The results are as follows:


collectList:[Dept{id=22}, Dept{id=23}]
collectSet:[Dept{id=23}, Dept{id=22}]
collectMap:{22=Dept{id=22}, 23=Dept{id=23}}

 Generate splice strings


I’m sure you’re all familiar with the scenario of stitching values from a list or array into a string separated by commas.


If you loop through the for loop and StringBuilder to loop through the splice, you also have to think about what to do with the last comma, which is tedious::.


public void testForJoinStrings() {
    List<String> ids = Arrays.asList("205", "10", "308", "49", "627", "193", "111", "193");
    StringBuilder builder = new StringBuilder();
    for (String id : ids) {
        builder.append(id).append(',');
    }
    builder.deleteCharAt(builder.length() - 1);
    System.out.println(builder.toString());
}


But now with Stream, it’s a breeze using collect :


public void testCollectJoinStrings() {
    List<String> ids = Arrays.asList("205", "10", "308", "49", "627", "193", "111", "193");
    String joinResult = ids.stream().collect(Collectors.joining(","));
    System.out.println( joinResult);
}


Both ways give exactly the same result, but Stream’s way is more elegant:

205,10,308,49,627,193,111,193

 📢 Knockout:


Regarding the instructions here, many partners in the comment section have raised questions that this scenario can actually be handled by using String.join() , and does not require the use of stream to achieve the above. Here to declare that the charm of Stream is that it can be combined with other business logic to deal with, so that the code logic is more natural, all in one. If it is purely a String string splicing requirements, there is no need to use Stream to achieve, after all, do not use a knife to kill a chicken ~ but you can look at the example given below, you can feel the use of Stream for string splicing the real charm of the place.

 Batch math operations on data


There is another scenario, the actual use of the time may be less, is the use of COLLECT to generate the sum of the numerical data information, you can also understand the implementation:


public void testNumberCalculate() {
    List<Integer> ids = Arrays.asList(10, 20, 30, 40, 50);

    Double average = ids.stream().collect(Collectors.averagingInt(value -> value));
    System.out.println( average);

    IntSummaryStatistics summary = ids.stream().collect(Collectors.summarizingInt(value -> value));
    System.out.println(summary);
}


In the above example, the collect method is used to perform mathematical operations on the values of the elements in the list and the result is as follows:

 Parallel Stream

 Description of the mechanism


The use of parallel streaming can effectively utilize a computer’s multi-CPU hardware to increase the speed of logic execution. Parallel streaming consists of dividing an entire stream into 多个片段 , then executing the processing logic in parallel for each of the segmented streams, and finally summarizing the results of the execution of the segmented streams into an overall stream.

 Constraints and limitations


Parallel streaming is similar to multi-threaded in parallel processing, so with multi-threaded scenarios related to some of the same problems will exist, such as deadlock and other issues, so the parallel stream to terminate the execution of the function logic, you must ensure that thread-safe.

 Answer to the original question


This is basically the end of the introduction to the concepts and usage of JAVA Stream. Let’s cut the focus back to one of the issues mentioned at the beginning of this article:


What are the advantages of Stream over the traditional foreach approach to streams?

 Based on the previous introduction, we should be able to come up with the following answers:


  • A cleaner, more declarative coding style that makes it easier to reflect the logical intent of the code

  • Decoupled logic, a stream intermediate processing logic, do not need to pay attention to the content of the upstream and downstream, just according to the agreement to realize their own logic can be
  •  Parallel streaming scenarios will be more efficient than iterator-by-iterator loops

  • Functional interfaces, delayed execution characteristics, intermediate pipeline operations no matter how many steps will not be immediately executed, only encountered when the termination of the operation will begin to execute, you can avoid some of the intermediate unnecessary operation consumption


Of course, Stream isn’t all good, and in some ways it has its drawbacks:

  •  Code tuning debug inconvenience

  • Programmers need some adaptation time when switching from historical writing to Stream


Well, about the JAVA Stream understanding of the main points and the use of skills of the exposition on the first here. Then through the above introduction, all partners have been eager to try it? Go to the project to use the experience! Of course, if you have questions, you are welcome to find me to discuss the discussion.

 Supplement 1:


Limited by space constraints, this article on the use of collect is only a simple introduction, but Stream in the ability to collect far more powerful than imagined, in order to be able to make this part clear, I Stream for the use of collect and the principle of writing a special article, click 👉👉👉 “Speaking through the JAVA Stream’s use of collect and principles It’s a great way to unlock more advanced ways to play with Stream.


After the publication of the two documents of Java Stream, we received enthusiastic support from our partners, and the two articles have gained the records of 10w+ reads, 2k+ likes, and 5k+ favorites, and the present document has been selected for the 2022掘金年度爆款好文 award. Being recognized by many partners is the happiest thing in the process of technology sharing. Many of my friends have raised some questions or their own insights in the comments, and there have been heated interactive discussions in the comments section. After combing through the relevant comments, we have summarized and elaborated on a few points that have been discussed more enthusiastically, which is also considered to be a supplement to the original 2 Stream series of documents. Interested partners can click on the “again talk about some of the practical skills and attention of the Java Stream” to see for themselves.

 

By lzz

Leave a Reply

Your email address will not be published. Required fields are marked *