2. Why Synchronization?
Let’s consider a typical race condition where we calculate the sum, and
multiple threads execute the calculate() method:
1. public class SynchronizedMethods {
2.
3. private int sum = 0;
4.
5. public void calculate() {
6. setSum(getSum() + 1);
7. }
8.
9. // standard setters and getters
10. }
Then, let’s write a simple test:
1. @Test class SynchronizedMethods {
public
2. public void givenMultiThread_whenNonSyncMethod() {
3. private int sum service
ExecutorService = 0; = Executors.newFixedThreadPool(3);
4. SynchronizedMethods summation = new SynchronizedMethods();
5. public void calculate() {
6. IntStream.range(0,
setSum(getSum()1000)
+ 1);
7. } .forEach(count -> service.submit(summation::calculate));
8. service.aTermination(1000, TimeUnit.MILLISECONDS);
9. // standard setters and getters
10. } assertEquals(1000, summation.getSum());
11. }
We’re using an ExecutorService with a 3-threads pool to execute the
calculate() 1000 times.
If we executed this serially, the expected output would be 1000, but our
multi-threaded execution fails almost every time with an inconsistent
actual output:
1. java.lang.AssertionError: expected:<1000> but was:<965>
2. at org.junit.Assert.fail(Assert.java:88)
3. at org.junit.Assert.failNotEquals(Assert.java:834)
4. ...
46
5. Processing Results of Asynchronous Computations
The most generic way to process the result of a computation is to feed it to
a function. The thenApply method does exactly that; it accepts a Function
instance, uses it to process the result, and returns a Future that holds a value
returned by a function:
1. CompletableFuture<String> completableFuture
webClient.get()
2. .uri(“/products”)
= CompletableFuture.supplyAsync(() -> “Hello”);
3. .retrieve()
4. CompletableFuture<String>
.bodyToMono(String.class)future = completableFuture
5. .block();
.thenApply(s -> s + “ World”);
6.
7. verifyCalledUrl(“/products”);
assertEquals(“Hello World”, future.get());
If we don’t need to return a value down the Future chain, we can use an
instance of the Consumer functional interface. Its single method takes a
parameter and returns void.
There’s a method for this use case in the CompletableFuture. The
thenAccept method receives a Consumer and passes it the result of the
computation. Then the final future.get() call returns an instance of the Void
type:
1. CompletableFuture<String> completableFuture
webClient.get()
2. .uri(“/products”)
= CompletableFuture.supplyAsync(() -> “Hello”);
3. .retrieve()
4. CompletableFuture<Void>
.bodyToMono(String.class)
future = completableFuture
5. .block();
.thenAccept(s -> System.out.println(“Computation returned: “ +
6. s));
7. verifyCalledUrl(“/products”);
8. future.get();
Finally, if we neither need the value of the computation nor want to return
some value at the end of the chain, then we can pass a Runnable lambda to
the thenRun method. In the following example, we simply print a line in the
console after calling the future.get():
82
1. Overview
Without necessary synchronizations, the compiler, runtime, or processors
may apply all sorts of optimizations. Even though these optimizations are
usually beneficial, sometimes they can cause subtle issues.
Caching and reordering are optimizations that may surprise us in concurrent
contexts. Java and the JVM provide many ways to control memory order,
and the volatile keyword is one of them.
In this chapter, we’ll focus on Java’s foundational, but often misunderstood,
concept, the volatile keyword. First, we’ll start with some background of
how the underlying computer architecture works, and then we’ll familiarize
ourselves with memory order in Java. Finally, we’ll discuss the challenges of
concurrency in multiprocessor shared architecture, and how volatile helps
fix them.
54