AROBS Transilvania – custom software development company
Java concurrency reloaded: Project Loom
Our team’s software development expertise is one of our most valuable resources. Our colleague, Lucian O., Java Developer, wrote about how Project Loom will improve development work.
Lucian’s perspective will get you through multithreading in Java, platform threads, and virtual threads. Continue reading to see the main advantages of virtual threads for Java concurrency and to spark your curiosity on Project Loom.
1. Introduction
Given the history of concurrency in Java, I find the new Project Loom very interesting. Even though it’s just in the first Preview in Java 19, I think we, as developers, could benefit a lot from the performance improvements that the project brings. In the end, Project Loom it’s a simple API that introduces the possibility of creating “a million threads” at a low resource cost. Let’s see in the following rows the benefits of Project Loom.
2. History of Multithreading in Java
Even from the beginning, with the release of 1.0, Java had support for the so-called Green Threads or user threads, which introduced the possibility of creating more threads for a single OS thread. These were removed in Java 1.1 in favor of platform threads because they were outperformed, and the implementation was not very efficient.
Java 1.5 added lots of utilities that improved multithreading, and in 1.8, the long waited Streams were introduced, together with CompletableFuture and a lot more asynchronous support.
But none of these releases had as much impact as virtual threads will have, or already have as they were released as a preview feature in Java 19. As we will see in the next sections, virtual threads highly increase performance and throughput.
3. Good old platform threads
Currently, we are thinking of concurrency in Java while keeping in mind the platform threads. Today, the platform thread is the java.lang.Thread, the simple thread as we know it, which we can create by extending the class Thread or implementing the Runnable interface, overriding method run(), and after that, calling method start() to start it.
public class PlatformThread extends Thread {
@Override
public void run() {
IntStream.range(1, 100).forEach(System.out::println);
}
public static void main(String[] args) {
new PlatformThread().start();
}
}
As we saw, Java has a long history in multithreading, and right now, the concurrency model is very mature and safe to use. But it has some drawbacks: the threads are very expensive in terms of resources used.
Platform threads are mapped 1:1 with OS threads, so the number of threads we are creating depends on how many OS threads we have on our machine and on how much memory we have available. This means that every time we create a thread, we pay significantly in memory and performance. Scaling a multithreading application would mean adding hardware resources to your machine.
Thread.start() is considered inefficient because, at startup, each Java thread consumes around 1Mb of memory, which is outside of the heap, so no matter how much heap we allocate, we also have to take into account the memory which is consumed by the threads, so the cost of creating a thread is quite high.
4. Virtual threads
Project Loom aims to drastically reduce the effort of writing, maintaining, and observing high-throughput concurrent applications that make the best use of available hardware.
Ron Pressler (Tech lead, Project Loom)
Virtual threads are not threads that run faster but that offer much better throughput and performance. Although each task takes the same time, so they have the same latency as platform threads might have, we can create and run far more virtual threads than platform threads. In the end, that’s a high gain because a system can process much more information in a given amount of time. We can create millions of virtual threads without having to worry about going out of memory, so increasing the number of threads we can create is, in the end, getting a great throughput for free.
A virtual thread is a regular thread that is scheduled by the JVM. A virtual thread runs the Java code on an OS thread, but it requires no OS thread while waiting for any blocking operation. Each blocking operation is actually yielding, so the virtual thread is voluntarily giving up an OS thread. What actually happens is that a virtual thread will be hibernated while waiting, and only when it’s necessary will the JVM wake him up, so in the meantime, it won’t consume any resources. This means that many virtual threads can run on the same OS thread, so they are sharing it. Also, an important benefit is that virtual threads reside on the heap, which means that they are subject to garbage collecting.
4.1. How many threads we can create?
Let’s start our demo with a very simple use case, but one enough for our scope.
The below program keeps creating platform threads and just parks them.
public static void main(String[] args) {
var counter = new AtomicInteger();
while (true) {
new Thread(() -> {
var count = counter.incrementAndGet();
System.out.printf("Count = %s%n", count);
LockSupport.park();
}).start();
}
}
In the same way let’s create virtual threads.
public static void main(String[] args) {
var counter = new AtomicInteger();
while (true) {
Thread.startVirtualThread(() -> {
var count = counter.incrementAndGet();
System.out.printf("Count = %s%n", count);
LockSupport.park();
});
}
}
On my machine I had stopped the creation of platform threads at around 100.000 threads, as the machine was freezing badly. I might have waited to get the OutOfMemory error, but as it froze a lot, that might take a lot of time. Also, creating 100k platform threads took some time, somewhere around 5 minutes.
When running the virtual threads program, I had, in a few seconds, 1 million threads, and they kept going. Again, I stopped the program pretty fast because it was enough for the scope of this example. No doubt we can create a lot more virtual threads than platform threads, and virtual threads are created very fast.
4.2. Comparing throughput
To compare throughput, we will run 100.000 tasks that will sleep for 1 second, and we will follow the processing time in each case.
Platform threads tasks:
public void processWithPlatformThread() {
long startTime = System.currentTimeMillis();
try (var executor = Executors
.newThreadPerTaskExecutor(Executors.defaultThreadFactory())) {
IntStream.range(0, 100_000).forEach(count -> executor.submit(() -> {
Thread.sleep(1000);
System.out.println(count);
return count;
}));
}
System.out.printf("Processing time: %s seconds",
(System.currentTimeMillis() - startTime)/1000.0f);
}
The result, when I run this code, is: Processing time: 65.472 seconds.
Virtual threads tasks:
public void processWithVirtualThread() {
long startTime = System.currentTimeMillis();
try (var executor = Executors.newVirtualThreadPerTaskExecutor()) {
IntStream.range(0, 100_000).forEach(count -> executor.submit(() -> {
Thread.sleep(1000);
System.out.println(count);
return count;
}));
}
System.out.printf("Processing time: %s seconds",
(System.currentTimeMillis() - startTime)/1000.0f);
}
The result, when I run the virtual thread code, is: Processing time: 8.835 seconds.
As we can see, we have far more performance when using virtual threads. Of course, these use cases are very simple, but I had seen tests like these when the tasks were actually processing something, e.g. getting data from an URL, and even in that case, the performance of the virtual thread was better with 25-30%.
As I mentioned earlier, virtual threads do not make our code faster. But what we can see from this test is that they improve our application’s throughput. So, rather than executing one task faster, it allows us to process more tasks in the same amount of time as platform threads.
5. Conclusion
In this article, we have seen the advantages that virtual threads bring to the Java concurrency world. No doubt that the greatest use of virtual threads will have the frameworks which we use every day, but in the end, this will be very beneficial for us too, as developers. And, of course, we can beneficiate directly in our concurrency model programming if we make use of the virtual threads.
The second chapter of Project Loom is Structured Concurrency, which is an Incubator feature in Java 19, and this is not part of this article’s scope, but I encourage you to read about it; it’s great!
Of course, there is a long time to wait until we have Project Loom as a final release in Java, but it will be in the second Preview in Java 20 in March of this year, so maybe not be so long. Start upgrading to newer Java versions in your projects!
// If you’re looking for a software services partner to help your business adapt to change, leave us a message
Looking for a digital upgrade?
// our recent news