Futuregen Skill | Online Courses - Bootcamp & R&D Platform |Online Courses - Learn Anything | No.1 Online Training in Haldwani, Uttrakhand ,India

Best Java Training Insitute with 100% Learning & Implementation

Graal Performance


Project GraalVM: Performance testing performed on 4 JVM'S(Oracle , IBM , ZULU, GRAAL)

If you want to use the polyglot features of GraalVM, of course the other JVMs do not provide a suitable alternative.Zulu OpenJDK and GraalVM are more stable in their memory usage when comparing to Oracle JDK and OpenJDK.GraalVM uses most time on garbage collection .GraalVM causes the slowest response times for the application .Java bytecode can be compiled into native code before execution by a just-in-time(Graal or Hotspot) compiler for faster performance.AOT compilation is one way of improving the performance of Java programs and in particularthe startup time of the JVM.

The JVM executes Java bytecode and compiles frequently executed code to native code.This is called Just-in-Time (JIT) Compilation.The JVM decides which code to JIT compile based on profilinginformation collected during executionThe compiler used for AOT is Graal.GraalVM is an ecosystem for compiling and running applications written in multiple languages.GraalVM removes the isolation between programming languages and enables interoperability in a shared runtime.JavaScript app which ran fasster on the GraalVM JavaScript Engine than on V8.Support for GraalVM on Windows is currently under development.

Create a Native Image#

The native image builder generates a snapshot of an application after startup and bundles it in a binary executable.Running your application inside a Java VM comes with startup and footprint costs. GraalVM has a featureto create native images for existing JVM-based applications. The image generation process employs staticanalysis to find any code reachable from the main Java method and then performs full ahead-of-time (AOT)compilation. The resulting native binary contains the whole program in machine code form for its immediate execution.

For Microservices Frameworks#

Spring Native project provides beta support for compiling Spring applications to native executables using GraalVMNative Image.For these frameworks GraalVM native images significantly reduce the runtime memory requirementscompared to running on HotSpot.

GraalVM Debugging and Monitoring Tools:

GraalVM provides a set of tools for developers to debug and monitor deployed applications, as well as the GraalVM platform, itself.
Traditional tools may not be ready to diagnose polyglot programs. GraalVM tools, however, are implemented to inspect single or
multi-language applications by debugging via numerous clients, profiling, statement counting, dynamic analysis, and much more.

AOT VS JIT:

What's the benefit of this? What's the benefit of AOT? The problem is a JIT compiler in the JIT configuration, there's a lot of work going on when you start the application. First of all, while the JVM executable starts executing, you load the classes, the Java byte codes from your disk, then you have to verify the byte codes, then you start interpreting your byte codes, which is usually about 50 times slower than your final machine code would be. This interpretation of byte codes and starting initializes as it starts up is about 50 times slower.

You run the static initializers as usual in interpreted mode, and then what you do on the Java HotSpot virtual machine is that you create a first-tier compilation. You use a fast compiler, which is C1, the client compiler of HotSpot to create your first machine code. You do that because you want to speed up your application as fast as possible. You don't want to wait for the final machine code to arrive because there's such a big difference between the interpreter speed and the final machine code speed, that you want an intermediate solution. This intermediate solution is provided by the C1 client compiler, and this intermediate solution specifically has something in there that gathers profiling feedback, that gathers information about how the application is running the code.

Profiling feedback includes loop counts, it includes for every branch, it includes the counts of how often did the branch go right or left, it includes information about actual concrete classes that occur at certain places, meaning at instance off or at virtual calls, and it includes some other minor information about the execution. Gathering this profiling feedback is not free, it slows down your code because while you start up, you need to do this extra bookkeeping of the profiling feedback.

Then, when a method gets really hot, meaning you've had to first compile, you've got some profile, then the method is scheduled for the second compiler, and this is when the heavyweight compiler, either the C2 compiler or the GraalVM compiler comes in and uses all the information gathered during the startup sequence of the application to create the final, hopefully, very good machine code, and then you execute faster. As you can see there's a long sequence here, and this is the main reason why an application that is running on the Java Virtual Machine is starting up slowly.

When we do Ahead of Time compilation we don't need to do a lot of these things, because we did the compilation to machine code during Ahead of Time compilation step, and then when you start the app, everything is ready. You do not need to interpret, you don't need to get a profiling feedback, nothing of that sort. Immediately start with the best machine code, plus, if you had snapshots and configuration files, you might even not avoid loading configuration files that you otherwise would load at runtime. Startup time is probably the area where the Ahead of Time compilation is beating the JIT compilation by the largest margin.

We can see that in the numbers, we have here measures of the capital popular web frameworks, and we measured the GraalVM JIT configuration versus the GraalVM AOT configuration in startup time. Startup time here is effectively from the start of the application until the first request can be served on that server. The gains here are huge, you have about 50X on average, and it's not surprising because it just showed you before how much is going on, on the one hand, versus how little is going on, on the other hand. This is opening a new way you can run a Java-based web server, and suddenly changes a little bit the equation about, "Well, if I can start up in these 16 milliseconds, I don't need to keep the process around all the time." When the web application is idle, you just shut down the process. You can almost start a new process per request if you want with this type of speeds.

Whether the compilation time is relevant for application or not, you can find out with the profiling tool called the Java Flight Recorder. Who of you has been using Java Flight Recorder? About a dozen. Who of you has ever looked at the compilation times with Java Flight Recorder? One. I also didn't know how awesome this feature is of Java Flight Recorder, but I want to present it to you here.

In Java Flight Recorder mode when you go to the Java application, you look up how the threads are doing. You can actually select the compiler threads. In the Graal case, these are called JVMCI native threat. In the C2 normal open JDKs, they're called C2 threats. When you select one of these threats, you will see here a little diagram press red, and yellow here means that you have a compilation running, the compiler is doing something.

This is an interesting workload, this is compiling Apache Spark. Why are the Scala compilers on the JVM? It runs for about three minutes, and what we can see is we have three compiler threads here that are all busy all the time. After these three minutes, I'm done compiling my Apache Spark and at the same time I've used tons of machine code that will be useless now because I'm done after three minutes. This is a typical workload where the compilation speed of the JIT compiler is very important. You can figure out whether the compilation speed of your JIT compiler is very important or not by selecting the compilers right here and see if they are green, which means they're not doing anything, or if they are yellow, which means they are compiling.

Memory Management at Image Run Time

A native image, when being executed, does not run on the Java HotSpot VM but on the runtime system provided with GraalVM. That runtime includes all necessary components, and one of them is the memory management.Java objects that a native image allocates at run time reside in the area called the Java heap. The Java heap is created when the native image starts up, and may increase or decrease in size while the native image runs. When the heap becomes full, a garbage collection is triggered to reclaim memory of objects that are no longer used.

Have Queries?

Talk to our Career Counselor for more Guidance on picking the right Career for you! .

ENQUIRE NOW
7.png
shape3.png