DEV Community

Vijay Vimalananda Guru
Vijay Vimalananda Guru

Posted on • Updated on

All about Android Runtime

In this post, we are going to learn about Dalvik and Android Runtime.

Most of the Android developers would have heard about the terms Dalvik, ART, JIT and AOT. If you ever wonder what these are then this post will help you.

Introduction:

Whenever APK is generated, part of that contains .dex files. Those files hold source code of the app as well as libraries that are used. These files are written using low-level code designed for a software interpreter which is called bytecode. When we run the app, the bytecodes which are written in the .dex file are translated by Android Runtime into machine code. This machine code provides instructions that can be directly understood by the machine and are processed by the CPU.

The compilation of bytecode into machine code can be done in various methods. To understand why Android Runtime came into action we have to move back in time and learn about Dalvik.

What is Dalvik Virtual Machine(DVM)?

Initially, when Android phones came to the market they were not as efficient as they are now. It is not only just because they have low RAM but also they used a compatible runtime called Dalvik. It is implemented to optimize particularly for one parameter - RAM usage. So instead of compiling the whole app to machine code they used the strategy called Just-In-Time(JIT) compilation.

With this JIT, the compiler works quite pretty like an interpreter. It compiles small chunks of code during the execution of the app. This helped Dalvik to save a lot of RAM. But this strategy has a serious drawback. Since this all happened at runtime it had a negative impact on runtime performance. Gradually, they optimized and enhanced Dalvik as a result some of the frequently run code was cached and not recompiled without a need. But still there was a boundary in that case because of limited RAM size.

In later days, apps were also getting bigger and that impacted the performance and caused problems for Dalvik. So, they introduced a new runtime called Android Runtime in Android L.

What is Android Runtime(ART)?

The way ART worked in Android Lollipop was completely different from what we saw in Dalvik. Like JIT in Dalvik, ART(Lollipop and Marshmallow) used a strategy called Ahead-of-Time(AOT) compilation. With this, instead of interpreting code at runtime code was compiled and converted to machine code before running the app. This approach widely improved runtime performance and is more efficient than JIT. Unfortunately, this also comes with some drawbacks as it consumes a lot more memory than Dalvik. Also, it took more time to install the app because the whole app needed to be transformed to the machine code plus it took more time to perform system update because all apps need to be reoptimized. To overcome this downside, JIT compilation was introduced back to ART along with a feature called Profile-guided compilation.

What is Profile-Guided Compilation?

Generally, most parts of the apps are rarely used. So, having the whole app precompiled was an overhead. That's why in Android Nougat, JIT was brought back to ART and modernized with another strategy called Profile-Guided Compilation.

Profile-guided compilation technique constantly improved the performance of Android apps as they run. Whenever the app is compiled, it automatically detects some methods that are frequently used. The pre-compile methods are cached and gave best performance. While other parts of the app stay uncompiled until they are actually used. This strategy provided excellent possible performance for key parts of the app while reducing the RAM usage.

After this change, ART doesn't impact the speed of app installation and system updates. The best part here is, the pre-compilation happens only while the device is charging or idle. But the weak part of this approach is that the user has to actually use the app in order to get the profile data as well as frequently used classes and methods that are pre-compiled. That means that the first few usage of the app can be pretty slow because JIT compilation is used then. To improve this initial user experience, ART was upgraded with a new feature called Profiles in the cloud. And it was introduced in Android Pie.

What is Profiles in the Cloud?

As we all know most of the users use the app in a same way. So in order to speed up the performance after installation, profile data of the user who has already used the app are being collected. This collected data is clustered into a file called a common core profile. When a new user installs the app, this file is downloaded along with the application. With the help of this file, ART can easily detect the pre-compiled classes and methods that are mostly used by the user. This approach helped new users to get startup and steady state performance when they install the app. Later, user-specific profile data is being collected and recompilation of the code is done for this particular user when the device is idle or charging. As all these things are done behind the scene by ART we don’t need to do anything manually to enable it.

Short Summary:

Dalvik VM was using JIT till KitKat. As it has some downsides Android Runtime(ART) was introduced in Lollipop to overcome the issue that Dalvik had. At the beginning ART used a technique called Ahead-of-Time(AOT). It also had some drawbacks like high RAM usage and more installation time so ART was enhanced with a strategy called Profile-Guided Compilation. Later it was upgraded and came up with Profiler in the Cloud.

Happy learning!!

Top comments (1)

Collapse
 
sandeepsatpute9271 profile image
Sandeep Satpute

Great and nice article, helpful me