AWS’s New JavaScript Runtime-LLRT

AWS’s new JavaScript Runtime-LLRT

Introduction

To start with, LLRT (Low Latency Runtime) is a lightweight JavaScript Runtime created to meet the increasing demand for efficient and fast serverless applications. For startups, it offers 10X faster speed and 2X faster total reduced cost as compared to the rest of the JavaScript Runtimes that are running on AWS Lambda.

LLRT is built in Rust, optimizing QuickJS as a JavaScript Engine that ensures quick startup and smart memory usage.

Low Latency Runtime (LLRT) is an experimental package that is subject to constant change and only meant for evaluation-related operations.

So, as you dive deeper into today’s blog post, you will know how effective and speedy LLRT is as a JavaScript Runtime how it partially supports certain APIs and modules like that of Node.js, and how it still has a long way to go to become a replacement of Node.js in the world of JavaScript developers.

LLRT: Test and Compatibility

If want to ensure that your code is compatible enough with LLRT then you have to write tests and execute them through the built-in test runner. Here is how.

The built-in test runner uses a very lightweight API resembling Jest, subsequently, using the assert module of Node.js for the final test conduction. Here starts the game, now test runner will implement the LLRT test command, making LLRT scan the present directories and sub-directories for certain files that either end with the test. mjs or test.js.

Further, you can use a particular test directory like the llrt-test-d<directory> option and scan the directories and sub-directories that way.

Plus, the test runner also consists of a backup for filtering purposes, and using these filters is very easy and simple and you just have to add additional command line arguments back to back. For example, llrt test crypto will run tests that are compatible with the filename having the word crypto.

Compatibility Matrix: A Quick Synopsis

Compatibility Matrix: A Quick Synopsis

Low Latency Runtime supports only a part of the Node.js APIs and Modules, so, cannot be considered a substitute for Node.js even in the long run. To know more about this, click on the link below.

https://github.com/awslabs/llrt/blob/main/API.md

Usage of AWS SDK (v3) with LLRT

Low Latency Runtime involves many AWS SDK utils and clients as a portion of the runtime process, built primarily in executable formats. Further, these AWS SDK clients have been modified in terms of offering quality performance without compromising on the compatibility part.

Recently LLRT has started replacing a few JavaScript dependencies used by the native SDK clients. The most notable ones among them include XML Parsing and Hash Calculations etc.

Below is a checklist of external packages that need to be bundled with your source code exclusive of V3 SDK packages.

Usage of AWS SDK (v3) with LLRT

Note that currently, LLRT is not supporting returning streams from SDK responses. So, alternatively, you need to use response.Body.transformToByteArray();or response.Body.transformToString(); as mentioned below.

const response = await client.send(command);
// or 'transformToByteArray()'
const str = await response.Body.transformToString();

Usage of node_modules (dependencies) with LLRT

Usage of node_modules (dependencies) with LLRT

Low Latency Runtime (LLRT) is designed for performance-critical applications, so, developers cannot use it to employ node_modules regardless of tree-shaking, minification, and bundling.

This enables LLRT to work almost with any kind of bundler that you have selected. Consider these configurations for a few popular bundlers.

ESBuild

esbuild index.js --platform=node --target=es2020 --format=esm --bundle --minify --external:@aws-sdk --external:@smithy --external:uuid

Webpack

import TerserPlugin from 'terser-web pack-plugin';
import nodeExternals from 'webpack-node-externals';
 
export default {
  entry: './index.js',
  output: {
	path: "dist",
	filename: 'bundle.js',
	libraryTarget: 'module',
  },
  target: 'web',
  mode: 'production',
  resolve: {
	extensions: ['.js'],
  },
  externals: [nodeExternals(),"@aws-sdk","@smithy","uuid"],
  optimization: {
	minimize: true,
	minimizer: [
  	new TerserPlugin({
    	terserOptions: {
      	ecma: 2020,
    	},
  	}),
	],
  },
};


Rollup

import resolve from 'rollup-plugin-node-resolve';
import commonjs from 'rollup-plugin-commonjs';

import { terser } from 'rollup-plugin-terser';

export default {

  input: 'index.js',

  output: {

file: 'dist/bundle.js',

format: 'esm',

sourcemap: true,

target: 'es2020',

  },

  plugins: [

resolve(),

commonjs(),

terser(),

  ],

  external: ["@aws-sdk","@smithy","uuid"],

};

Typescript Running with LLRT

When using Typescript, the same method applies to dependencies. So, Typescript must be bundled and then transpired into ES2020 JavaScript.

Important Note: LLRT does not support Typescript running without transpiration. This is mainly due to its design which leads to certain performance-based restrictions. Enough memory and CPU are needed for the transpiring process to take place smoothly. As a result, a great deal of cost and latency are added when the execution happens. So, one must conduct this activity on time to avoid any unwanted bugs or mistakes coming in the way.

Deno, Bun, Node.js and LLRT: Key Differences

So, JavaScript Runtime is already optimized in the form of many existing options including Deno, Bun, and Node.js. They are highly competent JavaScript Runtimes chiefly designed to serve the purpose of general application building.

Each of them depends on a Just-In-Time compiler (JIT) meant for effective optimization and code compilation. JIT compilation supports long-term performance-related gains and carries a substantial memory and computational.

Also, these runtimes do not fit in a serverless atmosphere and mostly have short-term runtime characteristics.

On the contrary, LLRT can efficiently incorporate a JIT compiler quickly and easily. The outcome of this process carries two notable advantages it including:

·         LLRT can conserve both CPU and memory resources without requiring the JIT overhead. Consequently, the allocation of code execution tasks becomes result-driven, hence, leading to reduced downtime for application startup.

·       JIT compilation is a highly sophisticated technology that not only contributes significantly to the overall size of the runtime but also enhances system complexity to a favorable extent.

To say in a nutshell, LLRT has redefined the concept of serverless application building that offers matchless speed, performance, and efficiency. Deno, Bun, and Node.js are quite good but they are not capable enough yet to integrate with AWS SDK providing users with highly intuitive, reliable, and proficient applications to meet their tailored needs.

LLRT Limitations

As compared to JIT-enabled runtimes, in many cases, LLRT has failed to show an impressive performance. Examples include performing activities with myriads of iterations, Monte Carlo Simulations, massive data processing, and so on.

So, make sure that you utilize LLRT for serverless tasks that are smaller in size. These generally include AWS service integration, authorization and validation, real-time processing, data transformation, etc.

Hence, we can say that Low Latency Runtime should not be used as a comprehensive that serves all kinds of purposes. Rather, we must employ it for the already existing elements that lead to the creation of powerful and robust serverless applications.

Further, the APIs that LLRT supports are dependent on Node.js specification. Therefore, you will need little to no code adjustments when transpiring the APIs back to their alternatives.

Reasons LLRT must be your next choice

Some of the notable reasons that make LLRT a worthwhile JavaScript Runtime as compared to other options include:

LLRT awaits the future of serverless computing. Plus, LLRT has proven to be a game-changer in the world of JavaScript Runtimes, setting a benchmark of standards for aspiring business owners, tech-savvies, and developers.

B2B and B2C Companies can now garner more happy and satisfied customers by using LLRT smartly and effectively. This is because the faster the apps, the better their performance will be, leaving customers elated and rejoiced. And the best part is brand storytelling becomes way quicker and better when your applications are LLRT-enabled.

Developers can now gain a foothold in the JavaScript Runtime landscape as building faster, more reliable, and more efficient applications is possible with LLRT. So, no more tolerating cold start bugs as serverless launches are already at your fingertips with Low Latency Runtime.

Conclusion

So, tech enthusiasts, developers, and business owners get ready for a revolutionary experience this time with LLRT that makes designing serverless applications possible in a snap of a finger.

LLRT applications not only offer a great deal of speed and efficiency to their users but the kind of performance they display is worth acknowledging.

Thanks to AWS SDK for bringing such a cutting-edge tool to the table and hopefully you too will get a hold of Low Latency Runtime the next time you want to build highly reliable, efficient, and scalable serverless applications.

Good Luck!

Frequently Asked Questions

Which is the best JavaScript Runtime?

LLRT is so far the best JavaScript Runtime version as compared to Deno, Bun, and Node.js since it can design serverless applications across a wide range of APIs and modules.

What is the maximum timeout in AWS?

One can configure AWS Lambda functions and operate them for up to 15 minutes per activity. So, you are free to set the timeout value anywhere between 1 second to 15 minutes depending on your tailored requirements.

Why choose LLRT?

LLRT is an extremely lightweight JavaScript Runtime designed to create fast, efficient, and serverless applications for business owners, tech enthusiasts, and aspiring developers alike. Consequently, some of the benefits to expect include better brand storytelling, improved customer satisfaction, serverless computing, and cloud-friendly performance to say the least.

Leave a comment: