Skip to main content
Category

OpenJS In Action

OpenJS in Action: There’s Open Source in Your Credit Card with Neo Financial

By Blog, Case Study, OpenJS In Action

We recently met with Ian Sutherland, engineering lead and Head of Developer Experience at Canadian fintech startup, Neo Financial. Ian has been with Neo Financial from the very beginning and has seen the engineering team grow from 1 employee to over 150 individuals in the last three years. Ian is also a Collaborator on the Node.js project hosted at the OpenJS Foundation. Watch the full interview:

https://youtu.be/pCfM4_jxH0E

What is Neo Financial?

Neo is a financial technology company that is reimagining how Canadians bank. Their first product was a rewards credit card, they later introduced a top rated high-interest savings account and recently launched Neo Invest, the first fully digital, actively managed investment experience. With Neo Invest your portfolio is actively managed by experts and engages a greater range of asset classes and investment strategies than most competitor portfolios.

Developed with JavaScript First

The Neo Financial web banking portal provides seamless mobile-first interactions for users. The backend of the portal is built entirely using JavaScript and Node.js and powers all of the app’s microservices and transaction processing. Ian and the engineering team decided early on that they would use JavaScript for everything they possibly could in developing their product. Ian shared his opinion that “Node.js is the technology of choice for running JavaScript on the server, so from day one, a decision was made that the team would use Node.js.”

Other factors influenced their decision to work with JavaScript and related technologies like Node.js. JavaScript is currently the most widely used programming language, which they felt would make it easier to scale their team of developers quickly. The language also, put simply, works well for them. Their team finds it easy to containerize apps using Node.js. It’s also easy to build serverless functions written in JavaScript running on Node.js, with no compilation step required. It’s fair to say that Node.js provides excellent performance and scalability, keeping their team’s infrastructure costs low. 

Working with Other Open Source Technologies

Using Node.js and JavaScript for local development has worked well for the Neo Financial dev team. Ian shared that they have a swift development experience where a person can change some code and have the project reload instantly. He also cited the npm ecosystem and the “millions” of packages out there as a benefit, helping his team work very productively. They also use TypeScript and Fastify in all of their services, and webpack indirectly through other frameworks.

Open source packages and plugins help speed up their team’s development work. Chances are, someone has already dealt with a similar problem. These packages make it easier to solve whatever the needs are without recreating the wheel.

A Note on Security

As a financial company, security is top of mind for Ian and his team. The Node.js project has also been focusing more on the security of Node.js itself. As a result, the devs at Neo feel very comfortable running it in production.

Contributing to Open Source

On a personal level, Ian has been involved in open source for several years. He started by making smaller contributions and later got involved in the React community. He eventually became a core maintainer of the Create React App and has been working on that project for the last three or four years. Then, about four years ago, he got involved in Node.js itself, primarily as part of a working group called the Tooling Group. The focus of this group is on making Node.js the best tool it can be for building things like CLI tools, or other tools that might run in a CI or build environment, lambdas, etc. 

As a team, the Neo Financial engineers try to do their part. They’ve open sourced developer tools and GitHub actions and try their best to give back wherever they possibly can. In a big thank you to the open source community, Ian said, “We have an awesome community. People are doing development on open source projects for free as volunteers when they contribute lines of code and fixes and documentation.”

We at the OpenJS Foundation feel the same way. We wouldn’t be anywhere without our contributors and our fantastic community. It was a pleasure speaking with Ian, and we’re grateful for his input as an individual and an engineering team leader. 

OpenJS In Action: Betting Your Product’s Developer Experience on Node.js and Open Source

By Blog, OpenJS In Action, Uncategorized

OpenJS recently spoke with Yavor Georgiev, Co-Founder, and Head of Product at Fusebit, to learn more about how his product leverages Node.js and other benefits of the open source ecosystem. Fusebit prides itself on being a “developer-first” focused product that takes the pain out of implementing SaaS integrations. Yavor and one of his co-founders at Fusebit previously had worked at Microsoft specifically on bringing support for Node.js to the Azure Cloud. 

We learned that the Fusebit product team strongly believes in and supports the Node.js ecosystem. The entire Fusebit service is based on a “JavaScript developer experience with Node.js and npm”, which delivers a best-in-class experience for their customers.

Programming Model Based on Node.js

The Fusebit service exposes a programming model based on Node.js, allowing any developer to create an integration. That’s key for a couple of reasons. First, since there’s already a massive community of developers familiar with Node.js and JavaScript, developers don’t have to learn anything new and can use their existing processes and DevOps techniques. Another key benefit to having their model based on Node.js is that due to the size of the npm ecosystem, there’s a module for virtually everything. One of the benefits of open source is that developers don’t have to write and implement everything from scratch. In this case, they can grab a module from npm and speed up their productivity. 

Security

We touched on the issue of security. Two of the co-founders of Fusebit were previously employees at Microsoft and later at Auth0, an identity and access management platform on which Fusebit’s security is based. Were it not for Node.js and companies like Auth0 being invested in securing the open source ecosystem, the Fusebit product itself wouldn’t be where it is today. They also leverage modules from npm where developers constantly update code and patch vulnerabilities.

Stripe for SaaS Integrations

The Fusebit service is like “Stripe for SaaS integrations.” So if you’re a developer working on a SaaS application and you need integrations to third-party SaaS products like Slack or JIRA, Fusebit provides the integrations in a turnkey way. Based on Node.js, there’s an infinite ability to customize solutions. As a result, Fusebit achieves great problem-solution-fits for their customers, unlike some low-code and no-code solutions. Another reason their product is focused on a developer audience is data fidelity is essential when connecting business software to something like Salesforce or other SaaS products. Someone has to have the right technical mindset to create that type of integration.

Open Source Contributions

The Fusebit team is also a proud contributor to open source development. Most of their source code is available on GitHub, so customers can go in and fork features, SaaS connectors, etc., and make them their own. 
We talked about everynode, a new project that Fusebit recently contributed to the open source ecosystem that lets developers run any version of Node.js, including the most recent builds on AWS Lambda. Lambda sometimes doesn’t have the latest versions available. The Fusebit team initially built it internally for integrations that required newer versions of Node.js and needed to run on AWS.

“You know, selfishly, it’s actually better for more developers to be familiar with it instead of keeping it secret. The more people are familiar with aspects of Fusebit that we’ve made open source, the better for us.”

The Fusebit team routinely takes pieces of the Fusebit service and makes them available to the public, whether it’s npm packages, repositories, or other content. The team also contributes by filing issues and contributing fixes to OSS projects and Node.js itself when needed. On making parts of their code public, Yavor commented, “You know, selfishly, it’s actually better for more developers to be familiar with it instead of keeping it secret. The more people are familiar with aspects of Fusebit that we’ve made open source, the better for us.”

JavaScript FTW!

With so many other programming languages out there, Yavor believes JavaScript is still in the lead for many reasons. It’s amazingly versatile, giving devs the ability to build end-to-end solutions. The language itself continues to evolve, and there are some remarkable initiatives around the standardization of the module spec, for example. Now you can write a module and use it pretty much anywhere JavaScript runs, whether it’s Node.js or in a browser. This continuous innovation supports the JavaScript language and the community and encourages people to continue learning JavaScript. 

Fusebit thanks the Node.js community and everybody who’s contributing unpaid hours to make Node.js and the package ecosystem great. According to Yavor, the Node.js community has been a tremendous help to their product. They also give back to our community by hiring folks with Node.js in their skillset. 

We at the OpenJS Foundation appreciate Yavor sharing his thoughts and experience.

Watch the Interview

OpenJS In Action: How Wix Applied Multi-threading to Node.js and Cut Thousands of SSR Pods and Money

By Blog, Case Study, OpenJS In Action
Author: Guy Treger, Sr. Software Engineer, Viewer-Server team, Wix

Background:

In Wix, as part of our sites’ Server-Side-Rendering architecture, we build and maintain the heavily used Server-Side-Rendering-Execution platform (aka SSRE). 

SSRE is a Node.js based multipurpose code execution platform, that’s used for executing React.js code written by front-end developers all across the company. Most often, these pieces of JS code perform CPU-intensive operations that simulate activities related to site rendering in the browser.

Pain: 

SSRE has reached a total traffic of about 1 million RPM, requiring at times largely more than accepted  production Kubernetes pods to serve it properly. 

This made us face an inherent painful mismatch:
On one side, the nature of Node.js – an environment best-suited for running I/O-bound operations on its single-threaded event loop. On the other, the extremely high traffic of CPU oriented tasks that we had to handle as part rendering sites on the server.

The naive solution we started with clearly proved inefficient, causing some ever growing pains in Wix’s server and infrastructure, such as having to manage tens of thousands of production kubernetes pods.

Solution: 

We had to change something. The way things were, all of our heavy CPU work was done by a single thread in Node.js. The straightforward thing that comes to mind is: offload the work to other compute units (processes/threads) that can run parallely on hardware that includes multiple CPU cores.

Node.js already offers multi-processing capabilities, but for our needs this was an overkill. We needed a lighter solution that would introduce less overhead, both in terms of resources required and in overall maintenance and orchestration.

Only recently, Node.js has introduced what it calls worker-threads. This feature has become Stable in the v14 (LTS) released in Oct 2020.

From the Node.js Worker-Threads documentation:

The worker_threads module enables the use of threads that execute JavaScript in parallel. To access it:

const worker = require(‘worker_threads’);

Workers (threads) are useful for performing CPU-intensive JavaScript operations. They do not help much with I/O-intensive work. The Node.js built-in asynchronous I/O operations are more efficient than Workers can be.

Unlike child_process or cluster, worker_threads can share memory.

So Node.js offers a native support for threading that we could use, but since it’s a pretty new thing around, it still lacks a bit in maturity and is not super smooth to use in your production-grade code of a critical application.

What we were mainly missing was:

  1. Task-pool capabilities
    What Node.js offers?
    One can manually spawn some Worker threads and maintain their lifecycle themselves. E.g.:
const { Worker } = require(“worker_threads”);

//Create a new worker
const worker = new Worker(“./worker.js”, {workerData: {….}});

worker.on(“exit”, exitCode => {
  console.log(exitCode);
});


We were reluctant to spawn our Workers manually, make sure there’s enough of them at every given time, re-create them when they die, implement different timeouts around their usage and more stuff that’s generally available for a multithreaded application.

  1. RPC-like inter-thread communication
    What Node.js offers?
    Out-of-the-box, threads can communicate between themselves (e.g. the main thread and its spawned workers) using an async messaging technique:
// Send a message to the worker
aWorker.postMessage({ someData: data })

// Listen for a message from the worker
aWorker.once(“message”, response => {
  console.log(response);
});

Dealing with messaging can really make the code much harder to read and maintain. We were looking for something friendlier, where one thread could asynchronously “call a method” on another thread and just receive back the result.

We went on to explore and test various OS packages around thread management and communication.


Along the way we found some packages that solve both the threadpool problem and the elegant RPC-like task execution on threads. A popular example was piscina. It looked all nice and dandy, but there was one showstopper. 


A critical requirement for us was to have a way for our worker threads to call some APIs exposed back on the main thread. One major use-case for that was reporting business metrics and logs from code running in the worker. Due to the way these things are widely done in Wix, we couldn’t directly do them from each of the workers, and had to go through the main thread.

So we dropped these good looking packages and looked for some different approaches. We realized that we couldn’t just take something off the shelf and plug it in.

Finally, we have reached our final setup with which we were happy.

We mixed and wired the native Workers API with two great OS packages:

  • generic-pool (npmjs) – a very solid and popular pooling API. This one helped us get our desired thread-pool feel.
  • comlink (npmjs) – a popular package mostly known for RPC-like communication in the browser (with the long-existing JS Web Workers). It recently added support for Node.js Workers. This package made inter-thread communication in our code look much more concise and elegant.

The way it now looks is now along the lines of the following:

import * as genericPool from ‘generic-pool’;
import * as Comlink from ‘comlink’;
import nodeEndpoint from ‘comlink/dist/umd/node-adapter’;

export const createThreadPool = ({
                                    workerPath,
                                    workerOptions,
                                    poolOptions,
                                }): Pool<OurWorkerThread> => {
    return genericPool.createPool(
        {
            create: () => {
                const worker = new Worker(workerPath, workerOptions);
                Comlink.expose({
                    …diagnosticApis,
                }, nodeEndpoint(worker));

                return {
                    worker,
                    workerApi: Comlink.wrap<ModuleExecutionWorkerApi>(nodeEndpoint(worker)) };
            },
            destroy: ({ worker }: OurWorkerThread) => worker.terminate(),
        },
        poolOptions
    );
};


And the usage in the web-server level:

const workerResponse = await workerPool.use(({ workerApi }: OurWorkerThread) =>
    workerApi.executeModule({
        …executionParamsFrom(incomingRequest)
    })
);

// … Do stuff with response

One major takeaway from the development journey was that imposing worker-threads on some existing code is by no means straightforward. Logic objects (i.e. JS functions) cannot be passed back and forth between threads, and so, sometimes considerable refactoring is needed. Clear concrete pure-data-based APIs for communication between the main thread and the workers must be defined and the code should be adjusted accordingly.

Results:

The results were amazing: we cut the number of pods by over 70% and made the entire system more stable and resilient. A direct consequence was cutting much of Wix’s infra costs accordingly.

Some numbers:

  • Our initial goal was achieved – total SSRE pod count dropped by ~70%.
    Respectively, RPM per pod improved by 153%.
  • Better SLA and a more stable application – 
    • Response time p50 : -11% 
    • Response time p95: -20%
    • Error rate decreased even further: 10x better
  • Big cut in total direct SSRE compute cost: -21%

Lessons learned and future planning:

We’ve learned that Node.js is indeed suitable also for CPU-bound high-throughput services. We managed to reduce our infra management overhead, but this modest goal turned out to be overshadowed by much more substantial gains.

The introduction of multithreading into the SSRE platform has opened the way for follow-up research of optimizations and improvements:

  • Finding the optimal number of CPU cores per machine, possibly allowing for non-constant size thread-pools.
  • Refactor the application to make workers do work that’s as pure CPU as possible.
  • Research memory sharing between threads to avoid a lot of large object cloning.
  • Apply this solution to other major Node.js-based applications in Wix.

Dressed to Impress: NET-A-PORTER, Mr Porter and JavaScript Frameworks

By Blog, Case Study, Fastify, OpenJS In Action

For this OpenJS In Action, Robin Glen, Principal Developer for YNAP joined the OpenJS Foundation Director Robin Ginn to discuss their use of JavaScript in building a global brand. YNAP is the parent company of luxury retailer NET-A-PORTER. Glen works within the Luxury division team at NET-A-PORTER (NAP), working on NET-A PORTER and Mr Porter. He has been with NAP for over 10 years. In addition to his work with NAP, Glen is also a member of the Chrome Advisory board.

Glen has been leading the developer team at YNAP for almost a decade, and continues to test, iterate and implement cutting edge open source technologies. For example, he was an early adopter of the Fastify web framework for Node.js to help increase web performance, particularly with the demand spikes his company experiences during holidays and sales.

Topics ranged from ways to make the user experience feel more pleasant and secure, to issues around Javascript bloat. Questions focused on the history of NAP, how NAP chose their current framework, and how that framework allows them to best service customers in their e-commerce site. 

The full interview is available here: OpenJS In Action: NET-A-PORTER, Mr Porter and JavaScript Frameworks 

Timestamps

0:00 Brief Introduction

2:09 Technology and NET-A-PORTER 

3:11 Defining Architectural Moment

4:50 Where YNAP is Today

6:50 Factors in Choosing Technologies? 

10:30 Fastify

14:00 YNAP and JS Foundation

15:10 Looking Forward: Engineering Roadmap  

18:58 What’s a “Good Day At Work” for you?

20:00 Wrap-Up

OpenJS In Action: ESRI powering COVID-19 response with open source

By Blog, Case Study, Dojo, ESLint, Grunt, OpenJS In Action

The OpenJS In Action series features companies that use OpenJS Foundation projects to develop efficient, effective web technologies. 

Esri, a geographic information systems company, is using predictive models and interactive maps with JavaScript technologies to help the world better understand and respond to the recent COVID-19 pandemic. Recently, they have built tools that visualize how social distancing precautions can help reduce cases and the burden on healthcare systems. They have also helped institutions like Johns Hopkins create their own informational maps by providing a template app and resources to extend functionality. 

Esri uses OpenJS Foundation projects such as Dojo Toolkit, Grunt, ESLint and Intern to increase developer productivity and deliver high-quality applications that help the world fight back against the pandemic. 

Esri’s contributions to the COVID response effort and an explanation of how they created the underlying technologies are available at this video: 

https://youtu.be/KLnht-1F3Ao

Robin Ginn, Executive Director of the OpenJS Foundation, spoke with Kristian Ekenes, Product Engineer at Esri, to highlight the work his company has been doing. Esri normally creates mapping software, databases and tools to help businesses manage spatial data. However, Ekenes started work on a tool called Capacity Analysis when the COVID-19 pandemic began to spread. 

Capacity Analysis is a configurable app that allows organizations to display and interact with results from two scenarios predicting a hospital’s ability to meet the demand of COVID-19 patients given configurable parameters, such as the percentage of people following social distancing guidelines. Health experts can create two hypothetical scenarios using one of two models: Penn Medicine’s COVID-19 Hospital Impact Model for Epidemics (CHIME) or the CDC’s COVID-19Surge model. Then they can deploy their own version of Capacity Analysis to view how demand for hospital beds, ICU beds, and ventilators varies by time and geography in each scenario. This tool is used by governments worldwide to better predict how the pandemic will challenge specific areas.

During the interview, Ekenes spoke on the challenges that come with taking on ambitious projects like Capacity Analysis. Esri has both a large developer team and a diverse ecosystem of applications. This makes it difficult to maintain consistency in the API and SDKs deployed across desktop and mobile platforms. To overcome these challenges, Esri utilizes several OpenJS Foundation projects including Dojo Toolkit, Grunt, ESLint and Intern

Ekenes explained that Grunt and ESLint increase developer productivity by providing real-time feedback when writing code. The linter also standardizes work across developers by indicating when incorrect practices are being used. This reduces the number of pull requests between collaborators and saves time for the entire team. Intern allows developers to write testing modules and create high-quality apps by catching bugs early. In short, Esri helps ensure consistent and thoroughly tested applications by incorporating OpenJS Foundation projects into their work. 

Expedia Group: Building better testing pipelines with opensource.

By Blog, Case Study, ESLint, OpenJS In Action

The OpenJS In Action series features companies that use OpenJS Foundation projects to help develop efficient, effective web technologies. 

Software developers at global travel company Expedia Group are using JavaScript, ESLint and robust testing pipelines to reduce inconsistency and duplication in their code. Switching from Java and JSP to Node.js has streamlined development and design systems. Beyond that, Expedia developers are looking into creating a library of reusable design and data components for use across their many brands and pages. 

Robin Ginn, executive director of the OpenJS Foundation, interviewed Tiffany Le-Nguyen, Software Development Engineer at Expedia Group.

Expedia is an example of how adoption of new technologies and techniques can improve customer and developer experiences. 

A video featuring Expedia is available here: https://youtu.be/FDF6SgtEvYY

Robin Ginn, executive director of the OpenJS Foundation, interviewed Tiffany Le-Nguyen, Software Development Engineer at Expedia Group. Le-Nguyen explained how accessibility and performance concerns led developers to modernize Expedia’s infrastructure. One of the choices they made was to integrate ESLint into their testing pipeline to catch bugs and format input before content was pushed live. ESLint also proved to be a huge time-saver — it enforced development standards and warned developers when incorrect practices were being used. 

ESLint was especially useful for guiding new developers through JavaScript, Node.js and TypeScript. Expedia made the bold move to switch most of their applications from Java and JSP to Node.js and TypeScript. Le-Nguyen is now able to catch most errors and quickly push out new features by combining Node.js with Express and a robust testing pipeline. 

However, Expedia is used globally to book properties and dates for trips. Users reserve properties with different currencies across different time zones. This makes it difficult to track when a property was reserved and whether the correct amount was paid. Luckily, Expedia was able to utilize Globalize, an OpenJS project that provides number formatting and parsing, date and time formatting and currency formatting for languages across the world. Le-Nguyen was able to simplify currency tracking across continents by integrating Globalize into the project. 

To end the talk, Le-Nguyen suggested that web developers should take another look into UI testing. Modern testing tools have simplified the previously clunky process. Proper implementation of a good testing pipeline improves the developer experience and leads to a better end product for the user.