Category

Node.js

Building Modern Native Add-ons for Node.js in 2020

By Blog, Node.js

This post was contributed by Chengzhong Wu (@legendecas), Gabriel Schulhof (@gabrielschulhof) Jim Schlight (@jimschlight), Kevin Eady Michael Dawson (@mhdawson1), Nicola Del Gobbo (@NickNaso). It originally appeared on the Node.js Project Medium Page. Node.js in an Impact Project of the OpenJS Foundation.

Introduction

N-API provides an ABI-stable API that can be used to develop native add-ons for Node.js, simplifying the task of building and supporting such add-ons across Node.js versions.

Image for post

With downloads of node-addon-api surpassing 2.5 million per week, all LTS versions of Node.js supporting N-API version 3 or higher and node.js 15.x being released with support for N-API 7, it is a good time to take a look at the progress on simplifying native add-on development for Node.js.

When we started working on N-API back in 2016 (the original proposal is 12 Dec 2016) we knew it was going to be a long journey. There are many native packages in the ecosystem and we understood the transition would take quite some time.

The good news is that we have come a long way since the initial proposal. There has been a lot of work by the Node.js collaborators and the team focussed on N-API as well as package authors who have moved over. In that time, N-API has become the default recommendation for how to build native add-ons.

While the basic design has remained consistent (as planned), we’ve added incremental features in each new N-API version in order to address feedback from package authors as they adopted N-API and node-addon-api.

It’s also been great to see the positive feedback from package authors along the way. For example https://twitter.com/mafintosh/status/1256180505210433541

Image for post

Having said that, let’s dive into some of the new features/functions that have been added over the last few years.

New features/functions

As people have been using N-API and node-addon-api we’ve been adding the key features that have been needed, including generally improving the add-on experience.

Image for post

The changes fall into 3 main categories which are covered in the sections which follow.

Multi-Threaded and Asynchronous Programming

As Node.js becomes more prominent in the computing world, the need to interact with native OS-level asynchronous activities has grown. Node.js is a single-threaded implementation of the JavaScript language, where only the main thread may interact with JavaScript values.

Performing computationally-intensive tasks on the main thread will block program execution, queuing events and callbacks in the event loop. As we gained experience with real-world use, in order to facilitate program integrity across multiple threads, both N-API and its wrapper node-addon-api were updated to provide several mechanisms to call into the Node.js thread from outside the main event loop, depending on use-case:

  • AsyncWorker: provides a mechanism to perform a one-shot action, and notify Node.js of its eventual completion or failure.
  • AsyncProgressWorker: similar to the above, adding the ability to provide progress updates for the asynchronous action.
  • Thread-safe functions: provides a mechanism to call into Node.js at any time from any number of threads.

Context-sensitivity

Another recent Node.js development is the arrival of workers. These are full-fledged Node.js environments running in threads parallel to the Node.js main thread. This means that native add-ons can now be loaded and unloaded multiple times as the main process creates and destroys worker threads.

Since threads share the same memory space as the main process, multiple copies of a native add-on must now be able to co-exist in a single process. On the other hand, the library containing a native add-on is only loaded once per process. Thus, global data stored by a native add-on, which was so far stored in global variables, must no longer be stored in such a way, because global storage is not thread-safe.

Static data members of C++ classes are also stored in a thread-unsafe manner, so those must also be avoided. It’s also important to remember that the thread is not necessarily that which makes an add-on instance unique. Thus, thread-local storage of global variables should also be avoided.

In N-API version 6 we started providing a space for storing per-instance global data by introducing the concept of add-on instances, multiple of which can co-exist in a process, and by providing some tools for creating self-contained add-ons, such as

  • the NAPI_MODULE_INIT()macro, which will initialize an add-on in such a way that it can then be loaded multiple times during the life cycle of the Node.js process.
  • napi_get_instance_data() and napi_set_instance_data() in order to provide a place for safely storing global data associated with a single instance of an add-on.
  • The node-addon-api Addon<T> class, which neatly combines the above tools to create a class whose instances represent instances of an add-on present in the various worker threads created by Node.js. Thus, add-on maintainers can store per-add-on-instance data as variables in an instance of the Addon<T> class and Node.js will create an instance of the Addon<T> class whenever it is needed on a new thread:
Image for post

Additional helper methods

As package maintainers used N-API we discovered a few additional APIs that were commonly needed. These included:

  • Date objects
  • BigInts
  • Retrieving property names from objects
  • Detaching ArrayBuffers

Building

One of the other main areas where the N-API team worked to fill in gaps and make it easier for maintainers to consume N-API was the build workflow, including additions to CMake.jsnode-pre-gyp and prebuild.

Historically, Node.js native addons have been built using node-gyp. For source code libraries that are already being built using CMake, the CMake.js build tool is an attractive alternative for building Node.js native add-ons. We have recently added an example of an add-on built using CMake.

Detailed information about using CMake.js with N-API add-ons can be found on the N-API Resource.

One of the realities of developing Node.js native add-ons is the fact that as part of installing the package using npm install the C or C++ code must be compiled and linked. This compilation step requires that a viable C/C++ toolchain be installed on the system doing the compilation. This can present a barrier to the adoption of native add-ons as the user of the add-on may not have the necessary tools installed. This can be addressed by creating prebuilt binaries that can be downloaded by the user of the native add-on.

A number of build tools can be used to create prebuilt binaries. node-pre-gyp builds binaries that are typically uploaded to AWS S3. prebuild is similar to node-pre-gyp but uploads the binaries to a GitHub release.

prebuildify is another option similar to the above that enables the native add-on developer to bundle the prebuilt binaries into the module uploaded to npm. The advantage of this approach is that the binaries are immediately available to the user when the package is downloaded. Although the downloaded npm package is larger in size, in practice the entire download process is faster for the user because secondary download requests to AWS S3 or a GitHub release are unnecessary.

Resources for getting started

One resource available to help get started is the node-addon-examples GitHub repository, containing samples of various Node.js native add-ons. The root of the repository contains folders for different functional aspects, from a simple Hello World add-on to a more complex multi-threaded add-on. Each example folder contains up to three subfolders: one for each Node.js add-on implementation (legacy NAN, N-API, and node-addon-api). To get started with the Hello World example using the node-addon-api implementation, simply run:

git clone https://github.com/nodejs/node-addon-examples.gitcd node-addon-examples/1_hello_world/node-addon-api/npm inode .

Another resource available is the The N-API Resource. This website contains information and additional in-depth walkthroughs regarding building Node.js add-ons and other advanced topics, such as:

  • tools needed to get started
  • migration guide from NAN
  • differences between build systems (node-gyp, cmake, …)
  • context-sensitivity and thread-safety

Closing out and call to action

Since the earliest days Node.js supported the ability to add features written in native code (C / C++) and to expose them through a JavaScript interface. Over time we recognized that there were challenges in implementing, maintaining, and distributing the resulting addons. N-API was identified as one of the core areas for improvements requested by module owners in order to address those challenges. The whole team and the community began to contribute to the creation of this new API in the core.

The resulting C API is now a part of every Node.js distribution and a C++ convenience wrapper called node-addon-api is distributed as an external package through npm. N-API was launched with the promise to guarantee the API and the ABI compatibility across different major versions of Node.js and this has introduced a series of benefits:

  • It has removed the need to recompile modules when migrating to newer major versions of Node.js
  • It allows JavaScript engines other than V8 to implement N-API which, in turn, allows add-on maintainers to target different runtimes (such as Babylon Native or IoT.js, and Electron) with the same code they use for supporting Node.js.
  • Since N-API is a C API it is possible to implement native add-ons using languages other than C / C++ (such as Go or Rust).

When N-API has been released as an experimental API in Node.js v8.0.0 its adoption started to grow slowly, but many developers started to send feedback and contributions and this led us to add new features and to create new tools to better support all the native add-ons ecosystem.

Today N-API is widely used for the development of native add-ons. Some of the most used native add-ons have been ported to N-API:

In the last few years many improvements happened for N-API and for native add-ons in general that bring the users’ and maintainers’ experience with native add-ons almost up to par with JavaScript modules.

Get Involved

We are constantly making progress on N-API and in general on the native add-ons ecosystem, but we always need more help. You could help us and the whole community to continue improving N-API in many ways:

  • Porting your own native module to use N-API
  • Porting a native module that your app depends on to N-API
  • Adding new features to N-API
  • Adding new features from N-API to node-addon-api
  • Fixing or adding test cases for node-addon-api
  • Fixing or adding examples to node-addon-examples

If you are interested in joining us, see details in https://github.com/nodejs/abi-stable-node#meeting on how to join our weekly meeting.

Node.js Certifications update: Node.js 10 to Node.js 14

By Blog, Certification, Node.js, Project Updates, Training

The OpenJS Node.js Application Developer (JSNAD) and the OpenJS Node.js Services Developer (JSNSD) Exams (Node.js Certifications) will be updated from Node.js version 10, which is now in maintenance, to Node.js version 14, which is the most current LTS (Long Term Support) line. Changes will come into effect November 3, 2020. All tests taking place after 8:00 pm PT on June 16, 2020 will be based on Node.js version 14.

 The updated exam will include the ability to use either native EcmaScript modules or CommonJS modules to answer questions, with CommonJS remaining the default and EcmaScript modules as an opt-in.

For example a given task on the examination may provide a folder containing an answer.js file and a package.json file. The package.json file does not contain a type field, as is the case when generating a package.json file with npm init. By default, the answer.js file is therefore considered a CommonJS module. So loading a module would be achieved like so:

const fs = require('fs')

To opt-in to native EcmaScript modules, candidates may either set the type field of the package.json file to module or may rename the answer.js file to answer.mjs. In either of those cases a module would be loaded like so:

import fs from 'fs'

Candidates may also explicitly opt-in to CommonJS by setting the type field to commonjs or by renaming the answer.js to answer.cjs but this is unnecessary as the absence of a type field means the answer.js file is interpreted as CommonJS anyway.

This opt-in approach for EcmaScript modules is in keeping with Node’s module determination algorithm, see https://nodejs.org/docs/latest-v14.x/api/packages.html#packages_determining_module_system. Industry standards and best practices will be tracked over the next year and EcmaScript modules may become the default in future updates.

The JSNSD exam has also been updated to become more web-framework friendly, the npm start field is now the essential entry-point for determining how a web server is started. This allows for frameworks with their own initialization CLIs to be used more easily than before, for example see https://www.fastify.io/docs/latest/Getting-Started/#run-your-server-from-cli.

While there are no changes to the current set of Domains and Competencies for the JSNSAD and JSNAD Exams, candidates are advised to review functionality of libraries or frameworks on Node.js version 14. For a full list of differences between Node.js version 10 and Node.js version 14 see https://nodejs.medium.com/node-js-version-14-available-now-8170d384567e.

To help prepare for the Node.js Certification exams, the Linux Foundation offers training courses for both the Applications and Services exams. The training courses were authored by David Clements, a principal architect, public speaker, author of the Node Cookbook, and open source creator specializing in Node.js and browser JavaScript.

These exams are evergreen and soon after a Node.js version becomes the only LTS line the certifications are updated to stay in lockstep with that LTS version. Now that Node.js version 10 has moved into maintenance, certifications will be based on Node.js version 14.

The OpenJS Node.js Certification program was developed in partnership with NearForm and NodeSource. The certifications are a good way to showcase your abilities in the job market and allow companies to find top talent.

Node.js v15.0.0 is here!

By Announcement, Blog, Node.js, Project Updates

This week, Node.js, an Impact project at the OpenJS Foundation, shipped Node.js v15, a major release for the JavaScript server-side runtime.

The new release includes:

  • Abort Controller
  • N-API Version 7
  • npm 7
  • Throw on unhandled rejections
  • QUIC (experimental)
  • V8 8.6

Additional project news includes

  • Completion of the Node.js Contributors Survey to gather feedback on the contribution process to determine target areas for improvement.
  • big improvements to Node.js automation and tooling including the ability to kick off CI runs and land commits just by adding a GitHub label, making it easier for collaborators to manage the constant flow of Pull Requests.
  • The beginning of Next 10 Years of Node.js effort. The goal of this effort is to reflect on what led to success in the first 10 years of Node.js and set the direction for success in the next 10. One of the outcomes so far is that we’ve created a Technical Values document to guide our efforts.

To read more about Node.js v15, please read the blog here written by Bethany Griggs and the Node.js TSC.

Podcast: iModernize – Always Bet on Node

By Blog, Node.js

Recently, Robin Ginn, OpenJS Foundation Executive Director and Joe Sepi, OpenJS CPC Chair, sat down with Amanda Blackburn of OpenJS Member Company, Profound Logic, to discuss the Foundation and Node.js. The following was posted originally on the Profound Logic blog.

Today Amanda speaks with two members of the OpenJS Foundation.

Links mentioned:

Node.js Helps NASA Keep Astronauts Safe and Data Accessible 

OpenJS World Keynote: Reaching Your Dreams In Tech and Science – Christina H. Koch, NASA Astronaut

https://twitter.com/joe_sepi\

https://LeastBestBeast.com

Amanda Blackburn:

Hello and welcome to the iModernize podcast; Technology news, views and insights for businesses on the IBM i platform. I am your host Amanda Blackburn and I am the Director of Marketing at Profound Logic Software. Today we are discussing the OpenJS Foundation, the neutral home to grow and sustain the JavaScript and web ecosystems with over 30 projects that include Node.js, Electron, AMP and JQuery. I am excited to be joined by Robin Ginn, Executive Director of the OpenJS Foundation, and Joe Sepi, who is an Open Source Engineer and an advocate at IBM. Robin just passed her one-year anniversary leading the organization and previously led major initiatives at Microsoft to advance Open Source technologies, community development and open standards.

Joe has been active in the Node.js project community, and the foundation, for a number of years and was part of the small group that merge the JS Foundation and the Node.js Foundation. He is now the chairperson for the cross-project council, the OpenJS Foundation’s top technical advisory committee.

Welcome to you both and thanks for joining us.

Robin Ginn

Thanks for having us.

Joe Sepi

Yeah thank you Amanda.

Amanda:

Just to jump right in, Robin, can you tell us about the history of the OpenJS Foundation?

Robin

I have been a part of the Node.js Foundation since it started almost 11 years ago. As you mentioned in your intro, the Node.js Foundation merged with the OpenJS Foundation to create a new home. We are new, but a lot of us have been at it for a very long time and we have lots of new friends that are joining. What we do is offer a neutral place for Open Source technology development and collaborate to happen. Having that neutral place is really important. If you are a company taking a big bet on a piece of Open Source software, you want to know that is being developed in a fair and open, and clear and transparent way.

We are super excited the Profound Logic is one of the members of the OpenJS Foundation. This membership helps us develop programs to support training and certification services, providing some IP support and giving people a place to development. We just hope that you all benefit from having greater connections to the community and take advantage of some marketing and though leadership opportunities as you are important leaders in the Open Source community.

Amanda

Definitely. We have enjoyed being a part of that because it is such a vibrant community, especially on the Node side from our experience. It is so nice to be able to go out and see people so passionate about a language and technology, and to see what they are able to do with it.

Joe, you have been involved with the community side of this, can you tell us a little more information on what that has been like?

Joe Sepi:

Sure! As you mentioned, the Node community is really passionate. Not just for the platform and the technology, but also for the community and the governance of the project. I have been a part of the Node.js community committee for a number of years, which focuses on the aspects outside of the core technical platform development, like the community part. We have taken a lot of that to the OpenJS Foundation and are working hard on building out a great community. We have been making a lot of progress on individual supporter program and generally just trying to engage with the community more.

Amanda

Yeah it has very impressive to see the level of commitment and involvement in the community. This is something that we enjoy sharing with businesses that we talk to. There are always new ideas and new ways to support businesses on these languages. Today’s business have more tech options that ever before, but also just as many tech challenges.

A question I would as you both is: why is Open Source important to today’s businesses?

Robin

Let’s just look at Node.js and why it is important. I think we were having Hackathons probably 12-13 years ago. I love to credit the foundation model for keeping Node.js modern and trustworthy for businesses today. You mentioned Netflix, NASA is using Node.js in space suit solutions as their astronauts spend time on the space stations. I think most companies are using Open Source, but the Linux foundation just released a white paper on the importance of Open Source for business, particularly vertical businesses. They found that businesses contributing to Open Source as they move towards digital transformation and modernization helps them innovate much more quickly, 3xs faster.

Amanda

I know for our customers, since we are in the legacy modernization space, we definitely try to get that message across. Open Source, including Node.js, can be really great way to help them address those challenges.

Joe

Yeah! I just pulled up the GitHub repository for Node, and there are over 2800 contributors. I don’t want to say that is free work, because you should always give back and support Open Source, but you are getting all of this people focusing on making the platform stable, secure and modern. It is like having a whole other team supporting the work and products that you are using.

Robin

That is what I love about GitHub. The support and feedback are instant and open for all. As you are building your own software solutions you have access to that developer feedback and documentation. 2800 developers are all working on it in real time.

Amanda

That is something that we have even taken advantage of for our own products and services. For those that don’t know, npm[now a part of GitHub] is a great way to discover those applications and code to repurpose for any number of ways, even for business technologies.

Something that we have noticed in the AS/400 market is a lot our developers are getting older, and soon they will be retiring. A lot of the businesses we work with are still running applications on RPGLE or older application languages and will lose that mindshare when their developers leave. How can Node or other Open Source languages help bridge that gap, especially when looking for new developers?

Joe

I don’t think there has been a better time to be utilize open source technologies to modernize these legacy applications. In my experience, when moving from legacy applications to more modern approaches (like microservices for example) you can do it with a piecemeal-type of approach. Take certain applications and start to think about things in isolation so you can maintain and update them without effecting the larger application. The more you can separate those types of things the better.

Amanda

Definitely. Robin, you might see this as well. One of the great challenges for businesses, not just legacy businesses, is the accumulation of technical debt and how to address that. I would imagine most businesses struggle with this. Could you speak a little about how Open Source languages might be able to help with that?

Robin

If you look at a combination Open source and open standard, what you are really doing is driving that inner operability. You can ease your transition to the cloud without having to rip and replace absolutely everything. You know that your systems will work better together. Node.js and other open source technology give you that flexibility to build modern apps and new solutions. You also mentioned the ability to attract new talent and developers. Before lockdown I was at a developer conference and I talked to some recruiters. They said one of the top categories they were hiring was Node.js developers. It definitely is the top of the developer talent pool together.

Amanda

Definitely. We see that languages such as RPG, or even COBOL, and are not even being taught in colleges anymore, even though a huge portion of the world’s business have them in their infrastructure. Node is really a great option because it is both client and server side, and business can look for JavaScript developers who could very easily learn Node and leverage that for their business applications.

Joe

I have been doing JavaScript for twenty years and with Node being created in 2009 it really made JavaScript such a prevalent technology in the space. To be able to hire engineers who can work on the front and back end is a huge asset for businesses in my experience.

Amanda

That is great to hear, and we are seeing the same.

Speaking of successfulness, which you mentioned before Robin with NASA trusting Node to keep them safe space, are their any other success stories with businesses using Node?

Robin

Oh gosh. I think we like to say Node is everywhere, and it really truly is once you start to talk to companies. We have actually been running some case studies on our blog if anyone is interested in taking a look. Companies like Netflix is using Node.jsEssry is doing some COVID tracking, as well as NearForm. There are a lot of really fascinating use cases.

But again, when you’re talking mission-critical, making sure your space suit doesn’t leak I We had written up this really cool case study, so we invited a NASA astronaut who benefitted from our technology, her name is Christina H. Koch. She spoke at OpenJS World, so you might want to check that out. She has a really fascinating story on how NASA is using technologies.

Joe

Yeah, that was a really great talk and I really enjoyed that. There is also a case study on the Weather Company. They have billions of locations, 60 languages and 230 location.

Amanda

Yeah that is pretty amazing, and I would say that shows Node’s scalability.

Joe

Yeah absolutely. You had mentioned on phrase: JavaScript and Node.js are everywhere. Another one we always hear: Always bet on Node. Before joining IBM I worked at a couple other big-name companies, and at both of them we were doing a rewrite/green field application and I recommended we do it in Node. Since we had quite a few PHP developers, so they decided to go with PHP. In both instances, within a few years they had to rewrite in Node. So, always bet on Node.

Amanda

Yeah that is something that we see as well. Robin you had mentioned rip and replace, and that is an option that we are always opposed too because we have enough experience to know it is not the fastest, easiest, or most thorough way. It is always a huge mess, and very expensive and risky. The other option we see is rewriting, something like Java or .NET, they run into similar limitations.

Node offers so much more flexibility, portability and stability that businesses can take advantage of. They can utilize that technology to help them do things like connect to the cloud or use AI. With things like npm you can just plug those right into your application, which is pretty cool.

Robin

Yeah and I think Node.js works in all the clouds. If you have a multi or single cloud strategy, it is going to work for you.

Joe

It is also, if I am not mistaken, the most utilized platform in everyone’s cloud. The other thing that is great about Node-It is great to use at the core of your applications, but if you have something that is resource-intense, you can spin out a worker thread or send that out to another service. Keeping Node at your core is a great option.

Amanda

You have both touched on my next questions which is: What advice and best practices would you give businesses considering leveraging Node for their enterprise applications.

Joe

One thing to keep in mind at the onset is to be very cloud-native focused and cloud-ready focused with your Node.js developments. Think about how developing your Node applications will be integrating with Kubernetes and being able to surface your metrics, and things like that.

Amanda

Yeah definitely. Most of our solutions here at Profound are based in Node. Doing things like offering options for systems integration, API, portability to the cloud, modernizing legacy code… Node is very flexible for all of these options.

Joe

Yeah and getting away from these monolithic applications and moving to a more microservice-oriented architecture is a good way to look at things too. And of course, serverless is great option if that is the right use case, some kind of event-driven architecture is very cost efficient and versatile.

Amanda

That all sounds like really good advice. I know that businesses have a lot to think about when it comes to their technology and Node, and other Open Source languages, are mature, stable, secure and flexible enough to help businesses of all sizes and industries accomplish their goals.

I have one final question: How are you staying sane through quarantine? And have you developed any new hobbies?

Joe

I am staying sane by relying on old hobbies, I am a musician. I have been doing some socially distance and responsible band practices and working on a new record. [Check out some of Joe’s music!]

Robin

That is really cool. My big thing has always been exercise. That has always been my number one thing. OId hobby, I actually just bought a guitar and my son is teaching me to play. I have not played since I was a kid.

Amanda

Wow well maybe you and Joe can collaborate on some musical projects.

Joe

Amazing. I am into it.

Robin

Jamming on our weekly calls.

Amanda

I feel like if there is a silver lining to the quarantine at all it, it is definitely challenging the way we spend our time, and even work. Technology plays a part in that as well. It is definitely an interesting time, but that really cool you both have that in common.

Robin

Yeah and how about you Amanda? What are you doing?

Amanda

I have actually gotten into Youtubing and creating videos on different topics that are my interests like Sci-Fi and video game stuff. I like that you get to interact with people who are interested in the same topics.

Robin

Super cool!

Amanda

Well thanks so much for taking the time and being here today. It was really great to talk to you both and learn more about the foundation and all it has to offer.

Robin

Thanks Amanda and thanks to the Profound Logic team for hosting us.

Joe

Yeah thank you so much. Great to be here.

Amanda

I wanted to take the time to direct everyone to OpenJSF.org to take advantage of all the foundation has to offer. That includes Open Source training and certification, collaboration with the community, and learning more about the projects and how to be a part of that.

Thanks for taking the time to join us, and we will talk to you next time!

From streaming to studio: The evolution of Node.js at Netflix

By Blog, Case Study, Node.js, Project Update

As platforms grow, so do their needs. However, the core infrastructure is often not designed to handle these new challenges as it was optimized for a relatively simple task. Netflix, a member of the OpenJS Foundation, had to overcome this challenge as it evolved from a massive web streaming service to a content production platform. Guilherme Hermeto, Senior Platform Engineer at Netflix, spearheaded efforts to restructure the Netflix Node.js infrastructure to handle new functions while preserving the stability of the application. In his talk below, he walks through his work and provides resources and tips for developers encountering similar problems.

Check out the full presentation 

Netflix initially used Node.js to enable high volume web streaming to over 182 million subscribers. Their three goals with this early infrastructure was to provide observability (metrics), debuggability (diagnostic tools) and availability (service registration). The result was the NodeQuark infrastructure. An application gateway authenticates and routes requests to the NodeQuark service, which then communicates with APIs and formats responses that are sent back to the client. With NodeQuark, Netflix also created a managed experience — teams could create custom API experiences for specific devices. This allows the Netflix app to run seamlessly on different devices. 

Beyond streaming

However, Netflix wanted to move beyond web streaming and into content production. This posed several challenges to the NodeQuark infrastructure and the development team. Web streaming requires relatively few applications, but serves a huge user base. On the other hand, a content production platform houses a large number of applications that serve a limited userbase. Furthermore, a content production app must have multiple levels of security for employees, partners and users. An additional issue is that development for content production is ideally fast paced while platform releases are slow, iterative processes intended to ensure application stability. Grouping these two processes together seems difficult, but the alternative is to spend unnecessary time and effort building a completely separate infrastructure. 

Hermeto decided that in order to solve Netflix’s problems, he would need to use self-contained modules. In other words, plugins! By transitioning to plugins, the Netflix team was able to separate the infrastructure’s functions while still retaining the ability to reuse code shared between web streaming and content production. Hermeto then took plugin architecture to the next step by creating application profiles. The application profile is simply a list of plugins required by an application. The profile reads in these specific plugins and then exports a loaded array. Therefore, the risk of a plugin built for content production breaking the streaming application was reduced. Additionally, by sectioning code out into smaller pieces, the Netflix team was able to remove moving parts from the core system, improving stability. 

Looking ahead

In the future, Hermeto wants to allow teams to create specific application profiles that they can give to customers. Additionally, Netflix may be able to switch from application versions to application profiles as the code breaks into smaller and smaller pieces. 

To finish his talk, Hermeto gave his personal recommendations for open source projects that are useful for observability and debuggability. Essentially, a starting point for building out your own production-level application!  

Personal recommendations for open source projects 

Metrics and alerts: 

Centralized Logging 

Distributed tracing 

Diagnostics 

Exception Management 

Node.js Package Maintenance: Bridging the gap between maintainers and consumers

By Blog, Node.js, Project Update

This blog was written by Michael Dawson with input from the Node.js package Maintenance Working Group. It was originally posted on the Node.js blog. Node.js is an OpenJS Foundation Impact Project.

Image for post

A while back I talked about the formation of the Node.js package maintenance Working Group and some of the initial steps that we had in mind in terms of helping to move the ecosystem forward. You can read up on that here if you’d like:
https://medium.com/@nodejs/call-to-action-accelerating-node-js-growth-e4862bee2919.

This blog is a call to action for package maintainers in order to help in one of our initiatives and to move it forward, we need your help.

It’s been almost 2 years and we’ve been working on a number of initiatives which you can learn more about through the issues in the package-maintenance repo. Things never move quite as fast as we’d like, but we are making progress on a number of different areas.

One area that we identified was:

Building and documenting guidance, tools and processes that businesses can use to identify packages on which they depend, and then to use this information to be able to build a business case that supports their organization and developers helping to maintain those packages.

We started by looking at how to close the gap between maintainers and consumers in terms of expectations. Mismatched expectations can often be a source of friction and by providing a way to communicate the level of support behind a package we believe we can:

  • help maintainers communicate the level of support they can/want to provide, which versions of Node.js they plan to support going forward, and the current level backing in place to help keep development of the package moving forward.
  • reduce potential friction due to mismatched expectations between module maintainers and consumers
  • help consumers better understand the packages they depend on so that they can improve their planning and manage risk.

In terms of managing risk we hope that by helping consumers better understand the key packages they depend on, it will encourage them to support these packages in one or more ways:

  • encouraging their employees to help with the ongoing maintenance
  • provide funding to the existing maintainers
  • support the Foundation the packages are part of (for example the OpenJS Foundation)

After discussion at one of the Node.js Collaborator Summits where there was good support for the concept, the team has worked to define some additional metadata in the package.json in order to allow maintainers to communicate this information.

The detailed specification for this data can be found in: https://github.com/nodejs/package-maintenance/blob/master/docs/PACKAGE-SUPPORT.md.

The TL/DR version is that it allows the maintainer to communicate:

  • target: the platform versions that the package maintainer aims to support. This is different from then existing engines field in that expresses a higher level intent like current LTS version for which the specific versions can change over time.
  • response: how quickly the maintainer chooses to, or is able to, respond to issues and contacts for that level of support
  • backing: how the project is supported, and how consumers can help support the project.

We completed the specification a while ago, but before asking maintainers to start adding the support information we wanted to provide some tooling to help validate that the information added was complete and valid. We’ve just finished the version of that tool which is called support.

The tool currently offers two commands which can be used which are:

  • show
  • validate

The show commands displays a simple tree of the packages for an application and the raw support information for those packages. Much more sophisticated commands will make sense to help consumers review/understand the support info but at this point it’s more important to start to have the information filled in, as that is needed before more sophisticated analysis makes sense.

The validate command helps maintainers validate that they’ve added/defined the support information correctly. If there are errors or omissions it will let the maintainer know so that as support information is added it is high quality and complete.

Our call to action is for package maintainers to:

  • Review the support specification and give us your feedback if you have suggestions/comments.
  • Add support info to your package
  • Use the support tool in order to validate the support info and give us feedback on the tool
  • Let us know when you’ve added support info so that we can keep track of how well we are doing in terms of the ecosystem supporting the initiative, as well as knowing which real-world packages we can use/reference when building additional functionality into the support tool.

We hope to see the ecosystem start to provide this information and look forward to seeing what tooling people (including the package-maintenance working group and others) come up with to help achieve the goals outlined.

How Node.js saved the U.S. Government $100K

By Blog, Case Study, Node.js, OpenJS World

The following blog is based on a talk given at the OpenJS Foundation’s annual OpenJS World event and covers solutions created with Node.js.

When someone proposes a complicated, expensive solution ask yourself: can it be done cheaper, better and/or faster? Last year, an external vendor wanted to charge $103,000 to create an interactive form and store the responses. Ryan Hillard, Systems Developer at the U.S. Small Business Administration, was brought in to create a less expensive, low-maintenance alternative to the vendor’s proposal. Hillard was able to create a solution using ~320 lines of code and $3000. In the talk below Hillard describes what the difficulties were and how his Node.js solution fixed the problem.

Last year, Hillard started work on a government’s case management system that received and processed feedback from external and internal users. Unfortunately, a recent upgrade and rigorous security measures prevented external users from leaving feedback. Hillard needed to create a secure interactive form and then store the data. However, the solution also needed to be cheap, easy to maintain and stable. 

Hillard decided to use three common services: Amazon Simple Storage Service (S3), Amazon Web Services (AWS) Lambda and Node.js. Together, these pieces provided a simple and versatile way to capture and then store response data. Maintenance is low because the servers are maintained by Amazon. Additionally, future developers can easily alter and improve the process as all three services/languages are commonly used. 

To end his talk, Hillard discussed the design and workflow processes that led him to his solution. He compares JavaScript to a giant toolkit with hundreds of libraries and dependencies — a tool for every purpose. However, this variety can be counterproductive as the complexity – and thus the management time – increases.

Developers should ask themselves how they can solve their problems without introducing anything new. In other words, size does matter — the smallest, simplest toolkit is the best!

OpenJS Foundation AMA: Node.js Certifications

By AMA, Blog, Certification, Node.js

In this AMA, we discussed the benefits of the OpenJS Node.js certification program. The certification tests a developer’s knowledge of Node.js and allows them to quickly establish their credibility and value in the job market. Robin Ginn, OpenJS Foundation Executive Director, served as the moderator. David Clements, Technical Lead of OpenJS Certifications, and Adrian Estrada, VP of Engineering at NodeSource, answered questions posed by the community. The full AMA is available at the link below: 

The OpenJS Foundation offers two certifications: OpenJS Node.js Application Developer (JSNAD) and OpenJS Node.js Services Developer (JSNSD). The Application Developer certification tests general knowledge of Node.js (file systems, streams etc.). On the other hand, the Services Developer certification asks developers to create basic Node services that might be required by a startup or enterprise. Services might include server setup and developing security to protect against malicious user input. 

In the talk, Clements and Estrada discussed why they created the certifications. They wanted to create an absolute measure of practical skill to help developers stand out and ease the difficulties of hiring for the industry. To that end, OpenJS certifications are relatively cheap and applicable to real world problems encountered in startup and enterprise environments. 

A timestamped summary of the video is available below: 

Note: If you are not familiar with the basics of the two certifications offered by the OpenJS Foundation, jumping to the two bolded sections may be a good place to start.

AMA Topics

Introductions 0:20

How did the members start working together? 2:35

How did work on the certifications start? 5:07

Is it possible to have feedback on the exam? 9:50

Applications of psychometric analysis 12:26

What is the Node.js Application Developer certification + Services Developer certification? 14:54

How do you take the exam? What should you expect? 18:22

Will there be differential pricing between countries? 22:04

How is the criteria for new npm packages chosen? 24:55

Are test takers able to use Google or mdn? 31:52

What benefits do OpenJS certifications have for developers? 33:22

How to use the certification after completion 39:43

What are the exam principles? 40:56

How much experience is required for the exam? 44:12 

Course available in Chinese 49:09

How will new Node versions affect the certifications? 53:43 

Closing thoughts 56:35

Node.js announces new mentorship opportunity

By Blog, Node.js

This post was written by A.A. Sobaki and the Node.js Mentorship Initiative. It first appeared on the project’s blog. Node.js is an Impact Project at the OpenJS Foundation.

The Node.js Mentorship Initiative is excited to announce a new mentee opening! We’d like to invite experienced developers to apply to join the Node.js Examples Initiative.

If you’re not familiar, the Examples Initiative’s mission is to build and maintain a repository of runnable, tested examples that go beyond “hello, world!” This is an important place to find practical and real-world examples of how to use the runtime in production.

Being a part of the Examples Initiative is a big opportunity. As a mentee, you will work with and learn from industry leaders and world-class software engineers. You will receive personalized guidance as you write code that will serve as a template for countless developers as they begin to use Node.js in their projects.

To get started, complete the application and coding challenge linked below. The coding challenge is a chance to showcase your skills. It is estimated it will take between 2–4 hours to complete the challenge.

Click the link to get started.

We look forward to receiving your application.

Node.js Promise reject use case survey

By Blog, Node.js, Survey

This post was contributed by the Node.js Technical Steering Committee.

The Node.js Project, an impact project of the OpenJS Foundation, handles unhandled rejections by emitting a deprecation warning to stderr. The warning shows the stack where the rejection happened, and states that in future Node.js versions unhandled rejections will result in Node.js exiting with non-zero status code. We intend to remove the deprecation warning, replacing it with a stable behavior which might be different from the one described on the deprecation warning. We’re running a survey to better understand how Node.js users are using Promises and how they are dealing with unhandled rejections today, so we can make an informed decision on how to move forward.

To learn more about what unhandled rejections are and potential issues with it, check out the original post.  Those interested in helping the TSC solve this are encouraged to participate in the survey, which will close on August 24th.