Category

Node.js

OpenJS World – Featured Profile – Beth Griggs

By Announcement, Blog, Event, Node.js, OpenJS World

Since 2016, Beth Griggs has been working as an Open Source Engineer at IBM where she focuses on the Node.js runtime. Node.js is an impact project in the OpenJS Foundation. Beth is a Node.js Technical Steering Committee Member and a member of the Node.js Release Working Group where she is involved with auditing commits for the long-term support (LTS) release lines and the creation of releases. 

What was your first experience of Node.js?

I joined the party a little late, my first experience of Node.js was while completing my final-year engineering project for my Bachelor’s degree in 2016. My engineering project was to create a ‘living meta-analysis’ tool that would enable researchers, specifically psychologists, to easily combine and update findings from related independent studies. I originally implemented the tool using a PHP framework, but after some time I realized I wasn’t enjoying the developer experience and hitting limitations with the framework. Half-way through my final year of university, I heard some classmates raving about Node.js, so I decided to check it out. Within a few weeks, I had reimplemented my project from scratch using Node.js.

How did you start contributing to Node.js?

I rejoined IBM in 2016, having spent my gap-year prior to university at IBM as Java Test Engineer in their WebSphere organization. I joined the Node.js team in IBM Runtime Technologies who at the time were responsible for building and testing the IBM SDK for Node.js. From running the Node.js test suite regularly internally, my team identified flaky tests that needed fixing out in the community – which turned in to some of my first contributions to Node.js core.

Over the next few years, our team deprecated the IBM SDK for Node.js in favor of maintaining these platforms directly in the Node.js community.  Around the same time, Myles Borins offered to mentor me to become involved with the Release Working Group, with a view of becoming a Node.js releaser (Thanks Myles!). Since then, that’s the area of Node.js where most of my contributions have been focused.

What has changed since you first started to contributing to Node.js?

One of the biggest changes is the emphasis on onboarding new contributors to major parts of the project. Getting new names and faces onboarded in a position where they can actively contribute to Node.js, and also an increase in socializing how people can contribute in ways other than code. 

Documentation of the internal contributor processes has improved a lot too, but there’s still room to improve.

What are you most excited about with the Node.js project at the moment?

I’m really enjoying the work that is happening in pkgjs GitHub organization where we’re building tools for package maintainers. I’m excited to see the tools that come out of pkgjs organization and the Node.js Package Maintenance team.

What are you most looking forward to at OpenJS World?

There are so many great talks (although, I’m a little bias as I was in the content team). I’m really looking forward to the keynote with Christina H Koch, a NASA astronaut. And also, the ‘Broken Promises’ workshop by James and Matteo from NearForm.

On the Cross Project Summit day, I’m looking forward to the Node.js Package Maintenance session. We’ve got a lot of momentum in that working group at the moment and it’ll be great to have input from the other OpenJS projects. I’m hoping my talk “Chronicles of the Node.js Ecosystem: The Consumer, The Author, and The Maintainer” is a good primer for the session. 

I’ll also be at the IBM virtual booth throughout the conference and catching my colleagues’ talks (https://developer.ibm.com/technologies/node-js/blogs/ibm-at-openjs-world-2020). 

What does your role at IBM include other than contributing to the Node.js community?

A wide variety of things really, no week is ever full of the same tasks. I’m often preparing talks and workshops for various conferences. Alongside that, I spend my time researching common methods and best practices for deploying Node.js applications to the cloud – specifically focusing on IBM Cloud and OpenShift. I often find myself assisting internal teams with their usage of Node.js, and analyzing various IBM offerings from a typical Node.js Developer’s point of view and providing feedback. I’m also scrum master for my team, so a portion of my time is taken up with those responsibilities too. 

What do you do outside of work?

Most often hanging out with my dog, Laddie. I’m a DIY enthusiast – mainly painting or upcycling various pieces of second-hand furniture. Since the start of lockdown in the UK, I have also been writing a book which is a convenient pass time. Big fan of replaying my old PS1 games too. 

Where should people go to get started contributing to the Node.js Project? 

Go to https://www.nodetodo.org/, which is a website that walks you through a path towards your first contribution to Node.js. As long as you’re a little bit familiar with Node.js, you can start here. The other option is to look for labels on repositories in the Node.js GitHub organization tagged with ‘Good first issue’. 

Alternatively, you can join one of our working group sessions on Zoom and start participating in discussions. The sessions are listed in the nodejs.org calendar. If you’re specifically interested in the Node.js Release Working Group, I run fortnightly mentoring/shadowing sessions that you’re welcome to join.

How The Weather Company uses Node.js in production

By Announcement, Blog, ESLint, member blog, Node.js

Using Node.js improved site speed, performance, and scalability

This piece was written by By Noel Madali and originally appeared on the IBM Developer Blog. IBM is a member of the OpenJS Foundation.

The Weather Company uses Node.js to power their weather.com website, a multinational weather information and news website available in 230+ locales and localized in about 60 languages. As an industry leader in audience reach and accuracy, weather.com delivers weather data, forecasts, observations, historical data, news articles, and video.

Because weather.com offers a location-based service that is used throughout the world, its infrastructure must support consistent uptime, speed, and precise data delivery. Scaling the solution to billions of unique locations has created multiple technical challenges and opportunities for the technical team. In this blog post, we cover some of the unique challenges we had to overcome when building weather.com and discuss how we ended up using Node.js to power our internationalized weather application.

Drupal ‘n Angular (DNA): The early days

In 2015, we were a Drupal ‘n Angular (DNA) shop. We unofficially pioneered the industry by marrying Drupal and Angular together to build a modular, content-based website. We used Drupal as our CMS to control content and page configuration, and we used Angular to code front-end modules.

Front-end modules were small blocks of user interfaces that had data and some interactive elements. Content editors would move around the modules to visually create a page and use Drupal to create articles about weather and publish it on the website.

DNA was successful in rapidly expanding the website’s content and giving editors the flexibility to create page content on the fly.

As our usage of DNA grew, we faced many technical issues which ultimately boiled down to three main themes:

  • Poor performance
  • Instability
  • Slower time for developers to fix, enhance, and deploy code (also known as velocity)

Poor performance

Our site suffered from poor performance, with sluggish load times and unreliable availability. This, in turn, directly impacted our ad revenue since a faster page translated into faster ad viewability and more revenue generation.

To address some of our performance concerns, we conducted different front-end experiments.

  • We analyzed and evaluated modules to determine what we could change. For example, we evaluated getting rid of some modules that were not used all the time or we rewrote modules so they wouldn’t use giant JS libraries.
  • We evaluated our usage of a tag manager in reference to ad serving performance.
  • We added lazy-loaded modules to remove the module on first load in order to reduce the amount of JavaScript served to the client.

Instability

Because of the fragile deployment process of using Drupal with Angular, our site suffered from too much downtime. The deployment process was a matter of taking the name of a git branch and entering it into a UI to get released into different environments. There was no real build process, but only version control.

Ultimately, this led to many bad practices that impacted developers including lack of version control methodology, non-reproduceable builds, and the like.

Slower developer velocity

The majority of our developers had front-end experience, but very few them were knowledgeable about the inner workings of Drupal and PHP. As such, features and bug fixes related to PHP were not addressed as quickly due to knowledge gaps.

Large deployments contributed to slower velocity as well as stability issues, where small changes could break the entire site. Since a deployment was the entire codebase (Drupal, Drupal plugins/modules, front-end code, PHP scripts, etc), small code changes in a release could easily get overlooked and not be properly tested, breaking the deployment.

Overall, while we had a few quick wins with DNA, the constant regressions due to the setup forced us to consider alternative paths for our architecture.

Rethinking our architecture to include Node.js

Our first foray into using Node.js was a one-off project for creating a lite experience for weather.com that was completely server-side rendered and had minimal JavaScript. The audience had limited bandwidth and minimal device capabilities (for example, low-end smartphones using Facebook’s Free Basics).

Stakeholders were happy with the lite experience, commenting on the nearly instantaneous page loads. Analyzing this proof-of-concept was important in determining our next steps in our architectural overhaul.

Differing from DNA, the lite experience:

  • Rendered pages as server side only
  • Kept the front-end footprint under 30KB (virtually no JavaScript, little CSS, few images).

We used what we learned with the lite experience to help us and serve our website more performantly. This started with rethinking our DNA architecture.

Metrics to measure success

Before we worked on a new architecture, we had to show our business that a re-architecture was needed. The first thing we had to determine was what to measuring to show success.

We consulted with the Google Ad team to understand how exactly a high-performing webpage impacts business results. Google showed us proof that improving page speed increases ad viewability which translates to revenue.

With that in hand, each day we conducted tests across a set of pages to measure:

  • Speed index
  • Time to first interaction
  • Bytes transferred
  • Time to first ad call

We used a variety of tools to collect our metrics: WebPageTest, Lighthouse, sitespeed.io.

As we compiled a list of these metrics, we were able to judge whether certain experiments were beneficial or not. We used our analysis to determine what needed to change in our architecture to make the site more successful.

While we intended to completely rewrite our DNA website, we acknowledged that we needed to stair step our approach for experimenting with a newer architecture. Using the above methodology, we created a beta page and A/B tested it to verify its success.

From Shark Tank to a beta of our architecture

Recognizing the performance of our original Node.js proof of concept, we held a “Shark Tank” session where we presented and defended different ideal architectures. We evaluated whole frameworks or combinations of libraries like Angular, React, Redux, Ember, lodash, and more.

From this experiment, we collectively agreed to move from our monolithic architecture to a Node.js backend and newer React frontend. Our timeline for this migration was between nine months to a year.

Ultimately, we decided to use a pattern of small JS libraries and tools, similar to that of a UNIX operating system’s tool chain of commands. This pattern gives us the flexibility to swap out one component from the whole application instead of having to refactor large amounts of code to include a new feature.

On the backend, we needed to decouple page creation and page serving. We kept Drupal as a CMS and created a way for documents to be published out to more scalable systems which can be read by other services. We followed the pattern of Backends for Frontends (BFF), which allowed us to decouple our page frontends and allow for more autonomy of our backend downstream systems. We use the documents published by the CMS to deliver pages with content (instead of the traditional method of the CMS monolith serving the pages).

Even though Drupal and PHP can render server-side, our developers were more familiar with JavaScript, so using Node.js to implement isomorphic (universal) rendering of the site increased our development velocity.

flow

Developing with Node.js was an easy focus shift for our previous front-end oriented developers. Since the majority of our developers had a primarily JavaScript background, we stayed away from solutions that revolved around separate server-side languages for rendering.

Over time, we implemented and evolved our usage from our first project. After developing our first few pages, we decided to move away from ExpressJS to Koa to use newer JS standards like async/await. We started with pure React but switched to React-like Inferno.js.

After evaluating many different build systems (gulp, grunt, browserify, systemjs, etc), we decided to use Webpack to facilitate our build process. We saw Webpack’s growing maturity in a fast-paced ecosystem, as well as the pitfalls of its competitors (or lack thereof).

Webpack solved our core issue of DNA’s JS aggregation and minification. With a centralized build process, we could build JS code using a standardized module system, take advantage of the npm ecosystem, and minify the bundles (all during the build process and not during runtime).

Moving from client-side to server-side rendering of the application increased our speed index and got information to the user faster. React helped us in this aspect of universal rendering–being able to share code on both the frontend and backend was crucial to us for server-side rendering and code reuse.

Our first launch of our beta page was a Single Page App (SPA). Traditionally, we had to render each page and location as a hit back to the origin server. With the SPA, we were able to reduce our hits back to the origin server and improve the speed of rendering the next view thanks to universal rendering.

The following image shows how much faster the webpage response was after the SPA was introduced.

flow

As our solution included more Node.js, we were able to take advantage of a lot of the tooling associated with a Node.js ecosystem, including ESLint for linting, Jest for testing, and eventually Yarn for package management.

Linting and testing, as well as a more refined CI/CD pipeline, helped reduce bugs in production. This led to a more mature and stable platform as a whole, higher engineering velocity, and increased developer happiness.

Changing deployment strategies

Recognizing our problems with our DNA deployments, we knew we needed a better solution for delivering code to infrastructure. With our DNA setup, we used a managed system to deploy Drupal. For our new solution, we decided to take advantage of newer, container-based deployment and infrastructure methodologies.

By moving to Docker and Kubernetes, we achieved many best practices:

  • Separating out disparate pages into different services reduces failures
  • Building stateless services allows for less complexity, ease of testing, and scalability
  • Builds are repeatable (Docker images ensure the right artifacts are deployed and consistent) Our Kubernetes deployment allowed us to be truly distributed across four regions and seven clusters, with dozens of services scaled from 3 to 100+ replicas running on 400+ worker nodes, all on IBM Cloud.

Addressing a familiar set of performance issues

After running a successful beta experiment, we continued down the path of migrating pages into our new architecture. Over time, some familiar issues cropped up:

  • Pages became heavier
  • Build times were slower
  • Developer velocity decreased

We had to evolve our architecture to address these issues.

Beta v2: Creating a more performant page

Our second evolution of the architecture was a renaissance (rebirth). We had to go back to the basics and revisit our lite experience and see why it was successful. We analyzed our performance issues and came to a conclusion that the SPA was becoming a performance bottleneck. Although SPA benefits second page visits, we came to an understanding that majority of our users visit the website and leave once they get their information.

We designed and built the solution without a SPA, but kept React hydration in order to keep code reuse across the server and client-side. We paid more attention to the tooling during development by ensuring that code coverage (the percentage of JS client code used vs delivered) was more efficient.

Removing the SPA overall was key to reducing build times as well. Since a page was no longer stitched together from a singular entry point, we split the Webpack builds so that individual pages can have their own set of JS and assets.

We were able to reduce our page weight even more compared to the Beta site. Reducing page weight had an overall impact on page load times. The graph below shows how speed index decreased.

flow Note: Some data was lost in between January through October of 2019.

This architecture is now our foundation for any and all pages on weather.com.

Conclusion

weather.com was not transformed overnight and it took a lot of work to get where we are today. Adding Node.js to our ecosystem required some amount of trial and error.

We achieved success by understanding our issues, collecting metrics, and implementing and then reimplementing solutions. Our journey was an evolution. Not only was it a change to our back end, but we had to be smarter on the front end to achieve the best performance. Changing our deployment strategy and infrastructure allowed us to achieve multiple best practices, reduce downtimes, and improve overall system stability. JavaScript being used in both the back end and front end improved developer velocity.

As we continue to architect, evolve, and expand our solution, we are always looking for ways to improve. Check out weather.com on your desktop, or for our newer/more performant version, check out our mobile web version on your mobile device.

New Node.js Training Course Supports Developers in their Certification, Technical and Career Goals

By Announcement, Blog, Certification, Node.js

Last October, the OpenJS Foundation in partnership with The Linux Foundation, released two Node.js certification exams to better support Node.js developers through showcasing their skills in the JavaScript framework. Today, we are thrilled to unveil the next phase of the OpenJS certification and training program with a new training course, LFW211 – Node.js Application Development.

LFW211 is a vendor-neutral training geared toward developers who wish to master and demonstrate creating Node.js applications. The course trains developers on a broad range of Node.js capabilities at depth, equipping them with rigorous foundational skills and knowledge that will translate to building any kind of Node.js application or library.

By the end of the course, participants:

  • Understand foundational essentials for Node.js and JavaScript development
  • Become skillful with Node.js debugging practices and tools
  • Efficiently interact at a high level with I/O, binary data and system metadata
  • Attain proficiency in creating and consuming ecosystem/innersource libraries

Node.js Application Development also will help prepare those planning to take the OpenJS Node.js Application Developer certification exam. A bundled offering including access to both the training course and certification exam is also available.

Thank you to David Clements who developed this key training. Dave is a Principal Architect, public speaker, author of the Node Cookbook, and open source creator specializing in Node.js and browser JavaScript. David is also one of the technical leads and authors of the official OpenJS Node.js Application Developer Certification and OpenJS Node.js Services Developer Certification.

Node.js is one of the most popular JavaScript frameworks in the world. It powers hundreds of thousands of websites, including some of the most popular like Google, IBM, Microsoft and Netflix. Individual developers and enterprises use Node.js to power many of their most important web applications, making it essential to maintain a stable pool of qualified talent.

Ready to take the training? The course is available now. The $299 course fee – or $499 for a bundled offering of both the course and related certification exam – provides unlimited access to the course for one year to all content and labs. This course and exam, in addition to all Linux Foundation training courses and certification exams, are discounted 30% through May 31 by using code ANYWHERE30. Interested individuals may enroll here.

What’s New With Node? Interview With Bethany Griggs, Node.js Technical Steering Committee

By Blog, In The News, Node.js
Bethany Griggs

In a recent interview with DZone, Bethany Griggs, Node.js Technical Steering Committee member and Open-source Engineer at IBM gave some insight into the recent Node.js v14 release as well as the latest in Node.js overall. Topics covered include changes with the project pertaining to contributor onboarding, getting started in Node.js, challenges, and highlights of Node.js v14. Node.js is an impact project of the OpenJS Foundation.

Bethany has been a Node Core Collaborator for over two years. She contributes to the open-source Node.js runtime and is a member of the Node.js Release Working Group where she is involved with auditing commits for the long-term support (LTS) release lines and the creation of releases. 

Bethany presents and runs workshops at international conferences, including at NodeSummit, London Node.js User Group, and NodeConfEU.

Read the full article here: https://dzone.com/articles/interview-with-bethany-griggs

Testing My Knowledge with OpenJS Certification, an interview with Daijiro Wachi

By Blog, Certification, Node.js

“Unlearning and Repetition”

An Interview with Daijiro Wachi, Node.js Core Collaborator

Daijiro is a Node.js Core Collaborator and can be found @watilde on Twitter.

Along with Node, he contributes to other repositories related to the JavaScript ecosystem such as npm and URL Standard. He helps organize JSConf JP in Tokyo and presents at conferences around the world including Tokyo, Amsterdam and San Francisco. He works as a Digital Consultant at a management consulting firm and accelerates “Digital Transformation” for new businesses, covering core business optimization for clients around planning, implementation and capability building.

In his spare time, Daijiro helps kids and beginners learn coding at CoderDojo and NodeSchool.

Why did you want to get certified?

I chose certification as a way to continually maintain the big picture of fast growing software development in Node.js. To remain a strong problem solver, it needs to continue to acquire both knowledge and experience and my way of acquiring knowledge was through real experience mostly. This approach gives me deep knowledge related to the project. However, that means the expertise I can earn depends on the scope of the project. As both exam JSNAD and JSNSD requires a wide range of intermediate-level knowledge and hands-on to answer to the questions, I thought the exam would be the ideal candidate to validate and maintain my breadth of knowledge.

What's the value for you personally and professionally in getting certified?

Personally, passing the exam helped me to expand my network within the community. After taking the exam, I realized that the exam should provide translated versions to be open for everyone. Then I had a chance to ask the author of the exam, David Clements, to introduce me to Robin Ginn, and I got a chance to deliver the request. During the process, I asked how many people were interested in Japanese to investigate demand in Japan to support my request, and I received questions from people who were interested in the exam and warm words to support the request. It’s always good to connect with people who are interested in the same thing. I hope we can accelerate the translation together.

As a professional, I think there were two benefits. One is unlearning. I thought I knew most of Node.js API and how to use it, but I couldn’t get 100% marks in the exam. Then I decided to reread documents and blogs and set up my server to practice. That was something I could not touch while I am working on a big framework. The second is repetition. As this exam is still new and many people are interested, I had the chance to explain the certification exam both internally and externally. Through communication, people realize that I am a Node.js professional and what that means.

Would you recommend other Node.js developers taking the certification?

Yes, I definitely recommend it. One of the pros of the exam is their comprehension of the scope to effectively learn intermediate-level Node.js knowledge. Therefore, I think it can be recommended to anyone at any level (beginner, intermediate, advanced). For beginners, they can use it as their learning goal. They can use the exam’s scope as an order of their studies which is always difficult to design. For intermediates, they can use it to maintain and update their knowledge. Working in the same codebase for a long time often leaves them with fewer chances to learn something outside the scope of the project, so they tend to forget the things they learned in the past. For advanced developers, it can be used to understand the range to be covered when coaching beginner/intermediate level developers. They can see how things that are natural for them are challenging for others.

On the other hand, there are some cons. The big blocker would be the price.[1] I purchased it cheaply using Cyber ​​Monday Sales. If you can negotiate with your employer, I recommend talking to your boss about the above benefits and have them become supporters.

How did you prepare? What advice would you give someone considering taking the exam?

The preparation was challenging as there was very little information about the exam on the web yet, and the official website only mentions Domains & Competencies. The online exam was not even familiar to me, and the process seems to be different from the other online exams that I have taken before. So I used the first time to experience how the exam works and find out what the actual scope is. Then, I immediately retook it by using the free-retake after understanding how to answer properly. The content of the questions themselves were fine for me.

What do you want to do next?

As Node.js has opened my door to the world, I want to give back the same to people in the community. Promoting this exam with problem-solving would be one of the ways to achieve that. Currently, I think that the value of this exam has not been promoted enough, at least in Japan, due to lack of awareness, language barrier, price, and so on. As a volunteer, I will try to contribute to The OpenJS Foundation to solve the problems from my perspective, and I hope this can help nurture more software engineers in the world. Don’t you think it is very exciting to be involved in a kind of “Digital Transformation” at this scale?

[1] Editor’s note: a 30% discount on Node.js Certification has been extended through May 31, 2020. Use promo code ANYWHERE30 to get your discount.

OpenJS Node.js Application Developer (JSNAD)

The OpenJS Node.js Application Developer certification is ideal for the Node.js developer with at least two years of experience working with Node.js.

Get more information and enroll »

OpenJS Node.js Services Developer
(JSNSD)

The OpenJS Node.js Services Developer certification is for the Node.js developer with at least two years of experience creating RESTful servers and services with Node.js.

Get more information and enroll »

Project Update: Node.js version 14 available now

By Blog, Node.js, Project Update

This blog was written by Michael Dawson and Bethany Griggs, with additional contributions from the Node.js Community Committee and the Node.js Technical Steering Committee. This post initially appeared on the Node.js Blog. Node.js is an Impact Project of the OpenJS Foundation.

We’re excited to announce that Node.js 14 was released today! The highlights in this release include improved diagnostics, an upgrade of V8, an experimental Async Local Storage API, hardening of the streams APIs, removal of the Experimental Modules warning, and the removal of some long deprecated APIs.

Node.js 14 replaces Node.js 13 as our current release line. As per the release schedule (https://github.com/nodejs/Release#release-schedule), Node.js 14 will be the `Current` release for the next 6 months, and then promoted to Long-term Support (LTS) in October 2020. As always, corporate users should wait to upgrade their production deployments until October when Node.js is promoted to LTS. However, now is the best time to start testing applications with Node.js 14, and try out new features.

As a reminder — both Node.js 12 and Node.js 10 will remain in long-term support until April 2022 and April 2021 respectively (more details on the LTS strategy here).

Get started now! Learn how to download the latest version here: https://nodejs.org/en/download/current/

Before we dive into the features highlighted for this release, it’s important to note that new features added to the master flow quickly into the current release. This means that significant features become available in minor releases without too much fanfare. We’d like to take this opportunity to highlight some of those in the Node.js 14 release even though they may already have been backported to earlier releases.

Diagnostic Report goes Stable

The diagnostic report will be released as a stable feature in Node.js 14 (it was added as an experimental feature in Node.js 12). This is an important step in the ongoing work within the project to improve and build up the diagnostics available when using Node.js and the ease with which they can be used, with much of this work is pushed forward by the Node.js Diagnostics Working Group.

The diagnostic report feature allows you to generate a report on demand or when certain events occur. This report contains information that can be useful to help diagnose problems in production including crashes, slow performance, memory leaks, high CPU usage, unexpected errors and more. For more information about the diagnostic report feature, see https://medium.com/the-node-js-collection/easily-identify-problems-in-node-js-applications-with-diagnostic-report-dc82370d8029. As a stable feature there will be one less command-line option needed to enable Diagnostic reports and it should be easier for users to enable it in production environments.

V8 upgraded to V8 8.1

As always a new version of the V8 JavaScript engine brings performance tweaks and improvements as well as keeping Node.js up with the ongoing improvements in the language and runtime. This time we also have some naming fun with it being version 8 of V8 (“V8 of V8”).

Highlights of the new JavaScript features include:

  • Optional Chaining — MDN
  • Nullish Coalescing — MDN
  • Intl.DisplayNames  — MDN
  • Enables calendar and numberingSystem options for Intl.DateTimeFormat — MDN

For more information about the new features in V8 checkout the Node.js V8 blog: https://v8.dev/blog.

Experimental Async Local Storage API

The project has been working on APIs to help manage context across Asynchronous Calls over a number of releases. The experimental Async Hooks API was introduced in earlier versions as part of this work. One of the key use cases for Async Hooks was Async Local Storage (also referred to as Continuation Local Storage). There have been a number of npm modules that have provided APIs to address this need, however, over the years these have been tricky to maintain outside of Node.js core and the project reached a consensus that exploring having Node.js provide an API would make sense. The 14.x release brings an experimental Async Local storage API (which was also backported into 13.10) https://nodejs.org/api/async_hooks.html#async_hooks_class_asynclocalstorage. We are looking for the community to try out this API and give us feedback on abstraction model, API interface, use case coverage, functional stability, naming, documentation etc. so that we can work on getting it out of experimental in later releases. The best way to provide feedback is to open an issue in the diagnostics repository here (https://github.com/nodejs/diagnostics/issues) with a title along the lines of “Experience report with AsyncLocalStorage API”.

Streams

This release includes a number of changes marked as SemVer major in the Node.js Streams implementation. These changes are intended to improve consistency across the Streams APIs to remove ambiguity and streamline behaviors across the various parts of Node.js core. As an example, http.OutgoingMessage is similar to stream.Writable and net.Socket behaves exactly like stream.Duplex. A notable change is that the `autoDestroy` option is now defaulted to true, making the stream always call `_destroy` after ending. While we don’t believe these SemVer major changes will affect most applications, as they only change edge cases, if you rely heavily on Streams it would be good to test while Node.js 14 is the current release so that it is ready for when Node.js 14 becomes LTS in October 2020.

Experimental Web Assembly System Interface

Packages written in Web Assembly for Node.js bring the opportunity for better performance and cross-platform support for certain use cases. The 14.x release includes an experimental implementation of the Web Assembly System Interface (WASI) in order to help support these use cases. While not new to Node.js v 14, this is noteworthy as WASI has the potential to significantly simplify the native modules experience. You can read more about it in the API docs: https://nodejs.org/api/wasi.html.

Removal of Experimental Modules Warning

In Node.js 13 we removed the need to include the ` — experimental-modules` flag, but when running EcmaScript Modules in Node.js, this would still result in a warning `ExperimentalWarning: The ESM module loader is experimental.`

As of Node.js 14 there is no longer this warning when using ESM in Node.js. However, the ESM implementation in Node.js remains experimental. As per our stability index: “The feature is not subject to Semantic Versioning rules. Non-backward compatible changes or removal may occur in any future release.” Users should be cautious when using the feature in production environments.

Please keep in mind that the implementation of ESM in Node.js differs from the developer experience you might be familiar with. Most transpilation workflows support features such as optional file extensions or JSON modules that the Node.js ESM implementation does not support. It is highly likely that modules from transpiled environments will require a certain degree of refactoring to work in Node.js. It is worth mentioning that many of our design decisions were made with two primary goals. Spec compliance and Web Compatibility. It is our belief that the current implementation offers a future proof model to authoring ESM modules that paves the path to Universal JavaScript. Please read more in our documentation.

The ESM implementation in Node.js is still experimental but we do believe that we are getting very close to being able to call ESM in Node.js “stable”. Removing the warning is a huge step in that direction.

New compiler and platform minimums

Node.js provides pre-built binaries for a number of different platforms. For each major release, the minimum toolchains are assessed and raised where appropriate.

This release coincides with us moving all of our macOS binaries to be compiled on macOS 10.15 (Catalina) with Xcode 11 to support package notarization. As binaries are still being compiled to support the respective compile targets for the release lines, we do not anticipate this having a negative impact on Node.js users on older versions of macOS. For Node.js 14, we’ve bumped the minimum macOS target version to macOS 10.13 (High Sierra).

On our Linux based platforms, for Node.js 14 the minimum GCC level remains at GCC 6, however, we plan to build/release the binaries for some of the platforms with GCC 8.

Node.js 14 will also not run on End-of-Life Windows distributions.

Further details are available in the Node.js BUILDING.md.

Call to action

For the 6 months, while it is in the ‘current’ phase, Node.js 14 will receive the most new features that are contributed to Node.js. For the next 6 months, this release line is perfect for trying out the latest features, testing the compatibility of your project with the latest Node.js updates and giving us feedback so that the release is ready to transition to LTS in October.

To download, visit: https://nodejs.org/en/download/current/

Thank you!

We’d like to use this opportunity to say a big thank you to all the contributors and Node.js collaborators that made this release come together. We’d also like to thank the Node.js Build Working Group for ensuring we have the infrastructure to create and test releases and making the necessary upgrades to our toolchains for Node.js 14. 

Maintainers Should Consider Following Node.js’ Release Schedule

By Blog, Node.js

This blog was written by Benjamin Coe. Ben works on the open-source libraries yargs, nyc, and c8, and is a core collaborator on Node.js. He works on the client libraries team at Google. This piece originally appeared on the Node.js Collection. Node.js is an impact project of the OpenJS Foundation.

tldr; Node.js has a tried and true release schedule, supporting LTS versions for 30 months. It offers significant benefits to the community for library maintainers to follow this same schedule:

  • ensuring the ability to take security patches.
  • reducing the burden on maintainers.
  • allowing module authors to take advantage of new platform features sooner.

My opinion of what Node.js versions library maintainers should aim to support has evolved over the years. Let me explain why…

The JavaScript ecosystem in 2014

I joined npm, Inc in April 2014. During this period, releases of Node.js had stalled. Node.js v0.10.x was released in April 2013, and Node.js v0.12.x wouldn’t be released until February 2015.

At the same time, the npm package registry was going through growing pains (see: “Outage Postmortem”“Four hours of partial outage”, etc.).

The state of Node.js and npm in 2014 had side effects on how folks thought about writing libraries: maintainers didn’t need to put mental overhead into deciding what Node.js versions they supported (for years, the answer was 0.10.x); partially owing to npm’s instability, and partially owing to frontend communities not having fully embraced npm for distribution, package dependency trees were smaller.

Small building blocks, like mkdirp, still represented a significant portion of the registry in 2014.

Things would change in the intervening six years…

The JavaScript ecosystem today

In February of 2015, motivated by the io.js fork, The Node.js Foundation was announced. In September of that same year, Node.js v4.0.0 was released. Node.js v4.0.0 merged the io.js and Node.js projects unblocked the release logjam and introduced the 30-month LTS cycle advocated in this article.

Since Node.js v4.0.0, maintainers have been able to count on a regular cadence of releases, pulling in new JavaScript language features (through V8 updates), additions to the standard library (like HTTP/2), and important bug and security fixes.

In parallel, during the period between 2014 and today, npm significantly improved stability, and the frontend community began to consolidate on npm for distribution. The side effect was that many more packages were being published to npm (numbers grew from 50,000 in 2014, to 700,000 in 2018). At the same time, dependency trees grew (the average number of dependencies in 2016 was 35.3, the average number of dependencies in 2018 was 86).

A library maintainer in 2020 has at least three versions of Node.js to think about (the Current, Active, and Maintenance versions). Also, on average, their libraries rely on an increasing number of dependencies… it can be a bit daunting!

A library maintainer in 2020 has at least three versions of Node.js to think about (the Current, Active, and Maintenance versions).

A great way for maintainers to handle the increasing complexity of the JavaScript ecosystem is to consider adopting Node.js’ 30-month LTS schedule for their own libraries.

Here’s how adopting this schedule benefits both the library authors and the community…

Being able to take security patches

A security vulnerability was recently reported for the library minimistminimist is the direct dependency of 14,000 libraries… it’s a transitive dependency of the universe.

The popular templating library Handlebars was bitten by this report through an indirect dependency (optimist). Handlebars was put in a position where it was difficult to silence this security warning without disrupting its users:

  • optimist, deprecated several years earlier, was pinned to an unpatched version(~0.0.1) of minimist.
  • Handlebars itself supported Node.js v0.4.7, making it a breaking change to update to yargs (optimist’s pirate-themed successor).

Although motivated by good intentions (“why not support as many environments as possible?”), when libraries support end-of-life versions of Node.js, it can ultimately end in disruptions for users. Maintainers find themselves bumping a major version as a fire drill, rather than as a scheduled update.

Dropping support for old @nodejs release is a breaking change and it should be released in a major version.— Matteo Collina

The wide adoption of Node.js’ LTS schedule for modules ensures that security patches can always be taken.

Reducing the burden on maintainers

Keeping dependencies up to date is a lot of work (my team at Google landed 1483 pull requests updating dependencies last month), it’s also important:

  • the closer to a dependency’s release you catch an unintended breakage, the more likely it will be quickly fixed or rolled back.
  • keeping dependencies fresh helps ensure that critical vulnerabilities and bug fixes are rolled out to your own users (this avoids the Handlebars/minimist issue discussed).

Tools like Dependabot and Renovate make sure updating dependencies isn’t a maintainer’s full-time job. However, if libraries don’t adhere to the same version support policy, it makes automation difficult. As an example, because of falling behind the scheduled deprecation of Node.js v8.x.x, the library yargs turned off automatic updates for decamelize (opening itself up to all the risks that go along with this).

A lot of open-source is made possible by the volunteer work of maintainers. I can’t think of many things less exciting than the constant auditing of the SemVer ranges advertised in the “engines” fields of dependencies.

The wide adoption of Node.js’ LTS schedule for modules creates consistency and reduces the maintainer burden around updating dependencies.

Helping to evolve the platform

For the last couple of years, I’ve been involved in the Node.js Tooling Group. We’ve advocated for a variety of API improvements for tooling authors, such as recursive directory creationrecursive directory removal, and Source Map support for stack traces.

In Node.js v8.4.0http2 support was addedThis addition is near and dear to my heart since it allows Google’s client libraries (which rely on HTTP/2) to run natively on Node.js.

JavaScript itself is an evolving platform. Node.js regularly updates the V8 JavaScript engine, pulling in new language features, such as async iterators, async/await, and spread operators.

Keeping the Node.js core small will always be an architectural goal of the project. Node.js is, however, an evolving platform.

The wide adoption of Node.js’ LTS release schedule allows module authors to leverage exciting new features that are added to the platform.


What actions am I advocating that library maintainers take?

  1. When you release a new library, set the engines field to the oldest active LTS version of Node.js.
  2. When a Node.js version reaches end-of-life, even if your library is finished, bump a major version updating the engines field to the current oldest active LTS of Node.js.
  3. Consider throwing a helpful exception if your library is used on an unsupported Node.js version.
  4. Consider documenting your version support policy (here’s an example of the one we wrote for my team).

The Node.js Package Maintenance Working Group is developing further recommendations related to library support policies, such as Cloud Native’s Long Term Support for Node.js Modules. This policy takes this article’s recommendations a step further and suggests that module maintainers support a major library version for the lifetime of the Node.js runtime it was released under. This means that, if your library releases v1.0.0 with support for Node v10.x.x, you would continue to backport security and bug fixes to v1.0.0 until Node v10.x.x reaches end-of-life. Committing to this level of support can be a great move for libraries targeting enterprise users (and, they even have a badge!).

Tracking Node.js’ release schedule saves you time, makes libraries more secure, allows you to use Node.js’ fancy new features, and benefits the community as a whole. Let’s get good about doing this together.

You Told Us: OpenJS Node Certification helps you stand out

By Blog, Certification, Node.js

The Node.js industry is mature and there is more demand for Node skills than there are qualified developers. OpenJS Node certifications create new opportunities for developers, and are an excellent way to improve your resume, and more quickly move to projects and jobs that are higher-paying and more fulfilling.

We asked a group of developers who took at least one of the certifications in the past three months about their experiences. Two major themes stand out.

  1. Yes, money is important, but effectively testing your own skills is important, too
  2. A vendor-neutral certification is better and the OpenJS format really challenges you

Nikita Galkin, Independent Contractor, JSFest Program Committee Member, Software Engineer, System Architect, Node.js Tech Speaker, GraphQL Advocate, talked about standing out:

Remote work in the global world has high competition between developers. With a certificate, you are more likely to receive an invitation to the job interview.

I received an offer for an interesting remote project with a good salary at the start of this year. At the tech-interview end I was asked “what this certification is?” and “how was it complicated?”.

I do not think the certification was critical in deciding in my favour, but that was one of the things that made me different from other applicants.

Patrick Heneise, Software Consultant & CEO, Zentered.co, explained it was about testing his own skills:

I wanted to know, after almost 9 years of Node.js where I stand. Having a Certified Node.js badge on my social profile. It’s an easy sign for potential customers and clients that my knowledge has been tested. 

I wasn’t looking for a new job or more money, so I can’t tell if it helped. But definitely helped me to know my strengths and weaknesses, and I found out where I need to improve my own skills.

João Moura, Lead Technical Architect at Isobar Switzerland, likes how OpenJS certification tests differ from vendor-specific tests:

I think it’s a major benefit to have a certification on NodeJS. In the current days, NodeJS is becoming one of the major development infrastructures and I want to be part of that. The certification is one more step in that direction.

From the experience I have, the vendor-specific exams tend to have questions that are there just to show you how great their product is, for example:

“what can this product do?

A: something

B: another thing

C: awesome things

D: all of the above”

The answer is obviously D, and now you have a certificate on that product, congratulations :).

Since this is vendor-neutral, the exam is a lot more directed to see what the user does to solve a specific problem, there is no selling material, the person taking the exam needs to really understand the problem and solve it in a good and quick way. and that for me is a lot more entertaining :).

Justin Dennison, Edutainer at ITProTV, says that completing tasks, instead of answering questions, was closer to a real development environment:

I enjoyed the test-taking experience as it was the first exam that I had taken that was simulated and practical in nature. Instead of answering multiple-choice questions (or any of the other types), I thought that completing the tasks was more akin to my experience in a development environment. The testing was thorough for Node.js as a whole. I feel that vendor-neutral testing allows for an alternative perspective to testing as well as a means to gather community driven requirements.

I took the certification to validate my own understanding and learning. I had been developing using Node.js and teaching Node.js for several years. However, as always, there are times that you will question yourself, “Do I really know or understand what is going on?” Knowing that I was given tasks to complete and was able to complete those task using Node.js was a nice confirmation of my knowledge. 

Amir Elemam, Independent Contractor, liked the format of the test excellent, it resulted in more interesting relationships and projects at work:

The coding labs exam format was totally new for me. On the one hand it was harder because if I wasn’t pretty sure about something, to a point where I would know what to search for, there was no way to even try the question, but on the other hand, I was able to test the code I was writing, so in the end I had a very good sense about my performance, that’s good because the results don’t come out by end of the exam, as it happens with other certification exams I’ve taken.

The first thing was boosted my self- confidence, I no longer have any shred of doubt about my Node.js capabilities. Also, it improved how confident others were about my Node.js skills, which improved relationships and more challenges were given.

Find out more about the OpenJS certification programs, and sign up now!

30% off Node.js Certifications through April 30th

By Announcement, Blog, Certification, Node.js

A Node.js Certification is a great way to showcase your abilities in the job market, and allow companies to find top developer talent — and now these exams are 30%. 

In October, the OpenJS Foundation announced the OpenJS Node.js Application Developer (JSNAD) and OpenJS Node.js Services Developer (JSNSD) certification programs, which are designed to demonstrate competence within the Node.js framework. 

Until April 30, 2020, these certification exams are 30% off the regular $300 per exam cost. Use coupon code ANYWHERE30 to save 30%.

You have up to a year to study and take the exam, yet given that many of our community must stick close to home due to global health concerns, we wanted to lighten the load. Our exams are proctored virtually and exam takers don’t have to travel to testing centers, and can take exams from the comfort and safety of their own homes or workplaces, reducing the time and stress required.

About the Exams
OpenJS Node.js Application Developer (JSNAD)
The OpenJS Node.js Application Developer certification is ideal for the Node.js developer with at least two years of experience working with Node.js. For more information and how to enroll: https://training.linuxfoundation.org/certification/jsnad/

OpenJS Node.js Services Developer (JSNSD)
The OpenJS Node.js Services Developer certification is for the Node.js developer with at least two years of experience creating RESTful servers and services with Node.js. For more information and how to enroll: https://training.linuxfoundation.org/certification/jsnsd/

Both exams are two hours long, performance-based exams delivered via a browser-based terminal and each includes an automatic free retake (if needed). Exams are monitored by a live human proctor and are conducted online in English. Certification is valid for three years and includes a PDF Certificate and a digital badge. Corporate pricing for groups of five or more is available.

Register today to become a Node.js certified developer.


AMA Recap from the Node.js Technical Steering Committee

By AMA, Blog, Node.js

Members of the Technical Steering Committee (TSC) for Node.js gave an informative AMA, which you can watch below. Speakers include Michael Dawson (@mhdawson1), Matteo Collina (@matteocollina), Gireesh Punathil (@gireeshpunam), Gabriel Schulhof (@gabrielschulhof), Bethany Griggs (@BethGriggs_), Colin Ihrig (@cjihrig), and Myles Borins (@MylesBorins).

Full video here

In this AMA, the TSC took questions from the live chat and gave insight into how they got involved. Questions ranged from whether Node.js is good for image processing to thoughts on Deno. The TSC focused on a mix of preexisting and user generated questions.

Beginning with suggestions on how to get involved with Node and ending on the same note, this AMA can inspire individuals to join Node.js.

Video by Section

Introductions (1:08)

How to Get Involved (4:48)

When To Update Your LTS? (13:45)

Is Node Good For Image Processing Applications? (34:45)

Upcoming 14RX (42:00)

What Do You Think About Deno? (44:20)

Yarn v2 Module (51:07)

Wrap Up (53:55)

Photo Credit: Myles Borins

Our next AMA will feature OpenJS Project NodeRED! Submit your questions for the NodeRED team here!