Category

Blog

Dojo turns 16, New Dojo 7 Delivers Suite of Reactive Material Widgets

By Announcement, Blog, Dojo

Dojo, an OpenJS Foundation Impact Project, just hit a new milestone. Dojo 7 is a progressive framework for modern web applications built with TypeScript. That means Dojo is an essential tool for building modern websites. The Dojo framework scales easily and allows building anything from simple static websites all the way up to enterprise-scale single-page reactive web applications. 

Dojo 7 Widgets takes a step forward in out-of-the-box usability, adding 20+ new widgets and a Material theme that developers can use to build feature-rich applications even faster, including new widgets that are consistent, usable, and easily accessible for important website building blocks like cards, passwords, forms, and more. 

See the Dojo Widgets documentation and examples for more information. 

Dojo’s no flavor-of-the-month JavaScript framework. The Dojo Toolkit was started in 2004 as part of a non-profit organization that was established to promote the adoption of the Dojo Toolkit. In 2016, the foundation merged with jQuery Foundation to become JS Foundation. Then in March 2019 the JS Foundation merged with the Node Foundation to become the OpenJS Foundation. Dojo, therefore, gives the OpenJS Foundation organizational roots that predate the iPhone.

In 2018, modern Dojo arrived with Dojo 2, a complete rewrite and rethink of Dojo into its current form of a lean modern TypeScript-first, batteries included progressive framework. Aligning with modern standards and best practices, the resulting distribution build of Dojo can include zero JavaScript code for statically built websites, or as little as 13KB of compressed JavaScript for full-featured web apps.

Dojo has been used widely over the years by companies such as Cisco, JP Morgan, Esri, Intuit, ADP, Fannie Mae, Daimler, and many more.  Applications created with the Dojo Toolkit more than 10 years ago still work today with only minor adjustments and upgrades.

Modern Dojo is open source software available under the modified BSD license. Developers can try modern Dojo from Code Sandbox, or install Dojo via npm:

npm i @dojo/cli @dojo/cli-create-app -g

Create your first app

dojo create app --name hello-world

Get started with widgets

npm install @dojo/widgets 

Visit dojo.io for documentation, tutorials, cookbooks, and other materials. Read Dojo’s blog on this new release here.

How The Weather Company uses Node.js in production

By Announcement, Blog, ESLint, member blog, Node.js

Using Node.js improved site speed, performance, and scalability

This piece was written by By Noel Madali and originally appeared on the IBM Developer Blog. IBM is a member of the OpenJS Foundation.

The Weather Company uses Node.js to power their weather.com website, a multinational weather information and news website available in 230+ locales and localized in about 60 languages. As an industry leader in audience reach and accuracy, weather.com delivers weather data, forecasts, observations, historical data, news articles, and video.

Because weather.com offers a location-based service that is used throughout the world, its infrastructure must support consistent uptime, speed, and precise data delivery. Scaling the solution to billions of unique locations has created multiple technical challenges and opportunities for the technical team. In this blog post, we cover some of the unique challenges we had to overcome when building weather.com and discuss how we ended up using Node.js to power our internationalized weather application.

Drupal ‘n Angular (DNA): The early days

In 2015, we were a Drupal ‘n Angular (DNA) shop. We unofficially pioneered the industry by marrying Drupal and Angular together to build a modular, content-based website. We used Drupal as our CMS to control content and page configuration, and we used Angular to code front-end modules.

Front-end modules were small blocks of user interfaces that had data and some interactive elements. Content editors would move around the modules to visually create a page and use Drupal to create articles about weather and publish it on the website.

DNA was successful in rapidly expanding the website’s content and giving editors the flexibility to create page content on the fly.

As our usage of DNA grew, we faced many technical issues which ultimately boiled down to three main themes:

  • Poor performance
  • Instability
  • Slower time for developers to fix, enhance, and deploy code (also known as velocity)

Poor performance

Our site suffered from poor performance, with sluggish load times and unreliable availability. This, in turn, directly impacted our ad revenue since a faster page translated into faster ad viewability and more revenue generation.

To address some of our performance concerns, we conducted different front-end experiments.

  • We analyzed and evaluated modules to determine what we could change. For example, we evaluated getting rid of some modules that were not used all the time or we rewrote modules so they wouldn’t use giant JS libraries.
  • We evaluated our usage of a tag manager in reference to ad serving performance.
  • We added lazy-loaded modules to remove the module on first load in order to reduce the amount of JavaScript served to the client.

Instability

Because of the fragile deployment process of using Drupal with Angular, our site suffered from too much downtime. The deployment process was a matter of taking the name of a git branch and entering it into a UI to get released into different environments. There was no real build process, but only version control.

Ultimately, this led to many bad practices that impacted developers including lack of version control methodology, non-reproduceable builds, and the like.

Slower developer velocity

The majority of our developers had front-end experience, but very few them were knowledgeable about the inner workings of Drupal and PHP. As such, features and bug fixes related to PHP were not addressed as quickly due to knowledge gaps.

Large deployments contributed to slower velocity as well as stability issues, where small changes could break the entire site. Since a deployment was the entire codebase (Drupal, Drupal plugins/modules, front-end code, PHP scripts, etc), small code changes in a release could easily get overlooked and not be properly tested, breaking the deployment.

Overall, while we had a few quick wins with DNA, the constant regressions due to the setup forced us to consider alternative paths for our architecture.

Rethinking our architecture to include Node.js

Our first foray into using Node.js was a one-off project for creating a lite experience for weather.com that was completely server-side rendered and had minimal JavaScript. The audience had limited bandwidth and minimal device capabilities (for example, low-end smartphones using Facebook’s Free Basics).

Stakeholders were happy with the lite experience, commenting on the nearly instantaneous page loads. Analyzing this proof-of-concept was important in determining our next steps in our architectural overhaul.

Differing from DNA, the lite experience:

  • Rendered pages as server side only
  • Kept the front-end footprint under 30KB (virtually no JavaScript, little CSS, few images).

We used what we learned with the lite experience to help us and serve our website more performantly. This started with rethinking our DNA architecture.

Metrics to measure success

Before we worked on a new architecture, we had to show our business that a re-architecture was needed. The first thing we had to determine was what to measuring to show success.

We consulted with the Google Ad team to understand how exactly a high-performing webpage impacts business results. Google showed us proof that improving page speed increases ad viewability which translates to revenue.

With that in hand, each day we conducted tests across a set of pages to measure:

  • Speed index
  • Time to first interaction
  • Bytes transferred
  • Time to first ad call

We used a variety of tools to collect our metrics: WebPageTest, Lighthouse, sitespeed.io.

As we compiled a list of these metrics, we were able to judge whether certain experiments were beneficial or not. We used our analysis to determine what needed to change in our architecture to make the site more successful.

While we intended to completely rewrite our DNA website, we acknowledged that we needed to stair step our approach for experimenting with a newer architecture. Using the above methodology, we created a beta page and A/B tested it to verify its success.

From Shark Tank to a beta of our architecture

Recognizing the performance of our original Node.js proof of concept, we held a “Shark Tank” session where we presented and defended different ideal architectures. We evaluated whole frameworks or combinations of libraries like Angular, React, Redux, Ember, lodash, and more.

From this experiment, we collectively agreed to move from our monolithic architecture to a Node.js backend and newer React frontend. Our timeline for this migration was between nine months to a year.

Ultimately, we decided to use a pattern of small JS libraries and tools, similar to that of a UNIX operating system’s tool chain of commands. This pattern gives us the flexibility to swap out one component from the whole application instead of having to refactor large amounts of code to include a new feature.

On the backend, we needed to decouple page creation and page serving. We kept Drupal as a CMS and created a way for documents to be published out to more scalable systems which can be read by other services. We followed the pattern of Backends for Frontends (BFF), which allowed us to decouple our page frontends and allow for more autonomy of our backend downstream systems. We use the documents published by the CMS to deliver pages with content (instead of the traditional method of the CMS monolith serving the pages).

Even though Drupal and PHP can render server-side, our developers were more familiar with JavaScript, so using Node.js to implement isomorphic (universal) rendering of the site increased our development velocity.

flow

Developing with Node.js was an easy focus shift for our previous front-end oriented developers. Since the majority of our developers had a primarily JavaScript background, we stayed away from solutions that revolved around separate server-side languages for rendering.

Over time, we implemented and evolved our usage from our first project. After developing our first few pages, we decided to move away from ExpressJS to Koa to use newer JS standards like async/await. We started with pure React but switched to React-like Inferno.js.

After evaluating many different build systems (gulp, grunt, browserify, systemjs, etc), we decided to use Webpack to facilitate our build process. We saw Webpack’s growing maturity in a fast-paced ecosystem, as well as the pitfalls of its competitors (or lack thereof).

Webpack solved our core issue of DNA’s JS aggregation and minification. With a centralized build process, we could build JS code using a standardized module system, take advantage of the npm ecosystem, and minify the bundles (all during the build process and not during runtime).

Moving from client-side to server-side rendering of the application increased our speed index and got information to the user faster. React helped us in this aspect of universal rendering–being able to share code on both the frontend and backend was crucial to us for server-side rendering and code reuse.

Our first launch of our beta page was a Single Page App (SPA). Traditionally, we had to render each page and location as a hit back to the origin server. With the SPA, we were able to reduce our hits back to the origin server and improve the speed of rendering the next view thanks to universal rendering.

The following image shows how much faster the webpage response was after the SPA was introduced.

flow

As our solution included more Node.js, we were able to take advantage of a lot of the tooling associated with a Node.js ecosystem, including ESLint for linting, Jest for testing, and eventually Yarn for package management.

Linting and testing, as well as a more refined CI/CD pipeline, helped reduce bugs in production. This led to a more mature and stable platform as a whole, higher engineering velocity, and increased developer happiness.

Changing deployment strategies

Recognizing our problems with our DNA deployments, we knew we needed a better solution for delivering code to infrastructure. With our DNA setup, we used a managed system to deploy Drupal. For our new solution, we decided to take advantage of newer, container-based deployment and infrastructure methodologies.

By moving to Docker and Kubernetes, we achieved many best practices:

  • Separating out disparate pages into different services reduces failures
  • Building stateless services allows for less complexity, ease of testing, and scalability
  • Builds are repeatable (Docker images ensure the right artifacts are deployed and consistent) Our Kubernetes deployment allowed us to be truly distributed across four regions and seven clusters, with dozens of services scaled from 3 to 100+ replicas running on 400+ worker nodes, all on IBM Cloud.

Addressing a familiar set of performance issues

After running a successful beta experiment, we continued down the path of migrating pages into our new architecture. Over time, some familiar issues cropped up:

  • Pages became heavier
  • Build times were slower
  • Developer velocity decreased

We had to evolve our architecture to address these issues.

Beta v2: Creating a more performant page

Our second evolution of the architecture was a renaissance (rebirth). We had to go back to the basics and revisit our lite experience and see why it was successful. We analyzed our performance issues and came to a conclusion that the SPA was becoming a performance bottleneck. Although SPA benefits second page visits, we came to an understanding that majority of our users visit the website and leave once they get their information.

We designed and built the solution without a SPA, but kept React hydration in order to keep code reuse across the server and client-side. We paid more attention to the tooling during development by ensuring that code coverage (the percentage of JS client code used vs delivered) was more efficient.

Removing the SPA overall was key to reducing build times as well. Since a page was no longer stitched together from a singular entry point, we split the Webpack builds so that individual pages can have their own set of JS and assets.

We were able to reduce our page weight even more compared to the Beta site. Reducing page weight had an overall impact on page load times. The graph below shows how speed index decreased.

flow Note: Some data was lost in between January through October of 2019.

This architecture is now our foundation for any and all pages on weather.com.

Conclusion

weather.com was not transformed overnight and it took a lot of work to get where we are today. Adding Node.js to our ecosystem required some amount of trial and error.

We achieved success by understanding our issues, collecting metrics, and implementing and then reimplementing solutions. Our journey was an evolution. Not only was it a change to our back end, but we had to be smarter on the front end to achieve the best performance. Changing our deployment strategy and infrastructure allowed us to achieve multiple best practices, reduce downtimes, and improve overall system stability. JavaScript being used in both the back end and front end improved developer velocity.

As we continue to architect, evolve, and expand our solution, we are always looking for ways to improve. Check out weather.com on your desktop, or for our newer/more performant version, check out our mobile web version on your mobile device.

Introducing OpenJS Foundation Open Office Hours

By Announcement, Blog, Office Hours

This piece was written by Joe Sepi, OpenJS Foundation Cross Project Council Chair

Kai Cataldo from ESlint during a recent Office Hours.
Kai Cataldo from ESlint during a recent Office Hours.

Earlier this year, to help our community better understand ways to participate, as well to provide hosted projects ways to showcase what they are working on, I started hosting bi-weekly Open Office Hours. 

The goal of Office Hours is to give members of our community a place to ask questions, get guidance on onboarding, and learn more about other projects in the Foundation. It has also served as a place for current projects to get connected to the wider OpenJS Foundation community and share key learnings. 

So far, we’ve had Wes Todd from Express Project, Alexandr Tovmach from Node.js i18n, Saulo Nunes talking through ways to contribute to Node.js and Kai Cataldo from ESlint. You can find all the previously recorded sessions and upcoming schedule at github.com/openjs-foundation/open-office-hours

Everyone is invited to attend.

How Can I Join?
These meetings will take place every other Thursday at 10 am PT, 12 pm CT, 1 pm ET and are scheduled on the OpenJS Public Calendar. Here’s the zoom link to join these sessions. 

While everyone is encouraged to join the call and the initiative, if you are unable to attend a session but would like to get more involved or have more questions, please open an issue in the repo.

Let’s do more in open source together!

Project News: Electron, releases a new version

By Announcement, Blog, Electron, Project Update

Congrats to the Electron team on their latest version release, Electron 9.0!

It includes upgrades to Chromium 83, V8 8.3, and Node.js 12.14. They’ve added several new API integrations for their spellchecker feature, enabled PDF viewer, and much more!

Read about all the details on the Electron blog here.

Learn more about Electron and why it has joined the Foundation as an incubation project.

New Node.js Training Course Supports Developers in their Certification, Technical and Career Goals

By Announcement, Blog, Certification, Node.js

Last October, the OpenJS Foundation in partnership with The Linux Foundation, released two Node.js certification exams to better support Node.js developers through showcasing their skills in the JavaScript framework. Today, we are thrilled to unveil the next phase of the OpenJS certification and training program with a new training course, LFW211 – Node.js Application Development.

LFW211 is a vendor-neutral training geared toward developers who wish to master and demonstrate creating Node.js applications. The course trains developers on a broad range of Node.js capabilities at depth, equipping them with rigorous foundational skills and knowledge that will translate to building any kind of Node.js application or library.

By the end of the course, participants:

  • Understand foundational essentials for Node.js and JavaScript development
  • Become skillful with Node.js debugging practices and tools
  • Efficiently interact at a high level with I/O, binary data and system metadata
  • Attain proficiency in creating and consuming ecosystem/innersource libraries

Node.js Application Development also will help prepare those planning to take the OpenJS Node.js Application Developer certification exam. A bundled offering including access to both the training course and certification exam is also available.

Thank you to David Clements who developed this key training. Dave is a Principal Architect, public speaker, author of the Node Cookbook, and open source creator specializing in Node.js and browser JavaScript. David is also one of the technical leads and authors of the official OpenJS Node.js Application Developer Certification and OpenJS Node.js Services Developer Certification.

Node.js is one of the most popular JavaScript frameworks in the world. It powers hundreds of thousands of websites, including some of the most popular like Google, IBM, Microsoft and Netflix. Individual developers and enterprises use Node.js to power many of their most important web applications, making it essential to maintain a stable pool of qualified talent.

Ready to take the training? The course is available now. The $299 course fee – or $499 for a bundled offering of both the course and related certification exam – provides unlimited access to the course for one year to all content and labs. This course and exam, in addition to all Linux Foundation training courses and certification exams, are discounted 30% through May 31 by using code ANYWHERE30. Interested individuals may enroll here.

Node-RED Creators AMA Recap

By AMA, Blog, Node-RED
Node-RED AMA participants answer community questions live.

The creators of Node-RED recently gave an informative Ask Me Anything (AMA) which you can watch below. Node-RED is a Growth Project at the OpenJS Foundation. Speakers include Nick O’Leary (@knolleary), Dave Conway-Jones (@ceejay), and John Walicki (@johnwalicki).

This AMA can help individuals interested in Node-RED get a better understanding of the frameworking tool. Using a combination of user generated and preexisting questions, the discussion focuses heavily on the processes employed by the creators of Node-RED to optimize the tool.

The creators of Node-RED answered questions from the live chat, giving insight into how Node-RED is iterated and improved. Questions ranged from where Node-RED has gone in the last 7 years to whether or not Node-RED is a prototyping tool. 

Full Video Here

Video by Section 

  1. Introductions (0:00)
  2. What’s been going on last 7 years (2:21
  3. Did you have use cases in mind? (4:46)
  4. What’s it been like to work with open source? (7:10)
  5. Why is Node red so popular in iOT space? (9:30)
  6. Where else is node red popular? (12:25)
  7. How do you answer the question “is it a prototyping tool?” (15:00)
  8. Where does Node-RED fit in the low programming world? (17:20)
  9. 2020 Recap, what’s next? (20:00)
  10.  New features in 1.1? (23:10)
  11.  Flow change for nodes? (26:00)
  12.  Thoughts about encryption? (28:40)
  13.  Do you see Node-RED scaling? (31:50)
  14.  Best practices for sharing readable flows (34:15)
  15.  Do you have large applications and flows being created now? (37:15)
  16.  What would you say to a developer who should use Node-RED? (40:00)
  17.  What can developers help? (41:25)
  18.  Open is a mindset, how do you wade through forums and open source? (43:30)
  19.  YouTube Q, on the edge constrained environment modeling? (45:40)
  20.  Node Red + AI? (48:50)
  21.  POV on containerization (50:30)
  22.  Refreshing node red dashboard, thoughts on replacing framework? (53:40)
  23.  Node Red conference? (57:45)
  24.  Last thoughts? (59:25)

Our next AMA is with the Node.js Security Working Group on June 3 at 9 am PT. Submit your questions here.

What’s New With Node? Interview With Bethany Griggs, Node.js Technical Steering Committee

By Blog, In The News, Node.js
Bethany Griggs

In a recent interview with DZone, Bethany Griggs, Node.js Technical Steering Committee member and Open-source Engineer at IBM gave some insight into the recent Node.js v14 release as well as the latest in Node.js overall. Topics covered include changes with the project pertaining to contributor onboarding, getting started in Node.js, challenges, and highlights of Node.js v14. Node.js is an impact project of the OpenJS Foundation.

Bethany has been a Node Core Collaborator for over two years. She contributes to the open-source Node.js runtime and is a member of the Node.js Release Working Group where she is involved with auditing commits for the long-term support (LTS) release lines and the creation of releases. 

Bethany presents and runs workshops at international conferences, including at NodeSummit, London Node.js User Group, and NodeConfEU.

Read the full article here: https://dzone.com/articles/interview-with-bethany-griggs

Project News: What’s new in ESLint v7.0.0

By Blog, ESLint, Project Update

ESLint is an open source JavaScript linting utility and an At-Large project at the OpenJS Foundation. 

Congrats to the ESLint team on their most recent release, v7.0.0! This new release brings new updates including improved developer experience, core rule changes, new ESLint class, and much more. 

ESLint v7.0.0 Highlights:

Dropping support for Node.js v8

Node.js 8 reached EOL in December 2019, and we are officially dropping support for it in this release.

Core rule changes

  1. The ten Node.js/CommonJS rules in core have been deprecated and moved to the eslint-plugin-node plugin.
  2. Several rules have been updated to recognize bigint literals and warn on more cases by default.
  3. eslint:recommended has been updated with a few new rules: no-dupe-else-if, no-import-assign, and no-setter-return.

Improved developer experience

  1. The default ignore patterns have been updated. ESLint will no longer ignore .eslintrc.js and bower_components/* by default. Additionally, it will now ignore nested node_modules directories by default.
  2. ESLint will now lint files with extensions other than .js if they are explicitly defined in overrides[].files – no need to use the --ext flag!
  3. ESLint now supports descriptions in directive comments, so things like disable comments can now be clearly documented!
  4. Additional validation has been added to the RuleTester class to improve testing custom rules in plugins.
  5. ESLint will now resolve plugins relative to the entry configuration file. This means that shared configuration files that are located outside the project can now be colocated with the plugins they require.
  6. Starting in ESLint v7, configuration files and ignore files passed to ESLint using the –config path/to/a-config and –ignore-path path/to/a-ignore CLI flags, respectively, will resolve from the current working directory rather than the file location. This allows for users to utilize shared plugins without having to install them directly in their project.

New ESLint class

  1. The CLIEngine class provides a synchronous API that is blocking the implementation of features such as parallel linting, supporting ES modules in shareable configs/parsers/plugins/formatters, and adding the ability to visually display the progress of linting runs. The new ESLint class provides an asynchronous API that ESLint core will now using going forward. CLIEngine will remain in core for the foreseeable future but may be removed in a future major version.

Check out the release notes for all updates here and the migration guide here.

For more information on ESLint and how to get involved go to https://eslint.org/.

Testing My Knowledge with OpenJS Certification, an interview with Daijiro Wachi

By Blog, Certification, Node.js

“Unlearning and Repetition”

An Interview with Daijiro Wachi, Node.js Core Collaborator

Daijiro is a Node.js Core Collaborator.

Along with Node, he contributes to other repositories related to the JavaScript ecosystem such as npm and URL Standard. He helps organize JSConf JP in Tokyo and presents at conferences around the world including Tokyo, Amsterdam and San Francisco. He works as a Digital Consultant at a management consulting firm and accelerates “Digital Transformation” for new businesses, covering core business optimization for clients around planning, implementation and capability building.

In his spare time, Daijiro helps kids and beginners learn coding at CoderDojo and NodeSchool.

Why did you want to get certified?

I chose certification as a way to continually maintain the big picture of fast growing software development in Node.js. To remain a strong problem solver, it needs to continue to acquire both knowledge and experience and my way of acquiring knowledge was through real experience mostly. This approach gives me deep knowledge related to the project. However, that means the expertise I can earn depends on the scope of the project. As both exam JSNAD and JSNSD requires a wide range of intermediate-level knowledge and hands-on to answer to the questions, I thought the exam would be the ideal candidate to validate and maintain my breadth of knowledge.

What's the value for you personally and professionally in getting certified?

Personally, passing the exam helped me to expand my network within the community. After taking the exam, I realized that the exam should provide translated versions to be open for everyone. Then I had a chance to ask the author of the exam, David Clements, to introduce me to Robin Ginn, and I got a chance to deliver the request. During the process, I asked how many people were interested in Japanese to investigate demand in Japan to support my request, and I received questions from people who were interested in the exam and warm words to support the request. It’s always good to connect with people who are interested in the same thing. I hope we can accelerate the translation together.

As a professional, I think there were two benefits. One is unlearning. I thought I knew most of Node.js API and how to use it, but I couldn’t get 100% marks in the exam. Then I decided to reread documents and blogs and set up my server to practice. That was something I could not touch while I am working on a big framework. The second is repetition. As this exam is still new and many people are interested, I had the chance to explain the certification exam both internally and externally. Through communication, people realize that I am a Node.js professional and what that means.

Would you recommend other Node.js developers taking the certification?

Yes, I definitely recommend it. One of the pros of the exam is their comprehension of the scope to effectively learn intermediate-level Node.js knowledge. Therefore, I think it can be recommended to anyone at any level (beginner, intermediate, advanced). For beginners, they can use it as their learning goal. They can use the exam’s scope as an order of their studies which is always difficult to design. For intermediates, they can use it to maintain and update their knowledge. Working in the same codebase for a long time often leaves them with fewer chances to learn something outside the scope of the project, so they tend to forget the things they learned in the past. For advanced developers, it can be used to understand the range to be covered when coaching beginner/intermediate level developers. They can see how things that are natural for them are challenging for others.

On the other hand, there are some cons. The big blocker would be the price.[1] I purchased it cheaply using Cyber ​​Monday Sales. If you can negotiate with your employer, I recommend talking to your boss about the above benefits and have them become supporters.

How did you prepare? What advice would you give someone considering taking the exam?

The preparation was challenging as there was very little information about the exam on the web yet, and the official website only mentions Domains & Competencies. The online exam was not even familiar to me, and the process seems to be different from the other online exams that I have taken before. So I used the first time to experience how the exam works and find out what the actual scope is. Then, I immediately retook it by using the free-retake after understanding how to answer properly. The content of the questions themselves were fine for me.

What do you want to do next?

As Node.js has opened my door to the world, I want to give back the same to people in the community. Promoting this exam with problem-solving would be one of the ways to achieve that. Currently, I think that the value of this exam has not been promoted enough, at least in Japan, due to lack of awareness, language barrier, price, and so on. As a volunteer, I will try to contribute to The OpenJS Foundation to solve the problems from my perspective, and I hope this can help nurture more software engineers in the world. Don’t you think it is very exciting to be involved in a kind of “Digital Transformation” at this scale?

[1] Editor’s note: a 30% discount on Node.js Certification has been extended through May 31, 2020. Use promo code ANYWHERE30 to get your discount.

OpenJS Node.js Application Developer (JSNAD)

The OpenJS Node.js Application Developer certification is ideal for the Node.js developer with at least two years of experience working with Node.js.

Get more information and enroll »

OpenJS Node.js Services Developer
(JSNSD)

The OpenJS Node.js Services Developer certification is for the Node.js developer with at least two years of experience creating RESTful servers and services with Node.js.

Get more information and enroll »

OpenJS World Announces Full Schedule

By Announcement, Blog, OpenJS World

Join the open source JavaScript community at OpenJS Foundation’s free virtual global conference

The OpenJS Foundation is excited to announce the full schedule of keynote speakers, sessions and workshops for OpenJS World, the Foundation’s annual global conference. From June 23 to 24, developers, software architects, engineers, and other community members from OpenJS Foundation hosted projects such as AMP, Dojo, Electron, and Node.js will tune in to network, learn and collaborate. 

We will also use this time to celebrate the 25th anniversary of JavaScript. OpenJS World will showcase several key JavaScript contributors, many of whom will be leading JavaScript into the next 25 years.

Due to continuing COVID-19 safety concerns, OpenJS World 2020 will now take place as a free virtual experience, at the same dates and time: June 23 – June 24 on the US Central Time Zone. If you have already registered and paid, we will be in touch with you about your refund.

The conference will include inspiring keynotes, informative presentations, and hands-on workshops that are aimed to help the OpenJS community better understand the latest and greatest of JavaScript technologies. 

Today we are excited to announce keynote speakers, sessions and hands-on workshops that will be joining us at OpenJS World! 

Keynote speakers

Session Highlights Include

  • Chronicles of the Node.jsEcosystem: The Consumer, The Author, and The Maintainer – Bethany Griggs, Open Source Engineer and Node.js TSC Member, IBM
  • Deno, a Secure Runtime for JavaScript and TypeScript – Ryan Dahl, Engineer, Deno Land
  • Fighting Impostor Syndrome with the Internet of Things – Tilde Thurium, Developer Evangelist, Twilio
  • From Streaming to Studio – The Evolution of Node.js at Netflix – Guilherme Hermeto, Senior Platform Engineer at Netflix
  • Hint, Hint!: Best Practices for Web Developers with webhint – Rachel Simone Weil, Edge DevTools Program Manager, Microsoft
  • Machine Learning for JavaScript Developers 101 –  Jason Mayes, Senior Developer Advocate for TensorFlow.js, Google
  • User-Centric Testing for 2020: How Expedia Modernized its Testing for the Web – Tiffany Le-Nguyen, Software Development Engineer, ExpediaGroup

The conference covers a range of topics for developers and end-users alike including frameworks, security, serverless, diagnostics, education, IoT, AI, front-end engineering, and much more. 

Interested in participating online in OpenJS World? Register now

Also, sponsorships for this year’s event are available now. If you are interested in sponsoring, check out the event prospectus for details and benefits. 

For new and current contributors, maintainers, and collaborators to the Foundation, we are hosting the OpenJS Foundation Collaborator Summit on June 22, 25 and 26th. This event is an ideal time for people interested or working on projects to share, learn, and get to know each other. Learn more about registering for the OpenJS Collaborator Summit. 

Thank you to the OpenJS World program committee for their tireless efforts in bringing in and selecting top tier keynote speakers and interesting and informative sessions. We are honored to work with such a dedicated and supportive community!