Category

Announcement

Dojo turns 16, New Dojo 7 Delivers Suite of Reactive Material Widgets

By Announcement, Blog, Dojo

Dojo, an OpenJS Foundation Impact Project, just hit a new milestone. Dojo 7 is a progressive framework for modern web applications built with TypeScript. That means Dojo is an essential tool for building modern websites. The Dojo framework scales easily and allows building anything from simple static websites all the way up to enterprise-scale single-page reactive web applications. 

Dojo 7 Widgets takes a step forward in out-of-the-box usability, adding 20+ new widgets and a Material theme that developers can use to build feature-rich applications even faster, including new widgets that are consistent, usable, and easily accessible for important website building blocks like cards, passwords, forms, and more. 

See the Dojo Widgets documentation and examples for more information. 

Dojo’s no flavor-of-the-month JavaScript framework. The Dojo Toolkit was started in 2004 as part of a non-profit organization that was established to promote the adoption of the Dojo Toolkit. In 2016, the foundation merged with jQuery Foundation to become JS Foundation. Then in March 2019 the JS Foundation merged with the Node Foundation to become the OpenJS Foundation. Dojo, therefore, gives the OpenJS Foundation organizational roots that predate the iPhone.

In 2018, modern Dojo arrived with Dojo 2, a complete rewrite and rethink of Dojo into its current form of a lean modern TypeScript-first, batteries included progressive framework. Aligning with modern standards and best practices, the resulting distribution build of Dojo can include zero JavaScript code for statically built websites, or as little as 13KB of compressed JavaScript for full-featured web apps.

Dojo has been used widely over the years by companies such as Cisco, JP Morgan, Esri, Intuit, ADP, Fannie Mae, Daimler, and many more.  Applications created with the Dojo Toolkit more than 10 years ago still work today with only minor adjustments and upgrades.

Modern Dojo is open source software available under the modified BSD license. Developers can try modern Dojo from Code Sandbox, or install Dojo via npm:

npm i @dojo/cli @dojo/cli-create-app -g

Create your first app

dojo create app --name hello-world

Get started with widgets

npm install @dojo/widgets 

Visit dojo.io for documentation, tutorials, cookbooks, and other materials. Read Dojo’s blog on this new release here.

How The Weather Company uses Node.js in production

By Announcement, Blog, ESLint, member blog, Node.js

Using Node.js improved site speed, performance, and scalability

This piece was written by By Noel Madali and originally appeared on the IBM Developer Blog. IBM is a member of the OpenJS Foundation.

The Weather Company uses Node.js to power their weather.com website, a multinational weather information and news website available in 230+ locales and localized in about 60 languages. As an industry leader in audience reach and accuracy, weather.com delivers weather data, forecasts, observations, historical data, news articles, and video.

Because weather.com offers a location-based service that is used throughout the world, its infrastructure must support consistent uptime, speed, and precise data delivery. Scaling the solution to billions of unique locations has created multiple technical challenges and opportunities for the technical team. In this blog post, we cover some of the unique challenges we had to overcome when building weather.com and discuss how we ended up using Node.js to power our internationalized weather application.

Drupal ‘n Angular (DNA): The early days

In 2015, we were a Drupal ‘n Angular (DNA) shop. We unofficially pioneered the industry by marrying Drupal and Angular together to build a modular, content-based website. We used Drupal as our CMS to control content and page configuration, and we used Angular to code front-end modules.

Front-end modules were small blocks of user interfaces that had data and some interactive elements. Content editors would move around the modules to visually create a page and use Drupal to create articles about weather and publish it on the website.

DNA was successful in rapidly expanding the website’s content and giving editors the flexibility to create page content on the fly.

As our usage of DNA grew, we faced many technical issues which ultimately boiled down to three main themes:

  • Poor performance
  • Instability
  • Slower time for developers to fix, enhance, and deploy code (also known as velocity)

Poor performance

Our site suffered from poor performance, with sluggish load times and unreliable availability. This, in turn, directly impacted our ad revenue since a faster page translated into faster ad viewability and more revenue generation.

To address some of our performance concerns, we conducted different front-end experiments.

  • We analyzed and evaluated modules to determine what we could change. For example, we evaluated getting rid of some modules that were not used all the time or we rewrote modules so they wouldn’t use giant JS libraries.
  • We evaluated our usage of a tag manager in reference to ad serving performance.
  • We added lazy-loaded modules to remove the module on first load in order to reduce the amount of JavaScript served to the client.

Instability

Because of the fragile deployment process of using Drupal with Angular, our site suffered from too much downtime. The deployment process was a matter of taking the name of a git branch and entering it into a UI to get released into different environments. There was no real build process, but only version control.

Ultimately, this led to many bad practices that impacted developers including lack of version control methodology, non-reproduceable builds, and the like.

Slower developer velocity

The majority of our developers had front-end experience, but very few them were knowledgeable about the inner workings of Drupal and PHP. As such, features and bug fixes related to PHP were not addressed as quickly due to knowledge gaps.

Large deployments contributed to slower velocity as well as stability issues, where small changes could break the entire site. Since a deployment was the entire codebase (Drupal, Drupal plugins/modules, front-end code, PHP scripts, etc), small code changes in a release could easily get overlooked and not be properly tested, breaking the deployment.

Overall, while we had a few quick wins with DNA, the constant regressions due to the setup forced us to consider alternative paths for our architecture.

Rethinking our architecture to include Node.js

Our first foray into using Node.js was a one-off project for creating a lite experience for weather.com that was completely server-side rendered and had minimal JavaScript. The audience had limited bandwidth and minimal device capabilities (for example, low-end smartphones using Facebook’s Free Basics).

Stakeholders were happy with the lite experience, commenting on the nearly instantaneous page loads. Analyzing this proof-of-concept was important in determining our next steps in our architectural overhaul.

Differing from DNA, the lite experience:

  • Rendered pages as server side only
  • Kept the front-end footprint under 30KB (virtually no JavaScript, little CSS, few images).

We used what we learned with the lite experience to help us and serve our website more performantly. This started with rethinking our DNA architecture.

Metrics to measure success

Before we worked on a new architecture, we had to show our business that a re-architecture was needed. The first thing we had to determine was what to measuring to show success.

We consulted with the Google Ad team to understand how exactly a high-performing webpage impacts business results. Google showed us proof that improving page speed increases ad viewability which translates to revenue.

With that in hand, each day we conducted tests across a set of pages to measure:

  • Speed index
  • Time to first interaction
  • Bytes transferred
  • Time to first ad call

We used a variety of tools to collect our metrics: WebPageTest, Lighthouse, sitespeed.io.

As we compiled a list of these metrics, we were able to judge whether certain experiments were beneficial or not. We used our analysis to determine what needed to change in our architecture to make the site more successful.

While we intended to completely rewrite our DNA website, we acknowledged that we needed to stair step our approach for experimenting with a newer architecture. Using the above methodology, we created a beta page and A/B tested it to verify its success.

From Shark Tank to a beta of our architecture

Recognizing the performance of our original Node.js proof of concept, we held a “Shark Tank” session where we presented and defended different ideal architectures. We evaluated whole frameworks or combinations of libraries like Angular, React, Redux, Ember, lodash, and more.

From this experiment, we collectively agreed to move from our monolithic architecture to a Node.js backend and newer React frontend. Our timeline for this migration was between nine months to a year.

Ultimately, we decided to use a pattern of small JS libraries and tools, similar to that of a UNIX operating system’s tool chain of commands. This pattern gives us the flexibility to swap out one component from the whole application instead of having to refactor large amounts of code to include a new feature.

On the backend, we needed to decouple page creation and page serving. We kept Drupal as a CMS and created a way for documents to be published out to more scalable systems which can be read by other services. We followed the pattern of Backends for Frontends (BFF), which allowed us to decouple our page frontends and allow for more autonomy of our backend downstream systems. We use the documents published by the CMS to deliver pages with content (instead of the traditional method of the CMS monolith serving the pages).

Even though Drupal and PHP can render server-side, our developers were more familiar with JavaScript, so using Node.js to implement isomorphic (universal) rendering of the site increased our development velocity.

flow

Developing with Node.js was an easy focus shift for our previous front-end oriented developers. Since the majority of our developers had a primarily JavaScript background, we stayed away from solutions that revolved around separate server-side languages for rendering.

Over time, we implemented and evolved our usage from our first project. After developing our first few pages, we decided to move away from ExpressJS to Koa to use newer JS standards like async/await. We started with pure React but switched to React-like Inferno.js.

After evaluating many different build systems (gulp, grunt, browserify, systemjs, etc), we decided to use Webpack to facilitate our build process. We saw Webpack’s growing maturity in a fast-paced ecosystem, as well as the pitfalls of its competitors (or lack thereof).

Webpack solved our core issue of DNA’s JS aggregation and minification. With a centralized build process, we could build JS code using a standardized module system, take advantage of the npm ecosystem, and minify the bundles (all during the build process and not during runtime).

Moving from client-side to server-side rendering of the application increased our speed index and got information to the user faster. React helped us in this aspect of universal rendering–being able to share code on both the frontend and backend was crucial to us for server-side rendering and code reuse.

Our first launch of our beta page was a Single Page App (SPA). Traditionally, we had to render each page and location as a hit back to the origin server. With the SPA, we were able to reduce our hits back to the origin server and improve the speed of rendering the next view thanks to universal rendering.

The following image shows how much faster the webpage response was after the SPA was introduced.

flow

As our solution included more Node.js, we were able to take advantage of a lot of the tooling associated with a Node.js ecosystem, including ESLint for linting, Jest for testing, and eventually Yarn for package management.

Linting and testing, as well as a more refined CI/CD pipeline, helped reduce bugs in production. This led to a more mature and stable platform as a whole, higher engineering velocity, and increased developer happiness.

Changing deployment strategies

Recognizing our problems with our DNA deployments, we knew we needed a better solution for delivering code to infrastructure. With our DNA setup, we used a managed system to deploy Drupal. For our new solution, we decided to take advantage of newer, container-based deployment and infrastructure methodologies.

By moving to Docker and Kubernetes, we achieved many best practices:

  • Separating out disparate pages into different services reduces failures
  • Building stateless services allows for less complexity, ease of testing, and scalability
  • Builds are repeatable (Docker images ensure the right artifacts are deployed and consistent) Our Kubernetes deployment allowed us to be truly distributed across four regions and seven clusters, with dozens of services scaled from 3 to 100+ replicas running on 400+ worker nodes, all on IBM Cloud.

Addressing a familiar set of performance issues

After running a successful beta experiment, we continued down the path of migrating pages into our new architecture. Over time, some familiar issues cropped up:

  • Pages became heavier
  • Build times were slower
  • Developer velocity decreased

We had to evolve our architecture to address these issues.

Beta v2: Creating a more performant page

Our second evolution of the architecture was a renaissance (rebirth). We had to go back to the basics and revisit our lite experience and see why it was successful. We analyzed our performance issues and came to a conclusion that the SPA was becoming a performance bottleneck. Although SPA benefits second page visits, we came to an understanding that majority of our users visit the website and leave once they get their information.

We designed and built the solution without a SPA, but kept React hydration in order to keep code reuse across the server and client-side. We paid more attention to the tooling during development by ensuring that code coverage (the percentage of JS client code used vs delivered) was more efficient.

Removing the SPA overall was key to reducing build times as well. Since a page was no longer stitched together from a singular entry point, we split the Webpack builds so that individual pages can have their own set of JS and assets.

We were able to reduce our page weight even more compared to the Beta site. Reducing page weight had an overall impact on page load times. The graph below shows how speed index decreased.

flow Note: Some data was lost in between January through October of 2019.

This architecture is now our foundation for any and all pages on weather.com.

Conclusion

weather.com was not transformed overnight and it took a lot of work to get where we are today. Adding Node.js to our ecosystem required some amount of trial and error.

We achieved success by understanding our issues, collecting metrics, and implementing and then reimplementing solutions. Our journey was an evolution. Not only was it a change to our back end, but we had to be smarter on the front end to achieve the best performance. Changing our deployment strategy and infrastructure allowed us to achieve multiple best practices, reduce downtimes, and improve overall system stability. JavaScript being used in both the back end and front end improved developer velocity.

As we continue to architect, evolve, and expand our solution, we are always looking for ways to improve. Check out weather.com on your desktop, or for our newer/more performant version, check out our mobile web version on your mobile device.

Introducing OpenJS Foundation Open Office Hours

By Announcement, Blog, Office Hours

This piece was written by Joe Sepi, OpenJS Foundation Cross Project Council Chair

Kai Cataldo from ESlint during a recent Office Hours.
Kai Cataldo from ESlint during a recent Office Hours.

Earlier this year, to help our community better understand ways to participate, as well to provide hosted projects ways to showcase what they are working on, I started hosting bi-weekly Open Office Hours. 

The goal of Office Hours is to give members of our community a place to ask questions, get guidance on onboarding, and learn more about other projects in the Foundation. It has also served as a place for current projects to get connected to the wider OpenJS Foundation community and share key learnings. 

So far, we’ve had Wes Todd from Express Project, Alexandr Tovmach from Node.js i18n, Saulo Nunes talking through ways to contribute to Node.js and Kai Cataldo from ESlint. You can find all the previously recorded sessions and upcoming schedule at github.com/openjs-foundation/open-office-hours

Everyone is invited to attend.

How Can I Join?
These meetings will take place every other Thursday at 10 am PT, 12 pm CT, 1 pm ET and are scheduled on the OpenJS Public Calendar. Here’s the zoom link to join these sessions. 

While everyone is encouraged to join the call and the initiative, if you are unable to attend a session but would like to get more involved or have more questions, please open an issue in the repo.

Let’s do more in open source together!

Project News: Electron, releases a new version

By Announcement, Blog, Electron, Project Update

Congrats to the Electron team on their latest version release, Electron 9.0!

It includes upgrades to Chromium 83, V8 8.3, and Node.js 12.14. They’ve added several new API integrations for their spellchecker feature, enabled PDF viewer, and much more!

Read about all the details on the Electron blog here.

Learn more about Electron and why it has joined the Foundation as an incubation project.

New Node.js Training Course Supports Developers in their Certification, Technical and Career Goals

By Announcement, Blog, Certification, Node.js

Last October, the OpenJS Foundation in partnership with The Linux Foundation, released two Node.js certification exams to better support Node.js developers through showcasing their skills in the JavaScript framework. Today, we are thrilled to unveil the next phase of the OpenJS certification and training program with a new training course, LFW211 – Node.js Application Development.

LFW211 is a vendor-neutral training geared toward developers who wish to master and demonstrate creating Node.js applications. The course trains developers on a broad range of Node.js capabilities at depth, equipping them with rigorous foundational skills and knowledge that will translate to building any kind of Node.js application or library.

By the end of the course, participants:

  • Understand foundational essentials for Node.js and JavaScript development
  • Become skillful with Node.js debugging practices and tools
  • Efficiently interact at a high level with I/O, binary data and system metadata
  • Attain proficiency in creating and consuming ecosystem/innersource libraries

Node.js Application Development also will help prepare those planning to take the OpenJS Node.js Application Developer certification exam. A bundled offering including access to both the training course and certification exam is also available.

Thank you to David Clements who developed this key training. Dave is a Principal Architect, public speaker, author of the Node Cookbook, and open source creator specializing in Node.js and browser JavaScript. David is also one of the technical leads and authors of the official OpenJS Node.js Application Developer Certification and OpenJS Node.js Services Developer Certification.

Node.js is one of the most popular JavaScript frameworks in the world. It powers hundreds of thousands of websites, including some of the most popular like Google, IBM, Microsoft and Netflix. Individual developers and enterprises use Node.js to power many of their most important web applications, making it essential to maintain a stable pool of qualified talent.

Ready to take the training? The course is available now. The $299 course fee – or $499 for a bundled offering of both the course and related certification exam – provides unlimited access to the course for one year to all content and labs. This course and exam, in addition to all Linux Foundation training courses and certification exams, are discounted 30% through May 31 by using code ANYWHERE30. Interested individuals may enroll here.

OpenJS World Announces Full Schedule

By Announcement, Blog, OpenJS World

Join the open source JavaScript community at OpenJS Foundation’s free virtual global conference

The OpenJS Foundation is excited to announce the full schedule of keynote speakers, sessions and workshops for OpenJS World, the Foundation’s annual global conference. From June 23 to 24, developers, software architects, engineers, and other community members from OpenJS Foundation hosted projects such as AMP, Dojo, Electron, and Node.js will tune in to network, learn and collaborate. 

We will also use this time to celebrate the 25th anniversary of JavaScript. OpenJS World will showcase several key JavaScript contributors, many of whom will be leading JavaScript into the next 25 years.

Due to continuing COVID-19 safety concerns, OpenJS World 2020 will now take place as a free virtual experience, at the same dates and time: June 23 – June 24 on the US Central Time Zone. If you have already registered and paid, we will be in touch with you about your refund.

The conference will include inspiring keynotes, informative presentations, and hands-on workshops that are aimed to help the OpenJS community better understand the latest and greatest of JavaScript technologies. 

Today we are excited to announce keynote speakers, sessions and hands-on workshops that will be joining us at OpenJS World! 

Keynote speakers

Session Highlights Include

  • Chronicles of the Node.jsEcosystem: The Consumer, The Author, and The Maintainer – Bethany Griggs, Open Source Engineer and Node.js TSC Member, IBM
  • Deno, a Secure Runtime for JavaScript and TypeScript – Ryan Dahl, Engineer, Deno Land
  • Fighting Impostor Syndrome with the Internet of Things – Tilde Thurium, Developer Evangelist, Twilio
  • From Streaming to Studio – The Evolution of Node.js at Netflix – Guilherme Hermeto, Senior Platform Engineer at Netflix
  • Hint, Hint!: Best Practices for Web Developers with webhint – Rachel Simone Weil, Edge DevTools Program Manager, Microsoft
  • Machine Learning for JavaScript Developers 101 –  Jason Mayes, Senior Developer Advocate for TensorFlow.js, Google
  • User-Centric Testing for 2020: How Expedia Modernized its Testing for the Web – Tiffany Le-Nguyen, Software Development Engineer, ExpediaGroup

The conference covers a range of topics for developers and end-users alike including frameworks, security, serverless, diagnostics, education, IoT, AI, front-end engineering, and much more. 

Interested in participating online in OpenJS World? Register now

Also, sponsorships for this year’s event are available now. If you are interested in sponsoring, check out the event prospectus for details and benefits. 

For new and current contributors, maintainers, and collaborators to the Foundation, we are hosting the OpenJS Foundation Collaborator Summit on June 22, 25 and 26th. This event is an ideal time for people interested or working on projects to share, learn, and get to know each other. Learn more about registering for the OpenJS Collaborator Summit. 

Thank you to the OpenJS World program committee for their tireless efforts in bringing in and selecting top tier keynote speakers and interesting and informative sessions. We are honored to work with such a dedicated and supportive community!

30% off Node.js Certifications through April 30th

By Announcement, Blog, Certification, Node.js

A Node.js Certification is a great way to showcase your abilities in the job market, and allow companies to find top developer talent — and now these exams are 30%. 

In October, the OpenJS Foundation announced the OpenJS Node.js Application Developer (JSNAD) and OpenJS Node.js Services Developer (JSNSD) certification programs, which are designed to demonstrate competence within the Node.js framework. 

Until April 30, 2020, these certification exams are 30% off the regular $300 per exam cost. Use coupon code ANYWHERE30 to save 30%.

You have up to a year to study and take the exam, yet given that many of our community must stick close to home due to global health concerns, we wanted to lighten the load. Our exams are proctored virtually and exam takers don’t have to travel to testing centers, and can take exams from the comfort and safety of their own homes or workplaces, reducing the time and stress required.

About the Exams
OpenJS Node.js Application Developer (JSNAD)
The OpenJS Node.js Application Developer certification is ideal for the Node.js developer with at least two years of experience working with Node.js. For more information and how to enroll: https://training.linuxfoundation.org/certification/jsnad/

OpenJS Node.js Services Developer (JSNSD)
The OpenJS Node.js Services Developer certification is for the Node.js developer with at least two years of experience creating RESTful servers and services with Node.js. For more information and how to enroll: https://training.linuxfoundation.org/certification/jsnsd/

Both exams are two hours long, performance-based exams delivered via a browser-based terminal and each includes an automatic free retake (if needed). Exams are monitored by a live human proctor and are conducted online in English. Certification is valid for three years and includes a PDF Certificate and a digital badge. Corporate pricing for groups of five or more is available.

Register today to become a Node.js certified developer.


Project News: WebdriverIO ships v6

By Announcement, Blog, Project Updates, WebdriverIO

Kudos to the WebdriverIO team for their recent v 6 release. Webdriver, a hosted project at the OpenJS Foundation, is a Next-gen browser automation test framework for Node.js

Big updates include:

Drop Node v8 Support
WebDriver has dropped support for Node v8, which was deprecated by the Node.js team at the start of 2020. It is not recommended to run any systems using that version anymore. It is strongly advised to switch to Node v12 which will be supported until April 2022.

Automation Protocol is now Default
Because of the great success of automation tools like Puppeteer and Cypress.io it became obvious that the WebDriver protocol in its current shape and form doesn’t meet the requirements of today’s developer and automation engineers. Members of the WebdriverIO project are part of the W3C Working Group that defines the WebDriver specification and they work together with browser vendors on solutions to improve the current state of the art. Thanks to folks from Microsoft there already proposals about a new bidirectional connection similar to other automation protocols like Chrome Devtools.

Performance Improvements
A big goal with the new release was to make WebdriverIO more performant and faster. Running tests on Puppeteer can already speed up local execution. Additionally, v6 has replaced the heavy dependency to request which has been fully depreciated as of February 11th 2020. With that, the bundle size of the webdriver and webdriverio package has been decreased by 4x.

These are only a few things that the v6 release brings. Read the full blog on the WebdriverIO site

30% off Node.js Certifications through April 7

By Announcement, Blog, Certification, Node.js

A Node.js Certification is a great way to showcase your abilities in the job market, and allow companies to find top developer talent — and now these exams are 30%

In October, the OpenJS Foundation announced the OpenJS Node.js Application Developer (JSNAD) and OpenJS Node.js Services Developer (JSNSD) certification programs, which are designed to demonstrate competence within the Node.js framework. 

From now until April 7, 2020, these certification exams are 30% off the regular $300 per exam cost. Use coupon code ANYWHERE30 to save 30%.

You have up to a year to study and take the exam, yet given that many of our community must stick close to home due to global health concerns, we wanted to lighten the load. Our exams are proctored virtually and exam takers don’t have to travel to testing centers, and can take exams from the comfort and safety of their own homes or workplaces, reducing the time and stress required.

About the Exams
OpenJS Node.js Application Developer (JSNAD)
The OpenJS Node.js Application Developer certification is ideal for the Node.js developer with at least two years of experience working with Node.js. For more information and how to enroll: https://training.linuxfoundation.org/certification/jsnad/

OpenJS Node.js Services Developer (JSNSD)
The OpenJS Node.js Services Developer certification is for the Node.js developer with at least two years of experience creating RESTful servers and services with Node.js. For more information and how to enroll: https://training.linuxfoundation.org/certification/jsnsd/

Both exams are two hours long, performance-based exams delivered via a browser-based terminal and each includes an automatic free retake (if needed). Exams are monitored by a live human proctor and are conducted online in English. Certification is valid for three years and includes a PDF Certificate and a digital badge. Corporate pricing for groups of five or more is available.

Register today to become a Node.js certified developer.


Tutorial: Use The Weather Company’s APIs to build a Node-RED weather dashboard

By Announcement, Blog, Node-RED, tutorial

Build a hyper-local weather dashboard

This blog post was written by John Walicki, CTO for Edge/IoT Advocacy in the Developer Ecosystem Group of IBM Cognitive Applications Group and originally published on IBM Developer.

Learn how to build a weather dashboard using a personal weather station, Node-RED, Weather Underground, and The Weather Company APIs and the node-red-contrib-twc-weather nodes. This tutorial demonstrates how to display hyper-local weather information from a residential or farming weather station.

Learning objectives

In this tutorial, you will:

  • Learn the basics of personal weather stations (PWS)
  • Connect your PWS to Weather Underground (WU) and view PWS data on WU
  • Register for a The Weather Company (TWC) API key
  • Get started with the TWC API documentation
  • Learn about Node-RED (local and on IBM Cloud)
  • Explore the node-red-contrib-twc-weather Node-RED PWS node examples
  • Import / Deploy the Weather Dashboard example
  • Display PWS data in your Weather Dashboard
  • Build a Severe Weather Alert Map Node-RED Dashboard using TWC APIs
  • Build a Call for Code Water Sustainability solution

Prerequisites

npm install node-red-contrib-twc-weather node-red-dashboard node-red-node-ui-table node-red-contrib-web-worldmap
  • Send your PWS data to http://www.wunderground.com and retrieve your PWS API key
  • If you don’t have a PWS, you can still get a time-restricted TWC API key by joining Call for Code (which gives you access to most of the TWC PWS APIs)

Estimated time

Completing this tutorial should take about 30 minutes.

Steps

Introduction to personal weather stations

Wikipedia defines a personal weather station as a set of weather measuring instruments operated by a private individual, club, association, or business (where obtaining and distributing weather data is not a part of the entity’s business operation). Personal weather stations have become more advanced and can include many different sensors to measure weather conditions. These sensors can vary between models but most measure wind speed, wind direction, outdoor and indoor temperatures, outdoor and indoor humidity, barometric pressure, rainfall, and UV or solar radiation. Other available sensors can measure soil moisture, soil temperature, and leaf wetness.

The cost of a sufficiently-accurate personal weather station is less than $200 USD; they have become affordable for citizen scientists and weather buffs.

Connect your PWS to Weather Underground

Weather Underground PWS device

Many PWS brands offer the ability to connect and send weather data to cloud based services. Weather Underground, a part of The Weather Company, an IBM Business, encourages members to register their PWS and send data to http://www.wunderground.com

Weather Underground PWS data

Members can view their personal weather station data on Weather Underground 

Get a TWC API key and get started with the TWC API documentation

In addition to the wunderground.com dashboard, the PWS data is available through your API Key and a set of robust TWC Restful APIs. Copy your API Key and click on the View API Documentation button.

Weather Underground API Key

Register for a TWC API key

If you don’t have a Personal Weather Station, you can still register for a time-restricted TWC API key by joining Call for Code 2020. The API Key is valid from March 1 to October 15, 2020. This API key gives you access to most of the TWC Personal Weather Station APIs. You can complete this tutorial using this API key.

Learn about Node-RED

Node-RED is an open source programming tool for wiring together hardware devices, APIs, and online services in new and interesting ways. It provides a browser-based editor that makes it easy to wire together flows using the wide range of nodes in the palette that can be deployed to its runtime in a single-click.

Follow these instructions to install Node-RED locally or Create a Node-RED Starter application in the IBM Cloud

Install node-red-contrib-twc-weather nodes

Once Node-RED is installed, add the dependencies for this tutorial:

npm install node-red-contrib-twc-weather node-red-dashboard node-red-node-ui-table node-red-contrib-web-worldmap

Explore node-red-contrib-twc-weather Node-RED PWS node examples

The node-red-contrib-twc-weather GitHub repository includes an example flow that exercises each of the Node-RED PWS APIs. You can learn about the nodes and their configuration options by clicking on each node and reading its comprehensive node information tab. Import this PWS-Examples.json flow into your Node-RED Editor and Deploy the flow. Don’t forget to paste in your TWC PWS API key. If you want to explore personal weather station data but don’t have your own PWS, you can query the weather station data at the Ridgewood Fire Headquarters using the StationID KNJRIDGE9

PWS Example Flow

Import / Deploy the weather dashboard example

Now that the Node-RED node-red-contrib-twc-weather nodes are able to query weather data, let’s build an example Weather Node-RED Dashboard that displays Personal Weather Station current and historical data on a map, in a table, a gauge, and on a chart. The PWS API key includes access to the TWC 5 Day Forecast, which is displayed with weather-lite icons. This flow requires node-red-dashboard, node-red-node-ui-table, and node-red-contrib-web-worldmap. Import this PWS-Dashboard.json flow and Deploy the flow.

Display PWS data in your weather dashboard

Launch the Node-RED Dashboard and experiment with the current conditions, forecast, and map. The Call for Code TWC API key might not have access to private PWS historical data.

PWS Dashboard

Build a Severe Weather Alert Map Node-RED Dashboard using TWC APIs

In addition to the node-red-contrib-twc-weather Node-RED nodes, you can review the TWC Severe Weather API Documentation and use the http request node and your API Key to make calls directly.

The The Weather Company APIs includes an API to query all of the current Severe Weather Alerts issued by the National Weather Service. This next example plots those Severe Weather Alerts on a Node-RED Dashboard.

This example flow and Node-RED Dashboard might be useful as part of a Call for Code solution.

Display Severe Weather Alerts on a map

Severe Weather Alert Dashboard

Get the Code: Node-RED flow for Severe Weather Alerts

Summary

Build a Call for Code Water Sustainability solution!

Now that you have completed this tutorial, you are ready to modify these example flows and Node-RED Dashboard to build a Call for Code Water Sustainability solution.