RESPONSIBLE JAVASCRIPT JEREMY WAGNER — HTTPS://JEREMY.CODES/ NEJS — OMAHA, NEBRASKA — AUGUST 2019

Hi, I’m Jeremy, thanks for coming to my talk, and thanks for NEJS for inviting me.

This talk is based on a series of articles for A List Apart called Responsible JavaScript, which is sort of a collection of ideas and techniques about getting JavaScript performance under control. So if you like this talk, you might like these articles. Links to resources will be in the slide deck, which I’ll post to noti.st later on.

SPHEXISHNESS

To start, I want to talk about a word I stumbled on years ago: “Sphexishness”, which is an unusual word that has some relevance to our work.

SPHEXISH (of animal behavior) deterministic; preprogrammed

To be “sphexish” means to exhibit deterministic and preprogrammed behaviors.

SPHEX Sphex pensylvanicus

The the root word “sphex” is a name for a genus of solitary wasps. [REVEAL WASP IMAGE]

I promise you that this is not an entomology talk. These wasps don’t just act in a preprogrammed fashion, they can also be easily manipulated.

This wasp provisions larvae with paralyzed crickets. When they bring prey back to the nest, they begin a routine. Before dragging the cricket into the nest, the wasp leaves it outside, then inspects its nest. This behavior seems thoughtful, but it isn’t, really. Because if an observer moves the cricket before the wasp reemerges, it will set the cricket back to where it was before inspecting the nest again. This cycle can go on endlessly without the wasp ever catching on.

npm install react npm install react-dom

6.5 KB 103.7 KB npm install react-router 21.6 KB npm install react-redux 14.4 KB Now, I didn’t come to Omaha to be a big jerk and imply that you’re mindless. Yet, there are some decisions involved in our work that we make without question. For example, when we begin a new project, we open a terminal and install a familiar framework… [SHOW REACT, REACT-DOM]

…and then possibly a client-side router for the framework. [SHOW REACT-ROUTER]

…and then possibly a state management library… [SHOW REACT-REDUX] - …and all the while, we’re unaware of—or have even made peace with—the overhead these conveniences bring.

This matters because the amount of JavaScript we serve has steadily increased over the years, to the point where it has become a major performance concern. Half the sites you visit send 375 KB or less of JavaScript. The 75th percentile sends at least 650 KB… …and 90th percentile sends at least one megabyte of JavaScript. These graphs are generated from HTTP Archive data, which, among other things, tracks the transfer size of JavaScript, which is often compressed. While compression is essential to loading performance, it doesn’t change that fact that when a megabyte of compressed JavaScript is downloaded, it decompresses to a significantly larger amount that browsers must parse, compile, and execute.

If you’re using a high-end device on a fast network, you probably won’t feel how really slow this can be…

…but on less capable hardware such as this affordable, but much slower, Moto G4 Android phone, chewing through tons of JavaScript is a slog.

SCRIPT

RENDER PAINT NET OTHER That’s worth paying attention to, because when devices, networks—or both—are slow, using the web becomes more difficult. [START ANIMATION]

At the bottom of this WebPageTest timeline is the main thread activity indicator. When it’s green, the browser has bandwidth more work. When it’s red, the browser can’t do anything else until it’s done with whatever is blocking the main thread. Pair that with a slow network, and you can imagine how tiresome the web can be to use for many people.

Understanding constraints is key to writing good software. The best video games ever made were a megabyte. Sometimes far less. Game developers of the time not only had vision, they also understood the constraints on their work, but their constraints were fixed to the hardware they produced games for. Our constraints are not fixed. They change significantly from person to person. In some ways, that makes our job much more difficult. But that doesn’t mean we can’t make great experiences on the web that work for everyone, everywhere.

ANTI-SPHEXISHNESS

So let’s talk about how we can turn that sphexishness… [REVEAL HEADING]

…into anti-sphexishness, for the good of the web, and for all who use it.

PAINT THE PICTURE NOT THE FRAME

There’s a phrase I came across recently. It goes… [REVEAL PHRASE]

Paint the picture, not the frame.

It comes from an article by Eric Bailey about accessibility and UX, and it’s a clever way of saying we shouldn’t reinvent things the browser already does well.

Eric advises us that we should not subvert a person’s expectations by changing externally consistent behaviors. Examples of external consistency might be the default behaviors of HTML elements, or the appearance of a scrollbar. When we disrupt external consistency, we may impede people in unexpected ways. One way we do this is when we fail to use semantic HTML, and instead rely on JavaScript to reimplement or approximate those behaviors. This can result in websites which are harder to use for those who rely on assistive technology.

import React, { Component } from “react”; import { validateEmail } from “helpers/validation”; class SignupForm extends Component { constructor (props) { super(props); } this.handleSubmit = this.handleSubmit.bind(this); this.updateEmail = this.updateEmail.bind(this); this.state.email = “”; updateEmail (event) { this.setState({ email: event.target.value }); } handleSubmit () { // If the email checks out, submit if (validateEmail(this.state.email)) { // … } } }

render () { return ( <div> <span class=”email-label”>Enter your email:</span> <input type=”text” id=”email” onChange={this.updateEmail} /> <div class=”submit-button” onClick={this.handleSubmit}>Sign Up</div> </div> ); } Let’s take this example React component, which is a newsletter subscription form. This component has an input field, a corresponding label, and a submit button. All in a single <div>. You may have opinions on what’s wrong here, but the solution doesn’t require more JavaScript. It requires less. Let’s dive into the form JSX.

render () { return ( <div> <span class=”email-label”>Enter your email:</span> <input type=”text” id=”email” onChange={this.updateEmail} /> <div class=”submit-button” onClick={this.handleSubmit}>Sign Up</div> </div> ); }

There are three things wrong: - One: a form isn’t a form unless it uses a <form> tag. - <div>s are not intrinsically flawed, they lack semantic meaning by design. - But this is a form. A form should always use a <form> tag, because that has meaning to assistive technologies. - Two: when we label inputs, a <label> element should be used with aforattribute that corresponds to anidon the input. - This lets assistive technologies know that a given input has an associated label. - Three: while<div>s can be coded to behave and look like buttons, doing so robs a button of any semantic meaning it would otherwise have if it was just a<button>element. - Plus, a<button>` element’s default behavior within a form is to submit that form. This makes for a more resilient solution for when—not if—JavaScript fails to run.

render () { return ( <form method=”POST” action=”/signup” onSubmit={handleSubmit}> <label for=”email” class=”email-label”>Enter your email:</label> <input type=”email” id=”email” required /> <button>Sign Up</button> </form> ); }

Here’s the refactored markup, of which every part now has semantic meaning assistive technologies can use. Assuming the component is server-rendered, it will also still work if scripts fail to run. Note that the submit event handler has been moved from the <button>’s onClick’ event to the<form>’sonSubmitevent. This is helpful for when we want to intercept a form’ssubmit` event if we want to enhance this form’s behavior with client-side scripts.

import React from “react”; const SignupForm = function (props) { const handleSubmit = function (event) { // Needed in case we’re sending data to the server XHR-style // (but will still work if server-rendered with JS disabled). event.preventDefault(); // Carry on… }; };

return ( <form method=”POST” action=”/signup” onSubmit={handleSubmit}> <label for=”email” class=”email-label”>Enter your email:</label> <input type=”email” id=”email” required /> <button>Sign Up</button> </form> ); Here’s the final component code. Additionally, because email validation is now handled through HTML, we can remove the email validation script entirely. Of course, we should always sanitize our inputs on the server. Any opportunity where you can remove some client-side script and get things a bit lighter should be a welcome change.

External consistency isn’t limited to HTML, CSS, and JavaScript. We expect browsers themselves to behave in predictable fashion. One of the most common subversions of this predictability is the SPA, or Single Page Application. I don’t hate SPAs, but… …the navigation behavior they replace is one that browsers already do well.

1 MS 2.07 S 5.24 S CLIENT-SIDE RENDERING

When we embrace client-side routing, we take on a whole host of new responsibilities the browser once managed for us. - History must be managed… - …tabindex and scrolling position must be accounted for… - Navigation cancelling can fail… - …and so on. [SHOW SLIDE CONTENT]

Even if we get client-side routing perfect, performance is affected if that content is not server-rendered. Furthermore, when we fail to send contentful markup from the server, the page’s contents are inaccessible if JavaScript fails.

1 MS 2.07 S 5.24 S SERVER-SIDE RENDERING (WITH CLIENT-SIDE HYDRATION)

When we rely on standard synchronous navigation behavior, we do lose a degree of snappiness, but we retain that coveted external consistency. That’s not to say client-side routers are always bad, filthy things, but using them requires extra care on your part. For example, you’ll need to provide server-side equivalents for all your client-side routes so people have a way to reliably access any part of your site from any context. [SHOW CLIENT-SIDE HYDRATION NOTE]

And then, if components are attached to server side markup through client-side hydration, people get a progressively enhanced experience.

<link rel=”prefetch” href=”/products/snes-console”> - If you want to avoid SPAs, but want to make navigations snappier, link prefetching may fit the bill. It can seriously boosts loading performance by fetching page HTML in advance of a user requesting it. It’s not perfect. It could potentially waste data if not done carefully.

To address these potential shortcomings, the Google Chrome team offers a very small link prefetching script. It will only prefetch links as they appear in the viewport, when the main thread is idle, and if the network isn’t slow.

BROWSERS GIVE US A LOT OF FREE STUFF

Now, I know I’m prattling on about all the free stuff the browser gives us, but the point remains: the browser gives us a lot for free. Let’s use that free stuff whenever possible, so we can focus instead on more challenging problems.

THE TOOLS ARE NOT INFALLIBLE

Another tenet of my Responsible JavaScript philosophy consists of a fundamental truth: [REVEAL TITLE CARD]

The tools are not infallible. A hammer can help you build something, or it can break your fingers. Understanding how the tools work is a part of creating fast and accessible websites.

PHOTO CREDIT: JOHN HOEY

One tool many of us use when we need the JavaScript we write to work everywhere is Babel. Babel is valuable, but we tend not to see how it can harm performance We would all benefit if we could transpile less, because the way Babel transforms our code can add a lot to our production code. It helps to know how Babel transforms the code we write, so we can compensate for its inefficiencies.

// Untransformed code: function logger(message, level = “log”) { consolelevel; }

Here’s an example console logging wrapper function, which accepts message and level parameters. The second parameter is the log level, with a default of "log".

// Babel-transformed code: function logger(message) { var level = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : “log”; consolelevel; }

Default parameters are nice, but Babel transforms them inefficiently, and repeats that inefficient transform every time default parameters are used. If we can’t avoid Babel altogether, we should try to compensate for this stuff.

// Code that Babel won’t touch: function logger(message, level) { consolelevel || “log”; } // ^^^^^^^^^^^^^^

We can avoid this specific transform by replacing the default parameter with an OR check. When we want to assign a default to an “optional” parameter, we perform a check where the left side of the OR is the parameter itself, and the right side is the default. If the level parameter is omitted, the right side of the OR condition is used.

export class User { constructor (id, name, email) { this.id = id; this.name = name; this.email = email; } getId () { return this.id; } getName () { return this.name; } }

getEmail () { return this.email; } Default parameters are only one feature that Babel transforms. Let’s take ES6 classes as another example.

“use strict”; Object.defineProperty(exports, “__esModule”, { value: true }); exports.User = void 0; function _classCallCheck(instance, Constructor) { if (!(instance instanceof Constructor)) { throw new TypeError(“Cannot call a class as a function”); } } function _defineProperties(target, props) { for (var i = 0; i < props.length; i++) { var descriptor = props[i]; descriptor.enumerable = descriptor.enumerable || false; descriptor.configurable = true; if (“value” in descriptor) descriptor.writable = true; Object.defineProperty(target, descriptor.key, descriptor); } } function _createClass(Constructor, protoProps, staticProps) { if (protoProps) _defineProperties(Constructor.prototype, protoProps); if (staticProps) _defineProperties(Constructor, staticProps); return Constructor; } var User = /#PURE/ function () { function User(id, name, email) { _classCallCheck(this, User); } this.id = id; this.name = name; this.email = email; _createClass(User, [{ key: “getId”, value: function getId() { return this.id; } }, { key: “getName”, value: function getName() { return this.name; } }, { key: “getEmail”, value: function getEmail() { return this.email; } }]); return User; }(); exports.User = User;

The way Babel transforms them is expensive. Babel adds a lot to ensure ES6 classes work everywhere. You can mitigate this cost in one of a few ways. - One, you could use the prototype pattern and avoid ES6 classes altogether. - Two, you could use @babel/plugin-transform-runtime to deduplicate the helpers Babel adds to reduce their impact across an entire project. - Or three, if you only need to support modern browsers, you could drop Babel altogether. If you can do this, it’s your best bet.

PHOTO CREDIT: MINTAREN

How we write JavaScript isn’t the only thing to consider when using Babel, we also need to know how to configure Babel as well.

TOTAL BUNDLE SIZE: ~117 KB - Here’s a webpack bundle analysis for an example app which uses a Babel configuration that isn’t finely tuned. It sits at roughly 117 KB, most of it comprised of polyfills.

presets: [ [ “@babel/preset-env”, { modules: false, useBuiltIns: “entry”, corejs: 3, targets: “> 0.25%, IE > 10, Firefox ESR, not dead” } ] ]

Polyfilling is something Babel is used a lot for. If you’re familiar with @babel/preset-env, this code may look familiar. However, it’s worth taking a second look at the useBuiltIns option, which uses core-js to polyfill features. When useBuiltIns is set to "entry", core-js itself must be added as an entry point, which adds more polyfills than we might need.

presets: [ [ “@babel/preset-env”, { modules: false, useBuiltIns: “usage”, corejs: 3, targets: “> 0.25%, IE > 10, Firefox ESR, not dead” } ] ]

But, if we change the value of useBuiltIns from "entry" to "usage", we can remove core-js as an entry point and Babel will only polyfill features that are actually used. This can seriously reduce how many polyfills get used.

presets: [ [ “@babel/preset-env”, { modules: false, useBuiltIns: “usage”, loose: true, corejs: 3, targets: “> 0.25%, IE > 10, Firefox ESR, not dead” } ] ]

While we’re here, there’s another config option we should pay attention to which toggles something called “loose mode”, which is when Babel transforms your code “loosely”. This means that Babel’s output adheres less strictly to the ECMAScript standard. Loose transforms are bit smaller, and still work in many cases, and they can be enabled by setting the “loose” option to true. Loose mode isn’t bulletproof, though. You could have issues if you move from transpiled ES6 to untranspiled ES6 later on, but if the savings are worth it, you can always address this problem if it comes up.

TOTAL BUNDLE SIZE: ~56.09 KB - After making these two quick configuration changes, we’ve reduced the size of our bundle by 52%. That’s a big deal. With half as much code, this app will be faster, especially for devices with limited processing power and memory.

DIFFERENTIAL SERVING

In addition, a novel way of serving less JavaScript has recently emerged called “differential serving”, which involves serving one of two bundles to users based on their browser’s capabilities. Legacy browsers get bundles with more transforms and polyfills, while modern browsers get smaller bundles with little to none of those things. The outcome is that an app functions identically in either case, but with substantially less code for those using modern browsers.

<!— The way we’ve always done it: —> <script defer src=”/js/app.js”></script>

Of course, we need a way to load these bundles properly. What you see here is how we’ve always loaded JavaScript.

<!— Modern browsers get this: —> <script type=”module” src=”/js/app.mjs”></script> <!— Legacy browsers get this: —> <script nomodule defer src=”/js/app.js”></script>

The pattern shown here is how we can differentially serve scripts. The first <script> tag loads a bundle for modern browsers. - Adding type=module ensures this script gets picked up by modern browsers. The second <script > element loads a bundle for legacy browsers. - nomodule ensures modern browsers will decline to download the affected script. - Legacy browsers don’t understand nomodule, so they download the script anyway.

// Config for legacy browsers presets: [ [ “@babel/preset-env”, { modules: false, useBuiltIns: “usage”, targets: “> 0.25%, IE > 10, Firefox ESR” } ] ]

Configuring your toolchain to generate these bundles is involved, but doable. First, you need to create two separate Babel configurations: one for legacy bundles, and one for modern bundles. This configuration is typical of what you’d see in a lot of projects which transform code that’s compatible in all browsers.

// Config for modern browsers presets: [ [ “@babel/preset-env”, { modules: false, targets: { esmodules: true } } ] ]

Now this is a configuration for generating bundles for modern browsers. You’ll notice that useBuiltIns is gone. That’s because this configuration is for a project which needs no polyfills for modern browsers. Depending on the language features you use, you may need to retain useBuiltIns, but you probably don’t need it. Instead of a browserslist query, we’ve supplied an option named esmodules set to ‘true’, which translates to a browserslist query for browsers that support ES6 modules. This works because browsers that support ES6 modules also support other modern features, such as async/await, arrow functions, and so on.

// babel.config.js module.exports = { env: { clientLegacy: { presets: [ [ “@babel/preset-env”, { modules: false, targets: “> 0.25%, IE > 10, Firefox ESR” } ] ] }, clientModern: { presets: [ [ “@babel/preset-env”, { modules: false, targets: { esmodules: true } } ] ] } } };

We can group these configs together under an env object in our Babel config. clientLegacy is a config for legacy browsers while clientModern is a config for modern ones. Then, in our bundler config, we can point to these separate Babel configs.

// Legacy config … module: { rules: [ { test: /.m?js$/i, exclude: /node_modules/i, use: [ { loader: “babel-loader”, options: { envName: “clientLegacy” } } ] } ] }, // …

In webpack, this is a typical example of how babel-loader ensures that scripts get processed by Babel. Note the envName option, which points to a configuration in the env object in the Babel config from the previous slide.

// Modern config … module: { rules: [ { test: /.m?js$/i, exclude: /node_modules/i, use: [ { loader: “babel-loader”, options: { envName: “clientModern” } } ] } ] }, // …

By creating a separate webpack config and pointing to the clientModern Babel config, you can generate a smaller bundle of your code for modern browsers with identical functionality.

LEGACY BROWSERS: 68.48 KB - The size reduction between these bundles depends. Sometimes you might only get 5 to 10 percent, but some projects could see more. This is a bundle analysis of an example app’s legacy bundle. It’s already small at around 68 KB.

MODERN BROWSERS: 26.75 KB - But with differential serving we can go from small to nano, and deliver this app to modern browsers in 40% of the size of its legacy counterpart.

But beware: some browsers may have issues with the platform-provided pattern for differentially serving scripts. If you want to know more, you can check out this article I wrote about some of the pitfalls, as well as learn how you can circumvent them.

BE ACCOMMODATING

Finally, this leads us into a discussion about what it means to be accommodating, because when we deploy something to the web, we have to be a steward of that thing.

In The U.S., many people live in large cities which are typically well-served by fast broadband and mobile internet connections. Yet, this article by the MIT Technology Review revealed that 58 percent of households in the Cleveland metro area with yearly incomes under $20,000 had no broadband internet access. These are people who rely on mobile internet connections, often with data caps, to access the web.

More striking is this passage, in which Pew Research found that one third of Americans don’t have an internet connection in their homes faster than dial-up. I doubt this has improved significantly since the article was written. The economic and infrastructure challenges haven’t been sufficiently addressed to broaden broadband access.

PHOTO CREDIT: BAS VAN SCHAIK

If you’re serving lots of assets, high latency or low bandwidth can make your site functionally inaccessible to some. Thankfully, a technology called Client Hints supported in Chromium-based browsers can help us bridge the divide.

RTT Approximate round trip time (ms)

Client Hints help developers understand the characteristics of both a person’s device and the network it’s connected to. There are lots of client hints, but here are the three I feel are most useful… [SHOW RTT]

The first is RTT—or Round Trip Time—which is the approximate latency of a user’s connection in milliseconds.

Downlink Approximate download speed (kbps)

Downlink is the approximate downstream bandwidth in kilobits per second.

ECT Effective connection type (“4g”, “3g”, “2g”, “slow-2g”)

The next is ECT—or Effective Connection Type—which is an enumerated string that categorizes the user’s connection based on both the RTT and Downlink hints.

Accept-CH: RTT, Downlink, ECT Accept-CH-Lifetime: 86400

These hints help us tailor experiences so that we send less stuff to those on slow connections. We can opt into these hints with the Accept-CH HTTP request header. [SHOW Accept-CH]

And we can tell the client how long we want those hints to persist to the client with the Accept-CH-Lifetime header. [SHOW Accept-CH-Lifetime]

In the above example, the RTT, Downlink, and ECT hints will persist on the client for a day.

<?php $ect = “4g”; if (isset($_SERVER[“HTTP_ECT”])) { $ect = $_SERVER[“HTTP_ECT”]; } ?>

Then, you can access these hints as request headers via a server-side language. Here for example, we initialize a variable with a default effective connection type of ”4g”. We do this for browsers that don’t support client hints, for which we’ll assume a fast connection by default. Then, we check if the ECT hint has been sent as a request header. If it has, we overwrite the variable with that header’s value.

<?php if ($ect === “4g” || $ect === “3g”) { ?> <div class=”carousel”> <!— Carousel content… —> </div> <script defer src=”/js/carousel.js”></script> <?php } ?>

With that information, we can create lighter experiences for those who need it most. For example, we can decide a person will only see a carousel if they’re on a fast connection. Otherwise, we compensate by sending them only what they really need.

ADAPTIVE PERFORMANCE

I call this “Adaptive Performance”, and it’s a way to create experiences that are more inclusive by being aware of shifting network conditions.

22 REQUESTS, 740 KB 91.26 SECONDS OVER 2G

5 REQUESTS, 12 KB 5.17 SECONDS OVER 2G And it works! Here are two versions of the same site: The version on the left has web fonts, a carousel, accordions, and JavaScript to run it all… [SHOW STATS]

…which is functionally inaccessible on 2G. But with Client Hints, we can boil this experience down to its core when networks are slow. [SHOW STATS]

For our trouble, affected users will have something they can access more quickly than the ideal experience.

If you want to learn more about Client Hints, you can check out this guide I wrote for Google Web Fundamentals.

FIGURE OUT WHAT PEOPLE WANT AND WORK BACKWARD FROM THERE

I’d like to close this talk on what I think is an important point. [SHOW TOP LINE]

Which is that we need to first figure out what people want from what we build for the web. By which I mean, what purpose are we serving? [SHOW BOTTOM LINE]

We then need to work backward from there and build something which serves that purpose with care.

Regardless of profession, craftspeople love their tools. As developers, we’re no different. We take pride in building great things with the tools we have. But unlike, say, the mechanic who fixes your car, the tools we use can have a direct and felt impact. We don’t need to burden people with the entire toolbox—or toolshed.

Sometimes it makes more sense to use smaller tools which are more focused on the actual work. Your experience as a developer is important, but it is never more important than the user’s experience. If your excitement for a certain set of tools causes you to build things that no longer efficiently serve the purpose you set out to fulfill, it’s time to re-evaluate them

And it’s my hope that eventually, we all can come to find our own ways of serving our collective purpose with utilitarian precision for the benefit of all who use the web. [FADE OUT IMAGE]

Even if that sometimes means that to get there, we don’t always need JavaScript.

THANK YOU JEREMY WAGNER — @MALCHATA — JEREMY.CODES