Robert Brodrecht

A Rambling Discussion of Performance and The App Versus Web Debate

Category

Tags

Introduction

Several people I follow (and genuinely respect) have been subtly arguing that the web is in danger of going away in favor of native apps. I don't buy it.

The most recent round of this banter has been related to Facebook’s new Instant Articles. While some of the discussion has been around the actually relevant topic of Facebook wrestling control of content away from publishers, a few people are using this as a soapbox to talk trash about the web platform. Mainly, it’s the continued drumbeat from John Gruber as far back as 2013 and Marco Arment on ATP and, framed in a differently, on his blog.

The basic argument — and I hope this is not too much of a strawman — is that native is faster and the web is slower, so native will win out over the web.

But what is “fast?” And why is native “faster?”

My recollection of the original argument against the web platform for poor performance from 2012-ish was related to UI performance: lack of 60fps scrolling and animation, poor response for tap speeds, and the like. I can’t find the articles that made me believe that particularly, but I think web developers have figured out a lot about improving UI performance: use the correct CSS transforms to render on the GPU and avoid jank-inducing designs, make sure you don’t cause layout thrashing, make adjustments to account for browser’s 300ms tap delay, etc.

In reality, the 2012 web backlash also started with Facebook when they moved from HTML5 to native because of load performance. But, even then, if Sencha is right, Facebook may have just had a poorly optimized app.

Whatever the case, it’s definitely load time this go around. Facebook’s Instant Articles, at least it is claimed, is about dealing with slow third-party websites. That’s as legit an excuse as it is a cover up for Facebook wanting to build its nouveau AOL. Google’s been pushing performance for a very long time, even making performance a ranking factor back in 2010. It’s a long-tail game but web developers are figuring out this performance thing.

The main issue, I think, is that most people see a web page as a single transaction, developers and normal people alike. You ask for a page, we give you a page. Much like any transaction, that takes time. On the web, we call it load time.

So, What contributes to load time and how are these things being addressed?

  • Connecting to the server. HTTP 2 is going to help this. It’s a more efficient protocol for loading modern websites that have grown beyond a single HTML file. Future protocols notwithstanding, developers have been doing unnatural things to optimize for HTTP 1.1 for a long time, such as spriting images, concatenating barely related files, using CDNs to spread requests out, and a lot of other techniques.
  • Flaky connections. Offline support via AppCache can help mitigate this, but AppCache is a Douchbag and ServiceWorkers seem a long way off.
  • Loading the CMS. Loading a CMS to render a page is a big performance hit. Database and page caching helps and there are solutions for most popular CMSs. Shared hosting hurts, too, but we could all get a VPS if we wanted to get a modicum of speed for a small site. If you run anything big, the solutions get complicated.
  • Rendering. Rendering takes a little time, especially accounting for blocking CSS and JavaScript load times that people may not be optimizing for (you know, ignoring best practices like putting JavaScript at the end of the document instead of the head). Recently, there has been a lot of research around critical path optimization. Once that gets to be more main stream, that will go a long way in making things feel better.
  • Loading extraneous libraries, ads, plugins, etc. Critical path could help that, but getting developers off of the framework addiction (both JavaScript and CSS frameworks) would go further.
  • Extraneous images. With the proliferation of retina devices and larger screens, designers keep stretching their legs to fill the available real estate. That often means bigger images. Responsive images is the answer to this problem. Responsive images are taking off, but CMSs need to catch up before it’s going to make a big impact.
  • Web fonts. Now that we have them, we are finding out they hurt a lot. Caching to localStorage helps and the W3C is working on Font Loading.
  • Interpreted code penalty. The penalty for all the interpreted code is getting better due to heroic efforts like LLVM JS compiling in Safari.

Browser vendors and web developers are working to get more performant. It took a long time for web standards to get full traction, and there are still people who aren’t doing responsive design. A sea change is not happening overnight. The entire web community isn’t going to embrace everything instantaneously, nor should they since that kind of mindset is what lead to this framework-a-day trend. If you look at leaders in the industry, you’ll see companies who are doing a really good job with performance. Google is really fast. Facebook and Amazon seem pretty fast for the type of site they are. The web can be fast.

So, why is Facebook’s Instant Articles fast? Because caching. The same thing that made Instapaper load articles fast is what makes Instant Articles fast. This isn’t about native. It’s about pre-fetching data and using technologies that would work on the web platform today if developers were so inclined.

Unlike web pages, people see apps as an ongoing relationship rather than single transactions. It’s not that apps blanket-statement perform better (try opening Mail.app to get your e-mail versus going to gmail.com in an already open browser on an older system, for example). It’s that the app platform allows the developer to more easily take advantage of caching and other performance enhancements (e.g. compiled code, which is arguably a form of caching code for processor use where as web pages are “compiled” after load).

Brent Simmons lamented that single web pages are approaching 2MB where his entire app is 5MB. I tweeted at him (Part 1 and Part 2) that the chrome of a website isn’t 2MB. It’s the content. Just like an app, the wrapper for the content may be light weight but the actual content might be enormous. The difference is, as Brent mentions, the app is cached locally. Taking that a step further, an app can request to download content without the user initiating the transaction, so the app feels faster because the slow part happened when the user wasn’t paying attention. Web pages may be able to cache some things locally, but they definitely can’t request data in the background since web pages get suspended if they aren’t open.

Apps would be slow, too, if not for these platform optimizations. If you did look at an app as a single transaction (find on the App Store, purchase, download and install, open, find single bit of content you want, close app, and delete app), apps are slow as hell compared to a website.

At this point, I fear I’m going to endlessly ramble, so I’m just going to wrap up before I end up devoting a third night to this (something about not letting perfection be the enemy of the good).

A lot of really smart people at really important companies, including the people that are making native app platforms, are working hard to make the web platform better. If you talk to those people, they see the web as an imperative. They aren’t just pulling another Internet Explorer in the 90s. It’s about interoperability and furthering the web platform, which happens to be a cross-platform effort that’s keeping the web platform bigger than any app platform. While a lot of users are doing a lot of stuff in apps, that likely has more to do with the proliferation of apps on wildly popular mobile phones than a “decline in relevance” of the web. The mission of the web is much further reaching than any app platform. It may be that apps are getting a lot of use and they may be faster (for now, anyway), but the web isn’t going away just because apps are doing well.