Velocity Conf: Building a Faster & Stronger Web

Velocity Conf knocked my socks off. This was my first O’Reilly conference and I can really see what the hub-bub is all about. Velocity was host to many top industry pioneers like the dudes from Etsy who created StatsD, Mitchell Hashimoto who works on Vagrant, and reps from Opera, Mozilla, and Google, among other big names.

The conference was split into a venn-diagram of operations, development, and devops, so it was easy to experience talks that were on the fringes of most attendees’ skillsets. Being mostly into development and UX, the web performance track was my home turf. However, I did learn some operations stuff that helps me level up beyond just being able to scale up my meager home NAS server. Many of the pure operations talks had to do with visualization of systems; it was nice to hear the discussion involve many HCI principles that we use on the Intridea UX team on a day to day basis.

There was so much material on the web performance side of things that I could go on for days about it, but I’ll just share a few of my favorite tips from Velocity for this debriefing. I often see front-end developers and engineers struggle with exactly how to measure and address web performance issues, and many of the Velocity presenters covered ways to effectively optimize page load; and yes, image compression was one of those things mentioned.

DOMinate the Document Object Model

Ok, so let’s think about DOM render, it happens serially right? That means that we have to make sure we don’t structure our markup in a way that would severely block the “thread” when loading. Modern browsers have implemented work-arounds like “speculative loading” to download resources while still parsing the rest of the DOM. This is all well and good, but speculative loading will still fail if we have any inline script tags that use document.write() to append markup to the document. This would be a sure-fire way to block the DOM. Not all document.write() is entirely evil, but one should definitely be wary of it.

Something cool that Chrome for Android is doing is spinning up multiple processes when loading a document so it’s likely that true concurrent DOM render is probably coming in the near future. The faster a user sees browser paint (elements on the screen), the faster they will believe the page is loading. You never want to give them the “white screen of death”.

Optimization for Mobile

With responsive design all the rage (and with good reason), there are special considerations to make to optimize for multiple devices. Jason Grigsby drilled down into this at Velocity in his talk “Performance Implications of Responsive Design”. We obviously want to limit the size of any asset on a mobile device if necessary, but the W3C spec still needs to catch up with an image tag that allows multiple sources for multiple breakpoints. Until then, we have this:

Picturefill, a JS lib that allows us to specify multiple images with data attributes. In my opinion, the current landscape of responsive design feels very much like back when CSS and semantic markup had become en vogue. Browsers and W3C spec will need to catch up, and until then we will have to put some hacks in place to heighten the UX.

Tools

Now for the tools…

The W3C now has a couple of recommendations in the works for Timing APIs to measure a slew of attributes surrounding page speed. They are super easy to use too, all you need to do to leverage is:

window.performance

…and BAMMO, you’ve got yourself an interface in which you can piece together just about any metric for page load, memory allocation, etc. that you want.

If you just want to get a good rundown of these metrics, but don’t want to build it yourself, then use the PageSpeed Critical Path tool, a project headed by Bryan McQuade at Google. Bryan, Patrick Meenan, Dallas Marlow, and Steven Souders went over the tool in depth at Velocity, and you can see their presentation here.

A Stronger, Faster Web

Velocity’s theme is centered around “Building a Faster and Stronger Web.” What amazes me is that after leaving the conference I already feel more confident in my ability to begin building a faster, strong web.

Velocity was a conference that didn’t disappoint. It wasn’t a dull offering of overdone presentation topics and speakers – it actually offered interesting panels and presentations on a variety of really engaging topics, all centered around that single theme. I’m looking forward to heading back next year and learning what it will be like building the web in 2013!