Sunday, December 18, 2011

SVN Commit Hooks For a Better Codebase

This is a cross post from the Wayfair Engineering Blog

As we have mentioned before, the main source control system we use at Wayfair is SVN, with TortoiseSVN as our client. One of the things we love about SVN is the ability to add commit hooks, or checks that run when someone tries to commit a file to source control. By having a few key checks we can prevent bugs, ensure consistent coding practices, and generally have a cleaner codebase.

The first commit hook we added was a PHP lint check. This means that when you try to commit a PHP file we run php -l on it from the command line, ensuring that the file has no syntax errors:

Rejected commit due to a syntax error in a PHP file

This might seem like a paranoid check, and you might assume that this kind of thing never happens, but with our ASP code (where we have no syntax check) we saw people committing files with invalid syntax on a semi-regular basis. Sometimes people are making the same change in a number of files and fail to test each one, or sometimes mistakes are made when people are rushing. This commit hook is a quick and easy way to prevent a silly (but extremely dangerous) problem.

Sunday, December 4, 2011

Performance Review of the New "Read It Later"

Read it Later is an online tool and app that allows you to save the "one read wonders" that you find on the web for another time.  I use this service frequently, since I'm often checking up on Twitter or Google+ on my phone and seeing posts that I want to read, but that I don't have time to read at that time.  It's easy to add a URL to Read it Later and pull it up on my desktop at home.  Overall it's a great service, and free.

When I started using Read it Later it had a really simple web interface, with no frills or fancy animations.  This made for a fast, but not particularly sophisticated product.

Read it Later is currently trying out a new and improved web interface that gives larger previews of your articles (including thumbnails), easier navigation and action menus, better use of the available screen real estate, nice animations when you mark things as read, and generally a cleaner and more modern look (among other improvements). 

Tuesday, October 25, 2011

Progressive Enhancement For a Faster Site

This is a cross post from the Wayfair Engineering Blog

Progressive Enhancement is often described as an alternate approach to “Graceful Degradation” – it encourages focusing on the most basic functionality first and then building out from there.  It also forms the core of the Yahoo! Graded Browser Support model, which we use as a guide for our own rules around browser support.  This is an important topic, but it has been covered fairly extensively in other articles, so I’m not going to dive into it too much here.  Instead I am going to talk about specific progressive enhancement techniques we use at Wayfair to improve site performance.

As you may know, there are some amazing new features in HTML5/CSS3 that make web development easier – rounded corners, drop shadows, gradients, placeholders in text fields, in browser form validation, etc., all of which reduce dependency on background images and JavaScript.  These are great features to have, but what about your IE 6, 7, and 8 users?  How about older versions of Safari and Firefox?  IE 6, 7, and 8 users comprise over 35% of the customers on, and we need to make sure that we give them a decent experience.

Tuesday, October 11, 2011

Switching from Classic ASP to PHP

This is a cross post from the Wayfair Engineering Blog

One of the big changes at Wayfair recently was moving all of our storefront code (well, almost all…we’re still working on our sessioned code) from Classic ASP (VBScript) to PHP.  The company was started in 2002 and at that time ASP was a common technology on the web, and one that our founders were familiar with.  After 8 years of working with it, we had pushed it to the limits and decided we’d get more benefit out of moving to a new technology.

Motivation for Switching

While ASP is still in extended support, as a language it hasn’t been actively developed for a number of years (I tried to find out exactly how many years, but my Google searches were fruitless, which might tell you something).  It’s also a proprietary Microsoft language, so we were unable to make modifications ourselves, so any bugs we found were not getting fixed.  ASP’s age also means that there are very few companies using it, so the community is small and there are basically no open source projects written in it that we can use.  It was also getting harder and harder to hire developers with Classic ASP experience. While training people isn’t hard to do, we would rather hire experts who are going to help us squeeze every ounce of performance and functionality out of a language.

Tuesday, September 20, 2011

September 2011 Site Performance Report

This is a cross post from the Wayfair Engineering Blog.

I recently saw this post on Etsy’s blog and found it inspiring – you don’t typically see retailers (or any companies for that matter) sharing performance data.  Even more impressive is the fact that they posted this on their primary blog as opposed to their engineering blog.  To me this says that Etsy really cares about performance, and wants all of their customers to be aware that it is a priority.  After reading that post I immediately wanted to share some Wayfair data – and this is my attempt to do that (maybe I can get it cross-posted to our main blog :-) ).

Here at Wayfair we care a lot about performance, and the business case for having a faster site is well documented.  To keep tabs on our site performance we have synthetic monitoring tools like Catchpoint, real user monitoring tools like Coradiant (now a part of BMC software), and our own home-brew monitoring instrumented in our ASP and PHP code.  We have also had great success using StatsD in conjunction with GraphiteThough we use the ruby client, StatsD was started as an open source NodeJS project from Etsy (I promise we aren’t stalking you guys).  The numbers that I am sharing are from Coradiant, and measure “host time” which is defined as the time between the last packet of the request and the first packet of the response as measured by a network tap.  Without further ado, here are the numbers for our highest traffic pages.  These numbers are all in milliseconds, and were measured between 9AM and 5PM Eastern Time on a Wednesday, so they should be indicative of real world performance.

Monday, September 5, 2011

Sharing Performance Data

I am very impressed with Etsy's decision to share some of their performance data.  This is rarely done in the WPO world, companies regularly share data like "improving landing page performance by 2.2 seconds increased download conversions by 15.4%" but you almost never see absolute numbers.

Why don't companies share hard numbers?

The obvious answer is that they want to keep whatever competitive advantage they have in the speed department.  If a site starts sharing its performance data then competing sites have a benchmark to shoot for.  Other companies could also use that data as an advertising ploy, saying things like "don't shop at Etsy, our site is twice as fast!".  On top of that, performance numbers are somewhat complicated to share and compare.  As I talk about in my post on SLA's, the numbers can change dramatically depending on how you measure them.  Etsy has done an okay job of explaining their methodology, they specified that they are talking about server side performance and they gave average and 90th percentile.  Since server side performance is location and network independent there isn't a ton of variability based on measuring tools.  If Etsy starts sharing full page load times I hope they will give a bit more information about how the tools they are using to prevent people from making apples to oranges comparisons.

How fast is your website? Setting (and keeping) Web performance SLAs

This is a cross post from the Yottaa Performance Blog

I’m sure many of you have heard people at the office ask the question “How fast is our site?”  or some variation of it.  Many of you have probably also realized that this question is largely meaningless.  Anybody who can respond to this question with a single number is trying to sell you something.  There are a huge number of variables that go into measuring the load time of a website:
  • What kind of monitoring are you using, synthetic or real user?
  • If synthetic, where was the test run from, close to your datacenter or across the world?
  • What percentile are you looking at in the data?
  • How many tests (or people for RUM data) are you aggregating to get this number?
  • Which pages are you monitoring?


Welcome!  I figured it was time to start a blog, and I've owned this domain for months without using it for anything.  I've been holding out for, but sometimes you have to settle for what's available.  Most of the posts on this blog will be about Web Performance Optimization (WPO), Astronomy, or other things that interest me.  Let me know if you have any feedback - it's a work in progress.