Saturday, September 26, 2009

Browserscope & Chrome Frame

I imagine many front end developers out there like me are currently pretty thrilled about the announcement of Google Chrome Frame this week. I've always been an admirer of Alex Russell's writing and work. To be fair though, I was once also a very excited Flash developer who thought I'd end up writing code for a single engine/consumer. I've felt that burn, but I'm still really just a kid at heart - so I'm psyched.

Chrome Frame presented a pretty interesting challenge for Browserscope. I wish we'd solved it fully before this morning. The problem is in detecting the user agent. When it first came out, I thought Browserscope wouldn't need any quick changes due to Chrome Frame, because we wouldn't be serving the <meta http-equiv="X-UA-Compatible" content="chrome=1"> header. Fortunately Steve read about the "cf:" way to run Chrome Frame the next morning - though unfortunately, that meant we had a problem.

Browserscope originally did its user agent classification based on the HTTP_USER_AGENT header sent from the browser when you share your test results. Chrome Frame adds a "chromeframe" string within the IE user agent string, so at first we thought we could detect Chrome Frame based on the presence of that string. But correctly, IE has the "chromeframe" string whether Chrome Frame has been invoked on the page or not (it's still an installed plug-in). And we wanted to be able to distinguish the IE/Chrome Frame results - which someone could submit by going to cf:

So then we thought we could just detect whether the client side render engine was Webkit, which is doable - and it turns out that if you ask the browser for window.navigator.userAgent you get a typical Chrome user agent string (type cf:about:version into IE with Chrome Frame installed). It's kind of a bummer that at the client level, you cannot tell that you're in Chrome Frame versus Chrome proper, but the Chrome Frame team may tweak the Chrome Frame user agent string if enough people also encounter this as an issue. Functionally, it should be a lot more like Chrome proper; Browserscope itself should be able to report if, and if/when that isn't the case.

While we'd anticipated new user agents on Browserscope, we hadn't imagined a hybrid sort of User Agent. We had to make some changes under the hood to support this in the back end (thanks slamm!). Now you should be able to go to Browserscope and compare your test results for IE and Chrome Frame (if you have it installed). Parsing-wise, we're grouping Chrome Frame along with the browser family(IE at this point) and major version bit(6|7|8) in the UI. We'd love your feedback on that decision.

Now, not only can you run the tests on Browserscope in Chrome Frame, but once you do, you'll then experience Browserscope itself in Webkit (check out all the rounded corners and text-shadows). And, how awesome is Webkit?! Of course, you can always toggle the checkbox on the homepage to switch back and forth to IE proper.

This experience reminded us on the Browserscope team how crucial it is to parse user agent "correctly" - and in this case that requires a combination of client and server information! - so I'll make a plea for folks to check out and add comments to a design doc for a user agent string parsing project we'd love to see take shape. Want to build it?

Sunday, September 13, 2009

Announcing Browserscope

I'm excited today to announce the release of Browserscope at

Browserscope is an open-source project for profiling web browsers and storing and aggregating crowd-sourced data about browser performance.

The goals are to foster innovation by tracking browser functionality and to be a resource for web developers.

Browserscope is based on Steve Souders' UA Profiler, and his original tests have been preserved here as the Network test category. Other test categories include Ian Hickson's Acid3 test (ported by Jacob Moon into Browserscope), Annie Sullivan's Rich Text Edit Mode tests, and John Resig's Selectors API Test Suite (ported by Lindsey Simon into Browserscope).

The Advantages of Crowdsourcing
The ability for users to contribute results is the key for Browserscope's longevity, accuracy, and currency.
  • No dedicated test resources are required; enabling the project to run in perpetuity
  • Tests are run under a wide variety of real world test conditions
  • Aggregating results reduces selection bias
  • New browsers show up immediately due to developer testing
 And this is where you come in! Click the button below to run your browser through the tests on Browserscope.

Below are some ideas culled from the project issue tracker on Google Code. Do you have some ideas? Add them, or feel free to checkout and work on the code. We'd love to get your patches (as directory diffs) and review those for inclusion in Browserscope!
  • Visualize test result trends over time
  • Wall of fame, up-and-comers, Billboard top 50
  • More test categories - cookies, security, reflow
  • More contributors
  • Tagged/personalized test results
  • Normalize time-based results across platforms
  • User agent parsing library


I want to extend my sincere and deep gratitude to everyone who's offered feedback, worked on the codebase, or otherwise helped Browserscope along. Specifically I want to thank:
Steve Souders: you've been a great mentor to work with, and your ability to think BIG and beyond is something I hope continues to rub off on us all a little ;) 
Steve Lamm: I've never worked with anyone who writes code that is of such high quality and readability. Browserscope's backend would not be the same without you.
Annie Sullivan: You're quick, wise, and totally fun to work with - I also really hope that Rich Text Edit  mode development will get easier thanks to your dedication.
John Skidgel: Design master, cohort, and all-around tolerater - you love problems, you has solutions.
Brett Slatkin and the App Engine team: Your insight into the overall system design was critical to our being able to deploy this on App Engine, and the Task Queue API rocks.
Jacob Moon: We sure got lucky that you provided the first outside-the-project developer experience! Thanks for bringing the Acid3 tests (and others) into Browserscope and for showing us how doable it is to port new tests into the system.
Christian Stockwell and the IE team: Your feedback was essential to uncovering issues regarding selection bias, hopefully the reflow tests will make their appearance again soon once we've done some more to address this issue.
Mike Belshe and the Chrome team: Your advice and feedback re: benchmarks versus compatibility testing was crucial. Yall run the tightest ship on the planet and it shows, i <3 Chrome.
David Baron, Dion Almaer, Ben Galbraith, John Resig and the Mozilla team: Your ideas and suggestions are present in many parts of the codebase - and additionally I couldn't have made the site look nearly as decent without using Firefox.
Robert Bowdidge: Thanks for inspiring me to work on this.
Lastly a big thanks to Google for giving me 20% time to work on this project for the last year.