You Can Test Everywhere

Reid Burke

Bayjax 2013

But not everything

How deep are your unit tests?

Kent Beck on Stack Overflow

I get paid for code that works, not for tests, so my philosophy is to test as little as possible to reach a given level of confidence
If I don't typically make a kind of mistake (like setting the wrong variables in a constructor), I don't test for it.
When coding on a team, I modify my strategy to carefully test code that we, collectively, tend to get wrong.

Kent Beck, known for advancing Test-Driven Development

Write code that works

Testing shows the presence, not the absence of bugs

— Edsger W. Dijkstra, NATO Software Engineering Conference 1969 (p. 16)

Testing is a tool

Another tool

Continuous Integration

Clean environment for your tests

Keeps you & your contributors honest

Joy of a single CI job

Broken Travis build email

Broken Travis build

Joy of a single environment

Node.js 0.10

Node.js 0.8

Multiple environments, multiple builds

Way more noise

More builds to check

More emails from CI

You make web apps

How many environments do you have?

PhantomJS

Joy of a single environment

PhantomJS

Do you visit websites with PhantomJS? Neither do your users

Setting up automated browsers is hard

Handling test results from multiple environments is harder

YUI testing in 2012

Build box

Node.js

Selenium

IE 7, Firefox

YUI uses Sauce Labs

They love open source projects

YUI testing in 2013

Build box

PhantomJS, Node.js

Selenium

IE 6, IE 7, IE 8, IE 9, IE 10, Chrome, Firefox

Sauce Labs

IE 11, Safari, iOS

Yeti

yeti.cx

Automates tests written with QUnit, Jasmine, Mocha, Dojo, or YUI Test

No Selenium required on your computer

Works with Jenkins & Selenium

Open source

More speed

Yeti supports parallel testing

Per build, we do...

8 parallel instances of IE, Firefox & Chrome

3 parallel instances of Safari & iOS

Parallel builds, too

Check out Kochiku

Testing powered by Yeti

AJAX testing with echoecho

Browser launching

Parallel testing

Functional testing

11 of 15 tested on every commit

yuilibrary.com/yui/environments/

4 of 15 tested before each release

Automated but tested outside of CI

100% automated

100K+ tests per commit

Over 3 hours of testing in 90 minutes

More complex

21 jobs x 3 branches = 63 jobs

~3 VMs x 63 jobs = 189 browser VMs

~189 browser VMs + 63 slaves = ~252 machines

Over 252 moving parts

Too many jobs

Too many test results

Impossible workflow

Failing build email

Failing build email console log

Unstable build email

Unstable Jenkins build

Unstable Jenkins tests

Unstable Jenkins unit tests

Unstable Jenkins unit test detail

Repeat for a dozen builds

Use the dashboard?

Did my latest commit pass? Did code break the build? Or is it flaky?

Confusing CI dashboard

Flaky infrastructure

Sauce Labs, Jenkins, build slaves, Selenium, Yeti server, what will fail first?

Flaky tests

98% of the time, it works every time

Build numbers instead of commits

Build #581 was for which commit again?

yui-dev_3_x-selleck-selastic-ie-576

yui-dev_3_x-selleck-selastic-ie-archaic-581

Too many results

Jenkins becomes slow

Nobody responds to build failures

Too hard to understand what needs fixing

Our solution: yo/tests

Classify flaky tests

yo/tests understands what tests are flaky

Hide flaky test results

Do not alert developers to flaky test failures

Prevents panic from bad tests

Hide flaky infrastructure

Engineers do not care if a CI slave is down

Prevents panic from bad infrastructure

Commits, not build numbers

As builds complete, they are organized around commits

Developers know exactly what code broke

Unstable build email

Red means code broke? Bad infra? What commit was this?

Confusing CI dashboard

Now, with yo/tests

yo/tests branch overview

yo/tests commit detail

yo/tests recent builds

yo/tests commit unstable tests

Shipping quality YUI releases

release-3.13.0

yo/tests release overview

yo/tests release rows

yo/tests builds

Highlight to take action

IE 60 failing tests

4 failing tests

0failing tests

Did my commit pass?

yo/tests branch overview

yo/tests commit detail

Wins

Prevent panic from bad tests

Prevent panic from bad infrastructure

Developers know exactly what code broke

How To Win

Classify flaky tests

Hide flaky test or infra failures

Commits, not build numbers

We are not done

yeti.cx

Works with Sauce Labs