Monthly Archives: July 2012

Popcorn and Popcorn Maker Testing – This is How We Do It


No jokes. I literally had that song pop into my head when trying to think of a title. Enjoy.

My primary focus over this summer has been and will be the testing we have in place for Popcorn Maker. It’s been severely lacking and we need to change that. However, I wanted to take the time to explain everything we do in both projects; what we use and why. First, let’s talk about testing in general.

So, Why is it Important?
This may be a dumb thing to talk about but I’m sure someone would want to know and for completions sake I’m going to briefly talk about it. No matter what kind of software you are writing – library or full fledged application – what you are working on will always have expected functionality. You want to have robust testing because it allows you to easily check against this expected functionality and know if you have broken any of that before you bring it out of development and into production.

Popcorn Testing
Unlike Popcorn Maker, Popcorn already has a great robust testing environment. The test suite it has is rather massive (the amount of tests is in the thousands) and it all uses QUnit based testing. For Popcorn it’s a perfect match because we can easily test ANYTHING in our library without breaking a sweat. It works perfectly for any javascript based library that isn’t reliant on a UI of any sort. The main selling point is that it allows you to easily break down tests into groups and report back useful data about any passing/failing tests. For us it really get’s the job done well and allows us to integrate with other services well such as Test Swarm. More on that later.

Popcorn Maker Testing
Popcorn Maker is a bit different here. Yes, it is all 100% javascript so from far away it would seem that QUnit would cover the job very well here also. Unfortunately it only can go so far because a lot of things need to be tested on the UI elements themselves and not just the core libraries/modules that run everything in the background. We have many QUnit based tests for our core modules, many of which I recently finished polishing up to give them the attention the deserved including many new ones. I also wrote a nice test harness that will run all of our tests together making it really easy to check over all tests at once.

However in the end this doesn’t solve the problem of the UI testing we need. In the past Chris and I played with the Selenium IDE to create some tests written in python based around our testing template. These actually worked before kindofish (totally a word by the way) until we implemented our own DragNDrop which broke these tests and to add to that when we used iframes for our editor and dialogs it was difficult to interact with them using their older Selenium RC. However I took a look recently at another option they have called the Selenium WebDriver API and see if it along with all the changes we have been making would change the results we were having in the past.

Sure enough, it did! I was able to get some basic tests working for our Export Dialog boxes and some basic drag and drop action going on track events. Here’s the code I wrote for the export dialog tests:

import unittest
import time
from selenium import webdriver
from selenium.webdriver import ActionChains
from selenium.webdriver.common.keys import Keys
from selenium.common.exceptions import NoSuchElementException

class PM_HeaderTests(unittest.TestCase):

  def setUp(self):
    self.driver = webdriver.Firefox()
    self.action = ActionChains( self.driver )

  def test_export_dialog(self):
    driver = self.driver
    action = self.action
    templateTitle = "Basic template"
    driver.get( "http://localhost:8888/templates/test.html" )
    driver.maximize_window()

    self.assertEqual( templateTitle, driver.title )

    # Ensure the button is present on the page, then click it
    sourceButton = driver.find_element_by_id( "butter-header-source" )
    self.assertTrue( sourceButton )
    sourceButton.click()

    # Ensure an element with the class_name is present
    # Only ever have this class on the page at one time
    sourceDialog = driver.find_element_by_class_name( "butter-dialog" )
    self.assertTrue( sourceDialog )

    # Get JSON, HTML button in export dialog and their text areas
    jsonButton = driver.find_element_by_class_name( "json-button" )
    htmlButton = driver.find_element_by_class_name( "html-button" )
    jsonTextArea = driver.find_element_by_class_name( "json-export" )
    htmlTextArea = driver.find_element_by_class_name( "html-export" )
    # Ensure the elements were on the page and retrieved
    self.assertTrue( jsonButton )
    self.assertTrue( htmlButton )
    self.assertTrue( jsonTextArea )
    self.assertTrue( htmlTextArea )

    # Check that the appropriate display properties were set when switching to the JSON
    jsonButton.click()
    self.assertEqual( jsonTextArea.value_of_css_property( "display" ), "block" )
    self.assertEqual( htmlTextArea.value_of_css_property( "display" ), "none" )

    # Check that the appropriate display properties were set when switching to the HTML
    htmlButton.click()
    self.assertEqual( jsonTextArea.value_of_css_property( "display" ), "none" )
    self.assertEqual( htmlTextArea.value_of_css_property( "display" ), "block" )

    # Check pressing the ESCAPE key closes the dialog
    try:
      sourceDialog.send_keys( Keys.ESCAPE )
      sourceDialog = driver.find_element_by_class_name( "butter-dialog" )
    except NoSuchElementException as e:
      self.assertTrue( "Dialog not found, as expected" )

    sourceButton.click()
    dialogCloseButton = driver.find_element_by_class_name( "close-button" )

    # Check pressing the close button closes the dialog
    try:
      dialogCloseButton.click()
      sourceDialog = driver.find_element_by_class_name( "butter-dialog" )
    except NoSuchElementException as e:
      self.assertTrue( "Dialog not found, as expected" )

    time.sleep( 2 )

  def tearDown(self):
    self.driver.quit()

if __name__ == "__main__":
    unittest.main()

One of my first few stabs at python so don’t judge me. To summarize, when run this will:

1. Open a Firefox Browser.
2. Navigate to our test page.
3. Maximize the window.
4. Press the View Source button in our header (Export Dialog).
5. Press the View JSON button.
6. Check appropriate display styles for both JSON and HTML text areas.
7. Press the View HTML button.
8. Recheck display styles for both text areas.
9. Check if closing with the ESCAPE key works as expected.
10. Press the View Source button again and then try closing with the explicit close button.

What I’m really hoping for is to find some better sort of callback like system so I know when things are done. Right now things seem fine but I’m half worried I will run into asynchronous issues in the future.

Continuous Integration – Test Swarm
This is actually something we have sort of had for a while now however we never used it much. I would say it boiled down to two factors:

1. We haven’t spent nearly as much time on Popcorn over the last few months and this was the only project it worked with at the time.
2. It wasn’t very convenient to use. The way we had it setup to listen for/run jobs was kind of clunky and cumbersome to use.

Luckily, this has changed thanks to more work done by Jon bringing in Botio to the project and Chris by not only updating our test swarm stuff but also adding integration into our botio system. We use a pull request based system for everything change we make to the project, meaning for every ticket where changes to the code are being made a pull request is made to help review the code. With botio listening for various commands we can comment in the pull request with a particular command and then have results sent back to the pull request. It’s helped make things very efficient for us and when we add in Chris’ work for /botio try to run a job on our test swarm instance we will be golden.

Hope you enjoyed my mini novel on how we go about testing for the projects Popcorn and Popcorn Maker. There will definitely be more to come later on our advances with Selenium. If you want to learn more out what Chris has been doing, check out some of his blog posts such as Time to Automate Butter Unit Testing and Popcorn and Butter Testing.

Popcorn – Focus, Take Aim and Fire!

Big things are coming. Big decisions have been made.

The Popcorn Maker project has been scrapped.

Just kidding.

Anyway, a lot has gone down for us over the last 12 days. For starters we have some final work to finish off on our 0.6 release of Popcorn Maker and our 1.3 release of Popcorn. Both are mostly going to be polishing releases for the work we have done up until now. Popcorn itself is pretty mature and doesn’t need many big ticket features to come on in, but Popcorn Maker is in the middle of some big decisions coming down as to what we want to ship in our 1.0 for the Mozilla Festival in November. So the need for big things to happen right now isn’t very high, we just need to get more minor fixes done.

Popcorn Maker has so many big lofty goals. It really has a bright future with all the cool things we want to do. Unfortunately to make our deadline we are forced to cut away some of these features from landing in 1.0. Easily one of the coolest and most promising things we were thinking of doing was mobile support. Thanks to David Serfried and Robert Stanica we had a promising glimpse at getting some of this working. It’s disappointing to see this thing axed but we have more important things that need to be working for then.

One of those is developing some good templates. While all the new ones we have are really cool we also realize that they are kind of hard to understand and also don’t function very well other than with some of the things they were provided. Kate Hudson is steering the ship behind this with some great designs and ideas on not only templates but how we can improve the overall Popcorn Maker ecosystem to simply be more friendly and easier to pick up.

TESTS!
I’m going to finish these basic unit tests this week. Period. I feel really terrible for letting them hang this long and I’m going to make sure I put a stop to that by finishing them up this week. Part of the reasons why I feel terrible about leaving them hanging for so long is I know someone else could easily have done them much, much quicker than me. If someone else had either the interest in doing them (Wait… sorry had to stop myself from choking there) or just simply the time they could have done them in probably a day or two. Even that I think is being generous. It’s time for me to bang these out because I want to move onto something else. I want to own something else in our project or be a key part of some other area of it. Right now I’ve had my hands dipped into many corners of the project but nothing that was specifically mine started from scratch. Maybe that’s working with Kate on templates, maybe that’s working on cross browser support. We still need to sort out work for what we are doing with these new features coming out of the big meeting last week.

Combine those tests with the work Chris is doing for test swarm and we should be pretty golden. Just need to get my act into gear.