Quality Assurance
Group: | Operations, Features, Platform, Mobile, Special projects |
Start: | 2012-05-01 |
End: | |
Team: | Mostly Wikimedia Release Engineering Team |
Lead: | Chris McMahon |
Status: | See updates |
- Not to be confused with the team: Wikimedia Release Engineering Team.
- Getting started with Quality Assurance? Take a look at pages that are both in QA and New contributors categories.
Quality Assurance at the Wikimedia Foundation (WMF) is about answering two questions:
- What should the software do? (Are we building the right thing?)
- How should the software function? (Are we building the thing right?)
The biggest issue facing Wikipedia today is that the number of editors has been declining steadily for some time. So the answer to the first question is: the software should increase the number of editors for Wikipedia.
QA testing works with the software development projects at WMF to answer the second question, "How should the software function?". The software development effort at WMF is divided into several projects:
- Core features is devoted to building significant features for Wikipedia dedicated to encouraging and supporting editors.
- Flow, a modern discussion and collaboration system, is the most important project right now from the Core features team.
- Growth is devoted to projects that bring in new editors. This team does a lot of A/B testing.
- We are building a VisualEditor for Wikipedia because having to learn wikitext is a barrier to editing Wikipedia, and sometimes even a burden to those who know it well.
- We are ramping up a project for better Multimedia support.
- Find out about Mobile software projects at the Mobile Gateway.
- Find out about Language software projects at the Language portal.
Contributing to testing: the most important thing any potential tester can do is to understand how the software to be tested should function. After that, contributing is just details.
Contents
Software testing
QA at WMF practices two main approaches to software testing: exploratory testing and automated browser testing.
Exploratory testing is "simultaneous learning, test design and test execution" or "test design and test execution at the same time". Exploratory testing is a powerful approach that everyone should know.
Our automated browser tests use Cucumber to define test scenarios and implement the Page Object design pattern. Most of the browser test code is in the repositories of the extensions being tested, but we manage some tests independently.
Contributing to testing: an automated browser test that fails is an opportunity for exploratory testing. Exploratory testing that discovers a bug is an opportunity to write an automated test.
We have pages devoted to exploratory testing and automated browser testing.
Testing environments
We have two main test environments.
One test environment is known as "beta labs" or "the beta cluster". Here we run the latest version of the master branch of the wiki software and all extensions. The code on beta labs is updated automatically every few minutes, and the databases about every hour or so. On the beta cluster we test the most recent software features that are assumed to be viable. We do not host wild experiments or unsupported features on beta labs, but only the latest version of the master branch of features to be deployed to production. At the moment, we do the majority of testing using desktop and mobile version of English Wikipedia, and desktop version of Wikimedia Commons.
As of January 2014, we have implemented a script that checks for fatal errors on beta labs every 12 hours and sends mail to the QA mail list if it finds such errors. Anyone with developer access to labs can read this log. The script itself resides here.
- Get Developer access
- ssh -A bastion.wmflabs.org (i.e. ssh -A <username>@bastion.wmflabs.org)
- from there, ssh deployment-bastion (or any deployment-foo host)
- cd /data/project/logs
The other test environment is "test2wiki" or "test2". This environment is a node on the production wiki cluster, a peer wiki to English Wikipedia, Commons, etc. This environment is the first target for a potentially deployable branch of all the code, and is updated weekly, one week ahead of all the production wikis. At the moment, we do the majority of testing using desktop and mobile version of the site. The deployment schedule for test2wiki (which is in "group0") and the production wikis is always available on the Deployment wikitech page.
Contributing to testing: maintaining our test environments is a big job. If you are interested in contributing, Labs is the place to start.
Resources
The QA mail list is a great resource not only for testing Wikipedia software but also for general discussion of QA and testing practice
We do issue tracking in Bugzilla.
Our source code is in Gerrit and is mirrored at GitHub.
We are on IRC in #wikimedia-qa
connect on freenode.
Contributing to testing: there is a conversation about QA and testing going on in these channels all the time. Feel free to join, ask questions, find out.
Step-by-step instructions to get started: Quality assurance/gettingstarted
More information
Because our QA effort is spread across Wikimedia Engineering we are not always 100% engaged with every project. We have a guide on when to use QA services.
We also collaborate with Bug management, Continuous integration, Wikimedia Labs and the testing plans of other Wikimedia Engineering teams.
Status
See also
- Bugs in MediaWiki Bugzilla, product Wikimedia, component Quality Assurance: all bugs, easy bugs
- The first week of training QA staff get
- Bug management & How to report a bug
- Continuous integration
- Wikimedia Labs