NattyServerQA
2276
Comment:
|
2284
|
Deletions are marked like this. | Additions are marked like this. |
Line 16: | Line 16: |
Line 17: | Line 18: |
Line 18: | Line 20: |
Line 19: | Line 22: |
Please check the status of this specification in Launchpad before editing it. If it is Approved, contact the Assignee or another knowledgeable person before making changes.
Launchpad entry: Natty Server QA
Summary
Fine-tune the current efforts on server testing (including regression testing), SRU tracking, bug management, EC2 image refresh tests, and daily upstream builds.
Scope
Server Testing
* Standardise test output (should be done in sync with desktop and other Platform areas)
* test output should have a summary view (test area, name, date, number of successes/failures, etc)
* review current tests; augment/improve as needed
* upgrading testing (from Maverick)
* prepare (with James Page) a as-soon-as-possible deployment of Hudson QA-wise; check CloudBees as an interim option
Server Regression Testing
* whenever possible, a small test should be written for a fix; as time goes by we will collect a series of tests, which should be included in the QA Regression Testing project. When applicable, the test should be upstreamed.
* set up & run the applicable regression test whenever a new build of a package is accepted.
SRU Tracking
* discuss with QA and Server Teams what can be assimilated by QA for SRUs (Chuck Short currently does all of this work)
* QA take over of appropriate pieces
Bug Management
* to be discussed
EC2 Images Refresh Tests
* take over (from Scott Moser) the EC2 image tests
* run the tests whenever a new image is provided; provide feedback to the server team
Although it was proposed that QA should take over the actual image refresh, I do not think this (publishing new images) should be performed by QA
Daily Build (from upstream VCS) for Server Packages
* currently maintained by Chuck Short
* discuss usage/maintenance with Server team
NattyServerQA (last edited 2010-11-18 16:08:51 by pool-173-64-203-126)