PerformanceTracking
Launchpad Entry: performance-tracking
Created: 2008-12-18
Contributors: bryce, Keybuk, cr3, heno
Summary
Use the test automation infrastructure to conduct performance testing and track progress over time.
Rationale
We should aim for better performance in key components and monitor for regressions.
Use Cases
- Boot time
- X.org
Assumptions
Design
- Scheduling and performing testing
- Log the results and plot them over time
Tests
- Which tests do we run?
Reporting
The hardware testing system using checkbox and the certification website currently returns a Pass, Fail or Skip result for each test. For performance testing this is not an appropriate result but the current system can be used as follows:
Fail: unused - we currently don't have failure criteria defined for these tests
Skip: The test could not run because of missing dependency
Pass: The test ran and returned a numerical result when we log and analyse in a separate process
Implementation
Code Changes
Unresolved issues
Discussion
QATeam/phillw/Specs/PerformanceTracking (last edited 2014-07-22 21:56:10 by host-80-41-221-66)