mobile-automated-tests

Test/Demo Plan

Ubuntu Mobile Testing Plan Outline Draft 0.1

The testing plan for Ubuntu Mobile covers user documentation, functionality, reliability, usability, portability, maintainability, performance, and Asian character input.

Overview There are four phases to testing the Ubuntu Mobile platform:

  1. Using a test version of UME for normal work.
  2. Running the 'testing plan' using special automated test suites such as LTP, Autotest and QMTest. Use the main desktop testing tool where appropriate. Autotest is currently (Dec 07) having a web front end written in Django to allow scheduling and inserting jobs. Autotest can run LTP, LDTP, QMTest and a range of kernel benchmark tests. Running dogtail tests in this web framework would be ideal

  3. Doing unusual things with the system installed.
  4. Measuring the system performance with UME installed.

Note: (Test candidate tools – Manual, QMTest, Dogtail and Autotest are in Italics)

(Autotest includes the LTP test suite and can run QMTest also)

Virus Scan Undertaken before Testing

Phase 1

The first phase of mobile testing is simple: we try to boot UME and use it for normal work.

  • • Before starting the system in a fully functional configuration boot the kernel with the init=/bin/bash command line argument, (make this possible in Xephyr) which makes it start only one bash process. From there we can check if the filesystems are mounted and unmounted properly and we can test some more complex functionality, like the suspend to disk or to RAM, in the minimal configuration.

Phase 2

This Phase can benefit from Automation

User documentation

Do the user manuals provide detailed information which is correct, complete and consistent with the software. Is it easy to browse and understand. (Manual)

Functionality

Image Creation

  • Install from USB: UME implements the functionality of installing from USB Live USB: UME implements the functionality of a Live USB session

    Live R/W USB: UME implements the functionality of a Live R/W USB session (Manual)

Startup and shutdown

The system implements the functions of startup and shutdown correctly. (Manual)

Kernel

Compile a test enabled Kernel. Checking if specific kernel subsystems work correctly. Also carry out regression and performance tests of the kernel.(Autotest)

Control Panel

About Me, Date and Time, Keybindings, Menus and Toolbars, Network Proxy, System Monitor, Background Properties, Font, Keyboard Properties, Contro Network Configuration, Screen Resolution, Touch Screen, Brightness, Volume Control (listing current @l Panel 24/10/2007) (Dogtail through HAIL, Autotest)

Core Applications

MID Browser (xnee)

Notepad (Dogtail)

Media Player (Dogtail)

Instant Messenger (Manual)

Email Client (Dogtail) (listing current @ 24/10/2007)

(Dogtail will be implemented through HAIL, a part of Hildon Desktop)

Software Integration

The system implements the integration between applications and system fonts, application startup correctly, it also basically implements the integration between applications and integration among applications.

Reliability

During the testing process, the system runs smoothly. It has certain fault-tolerance and recovering ability. The system provides indication information on invalid inputs. It has a data backup mechanism. The system is secure. The default setting does not permit to log in as the root user thus avoiding system damage by accidental operations. User verification is required for critical system setting.

Usability

It is easy to browse and understand information in the system. The system fully supports the operations of touchscreen and keyboard, and is fairly easy to use.

Maintainability

The system can estimate the runtime errors, defects, and the corresponding reasons, and provide analysis of such errors. The system provides convenient methods to change the configuration files. Plug-in interfaces are available in the system.

Portability

The system accepts and recognizes external cameras, Flash disks, GPS devices from a wide range of vendors

Asian Character Input

The system can display and input Asian characters correctly.

Phase 3

The third phase can be started, for example, from unplugging and replugging USB devices. While in theory the replugging of a USB device should not change anything, at least from the user’s point of view, doing it many times in a row may cause the kernel to crash if there is a bug in the USB subsystem. Note, however, that this is also stressful to the hardware, so such experiments should be undertaken with this in mind. Next, we can write a script that will read the contents of files from the /proc directory in a loop or some such. In short, in the third phase you should do things that are never done by normal users (or that are done very rarely: why would anyone mount and unmount certain filesystem in an infinite loop for example? :)).

Phase 4

Performance Testing (10 runs per test)

  • • carry out the tests regularly • ensure the stability of the test environment • compare things that are directly comparable

Timing

Test Item 1: Time used to load all default services during startup

Procedure: Start the system, record the time used to load all default services.

Result: XXXX

Test Item 2: GUI showdown time

Procedure: Shut down the system from GUI.

Result: XXXX

Test Item 3: Time used to copy a 500MB file

Procedure: Start Terminal, run command to copy a 500MB file from “/home” to “/home/Documents”.

Result: XXXX

Test Item 4: Time used to copy 2000 small files of 250KB

Procedure: copy between “/home” and “/home/Documents”

Result: XXXX

Test Item 5: Time consumed to delete a 500MB file

Procedure: rm <file_name>

Result: XXXX

Test Item 6: Time to open <insert_core_application>

Procedure: click desktop icon to start application

Result: XXXX

Test Item 7: Find a file

Procedure: Start Terminal,run command to find a specific file: “menu.lst”, record the time consumed.

Result: XXXX

Resource Performance

Test Item 1: Installation of UME

Procedure: Install Ubuntu mobile, check the hard disk used after installation is finished.

Result: XXXX

Test Item2: CPU usage when opening File Manager

Procedure: Start Terminal, “Top” to monitor the CPU usage; start another Terminal, run command to open File Manager. Record the cpu usage of the corresponding process.

Result: XXXX

Test Item3: CPU usage when opening <insert_core_application>

Procedure: Start Terminal, “Top” to monitor the CPU usage; start another Terminal, run command to open Application. Record the cpu usage of the corresponding process.

Result: XXXX

Test Item 4: Memory usage when opening File Manager

Procedure: Start Terminal,run “Top” to monitor the memory usage; start another Terminal, run command to open File Manager. Record the memory usage of the corresponding process.

Result: XXXX

Test Item 5: Memory usage when opening <insert_core_application>

Procedure: Start Terminal,run “Top” to monitor the memory usage; start another Terminal, run command to open Application. Record the memory usage of the corresponding process.

Result: XXXX

Test Item 6: System memory usage

Procedure: Start the system and wait till all default services are loaded, run “Top” tocheck the system memory usage.

Result: XXXX

Networking Performance

Test Item 1: Time used to open www.ubuntu.com homepage.

Procedure: Start Terminal, run command to open www.ubuntu.com homepage in MID browser, record the time consumed.

Result: XXXX

Test Item 1: Time used to change/obtain wifi connection

Procedure: Change/obtain wifi provider using control panel

Result: XXXX

Outstanding Issues

http://live.gnome.org/Accerciser Accerciser] which is used to inspect AT Objects does not work as no apps are found on the desktop. This is due to differences in Hildon Desktop. Accerciser is used to write actual dogtail UI testing scripts

Integrate with the ubuntu desktop test automation as much as possible


CategorySpec

Testing/Automation/mobile-automated-tests (last edited 2008-08-06 16:22:16 by localhost)