Setup LTSI Testing/Validation infrastructure

From eLinux.org
Jump to: navigation, search
Summary
Setup LTSI Testing/Validation infrastructure
Proposer
Artemi Ivanov / Hisao Munakata
Status
Selected to be sponsored by the CE Workgroup

Description

It has been discussed recently: how LTSI project could attract new contributors, what are the key factors for product architects/decision makers to select LTSI kernel as a base line for their development, what are potential risks when productizing a selected LTSI release.

It would be nice to start bringing quality and test coverage metrics to the LTSI project. This could help with LTSI productization/adaptation efforts by ODM/OEMs and semiconductor vendors. Building specifications and test coverage for LTSI is not a trivial task, it would require a few requirements gathering/discussion/prototyping cycles before finalizing "what needs to be tested", "how it needs to be tested", infrastructure, maintenance. Ultimate goal would be: LTSI quality specification (definition of features/components/configurations/performance/stability requirements), LTSI test coverage (definition of tests, parameters, thresholds), LTSI Testing Infrastructure (Test Automation framework, tracking system) and well-defined maintenance and contribution process.

While LTSI quality/testing discussion is ongoing It would be great to have LTSI Testing/Validation infrastructure prototype: setup test automation frameworks (Linaro LAVA and Jenkins), integrate a few popular/useful open source test suites (benchmarks - networking, graphics, storage; LTP/LTP-DDT; LAVA tests) and make it available for public. LTSI Test results, performance numbers, quality metrics should be available online (at least for a few selected hardware platforms/configurations). It is also expected that LTSI contributors could install/adapt same infrastructure for their environment (hopefully tests results and metrics are also reproducible)

Related work

Scope

  • Hardware infrastructure setup: setup a few selected hardware platforms, including automated recovery/powercycling
  • Software infrastructure setup: setup test frameworks, integrate and tune tests so they are could be run unattended

Efforts: 5 weeks?

Contractor Candidates

Artemi Ivanov (Cogent Embedded)

Status

This work was completed, and the results are at:

Code for the test framework can be downloaded from:

Comments

Fujitsu used the completed test framework, and reported their findings here: