Automated Testing Summit 2018

Revision as of 14:01, 8 November 2018 by Tim Bird (talk | contribs) (Action items from meeting)
Jump to: navigation, search

This is a public planning page for the Automated Testing Summit

2018 meeting

Coordinated by: Tim Bird and Kevin Hilman

Mailing list

Discussions on this topic have started at on the "Automated Testing" mailing list of the Yocto Project

Action items from meeting

Action items:

  • Tim: refine glossary with feedback from summit
    • Tim: Create glossary wiki page (separate from survey)
  • modify CI reference diagram with discussed changes (Kevin?)
  • Collect and organize survey results (still in progress, mostly done, see Test_Stack_Survey#Responses)
  • Start working on pdudaemon as central location for power control abstraction for DUT control driver interface (who to do this?)
  • Tim Bird: Create a test Definition survey (to collect Test Definition fields)
  • Tim Bird: Send test phases, as a way of starting discussion on Test Execution API
  • Tim Orling: create pdudaemon debian package?
  • "result format" survey needed soon
  • (done) create a page for test suite links - see Test Systems
    • page created, but not fully populated yet
  • (done) create a page for automated testing overview - see Automated Testing
  • create a reference for experimenting with a test framework: hello_test on a beaglebone
    • document describing how something works on each framework
  • Kevin: create an automated test project in the Linux Foundation
  • collect Run Artifact fields (for possible RA standard) (unassigned)
  • Tim Bird: arrange for sessions and meetings at ELCE 2019
  • Chris Fiege: create "Design for Testing" document aimed at board hardware designers

Date and Venue

  • Location: Edinburgh, Scotland
  • Venue: in the Edinburgh International Conference Centre
  • Date: October 25, 2018

Lunch will be included. Attendance is by invitation and is free-of-charge.

Sponsorship was provided by:

Summit Artifacts


See ATS 2018 Minutes


Here are Tim's slides from the event: PDF


A video link will be provided shortly.


This was a closed, invitation-only event.

For this event, Kevin and I decided to keep it to a small focused group. Linaro sponsored recording the presentations and discussion, and the videos will be made available after the summit. I apologize if you would like to attend but have not been invited. We are hopeful that we can put together a public automated testing event in the future.

Invitations were sent out previously, and representatives from the following projects have agreed to come to the event.

  • 0-day
  • Fuego
  • Gentoo CI system
  • Buildbot
  • Jenkins
  • KernelCI
  • Kselftest
  • ktest
  • LAVA
  • Labgrid
  • LKFT
  • LTP
  • Opentest
  • Phoronix Test Suite
  • ptest
  • R4D
  • SLAV
  • syzkaller/syzbot
  • tbot
  • Xilinux testing
  • Yocto project (oeqa and ptest)

Pre-meeting work

Tim and Kevin worked on a glossary and survey, and CI loop diagram for discussion.

See Test Stack Survey for the work in progress.

Eventually, we'd like to fill out the information on: Test Stack Layers


ATS 2018 Schedule
Time Topic Presenter or discussion leader Slides
9:00-9:10 Welcome and Introduction Tim and Kevin no slides
9:10-9:40 Vision and problem definition Tim ?
9:40-10:40 Glossary and Diagram discussion Tim and Kevin .
10:40-11:00 BREAK n/a .
11:00-12:30 test definition, build artifacts, execution API (E) . .
12:30-2:00 LUNCH BREAK n/a .
1:00-2:00 Brainstorming embedded Linux Requirements with CELP members Tim (discussion leader) .
2:00-2:50 run artifacts, standardized results format, parsers . .
2:50-3:10 BREAK n/a .
3:10-4:30 Farm standards - DUT control drivers, board definitions, discoverability, protocols . .
4:30-5:00 wrap-up Tim and Kevin .

Stuff not fit into schedule yet:


Agenda (brainstorming)

Here is some brainstorming on an agenda...

  • board farm survey (lightning talks)
    • what are people using?
    • what works?
    • what's missing?
      • board discovery/lab introspection?
  • layers and interfaces
    • what layers are supported, needed?
    • any way to leverage/separate/isolate existing software?
  • what tests need to be supported?
    • boot-time
    • run-time
    • package-based (package unit tests)
    • driver (hardware specific?)
      • requiring specialized hardware external to board (e.g. canbus simulator, hdmi frame-grabber)
    • multinode
      • how to allocate/schedule multiple pieces of equipment for a test (e.g. 2 or more nodes for a network test)
  • results reporting
    • centralized server and API to it (kernelCI json?)
  • sharing tests
  • how to define standards
    • de-facto only? (dominant project? (cough, LAVA))
    • documents?
  • survey of existing projects, and what pieces they focus on (or don't)