Difference between revisions of "Automated Testing Summit 2018"

From eLinux.org
Jump to: navigation, search
(Schedule)
Line 4: Line 4:
 
Coordinated by: Tim Bird and Kevin Hilman
 
Coordinated by: Tim Bird and Kevin Hilman
  
 +
== mailing list ==
 +
Discussions on this topic have started at on the "Automated Testing" mailing list of the Yocto Project
 +
* See https://lists.yoctoproject.org/listinfo/automated-testing
 +
 +
== action items ==
 
Action items:
 
Action items:
* Coordinate registration, signage and badging with ELCE/OSSE
 
 
* Collect and organize survey results (still in progress, mostly done, see [[Test_Stack_Survey#Responses]])
 
* Collect and organize survey results (still in progress, mostly done, see [[Test_Stack_Survey#Responses]])
* Finalize sessions, presentations, and discussion topics
+
 
 +
* start working on pdudaemon as central location for power control abstraction for DUT control driver interface
 +
* Tim to work on test Definition survey
 +
* Tim Orling - working on pdudaemon debian package?
 +
* result format survey needed soon
 +
* create a page for test suite links
 +
* create a page for automated testing overview
  
 
= Date and Venue  =
 
= Date and Venue  =

Revision as of 08:14, 25 October 2018

This is a public planning page for the Automated Testing Summit

2018 meeting

Coordinated by: Tim Bird and Kevin Hilman

mailing list

Discussions on this topic have started at on the "Automated Testing" mailing list of the Yocto Project

action items

Action items:

  • start working on pdudaemon as central location for power control abstraction for DUT control driver interface
  • Tim to work on test Definition survey
  • Tim Orling - working on pdudaemon debian package?
  • result format survey needed soon
  • create a page for test suite links
  • create a page for automated testing overview

Date and Venue

  • Location: Edinburgh, Scotland
  • Venue: in the Edinburgh International Conference Centre
  • Date: October 25, 2018

Lunch will be included. Attendance is by invitation and is free-of-charge.

Sponsorship is provided by the Core Embedded Linux Project of the Linux Foundation.

Attendees

Please note that this is a closed, invitation-only event.

For this event, Kevin and I decided to keep it to a small focused group. Linaro will be recording the presentations and discussion, and the videos will be made available after the summit. I apologize if you would like to attend but have not been invited. We are hopeful that we can put together a public automated testing event in the future.

Invitations were sent out previously, and representatives from the following projects have agreed to come to the event.

  • 0-day
  • Fuego
  • Gentoo CI system
  • Buildbot
  • Jenkins
  • KernelCI
  • kerneltests.org
  • Kselftest
  • ktest
  • LAVA
  • Labgrid
  • LKFT
  • LTP
  • Opentest
  • Phoronix Test Suite
  • ptest
  • R4D
  • SLAV
  • syzkaller/syzbot
  • tbot
  • Xilinux testing
  • Yocto project (oeqa and ptest)

Pre-meeting work

Tim and Kevin worked on a glossary and survey, and CI loop diagram for discussion.

See Test Stack Survey for the work in progress.

Eventually, we'd like to fill out the information on: Test Stack Layers

Schedule

ATS 2018 Schedule
Time Topic Presenter or discussion leader Slides
9:00-9:10 Welcome and Introduction Tim and Kevin no slides
9:10-9:40 Vision and problem definition Tim ?
9:40-10:40 Glossary and Diagram discussion Tim and Kevin .
10:40-11:00 BREAK n/a .
11:00-12:30 test definition, build artifacts, execution API (E) . .
12:30-2:00 LUNCH BREAK n/a .
1:00-2:00 Brainstorming embedded Linux Requirements with CELP members Tim (discussion leader) .
2:00-2:50 run artifacts, standardized results format, parsers . .
2:50-3:10 BREAK n/a .
3:10-4:30 Farm standards - DUT control drivers, board definitions, discoverability, protocols . .
4:30-5:00 wrap-up Tim and Kevin .

Stuff not fit into schedule yet:

*

Agenda (brainstorming)

Here is some brainstorming on an agenda...

  • board farm survey (lightning talks)
    • what are people using?
    • what works?
    • what's missing?
      • board discovery/lab introspection?
  • layers and interfaces
    • what layers are supported, needed?
    • any way to leverage/separate/isolate existing software?
  • what tests need to be supported?
    • boot-time
    • run-time
    • package-based (package unit tests)
    • driver (hardware specific?)
      • requiring specialized hardware external to board (e.g. canbus simulator, hdmi frame-grabber)
    • multinode
      • how to allocate/schedule multiple pieces of equipment for a test (e.g. 2 or more nodes for a network test)
  • results reporting
    • centralized server and API to it (kernelCI json?)
  • sharing tests
  • how to define standards
    • de-facto only? (dominant project? (cough, LAVA))
    • documents?
  • survey of existing projects, and what pieces they focus on (or don't)