Difference between revisions of "Automated Testing Summit 2018"

From eLinux.org
Jump to: navigation, search
(Attendees)
(2018 meeting)
Line 6: Line 6:
 
Action items:
 
Action items:
 
* Coordinate registration, signage and badging with ELCE/OSSE
 
* Coordinate registration, signage and badging with ELCE/OSSE
* Send out Survey
+
* Collect and organize survey results (still in progress, mostly done)
* Collect and organize survey results
 
 
* Finalize sessions, presentations, and discussion topics
 
* Finalize sessions, presentations, and discussion topics
  

Revision as of 17:17, 16 October 2018

This is a public planning page for the Automated Testing Summit

2018 meeting

Coordinated by: Tim Bird and Kevin Hilman

Action items:

  • Coordinate registration, signage and badging with ELCE/OSSE
  • Collect and organize survey results (still in progress, mostly done)
  • Finalize sessions, presentations, and discussion topics

Date and Venue

  • Location: Edinburgh, Scotland
  • Venue: in the Edinburgh International Conference Centre
  • Date: October 25, 2018
  • Room: to be announced
  • Time: 9:00 am to 5:00 pm

Lunch will be included. Attendance is by invitation and is free-of-charge.

Sponsorship is provided by the Core Embedded Linux Project of the Linux Foundation.

Attendees

Please note that this is a closed, invitation-only event.

Invitations were sent out previously, and representatives from the following projects have agreed to come to the event.

  • 0-day
  • Fuego
  • Gentoo CI system
  • Jenkins
  • KernelCI
  • kerneltests.org
  • Kselftest
  • ktest
  • LAVA
  • Labgrid
  • LKFT
  • LTP
  • Mentor testing
  • Opentest
  • Phoronix Test Suite
  • ptest
  • R4D
  • SLAV
  • syzkaller/syzbot
  • tbot
  • Xilinux testing
  • Yocto project (ptest?)

Pre-meeting work

Tim and Kevin worked on a glossary and survey, and CI loop diagram for discussion.

See Test Stack Survey for the work in progress.

Eventually, we'd like to fill out the information on: Test Stack Layers

Agenda (brainstorming)

Here is some brainstorming on an agenda...

  • board farm survey (lightning talks)
    • what are people using?
    • what works?
    • what's missing?
      • board discovery/lab introspection?
  • layers and interfaces
    • what layers are supported, needed?
    • any way to leverage/separate/isolate existing software?
  • what tests need to be supported?
    • boot-time
    • run-time
    • package-based (package unit tests)
    • driver (hardware specific?)
      • requiring specialized hardware external to board (e.g. canbus simulator, hdmi frame-grabber)
    • multinode
      • how to allocate/schedule multiple pieces of equipment for a test (e.g. 2 or more nodes for a network test)
  • results reporting
    • centralized server and API to it (kernelCI json?)
  • sharing tests
  • how to define standards
    • de-facto only? (dominant project? (cough, LAVA))
    • documents?
  • survey of existing projects, and what pieces they focus on (or don't)