Automated Testing microsummit Linaro Connect BKK19

Meeting was organized by Dan Rue, and Linaro graciously sponsored some of the participant's attendance.

Linaro Connect was held last week in Bangkok. There were a lot of sessions (listed below) around Linux testing, LAVA, and Fuego.

We also held an impromptu micro-summit (minutes below), where we all locked ourselves in a room together for the day and discussed LAVA, Fuego, test definitions, and yes, pdudaemon.

Finally, Maria Högberg (cc'd) has graciously arranged for a monthly meeting that we can use to coordinate across projects throughout the year - please send her an email if you'd like to be added. The first one is scheduled for Thursday, May 9th at 13:00 UTC.

Selected Sessions (videos should be available within a week or two at the referenced URLs):
 * EAS Unit Testing
 * LAVA Users Forum
 * KEYNOTE: Open Source QA - what will it take to get to the next level
 * Experiences and lessons we learned using kselftest and potential improvements
 * LAVA community enabled testing
 * Harmonizing open source test definitions
 * KernelCI New Generation
 * What is this Fuego thing and where is it going?
 * Automating test results analysis using neural networks
 * Scheduling in CI/CD systems
 * Bootloader testing in LAVA
 * How to integrate Fuego automated testing tool in your CI loop

= Micro-Summit Minutes = Date: Wednesday, April 4, 2019

Attendees
 * Daniel Sangorrin (Toshiba)
 * Tim Bird (Sony)
 * Carlos Hernandez (TI)
 * Kat Cosgrove (jfrog)
 * Naresh Kamboju
 * Antonio Tercerio
 * Daniel Diaz
 * Charles Oliveira
 * Chase Qi
 * Milosz Wasilewski
 * Stevan Radakovic
 * Dave Pigott
 * Maria Högbergadding
 * Luca Di Stefano
 * Steve McIntyre
 * Matt Hart
 * Remi Duraffort
 * Fathi Boudra
 * Anders Roxell
 * Dan Rue

PDU API

 * Wider community agreement on standardizing on pdudaemon
 * Linaro lab doesn’t actually use pdudaemon
 * Running pdudaemon @ Linaro
 * [Dave] Queueing vs running immediately
 * [matt] we can just add it to pdu daemon if this is a problem
 * [Matt] added in a PR
 * Dave will write a proposal for power control API
 * There will be an abstraction layer for power control to turn a device on or off.
 * [Dave] What about pressing buttons, turning on/off usb ports, etc
 * [Tim] let’s not deal with composition right now, and only components
 * [Carlos] we do need to solve the problem… once we have an interface for power, we need a similar interface for relays.
 * Combinations of things may still make it complicated. For example, hold a button while pressing power.
 * Core pdu api actions: on, off, status
 * Core relay api actions: on, off, ???

USB control

 * LAVA lab uses cambrionix USB hubs
 * Also use good usb shielded cables
 * Sony uses an open hardware board per DUT

Fuego Architecture

 * Tim gave a subset of his Thursday talk; see his Thursday talk for the full details.
 * Jenkins based
 * Test execution system
 * Steps broken into discrete phases like build, deploy, run, process, etc.
 * Board and platform management are abstracted
 * Not serial console based; typically uses SSH on a board that’s already provisioned (imaged)
 * Contains functional and benchmark tests
 * Fuego is a linux distribution, distributed as a container, self contained
 * Testing driven from host, using ssh connection to DUT
 * Is container privileged?
 * [tim] yes
 * [dave] Dynamic device detection is possible without privileged mode
 * Designed for embedded linux testing
 * IoT possible
 * By default, builds tests, though LTP has a way of checking to see if it’s on disk and skipping the build
 * Can also just skip the build phase so long as the build is in the location that is expected on target
 * [long discussion about skip lists, known issues, and (lack of) test documentation]
 * Fuego-core has skip logic inside the test folder (ltp for example)
 * Test-definitions has similar
 * Additionally, lkft uses known issues in SQUAD to annotate known failures

LAVA Architecture

 * Remi gave a brief LAVA architecture overview
 * Components:
 * Lava-server
 * Apache2, lava-server-gunicorn, postgresql, lava-logs, lava-master
 * Lava-dispatcher
 * Lava-slave, lava-run, devices under test
 * One lava-server, multiple dispatchers, multiple DUTs per dispatcher
 * Lava-run is ephemeral process that communicates with DUT during a job run
 * Device-type defines e.g. a raspberry pi
 * Device defines an instance of a raspberry pi, with specifics about which port its plugged into, what serial port is it, etc.
 * [daniel s] can i skip the deploy?
 * Yes, it’s supported
 * [daniel s] parsing in the dispatcher, where is that done?
 * [steve] we recommend parsing gets done on DUT
 * [remi] interestedin junit, tap13 parsing in LAVA so that a parser on DUT isn’t necessary
 * [remi] goal is to not have to parse on DUT and do as much as we can on dispatcher
 * Multinode jobs, used for networking, controlling lab hardware using lxc, etc
 * Remi mentioned a feature to run a container as a part of a test run, which could help with jobs that are more complex, and eliminate a lot of existing multi-node jobs (making it easier)

Running Fuego tests in Lava

 * Fuego operates a lot like an "interactive" LAVA test
 * Are several solutions here: One that seems good is to use the new feature to run a container as part of a test.

Running Linaro tests in Fuego

 * Fuego already has Functional.linaro, which runs a Linaro test using testrunner.

Test Definitions
Tim Bird proposing `run-test.sh` as a naming convention for the name of the test program for each package
 * In ptest, you can build a test package, which you can install and execute.
 * More controversially, why don’t you rename all the .sh scripts to run-test.sh
 * Tim Bird proposing a standard variable name for the location of kernel config path
 * LTP implements ‘KCONFIG_PATH’

How do we decide such standards?
 * Proposal @ Automated testing list
 * Tap13 is just a website. Maybe we need something similar to publish these things on

Tim Bird on Test Names
 * Standardizing on test names would be nice so that we can compare across projects
 * Related to test case definition discussion
 * A good first step is to investigate how tests are named by various projects, starting with LTP.
 * How different are test case names for LTP?

Carlos Hernandez - Test case definition standards (proposal) Provide everything needed to run any test from any test framework.


 * TGUID
 * Test Name
 * Description
 * Test Execution Engine (e.g. LAVA, Fuego, VATF)
 * Test script path

Next Steps

 * Agreement to start a public monthly meeting. Maria will organize.