SLAV survey response

= SLAV survey response = SLAV survey response provided by Pawel Wieczorek

Survey Questions

 * What is the name of your test framework? SLAV

Which of the aspects below of the CI loop does your test framework perform?


 * Lab / Board Farm
 * Test Scheduler
 * DUT Control

Does your test framework:

source code access

 * access source code repositories for the software under test? no
 * access source code repositories for the test software? no
 * include the source for the test software? no
 * provide interfaces for developers to perform code reviews? no
 * detect that the software under test has a new version? yes
 * if so, how?  polling artifact (image) server 
 * detect that the test software has a new version? no

test definitions
Does your test system:
 * have a test definition repository? yes
 * if so, what data format or language is used yaml

Does your test definition include:
 * source code (or source code location)? no
 * dependency information? no
 * execution instructions? yes
 * command line variants? no
 * environment variants? yes
 * setup instructions? yes (if specified in job description)
 * cleanup instructions? yes (if specified in job description)
 * if anything else, please describe:

Does your test system:
 * provide a set of existing tests? no, but LAVA yamls can be reused
 * if so, how many?

build management
Does your test system:
 * build the software under test (e.g. the kernel)? no
 * build the test software? no
 * build other software (such as the distro, libraries, firmware)? no
 * support cross-compilation? no
 * require a toolchain or build system for the SUT? no
 * require a toolchain or build system for the test software? no
 * come with pre-built toolchains? no
 * store the build artifacts for generated software? no
 * in what format is the build metadata stored (e.g. json)?
 * are the build artifacts stored as raw files or in a database?
 * if a database, what database?

Test scheduling/management
Does your test system:
 * check that dependencies are met before a test is run? yes (device capabilities specified in job description)
 * schedule the test for the DUT? yes
 * select an appropriate individual DUT based on SUT or test attributes? yes
 * reserve the DUT? yes
 * release the DUT? yes
 * install the software under test to the DUT? yes
 * install required packages before a test is run? no (unless specified in job description)
 * require particular bootloader on the DUT? no
 * deploy the test program to the DUT? no (unless specified in job description)
 * prepare the test environment on the DUT? no (unless specified in job description)
 * start a monitor (another process to collect data) on the DUT? no
 * start a monitor on external equipment? no
 * initiate the test on the DUT? yes
 * clean up the test environment on the DUT? no (unless specified in job description)

DUT control
Does your test system:
 * store board configuration data? yes
 * in what format? TOML (see https://github.com/toml-lang/toml)
 * store external equipment configuration data? yes
 * in what format? TOML
 * power cycle the DUT? yes
 * monitor the power usage during a run? yes
 * gather a kernel trace during a run? no
 * claim other hardware resources or machines (other than the DUT) for use during a test? no (except for DUT-Supervisor - described in glossary)
 * reserve a board for interactive use (ie remove it from automated testing)? yes
 * provide a web-based control interface for the lab? yes
 * provide a CLI control interface for the lab? yes

Run artifact handling
Does your test system:
 * store run artifacts yes
 * in what format? controlled by the user in job description
 * put the run meta-data in a database? no
 * if so, which database?
 * parse the test logs for results? no
 * convert data from test logs into a unified format? no
 * if so, what is the format?
 * evaluate pass criteria for a test? no
 * do you have a common set of result names: (e.g. pass, fail, skip, etc.) no
 * if so, what are they?


 * How is run data collected from the DUT? controlled by the user in job description
 * e.g. by pushing from the DUT, or pulling from a server?
 * How is run data collected from external equipment? controlled by the user (API)
 * Is external equipment data parsed? no

User interface
Does your test system:
 * have a visualization system? yes (API client)
 * show build artifacts to users? yes (API)
 * show run artifacts to users? yes (API)
 * do you have a common set of result colors? no
 * if so, what are they?
 * generate reports for test runs? no
 * notify users of test results by e-mail? no


 * can you query (aggregate and filter) the build meta-data? yes
 * can you query (aggregate and filter) the run meta-data? yes

(e.g. HTML, Javascript, xml, etc.) N/A etc.) N/A
 * what language or data format is used for online results presentation?
 * what language or data format is used for reports? (e.g. PDF, excel,


 * does your test system have a CLI control tool? yes (API client)
 * what is it called? leszy

Languages:
Examples: json, python, yaml, C, javascript, etc.
 * what is the base language of your test framework core? Go

What languages or data formats is the user required to learn? LAVA-like YAML

Can a user do the following with your test framework:

 * manually request that a test be executed? yes
 * see the results of recent tests? yes
 * set the pass criteria for a test? no
 * set the threshold value for a benchmark test? no
 * set the list of testcase results to ignore? no
 * provide a rating for a test? (e.g. give it 4 stars out of 5) no
 * customize a test?
 * alter the command line for the test program? yes
 * alter the environment of the test program? yes
 * specify to skip a testcase? yes
 * set a new expected value for a test? no
 * edit the test program source? yes
 * customize the notification criteria? no
 * customize the notification mechanism (eg. e-mail, text) no
 * generate a custom report for a set of runs? no
 * save the report parameters to generate the same report in the future? no

Requirements
Does your test framework:
 * require minimum software on the DUT? no
 * require minimum hardware on the DUT (e.g. memory) no
 * If so, what?
 * require agent software on the DUT? no, agent is on DUT-Supervisor (described in glossary)
 * If so, what agent? For communication with board farm manager (Boruta)
 * is there optional agent software or libraries for the DUT? no
 * require external hardware in your labs? no (although extra hardware is used for convenience)

APIS
Does your test framework:
 * use existing APIs or data formats to interact within itself, or with 3rd-party modules? yes
 * have a published API for any of its sub-module interactions (any of the lines in the diagram)? yes
 * Please provide a link or links to the APIs?
 * https://git.tizen.org/cgit/tools/boruta/plain/boruta.go
 * https://git.tizen.org/cgit/tools/weles/plain/swagger.yml

Are they:
 * What is the nature of the APIs you currently use?
 * RPCs? yes (for machine-to-machine communication: Boruta - Dryad)
 * Unix-style? no
 * compiled libraries? no
 * interpreter modules or libraries? no
 * web-based APIs? yes
 * something else?

Relationship to other software:

 * what major components does your test framework use? none
 * does your test framework interoperate with other test frameworks or software? no
 * which ones?

Overview
Please list the major components of your test system.


 * Dryad: hardware layer consisting of DUT, DUT-Controller and DUT-Supervisor (described in glossary)
 * Boruta: board farm manager (access request scheduler)
 * Weles: test manager (yaml description to performed actions converter; Boruta client - for requesting access to DUTs)

Glossary
Here is a glossary of terms. Please indicate if your system uses different terms for these concepts. Also, please suggest any terms or concepts that are missing.

provision, etc.)
 * DUT controller - program and hardware for controlling a DUT (reboot,

This responsibility is divided into DUT-Controller and DUT-Supervisor:

Control hardware (MuxPi) takes care of physical aspects of DUT management (switching power supply, jumpers/buttons etc.) while

'''DUT Supervisor - provides connection to the DUT and abstraction for DUT management actions (e.g. dut_boot, dut_login, dut_exec, dut_copyfrom, dut_copyto commands - additional software on NanoPi)


 * DUT scheduler - program for managing access to a DUT (take online/offline, make available for interactive use)
 * This is not shown in the CI Loop diagram - it could be the same as the Test Scheduler

Above is true in SLAV (Boruta)

= Additional Data =