Iperf test definition comparison

This page has a comparison between the test definitions from Fuego and Linaro for the OpenSSL test.

= Differences =
 * Fuego only runs ...
 * Linaro runs ...

High Level Assumptions

 * Fuego does not disturb the system
 * if something is installed, it is removed, by default
 * if something is started, it is stopped
 * Fuego assumes you can run another test upon completion of one test


 * Linaro assumes a clean install, that will be replaced on next test
 * Things can be modified (packages installed, and forgotten about)


 * Fuego treats system like final product that is immutable
 * Linaro treats system like development system, that is mutable

building

 * Fuego cross-builds the test software
 * Linaro does not build the software

Pre-requisites

 * Fuego checks for cross-compiler variables
 * Linaro checks for root account

Alterations

 * Linaro can install packages required by openssl on the board
 * Fuego deploys the test software to the board

Execution

 * Linaro runs test for each crypto algorithm separately
 * Fuego runs test for all crypto algorithms together


 * Factorization of the test is different
 * dependency check, alterations, test execution, parsing are done on board for Linaro
 * dependency check, test execution, parsing are done on the Host for Fuego

Parsing

 * Linaro parses the output for each crypto test on the target using awk
 * Fuego parses the combined output on the host using python (parser.py)

Results

 * output is different

Presentation

 * Linaro doesn't include presentation control for the test results in the test

Metadata

 * Fuego specifies author, license gitrepo, for test program
 * Linaro specifies the devices for the test to run on
 * Linaro specifies distros where test can run

= questions: =
 * Linaro install_deps: does this also install the package itself (with the openssl binary)?
 * Linaro: what does send-to-laval.sh do?

= Field comparisons =

= Fuego source =

fuego_test.sh
tarball=iperf-2.0.5.tar.gz

function test_build { # get updated config.sub and config.guess files, so configure # doesn't reject new toolchains cp /usr/share/misc/config.{sub,guess}. ./configure --host=$HOST --build=`./config.guess` sed -i -e "s|#define bool int|//#define bool int|g" config.h   make config.h    sed -i -e "s/#define HAVE_MALLOC 0/#define HAVE_MALLOC 1/g" -e "s/#define malloc rpl_malloc/\/\* #undef malloc \*\//g" config.h    sed -i -e '/HEADERS\(\)/ a\#include "gnu_getopt.h"' src/Settings.cpp make }

function test_deploy { put src/iperf $BOARD_TESTDIR/fuego.$TESTDIR/ }

function test_run { cmd "killall -SIGKILL iperf 2>/dev/null; exit 0"

# Start iperf server on Jenkins host iperf_exec=`which iperf`

if [ -z $iperf_exec ]; then echo "ERROR: Cannot find iperf" false else $iperf_exec -s & fi

assert_define BENCHMARK_IPERF_SRV

if [ "$BENCHMARK_IPERF_SRV" = "default" ]; then srv=$SRV_IP else srv=$BENCHMARK_IPERF_SRV fi

report "cd $BOARD_TESTDIR/fuego.$TESTDIR; ./iperf -c $srv -t 15; ./iperf -c $srv -d -t 15" $BOARD_TESTDIR/fuego.$TESTDIR/${TESTDIR}.log }

function test_cleanup { kill_procs iperf }

test.yaml
= Linaro source =