OpenSSL test definition comparison

From eLinux.org
Jump to: navigation, search

This page has a comparison between the test definitions from Fuego and Linaro for the OpenSSL test.

Differences

  • Fuego has two tests: Benchmark and Functional
    • The Fuego Benchmark test does the openssl-speed test
    • The Fuego Functional test performs multiple functional tests
  • Linaro has one test: openssl-speed

High Level Assumptions

  • Fuego does not disturb the system
    • if something is installed, it is removed, by default
    • if something is started, it is stopped
  • Fuego assumes you can run another test upon completion of one test
  • Linaro assumes a clean install, that will be replaced on next test
    • Things can be modified (packages installed, and forgotten about)
  • Fuego treats system like final product that is immutable
  • Linaro treats system like development system, that is mutable

preparation

building

  • Fuego cross-builds the test software
  • Linaro does not build the software

Pre-requisites

  • Fuego checks for cross-compiler variables
  • Linaro checks for root account

Alterations

  • Linaro can install packages required by openssl on the board
  • Fuego deploys the test software to the board

execution

  • Linaro runs test for each crypto algorithm separately
  • Fuego runs test for all crypto algorithms together
  • Factorization of the test is different
    • dependency check, alterations, test execution, parsing are done on board for Linaro
    • dependency check, test execution, parsing are done on the Host for Fuego

parsing

  • Linaro parses the output for each crypto test on the target using awk
  • Fuego parses the combined output on the host using python (parser.py)

results

  • output is different

Presentation

  • Linaro doesn't include presentation control for the test results in the test

metadata

  • Fuego specifies
  • Linaro specifies the devices for the test to run on
  • Linaro specifies

questions:

  • Linaro install_deps: does this also install the package itself (with the openssl binary)?
  • Linaro: what does send-to-laval.sh do?


Field comparisons

Field items
Fuego Linaro Notes
item use item use
OpenSSL.sh:test_pre_check check required test pre-requisites openssl-speed.sh:! check_root && err_msg check root pre-requisite Linaro is free-form code in script, Fuego is free-form in defined function
OpenSSL.sh:test_build cross-build the test program - - Linaro - no build instructions for this test
- - openssl-speed.sh:install_deps install required packages for test Fuego has no notion of installing package on board
fuego_test.sh:test_deploy Put test program on the board - - Linaro does not put the test program on the board?
fuego_test.sh:test_run instructions to execute the test program on the board openssl-speed.yaml:run:steps: instructions to execute the test program on the board -
parser.py code to parse the test program log openssl-speed.sh:awk lines code to parse the test program log Linaro parsing is inline with the test code
spec.json indicates values for test variables - - -
test.yaml:fuego_package indicates type/format of test openssl-speed.yaml:metadata:format indicates type/format of test -
test.yaml:name name of test oepnssl-speed.yaml:metadata:name name of test similar
test.yaml:description description of test openssl-speed.yaml:metadata:description description of test similar
test.yaml:license/author/version test program information - - -
test.yaml:maintainer Maintainer of this Fuego test openssl-speed.yaml:metadata:maintainer Maintainer of this Linaro test similar
test.yaml:fuego_release Fuego revision of this test - - -
test.yaml:type type of test openssl-speed.yaml:metadata:scope type of test? -
- - openssl-speed.yaml:metadata:os OSes that this test can run on Linaro only?
- - openssl-speed.yaml:metadata:devices devices that this test can run on Linaro only? (Fuego board selection is done by user when creating jobs for boards?)
test.yaml:tags tags for this test - - Fuego only?
test.yaml:params test variable values (note: none in this test) openssl-speed.yaml:params test variable values -
test.yaml:gitrepo upstream git repository for test program - - Fuego only?
test.yaml:data_files manifest used for packaging the test - - Fuego only?
fuego_test.sh:test_run:report? Add data to testlog add_metric Add data to result file Some differences, but can result in additional data in results
chart_config.json Indicate format and items from testcases to show in Jenkins - - -

Fuego source

OpenSSL/OpenSSL.sh

function test_pre_check {
    assert_define "SDKROOT"
    assert_define "CC"
    assert_define "AR"
    assert_define "RANLIB"
}

function test_build {
    SUFFIX=" --sysroot=${SDKROOT}"

    # ignore the none-linux-gnueabi - it's just a dummy placeholder
    ./Configure --cross-compile-prefix=${CROSS_COMPILE} shared zlib-dynamic os/compiler:none-linux-gnueabi

    # adjust Makefiles to use our tools and our sysroot
    # some of these may already have been adjusted by Configure
    sed -i -e "s#CC= cc#CC= ${CC}#g" -e "s#^AR= ar#AR= ${AR}#g" Makefile
    sed -i -e "s#CFLAG= #CFLAG= ${SUFFIX} #g"  Makefile
    sed -i -e "s#RANLIB= ranlib#RANLIB= ${RANLIB}#g" Makefile

    sed -i -e "s#CC=		cc#CC= ${CC}#g" -e "s#CFLAG=		#CFLAG= ${SUFFIX} #g"  apps/Makefile
    sed -i -e "s#CC=		cc#CC= ${CC}#g" -e "s#AR=		ar#AR= ${AR}#g" -e "s#CFLAG=		#CFLAG= ${SUFFIX} #g" crypto/Makefile
    sed -i -e "s#CC=cc#CC= ${CC} #g"  Makefile.shared

    # Configure puts $(CROSS_COMPILE) and value of $CROSS_COMPILE in CC
    # definition - fix that by removing the literal value
    sed -i -e "s#CC= \([$](CROSS_COMPILE)\)${CROSS_COMPILE}#CC= \1#" Makefile

    make
    echo '#!/bin/bash
    cd test
    ../util/opensslwrap.sh version -a
    ../util/shlib_wrap.sh ./bftest
    ../util/shlib_wrap.sh ./bntest
    ../util/shlib_wrap.sh ./casttest
    ../util/shlib_wrap.sh ./destest
    ../util/shlib_wrap.sh ./dhtest
    ../util/shlib_wrap.sh ./dsatest
    ../util/shlib_wrap.sh ./dummytest
    ../util/shlib_wrap.sh ./ecdhtest
    ../util/shlib_wrap.sh ./ecdsatest
    ../util/shlib_wrap.sh ./ectest
    ../util/shlib_wrap.sh ./enginetest
    ../util/shlib_wrap.sh ./evp_test evptests.txt
    ../util/shlib_wrap.sh ./exptest
    ../util/shlib_wrap.sh ./fips_dssvs
    ../util/shlib_wrap.sh ./fips_test_suite
    ../util/shlib_wrap.sh ./hmactest
    ../util/shlib_wrap.sh ./ideatest
    ../util/shlib_wrap.sh ./igetest
    ../util/shlib_wrap.sh ./md2test
    ../util/shlib_wrap.sh ./md4test
    ../util/shlib_wrap.sh ./md5test
    ../util/shlib_wrap.sh ./randtest
    ../util/shlib_wrap.sh ./rc2test
    ../util/shlib_wrap.sh ./rc4test
    ../util/shlib_wrap.sh ./rmdtest
    ../util/shlib_wrap.sh ./rsa_test
    ../util/shlib_wrap.sh ./sha1test
    ../util/shlib_wrap.sh ./sha256t
    ../util/shlib_wrap.sh ./sha512t
    ../util/shlib_wrap.sh ./shatest
    ../util/shlib_wrap.sh ./ssltest' > run-tests.sh
}

Benchmark.OpenSSL/fuego_test.sh

# some functions are shared between Benchmark.OpenSSL and Functional.OpenSSL
tarball=../OpenSSL/openssl-1.0.0t.tar.gz
source $FUEGO_CORE/tests/OpenSSL/openssl.sh

function test_deploy {
    put apps $BOARD_TESTDIR/fuego.$TESTDIR/
}

function test_run {
    report "cd $BOARD_TESTDIR/fuego.$TESTDIR; apps/openssl speed"
}

Benchmark.OpenSSL/parser.py

#!/usr/bin/python

import os, re, sys
import common as plib

ref_section_pat = "^\[[\d\w_.-]+.[gle]{2}\]"
cur_search_pat = re.compile("^(md5|aes-192 cbc|sha512)(\s*)([\d.k]{6,10})(\s*)([\d.k]{6,10})(\s*)([\d.k]{6,10})(\s*)([\d.k]{6,10})(\s*)([\d.k]{6,10})",re.MULTILINE)

cur_dict = {}

pat_result = plib.parse(cur_search_pat)
if pat_result:
	for i in range(3):
		testname = pat_result[i][0]
		if " " in testname:
			testname = pat_result[i][0].split(" ")[0] + "_" + pat_result[i][0].split(" ")[1]

		cur_dict[testname+".16bytes"] = pat_result[i][2].rstrip("k")
		cur_dict[testname+".64bytes"] = pat_result[i][4].rstrip("k")
		cur_dict[testname+".256bytes"] = pat_result[i][6].rstrip("k")
		cur_dict[testname+".1024bytes"] = pat_result[i][8].rstrip("k")
		cur_dict[testname+".8192bytes"] = pat_result[i][10].rstrip("k")

sys.exit(plib.process_data(ref_section_pat, cur_dict, 'm', 'Rate,1000s of bytes/second'))

Benchmark.OpenSSL/spec.json

{
    "testName": "Benchmark.OpenSSL",
    "specs": {
        "default": {}
    }
}

Benchmark.OpenSSL/test.yaml

fuego_package_version: 1
name: Benchmark.OpenSSL
description: |
    Measure the performance of OpenSSL libraries installed on the system.
license: OpenSSL license and SSLEAY
author: The OpenSSL Project
maintainer: Tim Bird <tim.bird@sony.com>
version: 1.0.0t
fuego_release: 1
type: Benchmark
tags: ['network', 'security', 'performance']
gitrepo: https://github.com/openssl/openssl
data_files:
    - chart_config.json
    - fuego_test.sh
    - parser.py
    - reference.log
    - spec.json
    - test.yaml
    - ../OpenSSL/openssl-1.0.0t.tar.gz
    - ../OpenSSL/openssl.sh

Benchmark.OpenSSL/chart_config.json

{
    "OpenSSL": ["aes-192_cbc","md5","sha512"]
}

Functional.OpenSSL/fuego_test.sh

# some functions are shared between Benchmark.OpenSSL and Functional.OpenSSL
tarball=../OpenSSL/openssl-1.0.0t.tar.gz
source $FUEGO_CORE/tests/OpenSSL/openssl.sh

function test_deploy {
    put apps util test run-tests.sh  $BOARD_TESTDIR/fuego.$TESTDIR/
}

function test_processing {
    P_CRIT="passed|ok"

    log_compare "$TESTDIR" "169" "${P_CRIT}" "p"
}

function test_run {
    report "cd $BOARD_TESTDIR/fuego.$TESTDIR; bash run-tests.sh"
}

Functional.OpenSSL/spec.json

{
    "testName": "Functional.OpenSSL",
    "specs": {
        "default": {}
    }
}

LAVA source

openssl-speed.sh

#!/bin/sh
# shellcheck disable=SC1004
# shellcheck disable=SC1091

. ../../lib/sh-test-lib
OUTPUT="$(pwd)/output"
RESULT_FILE="${OUTPUT}/result.txt"

usage() {
    echo "Usage: $0 [-s <true|false>]" 1>&2
    exit 1
}

while getopts "s:" o; do
  case "$o" in
    s) SKIP_INSTALL="${OPTARG}" ;;
    *) usage ;;
  esac
done

! check_root && error_msg "You need to be root to run this script."
create_out_dir "${OUTPUT}"

pkgs="openssl"
install_deps "${pkgs}" "${SKIP_INSTALL}"

# Record openssl vesion as it has a big impact on test reuslt.
openssl_version="$(openssl version | awk '{print $2}')"
add_metric "openssl-version" "pass" "${openssl_version}" "version"

# Test run.
cipher_commands="md5 sha1 sha256 sha512 des des-ede3 aes-128-cbc aes-192-cbc \
                aes-256-cbc rsa2048 dsa2048"
for test in ${cipher_commands}; do
    echo
    info_msg "Running openssl speed ${test} test"
    openssl speed "${test}" 2>&1 | tee "${OUTPUT}/${test}-output.txt"

    case "${test}" in
      # Parse asymmetric encryption output.
      rsa2048|dsa2048)
        awk -v test_case_id="${test}" 'match($1$2, test_case_id) \
            {printf("%s-sign pass %s sign/s\n", test_case_id, $(NF-1)); \
            printf("%s-verify pass %s verify/s\n", test_case_id, $NF)}' \
            "${OUTPUT}/${test}-output.txt" | tee -a "${RESULT_FILE}"
        ;;
      # Parse symmetric encryption output.
      des|des-ede3|aes-128-cbc|aes-192-cbc|aes-256-cbc)
        awk -v test_case_id="${test}" \
            '/^Doing/ {printf("%s-%s pass %d bytes/s\n", test_case_id, $7, $7*$10/3)}' \
            "${OUTPUT}/${test}-output.txt" | tee -a "${RESULT_FILE}"
        ;;
      *)
        awk -v test_case_id="${test}" \
            '/^Doing/ {printf("%s-%s pass %d bytes/s\n", test_case_id, $6, $6*$9/3)}' \
            "${OUTPUT}/${test}-output.txt" | tee -a "${RESULT_FILE}"
        ;;
    esac
done

openssl-speed.yaml

metadata:
    format: Lava-Test Test Definition 1.0
    name: openssl-speed
    description: "Use openssl speed command to test the performance of
                  cryptographic algorithms"
    maintainer:
        - chase.qi@linaro.org
    os:
        - debian
        - ubuntu
        - fedora
        - centos
    scope:
        - performance
    devices:
        - juno
        - hi6220-hikey
        - apq8016-sbc
        - mustang
        - d03
        - d05

params:
    SKIP_INSTALL: "False"

run:
    steps:
        - cd ./automated/linux/openssl/
        - ./openssl-speed.sh -s "${SKIP_INSTALL}"
        - ../../utils/send-to-lava.sh ./output/result.txt