Project

General

Profile

Actions

Running tests

The arvados git repository has a run-tests.sh script which tests (nearly) all of the components in the source tree. Jenkins at https://ci.arvados.org uses this exact script, so running it before pushing a new master is a good way to predict whether Jenkins will fail your build and start pestering you by IRC.

Background

You have the arvados source tree at ~/arvados and you might have local modifications.

Install prerequisites

Follow instructions at Hacking prerequisites: "Install dev environment", etc. Don't miss creating the Postgres database, docker groups, etc.

Environment

Your locale must use utf-8. Set environment variable LANG=C.UTF-8 if necessary to ensure the "locale" command reports UTF-8.

If you're Jenkins

Run all tests from a clean slate (slow, but more immune to leaks)

~/arvados/build/run-tests.sh WORKSPACE=~/arvados

If you're a developer

Cache everything needed to run test suites:

mkdir ~/.cache/arvados-build
~/arvados/build/run-tests.sh WORKSPACE=~/arvados --temp ~/.cache/arvados-build --only install

Start interactive mode:

$ ~/arvados/build/run-tests.sh WORKSPACE=~/arvados --temp ~/.cache/arvados-build --interactive

Start interactive mode and enabled debug output:

$ ~/arvados/build/run-tests.sh WORKSPACE=~/arvados ARVADOS_DEBUG=1 --temp ~/.cache/arvados-build --interactive

When prompted, choose a test suite to run:

== Interactive commands:
test TARGET
test TARGET:py3        (test with python3)
test TARGET -check.vv  (pass arguments to test)
install TARGET
install env            (go/python libs)
install deps           (go/python libs + arvados components needed for integration tests)
reset                  (...services used by integration tests)
exit
== Test targets:
cmd/arvados-client              lib/dispatchcloud/container     sdk/go/auth                     sdk/pam:py3                     services/fuse                   tools/crunchstat-summary
cmd/arvados-server              lib/dispatchcloud/scheduler     sdk/go/blockdigest              sdk/python                      services/fuse:py3               tools/crunchstat-summary:py3
lib/cli                         lib/dispatchcloud/ssh_executor  sdk/go/crunchrunner             sdk/python:py3                  services/health                 tools/keep-block-check
lib/cloud                       lib/dispatchcloud/worker        sdk/go/dispatch                 services/arv-git-httpd          services/keep-balance           tools/keep-exercise
lib/cloud/azure                 lib/service                     sdk/go/health                   services/crunch-dispatch-local  services/keepproxy              tools/keep-rsync
lib/cloud/ec2                   sdk/cwl                         sdk/go/httpserver               services/crunch-dispatch-slurm  services/keepstore              tools/sync-groups
lib/cmd                         sdk/cwl:py3                     sdk/go/keepclient               services/crunch-run             services/keep-web
lib/controller                  sdk/go/arvados                  sdk/go/manifest                 services/crunchstat             services/nodemanager
lib/crunchstat                  sdk/go/arvadosclient            sdk/go/stats                    services/dockercleaner          services/nodemanager:py3
lib/dispatchcloud               sdk/go/asyncbuf                 sdk/pam                         services/dockercleaner:py3      services/ws
What next? 

Example: testing lib/dispatchcloud/container, showing verbose/debug logs:

What next? test lib/dispatchcloud/container/ -check.vv
======= test lib/dispatchcloud/container
START: queue_test.go:99: IntegrationSuite.TestCancelIfNoInstanceType
WARN[0000] cancel container with no suitable instance type  ContainerUUID=zzzzz-dz642-queuedcontainer error="no suitable instance type" 
WARN[0000] cancel container with no suitable instance type  ContainerUUID=zzzzz-dz642-queuedcontainer error="no suitable instance type" 
START: queue_test.go:37: IntegrationSuite.TearDownTest
PASS: queue_test.go:37: IntegrationSuite.TearDownTest   0.846s

PASS: queue_test.go:99: IntegrationSuite.TestCancelIfNoInstanceType     0.223s

START: queue_test.go:42: IntegrationSuite.TestGetLockUnlockCancel
INFO[0001] adding container to queue                     ContainerUUID=zzzzz-dz642-queuedcontainer InstanceType=testType Priority=1 State=Queued
START: queue_test.go:37: IntegrationSuite.TearDownTest
PASS: queue_test.go:37: IntegrationSuite.TearDownTest   0.901s

PASS: queue_test.go:42: IntegrationSuite.TestGetLockUnlockCancel        0.177s

OK: 2 passed
PASS
ok      git.arvados.org/arvados.git/lib/dispatchcloud/container       2.150s
======= test lib/dispatchcloud/container -- 3s
Pass: lib/dispatchcloud/container tests (3s)
All test suites passed.

Running individual test cases

Most Go packages use gocheck. Use gocheck command line args like -check.f.

What next? test lib/dispatchcloud/container -check.vv -check.f=LockUnlock
======= test lib/dispatchcloud/container
START: queue_test.go:42: IntegrationSuite.TestGetLockUnlockCancel
INFO[0000] adding container to queue                     ContainerUUID=zzzzz-dz642-queuedcontainer InstanceType=testType Priority=1 State=Queued
START: queue_test.go:37: IntegrationSuite.TearDownTest
PASS: queue_test.go:37: IntegrationSuite.TearDownTest   0.812s

PASS: queue_test.go:42: IntegrationSuite.TestGetLockUnlockCancel        0.184s

OK: 1 passed
PASS
ok      git.arvados.org/arvados.git/lib/dispatchcloud/container       1.000s
======= test lib/dispatchcloud/container -- 2s

Python

What next? test sdk/python --test-suite=tests.test_collections.TextModes.test_read_sailboat_across_block_boundary
======= test sdk/python
running test
running egg_info
writing requirements to arvados_python_client.egg-info/requires.txt
writing arvados_python_client.egg-info/PKG-INFO
writing top-level names to arvados_python_client.egg-info/top_level.txt
writing dependency_links to arvados_python_client.egg-info/dependency_links.txt
writing pbr to arvados_python_client.egg-info/pbr.json
reading manifest file 'arvados_python_client.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
writing manifest file 'arvados_python_client.egg-info/SOURCES.txt'
running build_ext
test_read_sailboat_across_block_boundary (tests.test_collections.TextModes) ... ok

----------------------------------------------------------------------
Ran 1 test in 0.014s

OK
======= test sdk/python -- 1s

RailsAPI

What next? test services/api TESTOPTS=--name=/.*signed.locators.*/
[...]
# Running:

....

Finished in 1.080084s, 3.7034 runs/s, 461.0751 assertions/s.

Workbench

What next? test apps/workbench_integration TESTOPTS="-v -n=/.*non-empty.*/" 
======= test apps/workbench_integration
[...]
# Running:

Using port 43855 for selenium
.

Finished in 11.831848s, 0.0845 runs/s, 0.1690 assertions/s.

Restarting services for integration tests

If you have changed services/api code, and you want to check whether it breaks the lib/dispatchcloud/container integration tests:

What next? reset                                # teardown the integration-testing environment
What next? install services/api                 # (only needed if you've updated dependencies)
What next? test lib/dispatchcloud/container     # bring up the integration-testing environment and run tests
What next? test lib/dispatchcloud/container     # leave the integration-testing environment up and run tests

Updating cache after pulling master

Always quit interactive mode and restart after modifying run-tests.sh (via git-pull, git-checkout, editing, etc).

When you start, run "install all" to get the latest gem/python dependencies, install updated versions of Arvados services used by integration tests, etc.

Then you can resume your cycle of "test lib/controller", etc.

Controlling test order (Rails)

Rails tests start off with a line like this

Run options: -v -d --seed 57089

The seed value determines the order tests are run. To reproduce reproduce an order-dependent test failure, specify the same seed as a previous failed run:

What next? test services/api TESTOPTS="-v -d --seed 57089" 

Other options

For more usage info, try:

~/arvados/build/run-tests.sh --help

Running workbench diagnostics tests

You can run workbench diagnostics tests against any production server.

Update your workbench application.yml to add a "diagnostics" section with the login token and pipeline details. The below example configuration is to run the "qr1hi-p5p6p-ftcb0o61u4yd2zr" pipeline in "qr1hi" environment.

diagnostics:
  secret_token: useanicelongrandomlygeneratedsecrettokenstring
  arvados_workbench_url: https://workbench.qr1hi.arvadosapi.com
  user_tokens:
    active: yourcurrenttokenintheenvironmenttowhichyouarepointing
  pipelines_to_test:
    pipeline_1:
      template_uuid: qr1hi-p5p6p-ftcb0o61u4yd2zr
      input_paths: []
      max_wait_seconds: 300

You can now run the "qr1hi" diagnostics tests using the following command:

  cd $ARVADOS_HOME
  RAILS_ENV=diagnostics bundle exec rake TEST=test/diagnostics/pipeline_test.rb

Running workbench2 tests

React uses a lot of filesystem watchers (via inotify). The default number of watched files is relatively low at 8192. Increase that with:

echo fs.inotify.max_user_watches=524288 | sudo tee -a /etc/sysctl.conf && sudo sysctl -p

Docker

The integration tests can be run on non-debian based systems using docker. The workbench2 repo includes a Dockerfile that preinstalls the necessary dependencies to run cypress.

Build the docker image using this command from within the arvados repository

docker build . -f services/workbench2/docker/Dockerfile -t workbench2-build

Then, start the container using the following arguments:

xhost +local:root
ARVADOS_DIR=/path/to/arvados
docker run -ti -v$ARVADOS_DIR:/usr/src/arvados -w /usr/src/arvados --env="DISPLAY" --volume="/tmp/.X11-unix:/tmp/.X11-unix:rw" workbench2-build:latest bash

Docker Troubleshooting

No version of Cypress is installed

Solution: Manually install the required cypress version by running ./node_modules/.bin/cypress install inside the container

Debian Host System

These instructions assume a Debian 10 (buster) host system.

Install the Arvados test dependencies:

echo "deb http://deb.debian.org/debian buster-backports main" > /etc/apt/sources.list.d/backports.list
apt-get update
apt-get install -y --no-install-recommends golang -t buster-backports
apt-get install -y --no-install-recommends build-essential ca-certificates git libpam0g-dev

Install a few more dependencies for workbench2:

apt-get update
apt-get install -y --no-install-recommends gnupg2 sudo curl
curl -sS https://dl.yarnpkg.com/debian/pubkey.gpg | sudo apt-key add -
echo "deb https://dl.yarnpkg.com/debian/ stable main" | sudo tee /etc/apt/sources.list.d/yarn.list
apt-get update
apt-get install -y --no-install-recommends yarn libgbm-dev
# we need the correct version of node to install cypress
# use arvados-server install to get it (and all other dependencies)
# All this will then not need to be repeated by ./tools/run-integration-tests.sh
# so we are not doing double work.
cd /usr/src/arvados
go mod download
cd cmd/arvados-server
go install
~/go/bin/arvados-server install -type test
cd <your-wb2-directory>
yarn run cypress install

Make sure you have both the arvados and arvados-workbench2 source trees available, and then use the following commands (adjust path for the arvados source tree, if necessary) from your workbench2 source tree.

Running Tests

Run the unit tests with:

make unit-tests
# or
yarn test

Run the cypress integration tests with:

./tools/run-integration-tests.sh -i -a /usr/src/arvados

Updated by Stephen Smith 10 days ago · 61 revisions