Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add pytest-based acceptance and style tests #58

Merged
merged 1 commit into from
Dec 3, 2015

Conversation

nchammas
Copy link
Owner

This is a work in progress.

Run py.test from Flintrock's root directory and you'll kick off a full set of acceptance tests. This will replaces lint.sh and test-integration.sh.

I gather this is not a typical way to test software (e.g. the acceptance tests launch real clusters), but I think it's what makes the most sense for Flintrock.

Any input here would be great.

Fixes #49.

@nchammas
Copy link
Owner Author

@shivaram and @broxtronix - I'm curious if from your experience with spark-ec2 and spark-gce you have any thoughts on how to test Flintrock.

@shivaram
Copy link

Is there a high level description of what these tests are doing ? I never found a good way to test spark-ec2, but I do see some mock implementation of boto (like https://github.com/spulec/moto) that might be useful to get a realistic test -- @nchammas Have you had any experience using any of these ?

@nchammas
Copy link
Owner Author

Is there a high level description of what these tests are doing ?

@shivaram - At a high level these tests just invoke Flintrock's various commands like a user would and confirm that the return codes and other output are as expected. Each test's name describes what it's testing for.

For example:

def test_try_launching_duplicate_cluster(running_cluster):
    p = subprocess.run([
        './flintrock', 'launch', running_cluster],
        stderr=subprocess.PIPE)
    assert p.returncode == 1
    assert p.stderr.startswith(b"Cluster already exists: ")

Given an already running cluster, this tests that Flintrock exits with a code of 1 and a specific message if it tries to launch a new cluster with the same name.

Other tests confirm that HDFS and Spark are functioning, or that files can be uploaded to the cluster, and so on.

I never found a good way to test spark-ec2, but I do see some mock implementation of boto (like https://github.com/spulec/moto) that might be useful to get a realistic test -- @nchammas Have you had any experience using any of these ?

I've heard of Moto and similar libraries which mock out AWS, but I'm skeptical of their utility with Flintrock. As far as I can tell, they have 2 main problems:

  1. They don't help us test critical Flintrock features like whether Spark is functioning after a launch, or whether a file can be copied up to a cluster. We need a real cluster with real hosts we can SSH into and install stuff on to test these features.

    This alone is a critical issue that Moto cannot help with. Only full acceptance tests can confirm that Flintrock is behaving as expected, and it seems impractical to try to mock subcomponents like a host for more granular tests.

  2. Moto will always be playing catchup with AWS's real behavior. The same goes for any other libraries which mock other critical endpoints that Flintrock interacts with, like an Amazon Linux host.

    Instead of doing all this work to mock out AWS, Linux hosts, storage, etc. it seems better to just test against the real thing. Using t2.small instances, which is what the tests in this PR default to, the cost of a full batch of acceptance tests is roughly 4 instance-hours of t2.small, which is $0.104. Even as we add more tests, the cost is going to be less than a dollar.

    These end-to-end tests are slow, but they test critical features in the only way that I can see is possible.

@shivaram
Copy link

Using a real EC2 instance sounds good to me. There are scenarios where say you want to test if SSDs are coming up right or if all of the disks are getting formatted that'll be hard to do with t2.small. But to capture the most essentially features with t2.small sounds like a good idea.

@nchammas
Copy link
Owner Author

I've been using m3.xlarge and sometimes even d2.xlarge instances to test the storage work in #51, but yeah for most things t2.small does the job.

Perhaps I can add storage tests that fail if a cluster with ephemeral storage is launched (e.g. d2.xlarge instances) but the storage doesn't come up correctly. With instances that have no ephemeral storage the tests always pass.

I'll look into that.

@nchammas
Copy link
Owner Author

I've converted the lint tests from shell to Python, revamped the structure of the pytest-based tests, and added documentation.

A few items remain:

  • Testing operations against stopped clusters.
  • Running tests in parallel.

@nchammas nchammas changed the title [WIP] Add pytest-based acceptance tests Add pytest-based acceptance and style tests Dec 1, 2015
@nchammas
Copy link
Owner Author

nchammas commented Dec 1, 2015

I think getting these tests to run in parallel in a sensible way via pytest and its xdist plugin will depend on some resolution to this issue.

Right now, using py.test -n 4 will spin up multiple instances of a given fixture (e.g. a running, t2.small cluster) which is wasteful and slow. What we want is each fixture to be spun up exactly once but in parallel (e.g. t2.small and m3.medium clusters get spun up once but in parallel). That's currently not possible.

@nchammas
Copy link
Owner Author

nchammas commented Dec 3, 2015

I've created a proposal to improve how xdist distributes tests to workers. If implemented, it would give us a sane way to run Flintrock's tests in parallel using pytest.

pytest-dev/pytest-xdist#18

That'll be a while in the making, though, so I'm gonna merge this in since everything else is working.

nchammas added a commit that referenced this pull request Dec 3, 2015
Add pytest-based acceptance and style tests
@nchammas nchammas merged commit ee27fc2 into master Dec 3, 2015
@nchammas nchammas deleted the python-acceptance-tests branch December 3, 2015 18:38
@nchammas nchammas mentioned this pull request Dec 15, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants