-
Notifications
You must be signed in to change notification settings - Fork 173
Test specs
Test specs allow any Test Result coming from either results
command directly or from sync tests with --poll
or --wait
options to be asserted by comparing the actual result with expected results defined by a spec JSON string or file.
Specs are defined following Test Results JSON output whereas response.data
tree structure returned from results
command is traversed and all matching leaves from test specs definition are compared. e.g.:
testspecs.json
:
{
"median": {
"firstView": {
"requests": 20,
"render": 400,
"loadTime": 3000,
"score_gzip": {
"min": 90
}
}
}
}
By running:
$ webpagetest test http://staging.example.com --first --poll --specs testspecs.json
After tested and returning the following test results:
{
"response": {
...
"data": {
...
"median": {
"firstView": {
...
"requests": 15
"render": 500,
"loadTime": 2500,
"score_gzip": 70
...
}
}
}
...
}
}
It is compared to testspecs.json
and the output would be:
WebPageTest
✓ median.firstView.requests: 15 should be less than 20
1) median.firstView.render: 500 should be less than 400
✓ median.firstView.loadTime: 2500 should be less than 3000
2) median.firstView.score_gzip: 70 should be greater than 90
2 passing (3 ms)
2 failing
With exit status:
$ echo $?
2
By default all comparisons operations are < (lower than), except when an object is informed with min
and/or max
values, in this case the operations used for comparison are > (greater than) and < (lower than) when both min
and max
are informed than a range comparison is used.
Lower than comparison
{
"median": {
"firstView": {
"render": 400
}
}
}
or
{
"median": {
"firstView": {
"render": {
"max": 400
}
}
}
}
Greater than comparison
{
"median": {
"firstView": {
"score_gzip": {
"min": 75
}
}
}
}
Range comparison
{
"median": {
"firstView": {
"requests": {
"min": 10,
"max": 30
}
}
}
}
It is possible to optionally define default operations and label templates inside defaults
property in the specs
JSON file:
{
"defaults": {
"suiteName": "Performance Test Suite for example.com",
"text": ": {actual} should be {operation} {expected} for {metric}",
"operation": ">"
},
"median": {
"firstView": {
"score_gzip": 80,
"score_keep-alive": 80
}
}
}
Test suite name and specs texts will be used in the test output and both scores should be greater than 80 as follows:
Performance Test Suite for example.com
1) 70 should be greater than 80 for median.firstView.score_gzip
✓ 100 should be greater than 80 for median.firstView.score_keep-alive
1 passing (3 ms)
1 failing
If defaults
property is omitted, the following properties are used:
"defaults": {
"suiteName": "WebPageTest",
"text": "{metric}: {actual} should be {operation} {expected}",
"operation": "<"
}
You might want to keep a budget on your individual external resources like app.js, app.css, particular font file and etc. you can do the same with Regex Matching here is an example.
{
"median": {
"firstView": {
"requests": {
"find": [
{
"key": "url",
"pattern": "^/assets/.*/.*/_next/static/.*/pages/_app.js$",
"spec": {
"objectSizeUncompressed": {
"max": 295400
}
}
},
{
"key": "url",
"pattern": "^/assets/.*/.*/_next/static/.*/pages/index.js$",
"spec": {
"objectSizeUncompressed": {
"max": 742400
}
}
},
{
"key": "url",
"pattern": "^/assets/.*/.*/_next/static/chunks/commons.*.js$",
"spec": {
"objectSizeUncompressed": {
"max": 1153533
}
}
},
{
"key": "url",
"pattern": "^/assets/.*/.*/_next/static/runtime/main-.*.js$",
"spec": {
"objectSizeUncompressed": {
"max": 17010
}
}
},
{
"key": "url",
"pattern": "^/assets/.*/.*/_next/static/chunks/commons.*.chunk.css$",
"spec": {
"objectSizeUncompressed": {
"max": 43200
}
}
}
]
}
}
}
}
- {metric}: metric name, eg: median.firstView.loadTime
- {actual}: the value returned from the actual test results, eg: 300
- {operation}: the long operation name, eg: lower than
- {expected}: the defined expected value, eg: 200
- < → lower than
- > → greater than
- <> → greater than and lower than (range)
- = → equal to
Overriding individual specs labels is also possible by providing text
in the spec object:
{
"median": {
"firstView": {
"loadTime": {
"text": "page load time took {actual}ms and should be no more than {expected}ms",
"max": 3000
}
}
}
}
Which outputs:
WebPageTest
✓ page load time took 2500ms and should be no more than 3000ms
1 passing (2 ms)
WebPageTest API Wrapper Test Specs use Mocha to build and run test suites. The following reporters are available:
- dot (default)
- spec
- tap
- xunit
- list
- progress
- min
- nyan
- landing
- json
- doc
- markdown
- teamcity
By either sync testing or just fetching results with the --breakdown
option, it is possible to test by MIME type:
{
"median": {
"firstView": {
"breakdown": {
"js": {
"requests": 6,
"bytes": 200000
},
"css": {
"requests": 1,
"bytes": 50000
},
"image": {
"requests": 10,
"bytes": 300000
}
}
}
}
}
The spec above only allows up to 6 JS requests summing up to 200KB, 1 CSS request up to 50KB and no more than 10 images up to 300KB total.
When sync testing in Chrome with the --timeline
option, it is possible to test by Processing Breakdown:
{
"run": {
"firstView": {
"processing": {
"RecalculateStyles": 1300,
"Layout": 2000,
"Paint": 800
}
}
}
}
The spec above only allows up to 1300ms of Recalculate Styles, 2000ms of Layout and 800ms of Paint time processing. Thus, avoiding rendering regression once these metrics are know by measuring multiple times from previous tests.
Integration with Jenkins and other CI tools is seamless: Using a sync test command with either --poll
or --wait
(if Jenkins server is reachable from private instance of WebPageTest server) and specifying --specs
file or JSON string with either tap
or xunit
as --reporter
.
* TAP plugin installed from Jenkins Plugin Manager
Similarly to Jenkins Integration, Travis-CI also requires sync test command via --poll
option since it's very unlikely Travis-CI workers are reachable from private or public instances of WebPageTest servers. --specs
obviously is required to test the results but --reporter
is not so important because Travis-CI relies on the exit status rather than the output format like Jenkins does.
The following is an example of WebPageTest performance test for a contrived Node project in a GitHub public repo. Add a test script into package.json
file:
{
"name": "example",
"version": "0.0.1",
"dependencies": {
"webpagetest": ""
},
"scripts": {
"test": "./node_modules/webpagetest/bin/webpagetest
test http://staging.example.com
--server http://webpagetest.example.com
--key $WPT_API_KEY
--first
--location MYVM:Chrome
--poll
--timeout 60
--specs specs.json
--reporter spec"
}
}
* test script's line-breaks added for code clarity, it should be in a single line
The test script above will:
- schedule a test on private instance of WebPageTest hosted on http://webpagetest.example.com which must be publicly reachable from Travis-CI workers
- use WebPageTest API Key from
WPT_API_KEY
(environment variable, see Security below) - test http://staging.example.com which must be publicly reachable from WebPageTest agents
- run test for first view only
- run from location MYVM on Chrome browser
- poll results every 5 seconds (default)
- timeout in 60 seconds if no results are available
- test the results against
specs.json
spec file - output using the spec reporter
Since tests will be scheduled from public instances of Travis-CI workers WebPageTest API keys (--key
or -k
) should be used to prevent abuse, but do not put unencrypted API keys in public files. Fortunately Travis-CI provides an easy way to do that via Secure Environment Variables to avoid explicitly passing $WPT_API_KEY
in the public .travis.yml
file.
Install travis and go to the repo directory:
gem install travis
cd repo_dir
Next, encrypt WebPageTest API Key as a secure environment variable:
travis encrypt WPT_API_KEY=super_secret_api_key_here --add
Note that it must run from the repo directory or use -r
or --repo
to specify the repo name in the format user/repo
, e.g.: marcelduran/webpagetest-api
.
By default the --add
flag above will append the encrypted string into .travis.yml
file as:
env:
global:
- secure: <encrypted WPT_API_KEY=super_secret_api_key_here string>
Similar but a bit simpler than Travis-CI Integration, Drone.io has a user friendly interface that allows straight-forward and secure configuration.