Skip to content

Commit

Permalink
Added logic for creating and managing DAG invocations
Browse files Browse the repository at this point in the history
Signed-off-by: Kway Yi Shen <[email protected]>
  • Loading branch information
NotAnAddictz committed Nov 26, 2024
1 parent 4fd007a commit 734925b
Show file tree
Hide file tree
Showing 18 changed files with 611 additions and 66 deletions.
3 changes: 3 additions & 0 deletions .github/configs/wordlist.txt
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ workerNodeNum
durations
dur
ACM
Acyclic
addr
adservice
AdService
Expand Down Expand Up @@ -153,6 +154,7 @@ Daglis
DAGMode
datacenter
Datacenter
DAGs
dataflows
dataset
david
Expand Down Expand Up @@ -710,6 +712,7 @@ cgroups
noop
YAMLs
cgo
EnableDAGDataset
EnableMetricsScrapping
EnableZipkinTracing
EndpointPort
Expand Down
5 changes: 4 additions & 1 deletion cmd/config_knative_trace.json
Original file line number Diff line number Diff line change
Expand Up @@ -24,5 +24,8 @@

"GRPCConnectionTimeoutSeconds": 15,
"GRPCFunctionTimeoutSeconds": 900,
"DAGMode": false
"DAGMode": false,
"EnableDAGDataset": true,
"Width": 2,
"Depth": 2
}
3 changes: 2 additions & 1 deletion cmd/loader.go
Original file line number Diff line number Diff line change
Expand Up @@ -31,11 +31,12 @@ import (
"strings"
"time"

"golang.org/x/exp/slices"

"github.com/vhive-serverless/loader/pkg/common"
"github.com/vhive-serverless/loader/pkg/config"
"github.com/vhive-serverless/loader/pkg/driver"
"github.com/vhive-serverless/loader/pkg/trace"
"golang.org/x/exp/slices"

log "github.com/sirupsen/logrus"
tracer "github.com/vhive-serverless/vSwarm/utils/tracing/go"
Expand Down
6 changes: 6 additions & 0 deletions data/traces/example/dag_structure.csv
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
Width,Width - Percentile,Depth,Depth - Percentile,Total Nodes,Total Nodes
1,0.00%,1,0.00%,2,0.00%
1,78.66%,1,12.51%,2,55.98%
2,92.13%,2,67.96%,3,79.20%
3,95.24%,3,86.84%,4,86.63%
4,100.00%,4,100.00%,5,100.00%
1 change: 1 addition & 0 deletions data/traces/reference/.gitattributes
Original file line number Diff line number Diff line change
@@ -1 +1,2 @@
*.tar.gz filter=lfs diff=lfs merge=lfs -text
dag_structure.csv filter=lfs diff=lfs merge=lfs -text
3 changes: 3 additions & 0 deletions data/traces/reference/dag_structure.csv
Git LFS file not shown
9 changes: 8 additions & 1 deletion docs/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,10 @@
| MetricScrapingPeriodSeconds | int | > 0 | 15 | Period of Prometheus metrics scrapping |
| GRPCConnectionTimeoutSeconds | int | > 0 | 60 | Timeout for establishing a gRPC connection |
| GRPCFunctionTimeoutSeconds | int | > 0 | 90 | Maximum time given to function to execute[^5] |
| DAGMode | bool | true/false | false | Sequential invocation of all functions one after another |
| DAGMode | bool | true/false | false | Generates DAG workflows iteratively with functions in TracePath [^7]. Frequency and IAT of the DAG follows their respective entry function, while Duration and Memory of each function will follow their respective values in TracePath. |
| EnableDAGDataset | bool | true/false | true | Generate width and depth from data/traces/example/dag_structure.csv[^8] |
| Width | int | > 0 | 2 | Default width of DAG |
| Depth | int | > 0 | 2 | Default depth of DAG |

[^1]: To run RPS experiments add suffix `-RPS`.

Expand All @@ -55,6 +58,10 @@ Lambda; https://aws.amazon.com/about-aws/whats-new/2018/10/aws-lambda-supports-f

[^6]: Dirigent specific

[^7]: The generated DAGs consist of unique functions. The shape of each DAG is determined either ```Width,Depth``` or calculated based on ```EnableDAGDAtaset```.

[^8]: A [data sample](https://github.com/icanforce/Orion-OSDI22/blob/main/Public_Dataset/dag_structure.xlsx) of DAG structures has been created based on past Microsoft Azure traces. Width and Depth are determined based on probabilities of this sample.

---

InVitro can cause failure on cluster manager components. To do so, please configure the `cmd/failure.json`. Make sure
Expand Down
48 changes: 47 additions & 1 deletion docs/loader.md
Original file line number Diff line number Diff line change
Expand Up @@ -179,6 +179,53 @@ For more options, please see the `Makefile`.

For instructions on how to use the loader with OpenWhisk go to `openwhisk_setup/README.md`.

## Workflow Invocation
Generation of a Directed Acyclic Graph (DAG) workflow is supported by setting `"DAGMode: true"` in `cmd/config_knative_trace.json` (as specified in [`docs/configuration.md`](../docs/configuration.md)).

Before invocation, DAGs will be iteratively generated based on the parameters: `width`,`depth`,`EnableDAGDataset`, until the remaining functions are insufficient to maintain the desired DAG structure. The remaining functions will be unused for the rest of the experiment.

An example of the generated workflow can be seen here:

```bash
Functions available: 20
Width: 3
Depth: 4
EnableDAGDataset: false

DAG 1: f(0) -> f(1) -> f(3) -> f(5)
\
\ -> f(2) -> f(4) -> f(6)
\
\ -> f(7)

DAG 2: f(8) -> f(9) -> f(12) -> f(15)
\
\ -> f(10) -> f(13) -> f(16)
\
\-> f(11) -> f(14) -> f(17)
```
In the example, a single invocation of DAG 1 will result in 8 total functions invoked, with parallel invocations per branch. Invocation Frequency and IAT of DAGs 1 and 2 will depend on entry functions f(0) and f(8) respectively.

To obtain [reference traces](https://github.com/vhive-serverless/invitro/blob/main/docs/sampler.md#reference-traces) for DAG execution, use the following command:
```bash
git lfs pull
tar -xzf data/traces/reference/sampled_150.tar.gz -C data/traces
```
Microsoft has publicly released Microsoft Azure traces of function invocations from 10/18/2021 to 10/31/2021. From this trace, a [data sample](https://github.com/icanforce/Orion-OSDI22/blob/main/Public_Dataset) of DAG structures, representing the cumulative distribution of width and depth of DAGs during that period, was generated. Probabilities were applied to the data to derive the shape of the DAGs. The file `data/traces/example/dag_structure.csv` provides a simplified sample of the publicly released traces.

By default, the shape of the DAGs are automatically calculated at every iteration using the above mentioned cumulative distribution.
To manually set the shape of the DAGs, change the following parameters in `cmd/config_knative_trace.json`. Note that the number of functions in `TracePath` must be large enough to support the maximum size of the DAG. This ensures all DAGs generated will have the same width and depth.
```bash
"EnableDAGDataset": false,
"Width": <width>,
"Depth": <depth>
```

Lastly, start the experiment. This invokes all the generated DAGs with their respective frequencies.
```bash
go run cmd/loader.go --config cmd/config_knative_trace.json
```

## Running on Cloud Using Serverless Framework

**Currently supported vendors:** AWS
Expand All @@ -204,7 +251,6 @@ For instructions on how to use the loader with OpenWhisk go to `openwhisk_setup/
```bash
go run cmd/loader.go --config cmd/config_knative_trace.json
```

---
Note:
- Current deployment is via container image.
Expand Down
9 changes: 9 additions & 0 deletions pkg/common/trace_types.go
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,8 @@

package common

import "container/list"

type FunctionInvocationStats struct {
HashOwner string
HashApp string
Expand Down Expand Up @@ -103,3 +105,10 @@ type Function struct {

Specification *FunctionSpecification
}

type Node struct {
Function *Function
Branches []*list.List
Depth int
DAG string
}
12 changes: 12 additions & 0 deletions pkg/common/utilities.go
Original file line number Diff line number Diff line change
Expand Up @@ -135,3 +135,15 @@ func SumNumberOfInvocations(withWarmup bool, totalDuration int, functions []*Fun

return result
}

func GetName(function *Function) int {
parts := strings.Split(function.Name, "-")
if parts[0] == "test" {
return 0
}
functionId, err := strconv.Atoi(parts[2])
if err != nil {
log.Fatal(err)
}
return functionId
}
3 changes: 3 additions & 0 deletions pkg/config/parser.go
Original file line number Diff line number Diff line change
Expand Up @@ -80,6 +80,9 @@ type LoaderConfiguration struct {
GRPCConnectionTimeoutSeconds int `json:"GRPCConnectionTimeoutSeconds"`
GRPCFunctionTimeoutSeconds int `json:"GRPCFunctionTimeoutSeconds"`
DAGMode bool `json:"DAGMode"`
EnableDAGDataset bool `json:"EnableDAGDataset"`
Width int `json:"Width"`
Depth int `json:"Depth"`
}

func ReadConfigurationFile(path string) LoaderConfiguration {
Expand Down
6 changes: 5 additions & 1 deletion pkg/config/parser_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,11 @@ func TestConfigParser(t *testing.T) {
config.MetricScrapingPeriodSeconds != 15 ||
config.AutoscalingMetric != "concurrency" ||
config.GRPCConnectionTimeoutSeconds != 15 ||
config.GRPCFunctionTimeoutSeconds != 900 {
config.GRPCFunctionTimeoutSeconds != 900 ||
config.DAGMode != false ||
config.EnableDAGDataset != true ||
config.Width != 2 ||
config.Depth != 2 {

t.Error("Unexpected configuration read.")
}
Expand Down
6 changes: 5 additions & 1 deletion pkg/config/test_config.json
Original file line number Diff line number Diff line change
Expand Up @@ -22,5 +22,9 @@

"GRPCConnectionTimeoutSeconds": 15,
"GRPCFunctionTimeoutSeconds": 900,
"DAGMode": false
"DAGMode": false,
"EnableDAGDataset": true,
"DAGEntryFunction": 0,
"Width": 2,
"Depth": 2
}
8 changes: 6 additions & 2 deletions pkg/config/test_config_aws.json
Original file line number Diff line number Diff line change
Expand Up @@ -21,5 +21,9 @@

"GRPCConnectionTimeoutSeconds": 15,
"GRPCFunctionTimeoutSeconds": 900,
"DAGMode": false
}
"DAGMode": false,
"EnableDAGDataset": true,
"DAGEntryFunction": 0,
"Width": 2,
"Depth": 2
}
Loading

0 comments on commit 734925b

Please sign in to comment.