-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature: Added creating of Directed Acyclic Graphs (DAG) to existing DAG Driver #433
base: main
Are you sure you want to change the base?
Conversation
d9e9ef0
to
9161a77
Compare
@leokondrashov: Is this still relevant or we close this PR? |
I apologize for the late reply. I will review this pull request! |
d75890a
to
fca5d95
Compare
a8f5730
to
3357247
Compare
cbc9203
to
e35a93e
Compare
Dear @cvetkovic, The current version looks good to me. However, due to my limited experience in reviewing pull requests, I would greatly appreciate it if you could provide us with some feedback when you have a moment. Thank you very much! |
cmd/loader.go
Outdated
func getTrace() bool { | ||
_, err := exec.Command("git", "lfs", "pull").CombinedOutput() | ||
if err != nil { | ||
log.Warnf("Failed to get traces") | ||
return false | ||
} | ||
_, err = exec.Command("tar", "-xzf", "data/traces/reference/sampled_150.tar.gz", "-C", "data/traces").CombinedOutput() | ||
if err != nil { | ||
log.Warnf("Failed to extract sample") | ||
return false | ||
} | ||
return true | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think this should be git and tar commands should be part of the the loader. The user should be responsible for making sure these files exist and are in proper format.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please no proprietary formats. Use CSV.
pkg/config/parser.go
Outdated
DAGTracePath string `json:"DAGTracePath"` | ||
EnableDAGDataset bool `json:"EnableDAGDataset"` | ||
Width int `json:"Width"` | ||
Depth int `json:"Depth"` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Indent properly.
Signed-off-by: Kway Yi Shen <[email protected]> Signed-off-by: Kway Yi Shen <SamKway@localhost> Signed-off-by: Kway Yi Shen <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. @leokondrashov If good, you can proceed with merging.
@wanghanchengchn Just fix the errors linter reports. |
Thank you! Dear @NotAnAddictz, could you please address the failed checks? Additionally, this branch is out-of-date with the base branch. Kindly rebase on the main branch and verify that all checks are passing. Thank you! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me codewise. But I'd like to have documentation for the feature in the repo, not only in the PR description.
I think the linter problem is not caused by you, but it's easy to fix by changing Fatalf
to Fatal
.
@@ -21,5 +21,9 @@ | |||
|
|||
"GRPCConnectionTimeoutSeconds": 15, | |||
"GRPCFunctionTimeoutSeconds": 900, | |||
"DAGMode": false | |||
"DAGMode": false, | |||
"DAGTracePath": "data/traces/sampled_150/20", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does this file exist by default?
@@ -19,7 +19,12 @@ | |||
| MetricScrapingPeriodSeconds | int | > 0 | 15 | Period of Prometheus metrics scrapping | | |||
| GRPCConnectionTimeoutSeconds | int | > 0 | 60 | Timeout for establishing a gRPC connection | | |||
| GRPCFunctionTimeoutSeconds | int | > 0 | 90 | Maximum time given to function to execute[^4] | | |||
| DAGMode | bool | true/false | false | Sequential invocation of all functions one after another | | |||
| DAGMode | bool | true/false | false | Parallel invocation of DAG with each function acting as entry function | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is okay for the description of the config option, but we need proper documentation of this feature in the docs. Please consider writing down the generation and invocation processes of the DAG so the user can understand the output results without looking at the code.
@@ -19,7 +19,12 @@ | |||
| MetricScrapingPeriodSeconds | int | > 0 | 15 | Period of Prometheus metrics scrapping | | |||
| GRPCConnectionTimeoutSeconds | int | > 0 | 60 | Timeout for establishing a gRPC connection | | |||
| GRPCFunctionTimeoutSeconds | int | > 0 | 90 | Maximum time given to function to execute[^4] | | |||
| DAGMode | bool | true/false | false | Sequential invocation of all functions one after another | | |||
| DAGMode | bool | true/false | false | Parallel invocation of DAG with each function acting as entry function | | |||
| DAGTracePath | string | string | data/traces/sampled_150/20 | Folder with Azure trace dimensions used for DAG Invocation. | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please add a reference to the trace. It can be located with the rest of the docs (see other comment).
Summary
Previously, in PR #383 , we added functionality for sequential invocation for each function. This commit adds on to this existing feature, making every function act as an entry point and create a DAG structure for each function based on the width and depth distributions in
data/traces/example/dag_structure.xlsx
and invoke it according to the frequency of the entry functions.Implementation Notes ⚒️
sampled_150
folder containing the folders for each group of functions if required.Nodes
to facilitate DAG generationentriesWritten
to functionsDriver to ensure all invocations are written in the output fileExternal Dependencies 🍀
Breaking API Changes⚠️
Simply specify none (N/A) if not applicable.