-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DPLT-980 Tests for wildcard matching #86
Merged
Merged
Changes from all commits
Commits
Show all changes
34 commits
Select commit
Hold shift + click to select a range
1c8d90c
Utility to fetch blocks for tests. Wildcard tests for Coordinator ind…
gabehamilton 3268dfa
Allow snake case blocks to be returned from block server
gabehamilton 4dd1935
feat: Stream blocks while executing (#87)
roshaans 4296b11
[DPLT-1019] feat: store debug list in LocalStorage (#90)
roshaans 7485e5b
DPLT-929 historical filtering (#81)
gabehamilton 956d41b
ci: Fix rust CI check workflow
morgsmccauley 1342f23
[DPLT-1009] feat: contract validation with wild cards (#93)
roshaans 0c62837
fix: contract name
roshaans 311cd92
[DPLT-1007] feat: Add link to QueryApi Docs (#95)
roshaans 0e162fb
DPLT-1002 Publish current block height of indexer functions to CloudW…
morgsmccauley 3d995e8
DPLT-1020 Fetch indexing metadata file to determine last_indexed_bloc…
gabehamilton 60ac30f
Fixed timestamp field for lag log.
gabehamilton 4ffa492
Refactor: Introduce Editor context refactoring (#98)
roshaans 4d4905f
Updated indexed historical data folder
gabehamilton b8dc5bb
Limit unindexed block processing to two hours of blocks.
gabehamilton 3e064b3
[DPLT-1021] feat: enable graphiql explorer plugin (#94)
roshaans 27d134d
DPLT-1028 feat: Fork Your Own indexer modal (#99)
roshaans fd08139
DPLT-936 message per block (#101)
gabehamilton 7c33b12
Shell script for test blocks, updated test for block level matching r…
gabehamilton d61bc9e
cargo fmt
gabehamilton 3eb0009
Updated readme with test block info.
gabehamilton f0ab72d
Merge branch 'main' into DPLT-980_wildcard_tests
gabehamilton 01f0ddf
Fixed merge error
gabehamilton b6f305a
Refactor S3 and SQS operations into their own modules.
gabehamilton b80f90f
S3 list operation now handles continuation tokens and directory listi…
gabehamilton d1e3e57
Small adjustments from PR feedback
gabehamilton 0ed136e
S3 list methods for wildcards and comma separated contracts.
gabehamilton 851979c
Full support for wildcard and CSV contract matching.
gabehamilton 421f07d
Merge pull request #111 from near/DPLT-980_wildcard_and_csv
gabehamilton 4bec534
Merge pull request #110 from near/DPLT-980_s3_list_operation
gabehamilton f226866
Merge pull request #108 from near/DPLT-980_aws_refactor
gabehamilton da4e5a3
Merge branch 'main' into DPLT-980_wildcard_tests
gabehamilton 38fb20e
Clippy recommended fixes
gabehamilton 05bcc3a
The fmt of the clippy
gabehamilton File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -2,3 +2,4 @@ | |
.env* | ||
redis/ | ||
*.log | ||
/indexer/blocks/ | ||
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,9 @@ | ||
#!/bin/bash | ||
|
||
mkdir -p ./blocks | ||
|
||
# Iterate over all script arguments | ||
for block_id in "$@" | ||
do | ||
curl -o "./blocks/${block_id}.json" "https://70jshyr5cb.execute-api.eu-central-1.amazonaws.com/block/${block_id}?snake_case=true" | ||
done |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -115,3 +115,160 @@ fn build_indexer_rule_match_payload( | |
} | ||
} | ||
} | ||
|
||
#[cfg(test)] | ||
mod tests { | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Can we also test for a wildcard which doesn't match anything to avoid false positives here? |
||
use crate::outcomes_reducer::reduce_indexer_rule_matches_from_outcomes; | ||
use crate::types::indexer_rule_match::{ChainId, IndexerRuleMatch}; | ||
use indexer_rule_type::indexer_rule::{IndexerRule, IndexerRuleKind, MatchingRule, Status}; | ||
use near_lake_framework::near_indexer_primitives::StreamerMessage; | ||
|
||
fn read_local_file(path: &str) -> String { | ||
std::fs::read_to_string(path).unwrap() | ||
} | ||
fn read_local_streamer_message(block_height: u64) -> StreamerMessage { | ||
let path = format!("../blocks/{}.json", block_height); | ||
let json = serde_json::from_str(&read_local_file(&path)).unwrap(); | ||
serde_json::from_value(json).unwrap() | ||
} | ||
|
||
#[tokio::test] | ||
async fn match_wildcard_no_match() { | ||
let wildcard_rule = IndexerRule { | ||
indexer_rule_kind: IndexerRuleKind::Action, | ||
matching_rule: MatchingRule::ActionAny { | ||
affected_account_id: "*.nearcrow.near".to_string(), | ||
status: Status::Success, | ||
}, | ||
id: None, | ||
name: None, | ||
}; | ||
|
||
let streamer_message = read_local_streamer_message(93085141); | ||
let result: Vec<IndexerRuleMatch> = reduce_indexer_rule_matches_from_outcomes( | ||
&wildcard_rule, | ||
&streamer_message, | ||
ChainId::Testnet, | ||
) | ||
.await | ||
.unwrap(); | ||
|
||
assert_eq!(result.len(), 0); | ||
} | ||
|
||
#[tokio::test] | ||
async fn match_wildcard_contract_subaccount_name() { | ||
let wildcard_rule = IndexerRule { | ||
indexer_rule_kind: IndexerRuleKind::Action, | ||
matching_rule: MatchingRule::ActionAny { | ||
affected_account_id: "*.nearcrowd.near".to_string(), | ||
status: Status::Success, | ||
}, | ||
id: None, | ||
name: None, | ||
}; | ||
|
||
let streamer_message = read_local_streamer_message(93085141); | ||
let result: Vec<IndexerRuleMatch> = reduce_indexer_rule_matches_from_outcomes( | ||
&wildcard_rule, | ||
&streamer_message, | ||
ChainId::Testnet, | ||
) | ||
.await | ||
.unwrap(); | ||
|
||
assert_eq!(result.len(), 1); // There are two matches, until we add Extraction we are just matching the first one (block matching) | ||
} | ||
|
||
#[tokio::test] | ||
async fn match_wildcard_mid_contract_name() { | ||
let wildcard_rule = IndexerRule { | ||
indexer_rule_kind: IndexerRuleKind::Action, | ||
matching_rule: MatchingRule::ActionAny { | ||
affected_account_id: "*crowd.near".to_string(), | ||
status: Status::Success, | ||
}, | ||
id: None, | ||
name: None, | ||
}; | ||
|
||
let streamer_message = read_local_streamer_message(93085141); | ||
let result: Vec<IndexerRuleMatch> = reduce_indexer_rule_matches_from_outcomes( | ||
&wildcard_rule, | ||
&streamer_message, | ||
ChainId::Testnet, | ||
) | ||
.await | ||
.unwrap(); | ||
|
||
assert_eq!(result.len(), 1); // see Extraction note in previous test | ||
|
||
let wildcard_rule = IndexerRule { | ||
indexer_rule_kind: IndexerRuleKind::Action, | ||
matching_rule: MatchingRule::ActionAny { | ||
affected_account_id: "app.nea*owd.near".to_string(), | ||
status: Status::Success, | ||
}, | ||
id: None, | ||
name: None, | ||
}; | ||
|
||
let result: Vec<IndexerRuleMatch> = reduce_indexer_rule_matches_from_outcomes( | ||
&wildcard_rule, | ||
&streamer_message, | ||
ChainId::Testnet, | ||
) | ||
.await | ||
.unwrap(); | ||
|
||
assert_eq!(result.len(), 1); // see Extraction note in previous test | ||
} | ||
|
||
#[tokio::test] | ||
async fn match_csv_account() { | ||
let wildcard_rule = IndexerRule { | ||
indexer_rule_kind: IndexerRuleKind::Action, | ||
matching_rule: MatchingRule::ActionAny { | ||
affected_account_id: "notintheblockaccount.near, app.nearcrowd.near".to_string(), | ||
status: Status::Success, | ||
}, | ||
id: None, | ||
name: None, | ||
}; | ||
|
||
let streamer_message = read_local_streamer_message(93085141); | ||
let result: Vec<IndexerRuleMatch> = reduce_indexer_rule_matches_from_outcomes( | ||
&wildcard_rule, | ||
&streamer_message, | ||
ChainId::Testnet, | ||
) | ||
.await | ||
.unwrap(); | ||
|
||
assert_eq!(result.len(), 1); // There are two matches, until we add Extraction we are just matching the first one (block matching) | ||
} | ||
|
||
#[tokio::test] | ||
async fn match_csv_wildcard_account() { | ||
let wildcard_rule = IndexerRule { | ||
indexer_rule_kind: IndexerRuleKind::Action, | ||
matching_rule: MatchingRule::ActionAny { | ||
affected_account_id: "notintheblockaccount.near, *.nearcrowd.near".to_string(), | ||
status: Status::Success, | ||
}, | ||
id: None, | ||
name: None, | ||
}; | ||
|
||
let streamer_message = read_local_streamer_message(93085141); | ||
let result: Vec<IndexerRuleMatch> = reduce_indexer_rule_matches_from_outcomes( | ||
&wildcard_rule, | ||
&streamer_message, | ||
ChainId::Testnet, | ||
) | ||
.await | ||
.unwrap(); | ||
|
||
assert_eq!(result.len(), 1); // There are two matches, until we add Extraction we are just matching the first one (block matching) | ||
} | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The tests explicitly depend on this data, should we just commit it to the repo?