diff --git a/README.md b/README.md index 7e8f6c5e0..cd069149b 100644 --- a/README.md +++ b/README.md @@ -6,47 +6,71 @@ Requires Python 3.6 ## It is a who that does what now? What is STIX Patterning? What are STIX Observations? -[Structured Threat Information Expression (STIX™)](https://oasis-open.github.io/cti-documentation/) is a language and serialization format used to exchange cyber threat intelligence (CTI). STIX 2 Patterning is a part of STIX that deals with the "matching things" part of STIX, which is an integral component of STIX Indicators. +[Structured Threat Information Expression (STIX™)](https://oasis-open.github.io/cti-documentation/) is a language and serialization format used to exchange cyber threat intelligence (CTI). STIX 2 Patterning is a part of STIX that deals with the "matching things" part of STIX, which is an integral component of STIX Indicators. This library takes in STIX 2 Patterns as input, and "finds" data that matches the patterns inside various products that house repositories of cybersecurity data. Examples of such products include SIEM systems, endpoint management systems, threat intelligence platforms, orchestration platforms, network control points, data lakes, and more. -In addition to "finding" the data using these patterns, STIX-Shifter uniquely also *transforms the output* into STIX 2 Observations. Why would we do that you ask? To put it simply - so that all of the security data, regardless of the source, mostly looks and behaves the same. As anyone with experience in data science will tell you, the cleansing and normalizing of the data accross domains, is one of the largest hurdles to overcome with attempting to build cross-platform security analytics. This is one of the barriers we are attempting to break down with STIX Shifter. +In addition to "finding" the data using these patterns, STIX-Shifter uniquely also _transforms the output_ into STIX 2 Observations. Why would we do that you ask? To put it simply - so that all of the security data, regardless of the source, mostly looks and behaves the same. As anyone with experience in data science will tell you, the cleansing and normalizing of the data accross domains, is one of the largest hurdles to overcome with attempting to build cross-platform security analytics. This is one of the barriers we are attempting to break down with STIX Shifter. ## This sounds like Sigma, I already have that -[Sigma](https://github.com/Neo23x0/sigma) and STIX Patterning have goals that are related, but at the end of the day have slightly different scopes. While Sigma seeks to be "for log files what Snort is for network traffic and YARA is for files", STIX Patterning's goal is to encompass *all three* fundamental security data source types - network, file, and log - and do so simultaneously, allowing you to create complex queries and analytics that span domains. As such, so does STIX Shifter. We feel it is critical to be able to create search patterns that span SIEM, Endpoint, Network, and File levels, in order to detect the complex patterns used in modern campaigns. +[Sigma](https://github.com/Neo23x0/sigma) and STIX Patterning have goals that are related, but at the end of the day have slightly different scopes. While Sigma seeks to be "for log files what Snort is for network traffic and YARA is for files", STIX Patterning's goal is to encompass _all three_ fundamental security data source types - network, file, and log - and do so simultaneously, allowing you to create complex queries and analytics that span domains. As such, so does STIX Shifter. We feel it is critical to be able to create search patterns that span SIEM, Endpoint, Network, and File levels, in order to detect the complex patterns used in modern campaigns. ## Why would I want to use this? You may want to use this library and/or contribute to development, if any of the follwing are true: -* You are a vendor or project owner who wants to add some form of query or enrichment functionality to your product capabilities -* You are an end user and want to have a way to script searches and/or queries as part of your orchestrsation flow -* You are a vendor or project owner who has data that could be made available, and you want to contribute an adapter -* You just want to help make the world a safer place! +- You are a vendor or project owner who wants to add some form of query or enrichment functionality to your product capabilities +- You are an end user and want to have a way to script searches and/or queries as part of your orchestrsation flow +- You are a vendor or project owner who has data that could be made available, and you want to contribute an adapter +- You just want to help make the world a safer place! # How to use -## Converting from STIX Patterns to data source queries +## Converting from STIX Patterns to data source queries (query) or from data source results to STIX cyber observables (results) ### Call the stix_shifter in the format of ``` usage: stix_shifter.py translate [-h] - {qradar,dummy} - {results,query} data + {qradar, dummy, splunk} + {results, query} data positional arguments: - {qradar,dummy} What translation module to use - {results,query} What translation action to perform - data source A STIX identity object - data The data to be translated +{qradar, dummy} What translation module to use +{results, query} What translation action to perform +data source A STIX identity object +data STIX pattern or data to be translated optional arguments: -h, --help show this help message and exit -x run STIX validation on each observable as it's written to the output JSON ``` +## Connecting to a data source + +### Call the stix_shifter in the format of + +``` +usage: stix_shifter.py transmit [-h] + {async_dummy, synchronous_dummy, qradar, splunk, bigfix} + +positional arguments: +{} Transmission module to use + {"host": , "port": , "cert": } Data source connection + {"auth": } Data source authentication + { + "type": , Translation method to be used + "search_id": (for results and status), + "query": (for query), + "offset": (for results), + "length": (for results) + } + +optional arguments: + -h, --help show this help message and exit +``` + ### Example of converting a STIX pattern to an IBM QRadar AQL query: [See the QRadar module documentation](stix_shifter/src/modules/qradar/README.md) diff --git a/main.py b/main.py index b1475fd37..1bdea3749 100644 --- a/main.py +++ b/main.py @@ -4,30 +4,48 @@ import json +TRANSLATE = 'translate' +TRANSMIT = 'transmit' + + def __main__(): """ - In the case of converting a stix pattern to datasource query, arguments will take the form of... - - The module and translate_type will determine what module and method gets called - Options argument comes in as: + Stix-shifter can either be called to either translate or transmit. + In the case of translation, stix-shifter either translates a stix pattern to a datasource query, + or converts data source query results into JSON of STIX observations. + Arguments will take the form of... + "translate" + The module and translate_type will determine what module and method gets executed. + Option arguments comes in as: "{ - "select_fields": }, - "mapping": , + "select_fields": (In the case of QRadar), + "mapping": , "result_limit": , - "timerange": + "timerange": }" + In the case of transmission, stix-shifter connects to a datasource to execute queries, status updates, and result retrieval. + Arguments will take the form of... + "transmit" '{"host": , "port": , "cert": }', '{"auth": }', + < + query , + status , + results , + ping, + is_async + > """ # process arguments - parser = argparse.ArgumentParser(description='stix_shifter') - subparsers = parser.add_subparsers(dest='command') + parent_parser = argparse.ArgumentParser(description='stix_shifter') + parent_subparsers = parent_parser.add_subparsers(dest='command') # translate parser - translate_parser = subparsers.add_parser( - 'translate', help='Translate a query or result set using a specific translation module') + translate_parser = parent_subparsers.add_parser( + TRANSLATE, help='Translate a query or result set using a specific translation module') + # positional arguments translate_parser.add_argument( - 'module', choices=stix_shifter.MODULES, help='what translation module to use') + 'module', choices=stix_shifter.TRANSLATION_MODULES, help='what translation module to use') translate_parser.add_argument('translate_type', choices=[ stix_shifter.RESULTS, stix_shifter.QUERY], help='what translation action to perform') translate_parser.add_argument( @@ -35,27 +53,64 @@ def __main__(): translate_parser.add_argument( 'data', type=str, help='the data to be translated') translate_parser.add_argument('options', nargs='?', help='options that can be passed in') + # optional arguments translate_parser.add_argument('-x', '--stix-validator', action='store_true', help='run stix2 validator against the converted results') translate_parser.add_argument('-m', '--data-mapper', help='module to use for the data mapper') - args = parser.parse_args() + # transmit parser + transmit_parser = parent_subparsers.add_parser( + TRANSMIT, help='Connect to a datasource and exectue a query...') + + # positional arguments + transmit_parser.add_argument( + 'module', choices=stix_shifter.TRANSMISSION_MODULES, + help='choose which connection module to use' + ) + transmit_parser.add_argument( + 'connection', + type=str, + help='Data source connection with host, port, and certificate' + ) + transmit_parser.add_argument( + 'configuration', + type=str, + help='Data source authentication' + ) + + # operation subparser + operation_subparser = transmit_parser.add_subparsers(title="operation", dest="operation_command") + operation_subparser.add_parser(stix_shifter.PING, help="Pings the data source") + query_operation_parser = operation_subparser.add_parser(stix_shifter.QUERY, help="Executes a query on the data source") + query_operation_parser.add_argument('query_string', help='native datasource query string') + results_operation_parser = operation_subparser.add_parser(stix_shifter.RESULTS, help="Fetches the results of the data source query") + results_operation_parser.add_argument('search_id', help='uuid of executed query') + results_operation_parser.add_argument('offset', help='offset of results') + results_operation_parser.add_argument('length', help='length of results') + status_operation_parser = operation_subparser.add_parser(stix_shifter.STATUS, help="Gets the current status of the query") + status_operation_parser.add_argument('search_id', help='uuid of executed query') + operation_subparser.add_parser(stix_shifter.IS_ASYNC, help='Checks if the query operation is asynchronous') + + args = parent_parser.parse_args() if args.command is None: - parser.print_help(sys.stderr) + parent_parser.print_help(sys.stderr) sys.exit(1) - options = json.loads(args.options) if bool(args.options) else {} - if args.stix_validator: - options['stix_validator'] = args.stix_validator - if args.data_mapper: - options['data_mapper'] = args.data_mapper - shifter = stix_shifter.StixShifter() - result = shifter.translate( - args.module, args.translate_type, args.data_source, args.data, options=options) + + if args.command == TRANSLATE: + options = json.loads(args.options) if bool(args.options) else {} + if args.stix_validator: + options['stix_validator'] = args.stix_validator + if args.data_mapper: + options['data_mapper'] = args.data_mapper + result = shifter.translate( + args.module, args.translate_type, args.data_source, args.data, options=options) + elif args.command == TRANSMIT: + result = shifter.transmit(args) print(result) exit(0) diff --git a/setup.py b/setup.py index 97888c8b2..852a98e9a 100644 --- a/setup.py +++ b/setup.py @@ -45,7 +45,7 @@ # This is a one-line description or tagline of what your project does. This # corresponds to the "Summary" metadata field: # https://packaging.python.org/specifications/core-metadata/#summary - description='Tools and interface to translate STIX formatted results and queries to different data source formats', # Required + description='Tools and interface to translate STIX formatted results and queries to different data source formats and to set up appropriate connection strings for invoking and triggering actions in openwhisk', # Required # This is an optional longer description of your project that represents # the body of text which users will see when they visit PyPI. @@ -179,6 +179,11 @@ # 'sample=sample:main', # ], # }, + entry_points={ + 'console_scripts': [ + 'stix-transmission=stix_transmission.stix_transmission:main', + ], + }, # List additional URLs that are relevant to your project as a dict. # diff --git a/stix_shifter/src/modules/qradar/stix_to_aql.py b/stix_shifter/src/modules/qradar/stix_to_aql.py index d75504a5f..f6f8211db 100644 --- a/stix_shifter/src/modules/qradar/stix_to_aql.py +++ b/stix_shifter/src/modules/qradar/stix_to_aql.py @@ -7,6 +7,9 @@ logger = logging.getLogger(__name__) +DEFAULT_LIMIT = 10000 +DEFAULT_TIMERANGE = 5 + class StixToAQL(BaseQueryTranslator): @@ -25,8 +28,8 @@ def transform_query(self, data, options, mapping=None): query_object = generate_query(data) data_model_mapper = qradar_data_mapping.QRadarDataMapper(options) - result_limit = options['result_limit'] if 'result_limit' in options else 10000 - timerange = options['timerange'] if 'timerange' in options else 5 + result_limit = options['result_limit'] if 'result_limit' in options else DEFAULT_LIMIT + timerange = options['timerange'] if 'timerange' in options else DEFAULT_TIMERANGE query_string = aql_query_constructor.translate_pattern( query_object, data_model_mapper, result_limit, timerange) return query_string diff --git a/stix_shifter/stix_shifter.py b/stix_shifter/stix_shifter.py index b38fefd39..6cae31237 100644 --- a/stix_shifter/stix_shifter.py +++ b/stix_shifter/stix_shifter.py @@ -1,14 +1,20 @@ -import sys import importlib from stix_shifter.src.patterns.parser import generate_query from stix2patterns.validator import run_validator from stix_shifter.src.stix_pattern_parser import stix_pattern_parser import re +from stix_transmission import stix_transmission +import json -MODULES = ['qradar', 'dummy', 'car', 'cim', 'splunk', 'elastic'] +TRANSLATION_MODULES = ['qradar', 'dummy', 'car', 'cim', 'splunk', 'elastic'] +TRANSMISSION_MODULES = ['async_dummy', 'synchronous_dummy', 'qradar', 'splunk', 'bigfix'] RESULTS = 'results' QUERY = 'query' +DELETE = 'delete' +STATUS = 'status' +PING = 'ping' +IS_ASYNC = 'is_async' class StixValidationException(Exception): @@ -27,7 +33,7 @@ def translate(self, module, translate_type, data_source, data, options={}): """ Translated queries to a specified format :param module: What module to use - :type module: one of MODULES 'qradar', 'dummy' + :type module: one of TRANSLATION_MODULES 'qradar', 'dummy' :param translate_type: translation of a query or result set must be either 'results' or 'query' :type translate_type: str :param data: the data to translate @@ -38,7 +44,7 @@ def translate(self, module, translate_type, data_source, data, options={}): :rtype: str """ - if module not in MODULES: + if module not in TRANSLATION_MODULES: raise NotImplementedError translator_module = importlib.import_module( @@ -70,3 +76,44 @@ def translate(self, module, translate_type, data_source, data, options={}): return interface.translate_results(data_source, data, options) else: raise NotImplementedError + + def transmit(self, args): + """ + Connects to datasource and executes a query, grabs status update or query results + :param args: + args: '{"host": , "port": , "cert": }', '{"auth": }', + < + query , + status , + results , + ping, + is_async + > + """ + connection_dict = json.loads(args.connection) + configuration_dict = json.loads(args.configuration) + operation_command = args.operation_command + + connector = stix_transmission.StixTransmission(args.module, connection_dict, configuration_dict) + + if operation_command == QUERY: + query = args.query_string + result = connector.query(query) + elif operation_command == STATUS: + search_id = args.search_id + result = connector.status(search_id) + elif operation_command == RESULTS: + search_id = args.search_id + offset = args.offset + length = args.length + result = connector.results(search_id, offset, length) + elif operation_command == DELETE: + search_id = args.search_id + result = connector.delete(search_id) + elif operation_command == PING: + result = connector.ping() + elif operation_command == IS_ASYNC: + result = connector.is_async() + else: + raise NotImplementedError + return result diff --git a/tests/car_json_to_stix/__init__.py b/stix_transmission/__init__.py similarity index 100% rename from tests/car_json_to_stix/__init__.py rename to stix_transmission/__init__.py diff --git a/tests/patterns/__init__.py b/stix_transmission/src/__init__.py similarity index 100% rename from tests/patterns/__init__.py rename to stix_transmission/src/__init__.py diff --git a/tests/qradar_json_to_stix/__init__.py b/stix_transmission/src/modules/__init__.py similarity index 100% rename from tests/qradar_json_to_stix/__init__.py rename to stix_transmission/src/modules/__init__.py diff --git a/tests/qradar_stix_to_aql/__init__.py b/stix_transmission/src/modules/async_dummy/__init__.py similarity index 100% rename from tests/qradar_stix_to_aql/__init__.py rename to stix_transmission/src/modules/async_dummy/__init__.py diff --git a/stix_transmission/src/modules/async_dummy/async_dummy_connector.py b/stix_transmission/src/modules/async_dummy/async_dummy_connector.py new file mode 100644 index 000000000..79f199d40 --- /dev/null +++ b/stix_transmission/src/modules/async_dummy/async_dummy_connector.py @@ -0,0 +1,17 @@ +from ..base.base_connector import BaseConnector +from .async_dummy_ping import AsyncDummyPing +from .async_dummy_query_connector import AsyncDummyQueryConnector +from .async_dummy_status_connector import AsyncDummyStatusConnector +from .async_dummy_results_connector import AsyncDummyResultsConnector + + +class Connector(BaseConnector): + def __init__(self, connection, configuration): + host = connection.get('host') + port = connection.get('port') + path = connection.get('path') + self.query_connector = AsyncDummyQueryConnector(host, port, path) + self.status_connector = AsyncDummyStatusConnector(host, port, path) + self.results_connector = AsyncDummyResultsConnector(host, port, path) + self.is_async = True + self.ping_connector = AsyncDummyPing(host, port, path) diff --git a/stix_transmission/src/modules/async_dummy/async_dummy_ping.py b/stix_transmission/src/modules/async_dummy/async_dummy_ping.py new file mode 100644 index 000000000..8fe0c57b4 --- /dev/null +++ b/stix_transmission/src/modules/async_dummy/async_dummy_ping.py @@ -0,0 +1,11 @@ +from ..base.base_ping import BasePing + + +class AsyncDummyPing(BasePing): + def __init__(self, host, port, path): + self.host = host + self.port = port + self.path = path + + def ping(self): + return 'async ping' diff --git a/stix_transmission/src/modules/async_dummy/async_dummy_query_connector.py b/stix_transmission/src/modules/async_dummy/async_dummy_query_connector.py new file mode 100644 index 000000000..6324bbfeb --- /dev/null +++ b/stix_transmission/src/modules/async_dummy/async_dummy_query_connector.py @@ -0,0 +1,32 @@ +from ..base.base_query_connector import BaseQueryConnector + + +class AsyncDummyQueryConnector(BaseQueryConnector): + def __init__(self, host, port, path): + self.host = host + self.port = port + self.path = path + + def create_query_connection(self, query): + # set headers + headers = { + "Content-Type": "application/json", + "Accept": "application/json" + } + + # construct request object, purely for visual purposes in dummy implementation + request = { + "host": self.host, + "path": self.path + query, + "port": self.port, + "headers": headers, + "method": "POST" + } + + # return a mocked request + print(request) + + return { + "response_code": 200, + "query_id": "uuid_1234567890" + } diff --git a/stix_transmission/src/modules/async_dummy/async_dummy_results_connector.py b/stix_transmission/src/modules/async_dummy/async_dummy_results_connector.py new file mode 100644 index 000000000..e142fb6b9 --- /dev/null +++ b/stix_transmission/src/modules/async_dummy/async_dummy_results_connector.py @@ -0,0 +1,52 @@ +from ..base.base_results_connector import BaseResultsConnector +from ..base.base_status_connector import Status +import time + +QUERY_ID_TABLE = { + "uuid_1234567890": Status.COMPLETED.value, + "uuid_not_done": Status.RUNNING.value, + "uuid_should_error": Status.ERROR.value +} + +RETURN_DUMMY_DATA = { + "uuid_1234567890": "some data" +} + + +class AsyncDummyResultsConnector(BaseResultsConnector): + def __init__(self, host, port, path): + self.host = host + self.port = port + self.path = path + + def create_results_connection(self, query_id, offset, length): + # set headers + headers = { + "Content-Type": "application/json", + "Accept": "application/json" + } + + # construct request object, purely for visual purposes in dummy implementation + request = { + "host": self.host, + "path": self.path + query_id, + "port": self.port, + "headers": headers, + "method": "GET" + } + + print(request) + time.sleep(3) + return_obj = {} + + if QUERY_ID_TABLE[query_id] == Status.COMPLETED.value and RETURN_DUMMY_DATA[query_id]: + return_obj["success"] = True + return_obj["data"] = RETURN_DUMMY_DATA[query_id] + elif QUERY_ID_TABLE[query_id] == Status.RUNNING.value: + return_obj["success"] = False + return_obj["error"] = "Query is not finished processing" + else: + return_obj["success"] = False + return_obj["error"] = "Error: query results not found" + + return return_obj diff --git a/stix_transmission/src/modules/async_dummy/async_dummy_status_connector.py b/stix_transmission/src/modules/async_dummy/async_dummy_status_connector.py new file mode 100644 index 000000000..8a3a893b9 --- /dev/null +++ b/stix_transmission/src/modules/async_dummy/async_dummy_status_connector.py @@ -0,0 +1,42 @@ +from ..base.base_status_connector import BaseStatusConnector +from ..base.base_status_connector import Status + +QUERY_ID_TABLE = { + "uuid_1234567890": Status.COMPLETED.value, + "uuid_not_done": Status.RUNNING.value, +} + + +class AsyncDummyStatusConnector(BaseStatusConnector): + def __init__(self, host, port, path): + self.host = host + self.port = port + self.path = path + + def create_status_connection(self, query_id): + # set headers + headers = { + "Content-Type": "application/json", + "Accept": "application/json" + } + + # construct request object, purely for visual purposes in dummy implementation + request = { + "host": self.host, + "path": self.path + query_id, + "port": self.port, + "headers": headers, + "method": "GET" + } + + print(request) + return_obj = {} + + if query_id not in QUERY_ID_TABLE: + return_obj["success"] = False + return_obj["error"] = "query id does not exist" + else: + return_obj["success"] = True + return_obj["status"] = QUERY_ID_TABLE[query_id] + + return return_obj diff --git a/tests/splunk_json_to_stix/__init__.py b/stix_transmission/src/modules/base/__init__.py similarity index 100% rename from tests/splunk_json_to_stix/__init__.py rename to stix_transmission/src/modules/base/__init__.py diff --git a/stix_transmission/src/modules/base/base_connector.py b/stix_transmission/src/modules/base/base_connector.py new file mode 100644 index 000000000..b79866180 --- /dev/null +++ b/stix_transmission/src/modules/base/base_connector.py @@ -0,0 +1,100 @@ +from .base_ping import BasePing +from .base_query_connector import BaseQueryConnector +from .base_status_connector import BaseStatusConnector +from .base_delete_connector import BaseDeleteConnector +from .base_results_connector import BaseResultsConnector + + +class BaseConnector: + def __init__(self, connection, configuration): + """ + Args: + connection (dict): The datasource connection info. + configuration (dict): The datasource configuration info. + """ + self.query_connector = BaseQueryConnector() + self.status_connector = BaseStatusConnector() + self.delete_connector = BaseDeleteConnector() + self.results_connector = BaseResultsConnector() + self.ping = BasePing() + + def create_query_connection(self, query): + """ + Creates a connection to the specified datasource to send a query + + Args: + query (str): The datasource query. + + Returns: + dict: The return value. + keys: + success (bool): True or False + search_id (str): query ID + error (str): error message (when success=False) + """ + return self.query_connector.create_query_connection(query) + + def create_status_connection(self, search_id): + """ + Creates a connection to the specified datasource to determine the status of a given query + + Args: + search_id (str): The datasource query ID. + + Returns: + dict: The return value. + keys: + success (bool): True or False + status (enum 'Status'): + progress (int): percentage of progress (0-100) + error (str): error message (when success=False) + """ + return self.status_connector.create_status_connection(search_id) + + def create_results_connection(self, search_id, offset, length): + """ + Creates a connection to the specified datasource to retrieve query results + + Args: + search_id (str): The datasource query ID. + offset: data offset to start fetch from. + length: data length to fetch + + Returns: + dict: The return value. + keys: + success (bool): True or False + data (str): The query result data + error (str): error message (when success=False) + """ + return self.results_connector.create_results_connection(search_id, offset, length) + + def delete_query_connection(self, search_id): + """ + Deletes a query from the specified datasource + + Args: + search_id (str): The datasource query ID. + + Returns: + dict: The return value. + keys: + success (bool): True or False + error (str): error message (when success=False) + """ + return self.delete_connector.delete_query_connection(search_id) + + def ping(self): + """ + Sends a basic request to the datasource to confirm we are connected and authenticated + + Args: + search_id (str): The datasource query ID. + + Returns: + dict: The return value. + keys: + success (bool): True or False + error (str): error message (when success=False) + """ + return self.ping_connector.ping() diff --git a/stix_transmission/src/modules/base/base_delete_connector.py b/stix_transmission/src/modules/base/base_delete_connector.py new file mode 100644 index 000000000..b9a4a8159 --- /dev/null +++ b/stix_transmission/src/modules/base/base_delete_connector.py @@ -0,0 +1,19 @@ +from abc import ABCMeta, abstractmethod + + +class BaseDeleteConnector(object, metaclass=ABCMeta): + @abstractmethod + def delete_query_connection(self, search_id): + """ + Deletes a query from the specified datasource + + Args: + search_id (str): The datasource query ID. + + Returns: + dict: The return value. + keys: + success (bool): True or False + error (str): error message (when success=False) + """ + pass diff --git a/stix_transmission/src/modules/base/base_ping.py b/stix_transmission/src/modules/base/base_ping.py new file mode 100644 index 000000000..319fd8bf5 --- /dev/null +++ b/stix_transmission/src/modules/base/base_ping.py @@ -0,0 +1,19 @@ +from abc import ABCMeta, abstractmethod + + +class BasePing(object, metaclass=ABCMeta): + @abstractmethod + def ping(self): + """ + Sends a basic request to the datasource to confirm we are connected and authenticated + + Args: + search_id (str): The datasource query ID. + + Returns: + dict: The return value. + keys: + success (bool): True or False + error (str): error message (when success=False) + """ + pass diff --git a/stix_transmission/src/modules/base/base_query_connector.py b/stix_transmission/src/modules/base/base_query_connector.py new file mode 100644 index 000000000..626702b84 --- /dev/null +++ b/stix_transmission/src/modules/base/base_query_connector.py @@ -0,0 +1,20 @@ +from abc import ABCMeta, abstractmethod + + +class BaseQueryConnector(object, metaclass=ABCMeta): + @abstractmethod + def create_query_connection(self, query): + """ + Creates a connection to the specified datasource to send a query + + Args: + query (str): The datasource query. + + Returns: + dict: The return value. + keys: + success (bool): True or False + search_id (str): query ID + error (str): error message (when success=False) + """ + pass diff --git a/stix_transmission/src/modules/base/base_results_connector.py b/stix_transmission/src/modules/base/base_results_connector.py new file mode 100644 index 000000000..6a4892118 --- /dev/null +++ b/stix_transmission/src/modules/base/base_results_connector.py @@ -0,0 +1,22 @@ +from abc import ABCMeta, abstractmethod + + +class BaseResultsConnector(object, metaclass=ABCMeta): + @abstractmethod + def create_results_connection(self, search_id, offset, length): + """ + Creates a connection to the specified datasource to retrieve query results + + Args: + search_id (str): The datasource query ID. + offset: data offset to start fetch from. + length: data length to fetch + + Returns: + dict: The return value. + keys: + success (bool): True or False + data (str): The query result data + error (str): error message (when success=False) + """ + pass diff --git a/stix_transmission/src/modules/base/base_status_connector.py b/stix_transmission/src/modules/base/base_status_connector.py new file mode 100644 index 000000000..aab65f61e --- /dev/null +++ b/stix_transmission/src/modules/base/base_status_connector.py @@ -0,0 +1,30 @@ +from abc import ABCMeta, abstractmethod +from enum import Enum + + +class Status(Enum): + COMPLETED = 'COMPLETED' + ERROR = 'ERROR' + CANCELED = 'CANCELED' + TIMEOUT = 'TIMEOUT' + RUNNING = 'RUNNING' + + +class BaseStatusConnector(object, metaclass=ABCMeta): + @abstractmethod + def create_status_connection(self, search_id): + """ + Creates a connection to the specified datasource to determine the status of a given query + + Args: + search_id (str): The datasource query ID. + + Returns: + dict: The return value. + keys: + success (bool): True or False + status (enum 'Status'): + progress (int): percentage of progress (0-100) + error (str): error message (when success=False) + """ + pass diff --git a/stix_transmission/src/modules/bigfix/RestApiClient.py b/stix_transmission/src/modules/bigfix/RestApiClient.py new file mode 100644 index 000000000..c31f76313 --- /dev/null +++ b/stix_transmission/src/modules/bigfix/RestApiClient.py @@ -0,0 +1,138 @@ +from urllib.error import HTTPError +from urllib.error import URLError +from urllib.parse import quote +from urllib.request import Request +from urllib.request import urlopen +from urllib.request import install_opener +from urllib.request import build_opener +from urllib.request import HTTPSHandler + +import ssl +import sys +import base64 + + +class RestApiClient: + + def __init__(self, server_ip, user_name, password, cert): + self.headers = {} + + if user_name and password: + self.headers['Authorization'] = b"Basic " + base64.b64encode( + (user_name + ':' + password).encode('ascii')) + else: + raise Exception('No valid credentials found in configuration.') + + self.server_ip = server_ip + self.base_uri = '/api/' + + # Create a secure SSLContext + # PROTOCOL_SSLv23 is misleading. PROTOCOL_SSLv23 will use the highest + # version of SSL or TLS that both the client and server supports. + context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) + + # SSL version 2 and SSL version 3 are insecure. The insecure versions + # are disabled. + context.verify_mode = ssl.CERT_REQUIRED + if sys.version_info >= (3, 4): + context.check_hostname = True + + check_hostname = True + if cert is not None: + # Load the certificate if the user has specified a certificate + # file in config.ini. + + # The default QRadar certificate does not have a valid hostname, + # so me must disable hostname checking. + if sys.version_info >= (3, 4): + context.check_hostname = False + check_hostname = False + + # Instead of loading the default certificates load only the + # certificates specified by the user. + context.load_verify_locations(cadata=cert) + else: + if sys.version_info >= (3, 4): + # Python 3.4 and above has the improved load_default_certs() + # function. + context.load_default_certs(ssl.Purpose.CLIENT_AUTH) + else: + # Versions of Python before 3.4 do not have the + # load_default_certs method. set_default_verify_paths will + # work on some, but not all systems. It fails silently. If + # this call fails the certificate will fail to validate. + context.set_default_verify_paths() + + install_opener(build_opener( + HTTPSHandler(context=context, check_hostname=check_hostname))) + + # This method is used to set up an HTTP request and send it to the server + def call_api(self, endpoint, method, headers=None, params=[], data=None): + + path = self.parse_path(endpoint, params) + + # If the caller specified customer headers merge them with the default + # headers. + actual_headers = self.headers.copy() + if headers is not None: + for header_key in headers: + actual_headers[header_key] = headers[header_key] + + # Send the request and receive the response + request = Request( + 'https://' + self.server_ip + self.base_uri + path, + headers=actual_headers) + request.get_method = lambda: method + + try: + response = urlopen(request, data) + + response_info = response.info() + if 'Deprecated' in response_info: + + # This version of the API is Deprecated. Print a warning to + # stderr. + print("WARNING: " + response_info['Deprecated'], + file=sys.stderr) + + # returns response object for opening url. + return response + except HTTPError as e: + # an object which contains information similar to a request object + return e + except URLError as e: + if (isinstance(e.reason, ssl.SSLError) and + e.reason.reason == "CERTIFICATE_VERIFY_FAILED"): + print("Certificate verification failed.") + sys.exit(3) + else: + raise e + + # This method constructs the query string + def parse_path(self, endpoint, params): + + path = endpoint + '?' + + if isinstance(params, list): + + for kv in params: + if kv[1]: + path += kv[0]+'='+quote(kv[1])+'&' + + else: + for k, v in params.items(): + if params[k]: + path += k+'='+quote(v)+'&' + + # removes last '&' or hanging '?' if no params. + return path[:len(path)-1] + + # Simple getters that can be used to inspect the state of this client. + def get_headers(self): + return self.headers.copy() + + def get_server_ip(self): + return self.server_ip + + def get_base_uri(self): + return self.base_uri \ No newline at end of file diff --git a/stix_transmission/src/modules/bigfix/Utilities.py b/stix_transmission/src/modules/bigfix/Utilities.py new file mode 100644 index 000000000..93391756c --- /dev/null +++ b/stix_transmission/src/modules/bigfix/Utilities.py @@ -0,0 +1,41 @@ +import sys +import json + + +# This function prints out the response from an endpoint in a consistent way. +def pretty_print_response(response): + print(response.code) + parsed_response = json.loads(response.read().decode('utf-8')) + print(json.dumps(parsed_response, indent=4)) + return + + +# this function prints out information about a request that will be made +# to the API. +def pretty_print_request(client, path, method, headers=None): + ip = client.get_server_ip() + base_uri = client.get_base_uri() + + header_copy = client.get_headers().copy() + if headers is not None: + header_copy.update(headers) + + url = 'https://' + ip + base_uri + path + print('Sending a ' + method + ' request to:') + print(url) + print('with these headers:') + print(header_copy) + print() + + +# this function sets up data to be used by a sample. If the data already exists +# it prefers to use the existing data. +def data_setup(client, path, method, params=[]): + response = client.call_api(path, method, params=params) + if (response.code == 409): + print("Data already exists, using existing data") + elif(response.code >= 400): + print("An error occurred setting up sample data:") + pretty_print_response(response) + sys.exit(1) + return response \ No newline at end of file diff --git a/tests/splunk_stix_to_spl/__init__.py b/stix_transmission/src/modules/bigfix/__init__.py similarity index 100% rename from tests/splunk_stix_to_spl/__init__.py rename to stix_transmission/src/modules/bigfix/__init__.py diff --git a/stix_transmission/src/modules/bigfix/bigfix_api_client.py b/stix_transmission/src/modules/bigfix/bigfix_api_client.py new file mode 100644 index 000000000..709607013 --- /dev/null +++ b/stix_transmission/src/modules/bigfix/bigfix_api_client.py @@ -0,0 +1,45 @@ +from .RestApiClient import RestApiClient + + +class APIClient(RestApiClient): + + PING_ENDPOINT = 'help/clientquery' + QUERY_ENDPOINT = 'clientquery' + RESULT_ENDPOINT = 'clientqueryresults/' + SYNC_QUERY_ENDPOINT = 'query' + + def __init__(self, server_ip, user_name, password, cert): + + super(APIClient, self).__init__(server_ip, user_name, password, cert) + + def ping_box(self): + endpoint = self.PING_ENDPOINT + return self.call_api(endpoint, 'GET', self.headers) + + def create_search(self, query_expression): + headers = self.headers.copy() + self.headers['Content-type'] = 'application/xml' + endpoint = self.QUERY_ENDPOINT + data = query_expression + data = data.encode('utf-8') + return self.call_api(endpoint, 'POST', self.headers, data=data) + + def get_search_results(self, search_id, offset, length): + headers = self.headers.copy() + headers['Accept'] = 'application/json' + endpoint = self.RESULT_ENDPOINT + search_id + + params = dict() + params['output'] = 'json' + params['stats'] = '1' + params['start'] = offset + params['count'] = length + + return self.call_api(endpoint, 'GET', headers, params=params) + + def get_sync_query_results(self, relevance): + headers = self.headers.copy() + endpoint = self.SYNC_QUERY_ENDPOINT + params = dict() + params['relevance'] = relevance + return self.call_api(endpoint, 'GET', headers, params=params) diff --git a/stix_transmission/src/modules/bigfix/bigfix_connector.py b/stix_transmission/src/modules/bigfix/bigfix_connector.py new file mode 100644 index 000000000..42132e86b --- /dev/null +++ b/stix_transmission/src/modules/bigfix/bigfix_connector.py @@ -0,0 +1,28 @@ +from ..base.base_connector import BaseConnector +from .bigfix_ping import BigFixPing +from .bigfix_query_connector import BigFixQueryConnector +from .bigfix_status_connector import BigFixStatusConnector +from .bigfix_delete_connector import BigFixDeleteConnector +from .bigfix_results_connector import BigFixResultsConnector +from .bigfix_api_client import APIClient + + +class Connector(BaseConnector): + # TODO: config params passed into constructor instance + def __init__(self, connection, configuration): + auth = configuration.get("auth") + user_name = auth.get('user_name') + password = auth.get('password') + host = connection.get('host') + port = connection.get('port') + cert = connection.get('cert', None) + host = str(host) + port = str(port) + self.api_client = APIClient(host + ':' + port, user_name, password, cert) + self.results_connector = BigFixResultsConnector(self.api_client) + self.status_connector = BigFixStatusConnector(self.api_client) + self.delete_connector = BigFixDeleteConnector(self.api_client) + self.query_connector = BigFixQueryConnector(self.api_client) + self.ping_connector = BigFixPing(self.api_client) + self.is_async = True + diff --git a/stix_transmission/src/modules/bigfix/bigfix_delete_connector.py b/stix_transmission/src/modules/bigfix/bigfix_delete_connector.py new file mode 100644 index 000000000..caae78c60 --- /dev/null +++ b/stix_transmission/src/modules/bigfix/bigfix_delete_connector.py @@ -0,0 +1,12 @@ +from ..base.base_delete_connector import BaseDeleteConnector + + +class BigFixDeleteConnector(BaseDeleteConnector): + + def __init__(self, api_client): + self.api_client = api_client + + def delete_query_connection(self, search_id): + return_obj = dict() + return_obj['success'] = True + return return_obj diff --git a/stix_transmission/src/modules/bigfix/bigfix_ping.py b/stix_transmission/src/modules/bigfix/bigfix_ping.py new file mode 100644 index 000000000..8693438d8 --- /dev/null +++ b/stix_transmission/src/modules/bigfix/bigfix_ping.py @@ -0,0 +1,28 @@ +from ..base.base_ping import BasePing + + +class BigFixPing(BasePing): + + ENDPOINT = '/api/clientquery' + + def __init__(self, api_client): + self.api_client = api_client + + def ping(self): + try: + response = self.api_client.ping_box() + response_code = response.code + result = response.read().decode('utf-8') + return_obj = dict() + + if self.ENDPOINT in result and 199 < response_code < 300: + return_obj['success'] = True + else: + return_obj['success'] = False + return_obj['error'] = 'error when pinging data source' + return return_obj + except Exception as err: + return_obj = dict() + return_obj['success'] = False + return_obj['error'] = 'error when pinging data source: {}'.format(err) + return return_obj diff --git a/stix_transmission/src/modules/bigfix/bigfix_query_connector.py b/stix_transmission/src/modules/bigfix/bigfix_query_connector.py new file mode 100644 index 000000000..edad45d7b --- /dev/null +++ b/stix_transmission/src/modules/bigfix/bigfix_query_connector.py @@ -0,0 +1,41 @@ +from ..base.base_query_connector import BaseQueryConnector +import re + + +class BigFixQueryConnector(BaseQueryConnector): + + PATTERN = '(.*)' + DEFAULT_ID = 'UNKNOWN' + + def __init__(self, api_client): + self.api_client = api_client + + def create_query_connection(self, query): + try: + response = self.api_client.create_search(query) + response_code = response.code + result = response.read().decode('utf-8') + search_id = self.DEFAULT_ID + search = re.search(self.PATTERN, result, re.IGNORECASE) + + if search: + search_id = search.group(1) + + return_obj = dict() + + if 199 < response_code < 300: + return_obj['success'] = True + return_obj['search_id'] = search_id + else: + return_obj['success'] = False + return_obj['error'] = 'error when creating search' + + if search_id == self.DEFAULT_ID: + return_obj['success'] = False + return_obj['error'] = 'error when creating search' + return return_obj + except Exception as err: + return_obj = dict() + return_obj['success'] = False + return_obj['error'] = 'error when creating search: {}'.format(err) + return return_obj diff --git a/stix_transmission/src/modules/bigfix/bigfix_results_connector.py b/stix_transmission/src/modules/bigfix/bigfix_results_connector.py new file mode 100644 index 000000000..bd1973ec9 --- /dev/null +++ b/stix_transmission/src/modules/bigfix/bigfix_results_connector.py @@ -0,0 +1,27 @@ +from ..base.base_results_connector import BaseResultsConnector +import json + + +class BigFixResultsConnector(BaseResultsConnector): + def __init__(self, api_client): + self.api_client = api_client + + def create_results_connection(self, search_id, offset, length): + try: + response = self.api_client.get_search_results(search_id, offset, length) + response_code = response.code + return_obj = dict() + if 199 < response_code < 300: + response_results = response.read() + response_json = json.loads(response_results) + return_obj['success'] = True + return_obj['data'] = response_json['results'] + else: + return_obj['success'] = False + return_obj['error'] = 'error when getting results' + return return_obj + except Exception as err: + return_obj = dict() + return_obj['success'] = False + return_obj['error'] = 'error when getting results: {}'.format(err) + return return_obj diff --git a/stix_transmission/src/modules/bigfix/bigfix_status_connector.py b/stix_transmission/src/modules/bigfix/bigfix_status_connector.py new file mode 100644 index 000000000..0b5f5a429 --- /dev/null +++ b/stix_transmission/src/modules/bigfix/bigfix_status_connector.py @@ -0,0 +1,63 @@ +from ..base.base_status_connector import BaseStatusConnector +from ..base.base_status_connector import Status +import math +import json +import re +import sys + + +class BigFixStatusConnector(BaseStatusConnector): + + RELEVANCE = 'number of bes computers whose (last report time of it > (now - 60 * minute))' + PATTERN = '(.*)' + DEFAULT_CLIENT_COUNT = sys.maxsize + + def __init__(self, api_client): + self.api_client = api_client + + def create_status_connection(self, search_id): + + try: + response = self.api_client.get_sync_query_results(self.RELEVANCE) + result = response.read().decode('utf-8') + client_count = self.DEFAULT_CLIENT_COUNT + search = re.search(self.PATTERN, result, re.IGNORECASE) + + if search: + client_count = search.group(1) + + client_count = int(client_count) + + response = self.api_client.get_search_results(search_id, '0', '1') + response_code = response.code + response_json = json.loads(response.read()) + return_obj = dict() + + if 199 < response_code < 300: + return_obj['success'] = True + return_obj['status'] = Status.RUNNING.value + + reporting_agents = int(response_json.get('reportingAgents', '0')) + total_results = int(response_json.get('totalResults', '0')) + + if self.DEFAULT_CLIENT_COUNT == client_count: + return_obj['progress'] = 0 + else: + progress = (reporting_agents / client_count) * 100 + progress_floor = math.floor(progress) + return_obj['progress'] = progress_floor + + if client_count <= reporting_agents: + return_obj['status'] = Status.COMPLETED.value + if total_results <= 0: + return_obj['status'] = Status.ERROR.value + else: + return_obj['success'] = False + return_obj['error'] = 'error when getting search status' + + return return_obj + except Exception as err: + return_obj = dict() + return_obj['success'] = False + return_obj['error'] = 'error when getting search status: {}'.format(err) + return return_obj diff --git a/stix_transmission/src/modules/qradar/RestApiClient.py b/stix_transmission/src/modules/qradar/RestApiClient.py new file mode 100644 index 000000000..a3a68c198 --- /dev/null +++ b/stix_transmission/src/modules/qradar/RestApiClient.py @@ -0,0 +1,155 @@ +from urllib.error import HTTPError +from urllib.error import URLError +from urllib.parse import quote +from urllib.request import Request +from urllib.request import urlopen +from urllib.request import install_opener +from urllib.request import build_opener +from urllib.request import HTTPSHandler +from .Utilities import * + +import ssl +import sys +import base64 + + +# This is a simple HTTP client that can be used to access the REST API +class RestApiClient: + + # Constructor for the RestApiClient Class + def __init__(self, server_ip, auth_token, cert=None, proxy=None, version=None): + self.headers = {'Accept': 'application/json'} + if version is not None: + self.headers['Version'] = version + if auth_token: + self.headers['SEC'] = auth_token + else: + raise Exception('No valid credentials found in configuration.') + if proxy is not None: + proxy_url = proxy.get('url') + proxy_auth = proxy.get('auth') + + if (proxy_url is not None and + proxy_auth is not None): + self.headers['proxy'] = proxy_url + self.headers['Proxy-Authorization'] = 'Basic ' + proxy_auth + + self.server_ip = server_ip + self.base_uri = '/api/' + + # Create a secure SSLContext + # PROTOCOL_SSLv23 is misleading. PROTOCOL_SSLv23 will use the highest + # version of SSL or TLS that both the client and server supports. + context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) + + # SSL version 2 and SSL version 3 are insecure. The insecure versions + # are disabled. + context.verify_mode = ssl.CERT_REQUIRED + if sys.version_info >= (3, 4): + context.check_hostname = True + + check_hostname = True + if cert is not None: + # Load the certificate if the user has specified a certificate + # file in config.ini. + + # The default QRadar certificate does not have a valid hostname, + # so me must disable hostname checking. + if sys.version_info >= (3, 4): + context.check_hostname = False + check_hostname = False + + # Instead of loading the default certificates load only the + # certificates specified by the user. + context.load_verify_locations(cadata=cert) + else: + if sys.version_info >= (3, 4): + # Python 3.4 and above has the improved load_default_certs() + # function. + context.load_default_certs(ssl.Purpose.CLIENT_AUTH) + else: + # Versions of Python before 3.4 do not have the + # load_default_certs method. set_default_verify_paths will + # work on some, but not all systems. It fails silently. If + # this call fails the certificate will fail to validate. + context.set_default_verify_paths() + + install_opener(build_opener( + HTTPSHandler(context=context, check_hostname=check_hostname))) + + # This method is used to set up an HTTP request and send it to the server + def call_api(self, endpoint, method, headers=None, params=[], data=None, + print_request=False): + + path = self.parse_path(endpoint, params) + + # If the caller specified customer headers merge them with the default + # headers. + actual_headers = self.headers.copy() + if headers is not None: + for header_key in headers: + actual_headers[header_key] = headers[header_key] + + # Send the request and receive the response + request = Request( + 'https://' + self.server_ip + self.base_uri + path, + headers=actual_headers) + request.get_method = lambda: method + + # Print the request if print_request is True. + if print_request: + SampleUtilities.pretty_print_request(self, path, method, + headers=actual_headers) + + try: + response = urlopen(request, data) + + response_info = response.info() + if 'Deprecated' in response_info: + + # This version of the API is Deprecated. Print a warning to + # stderr. + print("WARNING: " + response_info['Deprecated'], + file=sys.stderr) + + # returns response object for opening url. + return response + except HTTPError as e: + # an object which contains information similar to a request object + return e + except URLError as e: + if (isinstance(e.reason, ssl.SSLError) and + e.reason.reason == "CERTIFICATE_VERIFY_FAILED"): + print("Certificate verification failed.") + sys.exit(3) + else: + raise e + + # This method constructs the query string + def parse_path(self, endpoint, params): + + path = endpoint + '?' + + if isinstance(params, list): + + for kv in params: + if kv[1]: + path += kv[0]+'='+quote(kv[1])+'&' + + else: + for k, v in params.items(): + if params[k]: + path += k+'='+quote(v)+'&' + + # removes last '&' or hanging '?' if no params. + return path[:len(path)-1] + + # Simple getters that can be used to inspect the state of this client. + def get_headers(self): + return self.headers.copy() + + def get_server_ip(self): + return self.server_ip + + def get_base_uri(self): + return self.base_uri diff --git a/stix_transmission/src/modules/qradar/Utilities.py b/stix_transmission/src/modules/qradar/Utilities.py new file mode 100644 index 000000000..93391756c --- /dev/null +++ b/stix_transmission/src/modules/qradar/Utilities.py @@ -0,0 +1,41 @@ +import sys +import json + + +# This function prints out the response from an endpoint in a consistent way. +def pretty_print_response(response): + print(response.code) + parsed_response = json.loads(response.read().decode('utf-8')) + print(json.dumps(parsed_response, indent=4)) + return + + +# this function prints out information about a request that will be made +# to the API. +def pretty_print_request(client, path, method, headers=None): + ip = client.get_server_ip() + base_uri = client.get_base_uri() + + header_copy = client.get_headers().copy() + if headers is not None: + header_copy.update(headers) + + url = 'https://' + ip + base_uri + path + print('Sending a ' + method + ' request to:') + print(url) + print('with these headers:') + print(header_copy) + print() + + +# this function sets up data to be used by a sample. If the data already exists +# it prefers to use the existing data. +def data_setup(client, path, method, params=[]): + response = client.call_api(path, method, params=params) + if (response.code == 409): + print("Data already exists, using existing data") + elif(response.code >= 400): + print("An error occurred setting up sample data:") + pretty_print_response(response) + sys.exit(1) + return response \ No newline at end of file diff --git a/stix_transmission/src/modules/qradar/__init__.py b/stix_transmission/src/modules/qradar/__init__.py new file mode 100644 index 000000000..e69de29bb diff --git a/stix_transmission/src/modules/qradar/arielapiclient.py b/stix_transmission/src/modules/qradar/arielapiclient.py new file mode 100644 index 000000000..6a773c458 --- /dev/null +++ b/stix_transmission/src/modules/qradar/arielapiclient.py @@ -0,0 +1,110 @@ +from .RestApiClient import RestApiClient +import urllib.parse + + +# Inherits methods from RestApiClient +class APIClient(RestApiClient): + + # API METHODS + + # These methods are used to call Ariel's API methods through http requests. + # Each method makes use of the http methods below to perform the requests. + + # This class will encode any data or query parameters which will then be + # sent to the call_api() method of its inherited class. + def __init__(self, server_ip, auth_token, proxy, cert): + + # This version of the ariel APIClient is designed to function with + # version 6.0 of the ariel API. + self.endpoint_start = 'ariel/' + super(APIClient, self).__init__(server_ip, auth_token, cert, proxy, + version='8.0') + + def ping_box(self): + endpoint = 'help/resources' + return self.call_api(endpoint, 'GET', self.headers) + + def get_databases(self): + + endpoint = self.endpoint_start + 'databases' + # Sends a GET request to + # https:///rest/api/ariel/databases + return self.call_api(endpoint, 'GET', self.headers) + + def get_database(self, database_name): + + endpoint = self.endpoint_start + 'databases' + '/' + database_name + # Sends a GET request to + # https:///rest/api/ariel/databases/ + return self.call_api(endpoint, 'GET', self.headers) + + def get_searches(self): + + endpoint = self.endpoint_start + "searches" + # sends a GET request to https:///rest/api/ariel/searches + return self.call_api(endpoint, 'GET', self.headers) + + def create_search(self, query_expression): + + endpoint = self.endpoint_start + "searches" + + # sends a POST request to https:///rest/api/ariel/searches + + data = {'query_expression': query_expression} + + data = urllib.parse.urlencode(data) + data = data.encode('utf-8') + + return self.call_api(endpoint, 'POST', self.headers, data=data) + + def get_search(self, search_id): + + # Sends a GET request to + # https:///rest/api/ariel/searches/ + endpoint = self.endpoint_start + "searches/" + search_id + + return self.call_api(endpoint, 'GET', self.headers) + + def get_search_results(self, search_id, + response_type, range_start=None, range_end=None): + + headers = self.headers.copy() + headers['Accept'] = response_type + + if ((range_start is not None) and (range_end is not None)): + headers['Range'] = ('items=' + + str(range_start) + '-' + str(range_end)) + + # sends a GET request to + # https:///rest/api/ariel/searches/ + endpoint = self.endpoint_start + "searches/" + search_id + '/results' + + # response object body should contain information pertaining to search. + return self.call_api(endpoint, 'GET', headers) + + def update_search(self, search_id, save_results=None, status=None): + + # sends a POST request to + # https:///rest/api/ariel/searches/ + # posts search result to site + endpoint = self.endpoint_start + "searches/" + search_id + + data = {} + if save_results: + data['save_results'] = save_results + if status: + data['status'] = status + + data = urllib.parse.urlencode(data) + data = data.encode('utf-8') + + return self.call_api(endpoint, 'POST', self.headers, data=data) + + def delete_search(self, search_id): + + # sends a DELETE request to + # https:///rest/api/ariel/searches/ + # deletes search created earlier. + endpoint = self.endpoint_start + "searches" + '/' + search_id + + return self.call_api(endpoint, 'DELETE', self.headers) diff --git a/stix_transmission/src/modules/qradar/qradar_connector.py b/stix_transmission/src/modules/qradar/qradar_connector.py new file mode 100644 index 000000000..8c818983a --- /dev/null +++ b/stix_transmission/src/modules/qradar/qradar_connector.py @@ -0,0 +1,24 @@ +from ..base.base_connector import BaseConnector +from .qradar_ping import QRadarPing +from .qradar_query_connector import QRadarQueryConnector +from .qradar_status_connector import QRadarStatusConnector +from .qradar_delete_connector import QRadarDeleteConnector +from .qradar_results_connector import QRadarResultsConnector +from .arielapiclient import APIClient + + +class Connector(BaseConnector): + # TODO: config params passed into constructor instance + def __init__(self, connection, configuration): + auth = configuration.get('auth') + host = connection.get('host') + port = connection.get('port') + cert = connection.get('cert', None) + proxy = connection.get('proxy') + self.api_client = APIClient(host + ':' + str(port), auth['SEC'], proxy, cert) + self.results_connector = QRadarResultsConnector(self.api_client) + self.status_connector = QRadarStatusConnector(self.api_client) + self.delete_connector = QRadarDeleteConnector(self.api_client) + self.query_connector = QRadarQueryConnector(self.api_client) + self.ping_connector = QRadarPing(self.api_client) + self.is_async = True diff --git a/stix_transmission/src/modules/qradar/qradar_delete_connector.py b/stix_transmission/src/modules/qradar/qradar_delete_connector.py new file mode 100644 index 000000000..a61d0eb4b --- /dev/null +++ b/stix_transmission/src/modules/qradar/qradar_delete_connector.py @@ -0,0 +1,26 @@ +from ..base.base_delete_connector import BaseDeleteConnector +import json + + +class QRadarDeleteConnector(BaseDeleteConnector): + def __init__(self, api_client): + self.api_client = api_client + + def delete_query_connection(self, search_id): + try: + response = self.api_client.delete_search(search_id) + response_code = response.code + response_json = json.loads(response.read()) + + # Construct a response object + return_obj = dict() + if response_code == 200: + return_obj['success'] = True + else: + return_obj['success'] = False + return_obj['error'] = response_json['message'] + + return return_obj + except Exception as err: + print('error when deleting search {}:'.format(err)) + raise diff --git a/stix_transmission/src/modules/qradar/qradar_ping.py b/stix_transmission/src/modules/qradar/qradar_ping.py new file mode 100644 index 000000000..47588e39e --- /dev/null +++ b/stix_transmission/src/modules/qradar/qradar_ping.py @@ -0,0 +1,27 @@ +from ..base.base_ping import BasePing +import json + + +class QRadarPing(BasePing): + def __init__(self, api_client): + self.api_client = api_client + + def ping(self): + try: + response = self.api_client.ping_box() + response_code = response.code + + response_json = json.loads(response.read()) + + return_obj = dict() + return_obj['success'] = False + + if len(response_json) > 0 and response_code == 200: + return_obj['success'] = True + else: + return_obj['error'] = response_json['message'] + + return return_obj + except Exception as err: + print('error when pinging datasource {}:'.format(err)) + raise diff --git a/stix_transmission/src/modules/qradar/qradar_query_connector.py b/stix_transmission/src/modules/qradar/qradar_query_connector.py new file mode 100644 index 000000000..5f83da236 --- /dev/null +++ b/stix_transmission/src/modules/qradar/qradar_query_connector.py @@ -0,0 +1,28 @@ +from ..base.base_query_connector import BaseQueryConnector +import json + + +class QRadarQueryConnector(BaseQueryConnector): + def __init__(self, api_client): + self.api_client = api_client + + def create_query_connection(self, query): + # Grab the response, extract the response code, and convert it to readable json + try: + response = self.api_client.create_search(query) + response_code = response.code + response_json = json.loads(response.read()) + + # Construct a response object + return_obj = dict() + + if response_code == 201: + return_obj['success'] = True + return_obj['search_id'] = response_json['search_id'] + else: + return_obj['success'] = False + return_obj['error'] = response_json['message'] + return return_obj + except Exception as err: + print('error when creating search: {}'.format(err)) + raise diff --git a/stix_transmission/src/modules/qradar/qradar_results_connector.py b/stix_transmission/src/modules/qradar/qradar_results_connector.py new file mode 100644 index 000000000..46c3f7dbc --- /dev/null +++ b/stix_transmission/src/modules/qradar/qradar_results_connector.py @@ -0,0 +1,29 @@ +from ..base.base_results_connector import BaseResultsConnector +import json + + +class QRadarResultsConnector(BaseResultsConnector): + def __init__(self, api_client): + self.api_client = api_client + + def create_results_connection(self, search_id, offset, length): + min_range = offset + max_range = offset + length + # Grab the response, extract the response code, and convert it to readable json + try: + response = self.api_client.get_search_results(search_id, 'application/json', min_range, max_range) + response_code = response.code + + # Construct a response object + return_obj = dict() + if response_code == 200: + return_obj['success'] = True + response_json = json.loads(response.read()) + return_obj['data'] = response_json['events'] + else: + return_obj['success'] = False + return_obj['error'] = response_json['message'] + return return_obj + except Exception as err: + print('error when getting search results: {}'.format(err)) + raise diff --git a/stix_transmission/src/modules/qradar/qradar_status_connector.py b/stix_transmission/src/modules/qradar/qradar_status_connector.py new file mode 100644 index 000000000..aab6c6448 --- /dev/null +++ b/stix_transmission/src/modules/qradar/qradar_status_connector.py @@ -0,0 +1,52 @@ +from ..base.base_status_connector import BaseStatusConnector +from ..base.base_status_connector import Status +from enum import Enum +import json + + +class QRadarStatus(Enum): + # WAIT, EXECUTE, SORTING, COMPLETED, CANCELED, ERROR + WAIT = 'WAIT' + EXECUTE = 'EXECUTE' + SORTING = 'SORTING' + COMPLETED = 'COMPLETED' + CANCELED = 'CANCELED' + ERROR = 'ERROR' + + +class QRadarStatusConnector(BaseStatusConnector): + def __init__(self, api_client): + self.api_client = api_client + + def __getStatus(self, qradar_status): + switcher = { + QRadarStatus.WAIT.value: Status.RUNNING, + QRadarStatus.EXECUTE.value: Status.RUNNING, + QRadarStatus.SORTING.value: Status.RUNNING, + QRadarStatus.COMPLETED.value: Status.COMPLETED, + QRadarStatus.CANCELED.value: Status.CANCELED, + QRadarStatus.ERROR.value: Status.ERROR + } + return switcher.get(qradar_status).value + + def create_status_connection(self, search_id): + # Grab the response, extract the response code, and convert it to readable json + try: + response = self.api_client.get_search(search_id) + response_code = response.code + response_json = json.loads(response.read()) + + # Construct a response object + return_obj = dict() + + if response_code == 200: + return_obj['success'] = True + return_obj['status'] = self.__getStatus(response_json['status']) + return_obj['progress'] = response_json['progress'] + else: + return_obj['success'] = False + return_obj['error'] = response_json['message'] + return return_obj + except Exception as err: + print('error when getting search status: {}'.format(err)) + raise diff --git a/stix_transmission/src/modules/splunk/RestApiClient.py b/stix_transmission/src/modules/splunk/RestApiClient.py new file mode 100644 index 000000000..28122f2f3 --- /dev/null +++ b/stix_transmission/src/modules/splunk/RestApiClient.py @@ -0,0 +1,140 @@ +from urllib.error import HTTPError +from urllib.error import URLError +from urllib.parse import quote +from urllib.request import Request +from urllib.request import urlopen +from urllib.parse import urlencode +from urllib.request import install_opener +from urllib.request import build_opener +from urllib.request import HTTPSHandler +from .Utilities import * + +import ssl +import sys +import base64 +import json + + +# This is a simple HTTP client that can be used to access the REST API +class RestApiClient: + + # Constructor for the RestApiClient Class + def __init__(self, server_ip, auth=None, output_mode='json', version=None): + self.headers = {'Content-Type': 'application/x-www-form-urlencoded', + 'Accept': 'application/json'} + # TODO: check version functionality + if version is not None: + self.headers['Version'] = version + if auth is None: + raise Exception('No valid credentials found in configuration.') + + self.server_ip = server_ip + self.base_uri = server_ip + self.output_mode = output_mode + self.auth = auth + + # Create a secure SSLContext + # PROTOCOL_SSLv23 is misleading. PROTOCOL_SSLv23 will use the highest + # version of SSL or TLS that both the client and server supports. + context = ssl.SSLContext(ssl.PROTOCOL_SSLv23) + + # Disable ssl check + context.verify_mode = ssl.CERT_NONE + context.check_hostname = False + check_hostname = False + + install_opener(build_opener( + HTTPSHandler(context=context, check_hostname=check_hostname))) + + self.set_auth_token() # set authorization token in header + + # This method is used to set up an HTTP request and send it to the server + def call_api(self, endpoint, method, headers=None, params=[], data=None, + print_request=False): + + path = self.parse_path(endpoint, params) + + # If the caller specified customer headers merge them with the default + # headers. + actual_headers = self.headers.copy() + if headers is not None: + for header_key in headers: + actual_headers[header_key] = headers[header_key] + + # Send the request and receive the response + request = Request( + 'https://' + self.server_ip + '/' + path, + headers=actual_headers) + request.get_method = lambda: method + + # Print the request if print_request is True. + if print_request: + SampleUtilities.pretty_print_request(self, path, method, + headers=actual_headers) + + try: + response = urlopen(request, data) + response_info = response.info() + if 'Deprecated' in response_info: + + # This version of the API is Deprecated. Print a warning to + # stderr. + print("WARNING: " + response_info['Deprecated'], + file=sys.stderr) + + # returns response object for opening url. + return response + except HTTPError as e: + # an object which contains information similar to a request object + return e + except URLError as e: + if (isinstance(e.reason, ssl.SSLError) and + e.reason.reason == "CERTIFICATE_VERIFY_FAILED"): + print("Certificate verification failed.") + sys.exit(3) + else: + raise e + + # This method constructs the query string + def parse_path(self, endpoint, params): + + path = endpoint + '?' + + if isinstance(params, list): + + for kv in params: + if kv[1]: + path += kv[0]+'='+quote(kv[1])+'&' + + else: + for k, v in params.items(): + if params[k]: + path += k+'='+quote(v)+'&' + + # removes last '&' or hanging '?' if no params. + return path[:len(path)-1] + + # Simple getters that can be used to inspect the state of this client. + def get_headers(self): + return self.headers.copy() + + def get_server_ip(self): + return self.server_ip + + def get_base_uri(self): + return self.base_uri + + def set_auth_token(self): + + data = {'username': self.auth['username'], 'password': self.auth['password'], 'output_mode': self.output_mode} + endpoint = self.endpoint_start + 'auth/login' + + try: + data = urlencode(data) + data = data.encode('utf-8') + response_json = json.load(self.call_api(endpoint, 'POST', None, data=data)) + self.headers['Authorization'] = "Splunk " + response_json['sessionKey'] + except Exception as e: + print('Authentication error occured while getting auth token.') + raise e + diff --git a/stix_transmission/src/modules/splunk/Utilities.py b/stix_transmission/src/modules/splunk/Utilities.py new file mode 100644 index 000000000..6a31c5c92 --- /dev/null +++ b/stix_transmission/src/modules/splunk/Utilities.py @@ -0,0 +1,41 @@ +import sys +import json + + +# This function prints out the response from an endpoint in a consistent way. +def pretty_print_response(response): + print(response.code) + parsed_response = json.loads(response.read().decode('utf-8')) + print(json.dumps(parsed_response, indent=4)) + return + + +# this function prints out information about a request that will be made +# to the API. +def pretty_print_request(client, path, method, headers=None): + ip = client.get_server_ip() + base_uri = client.get_base_uri() + + header_copy = client.get_headers().copy() + if headers is not None: + header_copy.update(headers) + + url = 'https://' + ip + path + print('Sending a ' + method + ' request to:') + print(url) + print('with these headers:') + print(header_copy) + print() + + +# this function sets up data to be used by a sample. If the data already exists +# it prefers to use the existing data. +def data_setup(client, path, method, params=[]): + response = client.call_api(path, method, params=params) + if (response.code == 409): + print("Data already exists, using existing data") + elif(response.code >= 400): + print("An error occurred setting up sample data:") + pretty_print_response(response) + sys.exit(1) + return response \ No newline at end of file diff --git a/stix_transmission/src/modules/splunk/__init__.py b/stix_transmission/src/modules/splunk/__init__.py new file mode 100644 index 000000000..e69de29bb diff --git a/stix_transmission/src/modules/splunk/spl_api_client.py b/stix_transmission/src/modules/splunk/spl_api_client.py new file mode 100644 index 000000000..481476237 --- /dev/null +++ b/stix_transmission/src/modules/splunk/spl_api_client.py @@ -0,0 +1,68 @@ +from .RestApiClient import RestApiClient +import urllib.parse +import json + + +# Inherits methods from RestApiClient +class APIClient(RestApiClient): + + # API METHODS + + # These methods are used to call Splunk's API methods through http requests. + # Each method makes use of the http methods below to perform the requests. + + # This class will encode any data or query parameters which will then be + # sent to the call_api() method of its inherited class. + def __init__(self, server_ip, auth): + + # This version of the Splunk APIClient is designed to function with + # Splunk Enterprise version >= 6.5.0 and <= 7.1.2 + # http://docs.splunk.com/Documentation/Splunk/7.1.2/RESTREF/RESTprolog + self.endpoint_start = 'services/' + super(APIClient, self).__init__(server_ip, auth) + + def ping_box(self): + endpoint = self.endpoint_start + 'server/status' + params = {'output_mode': self.output_mode} + return self.call_api(endpoint, 'GET', self.headers, params=params) + + def create_search(self, query_expression): + endpoint = self.endpoint_start + "search/jobs" + # sends a POST request to + # https://:/services/search/jobs + data = {'search': query_expression, 'output_mode': self.output_mode} + data = urllib.parse.urlencode(data) + data = data.encode('utf-8') + return self.call_api(endpoint, 'POST', self.headers, data=data) + + def get_search(self, search_id): + endpoint = self.endpoint_start + 'search/jobs/' + search_id + # sends a GET request to + # https://:/services/search/jobs/ + # returns information about the search job and its properties. + params = {'output_mode': self.output_mode} + return self.call_api(endpoint, 'GET', self.headers, params=params) + + def get_search_results(self, search_id, offset, count): + headers = self.headers.copy() + # sends a GET request to + # https://:/services/search/jobs//results + # returns results associated with the search job. + endpoint = self.endpoint_start + "search/jobs/" + search_id + '/results' + params = {'output_mode': self.output_mode} + if ((offset is not None) and (count is not None)): + params['offset'] = str(offset) + params['count'] = str(count) + + # response object body should contain information pertaining to search. + return self.call_api(endpoint, 'GET', headers, params=params) + + def delete_search(self, search_id): + endpoint = self.endpoint_start + 'search/jobs/' + search_id + data = {'output_mode': self.output_mode} + data = urllib.parse.urlencode(data) + data = data.encode('utf-8') + # sends a DELETE request to + # https://:/services/search/jobs/ + # cancels and deletes search created earlier. + return self.call_api(endpoint, 'DELETE', self.headers, data=data) diff --git a/stix_transmission/src/modules/splunk/splunk_auth.py b/stix_transmission/src/modules/splunk/splunk_auth.py new file mode 100644 index 000000000..a5f16af4e --- /dev/null +++ b/stix_transmission/src/modules/splunk/splunk_auth.py @@ -0,0 +1,10 @@ +import json + + +class SplunkAuth(): + def __init__(self, api_client): + self.api_client = api_client + + def get_auth_token(self, auth): + response = self.api_client.get_auth_token(auth) + return response diff --git a/stix_transmission/src/modules/splunk/splunk_connector.py b/stix_transmission/src/modules/splunk/splunk_connector.py new file mode 100644 index 000000000..1693045bb --- /dev/null +++ b/stix_transmission/src/modules/splunk/splunk_connector.py @@ -0,0 +1,27 @@ +from ..base.base_connector import BaseConnector +from .splunk_query_connector import SplunkQueryConnector +from .splunk_status_connector import SplunkStatusConnector +from .splunk_results_connector import SplunkResultsConnector +from .splunk_delete_connector import SplunkDeleteConnector +from .spl_api_client import APIClient +from .splunk_ping import SplunkPing +from .splunk_auth import SplunkAuth + +import json + + +class Connector(BaseConnector): + # TODO: config params passed into constructor instance + def __init__(self, connection, configuration): + auth = configuration.get("auth") + host = connection.get("host") + port = connection.get("port") + url = host + ':' + str(port) + + self.api_client = APIClient(url, auth) + self.delete_connector = SplunkDeleteConnector(self.api_client) + self.results_connector = SplunkResultsConnector(self.api_client) + self.status_connector = SplunkStatusConnector(self.api_client) + self.query_connector = SplunkQueryConnector(self.api_client) + self.ping_connector = SplunkPing(self.api_client) + self.is_async = True diff --git a/stix_transmission/src/modules/splunk/splunk_delete_connector.py b/stix_transmission/src/modules/splunk/splunk_delete_connector.py new file mode 100644 index 000000000..bbb567e10 --- /dev/null +++ b/stix_transmission/src/modules/splunk/splunk_delete_connector.py @@ -0,0 +1,33 @@ +from ..base.base_delete_connector import BaseDeleteConnector +import json + + +class SplunkDeleteConnector(BaseDeleteConnector): + def __init__(self, api_client): + self.api_client = api_client + + def delete_query_connection(self, search_id): + # Grab the response, extract the response code, and convert it to readable json + try: + response = self.api_client.delete_search(search_id) + response_code = response.code + response_json = json.load(response) + + # Construct a response object + return_obj = dict() + if response_code == 200: + return_obj['success'] = True + else: + # extract message + if len(response_json['messages']) > 0: + message = response_json['messages'][0]['text'] + else: + message = "Unknown sid." + + return_obj['success'] = False + return_obj['error'] = message + + return return_obj + except Exception as err: + print('error deleting search: {}'.format(err)) + raise \ No newline at end of file diff --git a/stix_transmission/src/modules/splunk/splunk_ping.py b/stix_transmission/src/modules/splunk/splunk_ping.py new file mode 100644 index 000000000..c5f4b2366 --- /dev/null +++ b/stix_transmission/src/modules/splunk/splunk_ping.py @@ -0,0 +1,27 @@ +from ..base.base_ping import BasePing +import json + + +class SplunkPing(BasePing): + def __init__(self, api_client): + self.api_client = api_client + + def ping(self): + try: + response = self.api_client.ping_box() + response_code = response.code + + response_json = response.read() + + return_obj = dict() + return_obj['success'] = False + + if len(response_json) > 0 and response_code == 200: + return_obj['success'] = True + else: + return_obj['error'] = response_json['messages'] + + return return_obj + except Exception as err: + print('error when pinging datasource {}:'.format(err)) + raise \ No newline at end of file diff --git a/stix_transmission/src/modules/splunk/splunk_query_connector.py b/stix_transmission/src/modules/splunk/splunk_query_connector.py new file mode 100644 index 000000000..d6ec4c80c --- /dev/null +++ b/stix_transmission/src/modules/splunk/splunk_query_connector.py @@ -0,0 +1,28 @@ +from ..base.base_query_connector import BaseQueryConnector +import json + + +class SplunkQueryConnector(BaseQueryConnector): + def __init__(self, api_client): + self.api_client = api_client + + def create_query_connection(self, query): + # Grab the response, extract the response code, and convert it to readable json + try: + response = self.api_client.create_search(query) + response_code = response.code + response_json = json.loads(response.read()) + + # Construct a response object + return_obj = dict() + + if response_code == 201: + return_obj['success'] = True + return_obj['search_id'] = response_json['sid'] + else: + return_obj['success'] = False + return_obj['error'] = response_json['messages'][0]['text'] + return return_obj + except Exception as err: + print('error when creating search: {}'.format(err)) + raise diff --git a/stix_transmission/src/modules/splunk/splunk_results_connector.py b/stix_transmission/src/modules/splunk/splunk_results_connector.py new file mode 100644 index 000000000..c6a5ab00d --- /dev/null +++ b/stix_transmission/src/modules/splunk/splunk_results_connector.py @@ -0,0 +1,33 @@ +from ..base.base_results_connector import BaseResultsConnector +from .spl_api_client import APIClient +import json + + +class SplunkResultsConnector(BaseResultsConnector): + def __init__(self, api_client): + self.api_client = api_client + + def create_results_connection(self, search_id, offset, length): + # Grab the response, extract the response code, and convert it to readable json + try: + response = self.api_client.get_search_results(search_id, offset, length) + response_code = response.code + response_json = json.load(response) + if "results" in response_json: + results = [{}] if (response_json['results'] == []) else response_json['results'] + else: + results = [{}] + + # Construct a response object + return_obj = dict() + if response_code == 200: + return_obj['success'] = True + return_obj['data'] = results + else: + return_obj['success'] = False + return_obj['error'] = response_json['messages'][0]['text'] + return return_obj + + except Exception as err: + print('error when getting search results: {}'.format(err)) + raise diff --git a/stix_transmission/src/modules/splunk/splunk_status_connector.py b/stix_transmission/src/modules/splunk/splunk_status_connector.py new file mode 100644 index 000000000..b130be840 --- /dev/null +++ b/stix_transmission/src/modules/splunk/splunk_status_connector.py @@ -0,0 +1,48 @@ +from ..base.base_status_connector import BaseStatusConnector, Status +from .spl_api_client import APIClient +import json +import math + + +class SplunkStatusConnector(BaseStatusConnector): + def __init__(self, api_client): + self.api_client = api_client + + def create_status_connection(self, search_id): + # Grab the response, extract the response code, and convert it to readable json + try: + response = self.api_client.get_search(search_id) + response_code = response.code + response_json = json.load(response) + + status, progress = '', '' + + if 'entry' in response_json and isinstance(response_json['entry'], list): + content = response_json['entry'][0]['content'] + progress = math.ceil(content['doneProgress'] * 100) # convert 0-1.0 scale to 0-100 + + if content['isDone'] is True: + status = Status.COMPLETED.value + elif content['isFailed'] is True: + status = Status.ERROR.value + elif content['isFinalized'] is True: + status = Status.CANCELED.value + elif progress < 100: + status = Status.RUNNING.value + else: + status = 'NA' + + # Construct a response object + return_obj = dict() + if response_code == 200: + return_obj['success'] = True + return_obj['status'] = status + return_obj['progress'] = progress + else: + return_obj['success'] = False + return_obj['error'] = response_json['messages'][0]['text'] + + return return_obj + except Exception as err: + print('error when getting search status: {}'.format(err)) + raise diff --git a/stix_transmission/src/modules/synchronous_dummy/__init__.py b/stix_transmission/src/modules/synchronous_dummy/__init__.py new file mode 100644 index 000000000..e69de29bb diff --git a/stix_transmission/src/modules/synchronous_dummy/synchronous_dummy_connector.py b/stix_transmission/src/modules/synchronous_dummy/synchronous_dummy_connector.py new file mode 100644 index 000000000..d9ebef1e0 --- /dev/null +++ b/stix_transmission/src/modules/synchronous_dummy/synchronous_dummy_connector.py @@ -0,0 +1,10 @@ +from ..base.base_connector import BaseConnector +from .synchronous_dummy_results_connector import SynchronousDummyResultsConnector +from .synchronous_dummy_ping import SynchronousDummyPing + + +class Connector(BaseConnector): + def __init__(self): + self.results_connector = SynchronousDummyResultsConnector() + self.is_async = False + self.ping = SynchronousDummyPing() diff --git a/stix_transmission/src/modules/synchronous_dummy/synchronous_dummy_ping.py b/stix_transmission/src/modules/synchronous_dummy/synchronous_dummy_ping.py new file mode 100644 index 000000000..8fa400b01 --- /dev/null +++ b/stix_transmission/src/modules/synchronous_dummy/synchronous_dummy_ping.py @@ -0,0 +1,6 @@ +from ..base.base_ping import BasePing + + +class SynchronousDummyPing(BasePing): + def ping(self): + return "synchronous ping" diff --git a/stix_transmission/src/modules/synchronous_dummy/synchronous_dummy_results_connector.py b/stix_transmission/src/modules/synchronous_dummy/synchronous_dummy_results_connector.py new file mode 100644 index 000000000..1cf4cd488 --- /dev/null +++ b/stix_transmission/src/modules/synchronous_dummy/synchronous_dummy_results_connector.py @@ -0,0 +1,48 @@ +from ..base.base_results_connector import BaseResultsConnector +import time + +RETURN_DUMMY_DATA = { + "obj_1": {}, + "obj_2": {}, + "obj_3": {}, + "obj_4": {}, + "obj_5": {}, +} + + +class SynchronousDummyResultsConnector(BaseResultsConnector): + def create_results_connection(self, params, options): + """ + Creates a connection to the specified datasource to send a query + + :param params: the parameters for the query + :param options: CLI options passed in + + :return: in dummy connectors, just returns passed in parameters + """ + config = params['config'] + query = params['query'] + # set headers + + headers = { + "Content-Type": "application/json", + "Accept": "application/json" + } + + # construct request object, purely for visual purposes in dummy implementation + request = { + "host": config['host'], + "path": config['path'] + query, + "port": config['port'], + "headers": headers, + "method": "GET" + } + + print(request) + time.sleep(3) + return_obj = { + "response_code": 200, + "query_results": RETURN_DUMMY_DATA + } + + return return_obj diff --git a/stix_transmission/stix_transmission.py b/stix_transmission/stix_transmission.py new file mode 100644 index 000000000..649866c3d --- /dev/null +++ b/stix_transmission/stix_transmission.py @@ -0,0 +1,33 @@ +import importlib + + +class StixTransmission: + def __init__(self, module, connection, configuration): + + self.connector_module = importlib.import_module("stix_transmission.src.modules." + module + + "." + module + "_connector") + self.interface = self.connector_module.Connector(connection, configuration) + + def query(self, query): + # Creates and sends a query to the correct datasource + return self.interface.create_query_connection(query) + + def status(self, search_id): + # Creates and sends a status query to the correct datasource asking for the status of the specific query + return self.interface.create_status_connection(search_id) + + def results(self, search_id, offset, length): + # Creates and sends a query to the correct datasource asking for results of the specific query + return self.interface.create_results_connection(search_id, offset, length) + + def delete(self, search_id): + # Sends a request to the correct datasource, asking to terminate a specific query + return self.interface.delete_query_connection(search_id) + + def ping(self): + # Creates and sends a ping request to confirm we are connected and authenticated + return self.interface.ping() + + def is_async(self): + # Check if the module is async/sync + return self.interface.is_async diff --git a/tests/stix_shifter/__init__.py b/tests/stix_shifter/__init__.py new file mode 100644 index 000000000..e69de29bb diff --git a/tests/stix_shifter/patterns/__init__.py b/tests/stix_shifter/patterns/__init__.py new file mode 100644 index 000000000..e69de29bb diff --git a/tests/patterns/generate_test_case.py b/tests/stix_shifter/patterns/generate_test_case.py similarity index 100% rename from tests/patterns/generate_test_case.py rename to tests/stix_shifter/patterns/generate_test_case.py diff --git a/tests/patterns/generated_tests.py b/tests/stix_shifter/patterns/generated_tests.py similarity index 100% rename from tests/patterns/generated_tests.py rename to tests/stix_shifter/patterns/generated_tests.py diff --git a/tests/patterns/helpers/connectors.py b/tests/stix_shifter/patterns/helpers/connectors.py similarity index 100% rename from tests/patterns/helpers/connectors.py rename to tests/stix_shifter/patterns/helpers/connectors.py diff --git a/tests/patterns/helpers/input_file_helpers.py b/tests/stix_shifter/patterns/helpers/input_file_helpers.py similarity index 100% rename from tests/patterns/helpers/input_file_helpers.py rename to tests/stix_shifter/patterns/helpers/input_file_helpers.py diff --git a/tests/patterns/input_files/and_not_in_set.json b/tests/stix_shifter/patterns/input_files/and_not_in_set.json similarity index 100% rename from tests/patterns/input_files/and_not_in_set.json rename to tests/stix_shifter/patterns/input_files/and_not_in_set.json diff --git a/tests/patterns/input_files/and_not_like.json b/tests/stix_shifter/patterns/input_files/and_not_like.json similarity index 100% rename from tests/patterns/input_files/and_not_like.json rename to tests/stix_shifter/patterns/input_files/and_not_like.json diff --git a/tests/patterns/input_files/anded_obs_expression.json b/tests/stix_shifter/patterns/input_files/anded_obs_expression.json similarity index 100% rename from tests/patterns/input_files/anded_obs_expression.json rename to tests/stix_shifter/patterns/input_files/anded_obs_expression.json diff --git a/tests/patterns/input_files/anded_one_regex.json b/tests/stix_shifter/patterns/input_files/anded_one_regex.json similarity index 100% rename from tests/patterns/input_files/anded_one_regex.json rename to tests/stix_shifter/patterns/input_files/anded_one_regex.json diff --git a/tests/patterns/input_files/anded_two_regex.json b/tests/stix_shifter/patterns/input_files/anded_two_regex.json similarity index 100% rename from tests/patterns/input_files/anded_two_regex.json rename to tests/stix_shifter/patterns/input_files/anded_two_regex.json diff --git a/tests/patterns/input_files/car_2013_03_001.json b/tests/stix_shifter/patterns/input_files/car_2013_03_001.json similarity index 100% rename from tests/patterns/input_files/car_2013_03_001.json rename to tests/stix_shifter/patterns/input_files/car_2013_03_001.json diff --git a/tests/patterns/input_files/car_2013_05_002.json b/tests/stix_shifter/patterns/input_files/car_2013_05_002.json similarity index 100% rename from tests/patterns/input_files/car_2013_05_002.json rename to tests/stix_shifter/patterns/input_files/car_2013_05_002.json diff --git a/tests/patterns/input_files/car_2014_11_004.json b/tests/stix_shifter/patterns/input_files/car_2014_11_004.json similarity index 100% rename from tests/patterns/input_files/car_2014_11_004.json rename to tests/stix_shifter/patterns/input_files/car_2014_11_004.json diff --git a/tests/patterns/input_files/followedby_obs_expression.json b/tests/stix_shifter/patterns/input_files/followedby_obs_expression.json similarity index 100% rename from tests/patterns/input_files/followedby_obs_expression.json rename to tests/stix_shifter/patterns/input_files/followedby_obs_expression.json diff --git a/tests/patterns/input_files/gt.json b/tests/stix_shifter/patterns/input_files/gt.json similarity index 100% rename from tests/patterns/input_files/gt.json rename to tests/stix_shifter/patterns/input_files/gt.json diff --git a/tests/patterns/input_files/gt_and.json b/tests/stix_shifter/patterns/input_files/gt_and.json similarity index 100% rename from tests/patterns/input_files/gt_and.json rename to tests/stix_shifter/patterns/input_files/gt_and.json diff --git a/tests/patterns/input_files/gt_and_gte.json b/tests/stix_shifter/patterns/input_files/gt_and_gte.json similarity index 100% rename from tests/patterns/input_files/gt_and_gte.json rename to tests/stix_shifter/patterns/input_files/gt_and_gte.json diff --git a/tests/patterns/input_files/gt_and_is_equal.json b/tests/stix_shifter/patterns/input_files/gt_and_is_equal.json similarity index 100% rename from tests/patterns/input_files/gt_and_is_equal.json rename to tests/stix_shifter/patterns/input_files/gt_and_is_equal.json diff --git a/tests/patterns/input_files/gte.json b/tests/stix_shifter/patterns/input_files/gte.json similarity index 100% rename from tests/patterns/input_files/gte.json rename to tests/stix_shifter/patterns/input_files/gte.json diff --git a/tests/patterns/input_files/in_set.json b/tests/stix_shifter/patterns/input_files/in_set.json similarity index 100% rename from tests/patterns/input_files/in_set.json rename to tests/stix_shifter/patterns/input_files/in_set.json diff --git a/tests/patterns/input_files/like.json b/tests/stix_shifter/patterns/input_files/like.json similarity index 100% rename from tests/patterns/input_files/like.json rename to tests/stix_shifter/patterns/input_files/like.json diff --git a/tests/patterns/input_files/like_single_char.json b/tests/stix_shifter/patterns/input_files/like_single_char.json similarity index 100% rename from tests/patterns/input_files/like_single_char.json rename to tests/stix_shifter/patterns/input_files/like_single_char.json diff --git a/tests/patterns/input_files/lt.json b/tests/stix_shifter/patterns/input_files/lt.json similarity index 100% rename from tests/patterns/input_files/lt.json rename to tests/stix_shifter/patterns/input_files/lt.json diff --git a/tests/patterns/input_files/lte.json b/tests/stix_shifter/patterns/input_files/lte.json similarity index 100% rename from tests/patterns/input_files/lte.json rename to tests/stix_shifter/patterns/input_files/lte.json diff --git a/tests/patterns/input_files/md5_hash.json b/tests/stix_shifter/patterns/input_files/md5_hash.json similarity index 100% rename from tests/patterns/input_files/md5_hash.json rename to tests/stix_shifter/patterns/input_files/md5_hash.json diff --git a/tests/patterns/input_files/negated_comparison.json b/tests/stix_shifter/patterns/input_files/negated_comparison.json similarity index 100% rename from tests/patterns/input_files/negated_comparison.json rename to tests/stix_shifter/patterns/input_files/negated_comparison.json diff --git a/tests/patterns/input_files/neq.json b/tests/stix_shifter/patterns/input_files/neq.json similarity index 100% rename from tests/patterns/input_files/neq.json rename to tests/stix_shifter/patterns/input_files/neq.json diff --git a/tests/patterns/input_files/not_in_set.json b/tests/stix_shifter/patterns/input_files/not_in_set.json similarity index 100% rename from tests/patterns/input_files/not_in_set.json rename to tests/stix_shifter/patterns/input_files/not_in_set.json diff --git a/tests/patterns/input_files/not_like.json b/tests/stix_shifter/patterns/input_files/not_like.json similarity index 100% rename from tests/patterns/input_files/not_like.json rename to tests/stix_shifter/patterns/input_files/not_like.json diff --git a/tests/patterns/input_files/ored_obs_expression.json b/tests/stix_shifter/patterns/input_files/ored_obs_expression.json similarity index 100% rename from tests/patterns/input_files/ored_obs_expression.json rename to tests/stix_shifter/patterns/input_files/ored_obs_expression.json diff --git a/tests/patterns/input_files/regex.json b/tests/stix_shifter/patterns/input_files/regex.json similarity index 100% rename from tests/patterns/input_files/regex.json rename to tests/stix_shifter/patterns/input_files/regex.json diff --git a/tests/patterns/input_files/regex_anchors.json b/tests/stix_shifter/patterns/input_files/regex_anchors.json similarity index 100% rename from tests/patterns/input_files/regex_anchors.json rename to tests/stix_shifter/patterns/input_files/regex_anchors.json diff --git a/tests/patterns/input_files/regex_back_anchor.json b/tests/stix_shifter/patterns/input_files/regex_back_anchor.json similarity index 100% rename from tests/patterns/input_files/regex_back_anchor.json rename to tests/stix_shifter/patterns/input_files/regex_back_anchor.json diff --git a/tests/patterns/input_files/regex_front_anchor.json b/tests/stix_shifter/patterns/input_files/regex_front_anchor.json similarity index 100% rename from tests/patterns/input_files/regex_front_anchor.json rename to tests/stix_shifter/patterns/input_files/regex_front_anchor.json diff --git a/tests/patterns/input_files/regex_no_anchors.json b/tests/stix_shifter/patterns/input_files/regex_no_anchors.json similarity index 100% rename from tests/patterns/input_files/regex_no_anchors.json rename to tests/stix_shifter/patterns/input_files/regex_no_anchors.json diff --git a/tests/patterns/input_files/timestamp.json b/tests/stix_shifter/patterns/input_files/timestamp.json similarity index 100% rename from tests/patterns/input_files/timestamp.json rename to tests/stix_shifter/patterns/input_files/timestamp.json diff --git a/tests/patterns/integration_tests.py b/tests/stix_shifter/patterns/integration_tests.py similarity index 100% rename from tests/patterns/integration_tests.py rename to tests/stix_shifter/patterns/integration_tests.py diff --git a/tests/patterns/test_analytic_translator.py b/tests/stix_shifter/patterns/test_analytic_translator.py similarity index 100% rename from tests/patterns/test_analytic_translator.py rename to tests/stix_shifter/patterns/test_analytic_translator.py diff --git a/tests/patterns/test_miscellaneous_tests.py b/tests/stix_shifter/patterns/test_miscellaneous_tests.py similarity index 100% rename from tests/patterns/test_miscellaneous_tests.py rename to tests/stix_shifter/patterns/test_miscellaneous_tests.py diff --git a/tests/patterns/test_web_api.py b/tests/stix_shifter/patterns/test_web_api.py similarity index 100% rename from tests/patterns/test_web_api.py rename to tests/stix_shifter/patterns/test_web_api.py diff --git a/tests/car_json_to_stix/test_class.py b/tests/stix_shifter/test_car_json_to_stix.py similarity index 100% rename from tests/car_json_to_stix/test_class.py rename to tests/stix_shifter/test_car_json_to_stix.py diff --git a/tests/qradar_json_to_stix/test_class.py b/tests/stix_shifter/test_qradar_json_to_stix.py similarity index 100% rename from tests/qradar_json_to_stix/test_class.py rename to tests/stix_shifter/test_qradar_json_to_stix.py diff --git a/tests/qradar_stix_to_aql/test_class.py b/tests/stix_shifter/test_qradar_stix_to_aql.py similarity index 100% rename from tests/qradar_stix_to_aql/test_class.py rename to tests/stix_shifter/test_qradar_stix_to_aql.py diff --git a/tests/splunk_json_to_stix/test_class.py b/tests/stix_shifter/test_splunk_json_to_stix.py similarity index 100% rename from tests/splunk_json_to_stix/test_class.py rename to tests/stix_shifter/test_splunk_json_to_stix.py diff --git a/tests/splunk_stix_to_spl/test_class.py b/tests/stix_shifter/test_splunk_stix_to_spl.py similarity index 100% rename from tests/splunk_stix_to_spl/test_class.py rename to tests/stix_shifter/test_splunk_stix_to_spl.py diff --git a/tests/stix_transmission/__init__.py b/tests/stix_transmission/__init__.py new file mode 100644 index 000000000..e69de29bb diff --git a/tests/stix_transmission/splunk/__init__.py b/tests/stix_transmission/splunk/__init__.py new file mode 100644 index 000000000..e69de29bb diff --git a/tests/stix_transmission/splunk/api_response/all_results.json b/tests/stix_transmission/splunk/api_response/all_results.json new file mode 100644 index 000000000..776a4cfed --- /dev/null +++ b/tests/stix_transmission/splunk/api_response/all_results.json @@ -0,0 +1 @@ +{"links":{},"origin":"https://9.99.999.99:0123/services/search/jobs","updated":"2018-09-12T10:15:40+00:00","generator":{"build":"2b5b15c4ee89","version":"7.0.1"},"entry":[{"name":"search eventtype=network_traffic | fields tag| spath","id":"https://9.99.999.99:0123/services/search/jobs/1536747204.4164","updated":"2018-09-12T10:14:22.538+00:00","links":{"alternate":"/services/search/jobs/1536747204.4164","search.log":"/services/search/jobs/1536747204.4164/search.log","events":"/services/search/jobs/1536747204.4164/events","results":"/services/search/jobs/1536747204.4164/results","results_preview":"/services/search/jobs/1536747204.4164/results_preview","timeline":"/services/search/jobs/1536747204.4164/timeline","summary":"/services/search/jobs/1536747204.4164/summary","control":"/services/search/jobs/1536747204.4164/control"},"published":"2018-09-12T10:13:25.000+00:00","author":"bhavesh","content":{"canSummarize":false,"cursorTime":"1970-01-01T00:00:00.000+00:00","defaultSaveTTL":"604800","defaultTTL":"600","delegate":"","diskUsage":73728,"dispatchState":"DONE","doneProgress":1,"dropCount":0,"earliestTime":"2018-04-20T12:36:17.000+00:00","eventAvailableCount":1,"eventCount":1,"eventFieldCount":10,"eventIsStreaming":true,"eventIsTruncated":false,"eventSearch":"search eventtype=network_traffic | fields tag | spath ","eventSorting":"desc","indexEarliestTime":1535991010,"indexLatestTime":1535991010,"isBatchModeSearch":false,"isDone":true,"isEventsPreviewEnabled":false,"isFailed":false,"isFinalized":false,"isPaused":false,"isPreviewEnabled":false,"isRealTimeSearch":false,"isRemoteTimeline":false,"isSaved":false,"isSavedSearch":false,"isTimeCursored":true,"isZombie":false,"keywords":"eventtype::network_traffic","label":"","normalizedSearch":"litsearch (index=shifter log_type=\"network\") | fields tag | spath | fields keepcolorder=t \"_bkt\" \"_cd\" \"_si\" \"host\" \"index\" \"linecount\" \"source\" \"sourcetype\" \"splunk_server\"","numPreviews":0,"optimizedSearch":"| search eventtype=network_traffic | fields tag | spath","pid":"17529","priority":5,"provenance":"","remoteSearch":"litsearch (index=shifter log_type=\"network\") | fields tag | spath | fields keepcolorder=t \"_bkt\" \"_cd\" \"_si\" \"host\" \"index\" \"linecount\" \"source\" \"sourcetype\" \"splunk_server\"","reportSearch":"","resultCount":1,"resultIsStreaming":true,"resultPreviewCount":1,"runDuration":0.250501694,"sampleRatio":"1","sampleSeed":"0","scanCount":2,"searchCanBeEventType":false,"searchTotalBucketsCount":4,"searchTotalEliminatedBucketsCount":2,"sid":"1536747204.4164","statusBuckets":0,"ttl":522,"performance":{"command.fields":{"duration_secs":0.002,"invocations":2,"input_count":2,"output_count":2},"command.search":{"duration_secs":0.004,"invocations":1,"input_count":0,"output_count":1},"command.search.calcfields":{"duration_secs":0.001,"invocations":1,"input_count":2,"output_count":2},"command.search.expand_search":{"duration_secs":0.025,"invocations":1},"command.search.fieldalias":{"duration_secs":0.001,"invocations":1,"input_count":2,"output_count":2},"command.search.filter":{"duration_secs":0.001,"invocations":1},"command.search.index":{"duration_secs":0.002,"invocations":2},"command.search.index.usec_1_8":{"invocations":85},"command.search.kv":{"duration_secs":0.001,"invocations":1},"command.search.lookups":{"duration_secs":0.001,"invocations":1,"input_count":2,"output_count":2},"command.search.parse_directives":{"duration_secs":0.001,"invocations":1},"command.search.rawdata":{"duration_secs":0.002,"invocations":1},"command.search.summary":{"duration_secs":0.001,"invocations":1},"command.search.tags":{"duration_secs":0.001,"invocations":1,"input_count":1,"output_count":1},"command.search.typer":{"duration_secs":0.001,"invocations":1,"input_count":1,"output_count":1},"command.spath":{"duration_secs":0.001,"invocations":1,"input_count":1,"output_count":1},"dispatch.check_disk_usage":{"duration_secs":0.001,"invocations":1},"dispatch.createdSearchResultInfrastructure":{"duration_secs":0.001,"invocations":1},"dispatch.evaluate":{"duration_secs":0.04,"invocations":1},"dispatch.evaluate.fields":{"duration_secs":0.001,"invocations":1},"dispatch.evaluate.noop":{"invocations":1},"dispatch.evaluate.search":{"duration_secs":0.039,"invocations":1},"dispatch.evaluate.spath":{"duration_secs":0.001,"invocations":1},"dispatch.fetch":{"duration_secs":0.051,"invocations":2},"dispatch.localSearch":{"duration_secs":0.004,"invocations":1},"dispatch.optimize.FinalEval":{"duration_secs":0.041,"invocations":1},"dispatch.optimize.matchReportAcceleration":{"duration_secs":0.151,"invocations":1},"dispatch.optimize.optimization":{"duration_secs":0.001,"invocations":1},"dispatch.optimize.reparse":{"duration_secs":0.001,"invocations":1},"dispatch.optimize.toJson":{"duration_secs":0.001,"invocations":1},"dispatch.optimize.toSpl":{"duration_secs":0.001,"invocations":1},"dispatch.readEventsInResults":{"duration_secs":0.001,"invocations":1},"dispatch.stream.local":{"duration_secs":0.004,"invocations":1},"dispatch.timeline":{"duration_secs":0.003,"invocations":2},"dispatch.writeStatus":{"duration_secs":0.007,"invocations":7},"startup.configuration":{"duration_secs":0.017,"invocations":1},"startup.handoff":{"duration_secs":0.06,"invocations":1}},"messages":[],"request":{"search":"search eventtype=network_traffic | fields tag| spath"},"runtime":{"auto_cancel":"0","auto_pause":"0"},"searchProviders":["splunk3-01.internal.resilientsystems.com"]},"acl":{"perms":{"read":["bhavesh"],"write":["bhavesh"]},"owner":"bhavesh","modifiable":true,"sharing":"global","app":"search","can_write":true,"ttl":"600"}},{"name":"| archivebuckets","id":"https://9.99.999.99:0123/services/search/jobs/scheduler__nobody_c3BsdW5rX2FyY2hpdmVy__RMD5473cbac83d6c9db7_at_1536743820_550","updated":"2018-09-12T09:17:01.565+00:00","links":{"alternate":"/services/search/jobs/scheduler__nobody_c3BsdW5rX2FyY2hpdmVy__RMD5473cbac83d6c9db7_at_1536743820_550","search.log":"/services/search/jobs/scheduler__nobody_c3BsdW5rX2FyY2hpdmVy__RMD5473cbac83d6c9db7_at_1536743820_550/search.log","events":"/services/search/jobs/scheduler__nobody_c3BsdW5rX2FyY2hpdmVy__RMD5473cbac83d6c9db7_at_1536743820_550/events","results":"/services/search/jobs/scheduler__nobody_c3BsdW5rX2FyY2hpdmVy__RMD5473cbac83d6c9db7_at_1536743820_550/results","results_preview":"/services/search/jobs/scheduler__nobody_c3BsdW5rX2FyY2hpdmVy__RMD5473cbac83d6c9db7_at_1536743820_550/results_preview","timeline":"/services/search/jobs/scheduler__nobody_c3BsdW5rX2FyY2hpdmVy__RMD5473cbac83d6c9db7_at_1536743820_550/timeline","summary":"/services/search/jobs/scheduler__nobody_c3BsdW5rX2FyY2hpdmVy__RMD5473cbac83d6c9db7_at_1536743820_550/summary","control":"/services/search/jobs/scheduler__nobody_c3BsdW5rX2FyY2hpdmVy__RMD5473cbac83d6c9db7_at_1536743820_550/control"},"published":"2018-09-12T09:17:00.000+00:00","author":"splunk-system-user","content":{"canSummarize":false,"cursorTime":"1970-01-01T00:00:00.000+00:00","defaultSaveTTL":"604800","defaultTTL":"600","delegate":"scheduler","diskUsage":57344,"dispatchState":"DONE","doneProgress":1,"dropCount":0,"earliestTime":"1970-01-01T00:00:00.000+00:00","eventAvailableCount":1,"eventCount":1,"eventFieldCount":0,"eventIsStreaming":true,"eventIsTruncated":true,"eventSearch":"archivebuckets ","eventSorting":"none","isBatchModeSearch":false,"isDone":true,"isEventsPreviewEnabled":false,"isFailed":false,"isFinalized":false,"isPaused":false,"isPreviewEnabled":false,"isRealTimeSearch":false,"isRemoteTimeline":false,"isSaved":false,"isSavedSearch":true,"isTimeCursored":false,"isZombie":false,"keywords":"","label":"Bucket Copy Trigger","latestTime":"2018-09-12T09:17:00.000+00:00","normalizedSearch":"","numPreviews":0,"optimizedSearch":"| archivebuckets","pid":"16571","priority":5,"provenance":"scheduler","remoteSearch":"","reportSearch":"","resultCount":1,"resultIsStreaming":true,"resultPreviewCount":1,"runDuration":0.878849033,"sampleRatio":"1","sampleSeed":"0","savedSearchLabel":"{\"owner\":\"nobody\",\"app\":\"splunk_archiver\",\"sharing\":\"app\"}","scanCount":0,"searchCanBeEventType":false,"searchLatestTime":1536743820,"searchTotalBucketsCount":0,"searchTotalEliminatedBucketsCount":0,"sid":"scheduler__nobody_c3BsdW5rX2FyY2hpdmVy__RMD5473cbac83d6c9db7_at_1536743820_550","statusBuckets":0,"ttl":3681,"performance":{"command.archivebuckets":{"duration_secs":0.8,"invocations":2,"input_count":0,"output_count":1},"dispatch.check_disk_usage":{"duration_secs":0.001,"invocations":1},"dispatch.createdSearchResultInfrastructure":{"duration_secs":0.001,"invocations":1},"dispatch.evaluate":{"duration_secs":0.001,"invocations":1},"dispatch.evaluate.archivebuckets":{"duration_secs":0.001,"invocations":1},"dispatch.evaluate.noop":{"invocations":1},"dispatch.fetch":{"duration_secs":0.002,"invocations":2},"dispatch.optimize.FinalEval":{"duration_secs":0.002,"invocations":1},"dispatch.optimize.matchReportAcceleration":{"duration_secs":0.003,"invocations":1},"dispatch.optimize.optimization":{"duration_secs":0.011,"invocations":1},"dispatch.optimize.reparse":{"duration_secs":0.001,"invocations":1},"dispatch.optimize.toJson":{"duration_secs":0.001,"invocations":1},"dispatch.optimize.toSpl":{"duration_secs":0.001,"invocations":1},"dispatch.readEventsInResults":{"duration_secs":0.001,"invocations":1},"dispatch.results_combiner":{"duration_secs":0.002,"invocations":2,"input_count":0,"output_count":0},"dispatch.timeline":{"duration_secs":0.002,"invocations":2},"dispatch.writeStatus":{"duration_secs":0.007,"invocations":7},"startup.configuration":{"duration_secs":0.017,"invocations":1},"startup.handoff":{"duration_secs":0.072,"invocations":1}},"messages":[],"request":{"auto_cancel":"0","auto_pause":"0","buckets":"0","earliest_time":"","index_earliest":"","index_latest":"","indexedRealtime":"","indexedRealtimeMinSpan":"","indexedRealtimeOffset":"","latest_time":"now","lookups":"1","max_count":"500000","max_time":"0","reduce_freq":"10","rt_backfill":"0","rt_maximum_span":"","sample_ratio":"1","spawn_process":"1","time_format":"%FT%T.%Q%:z","ui_dispatch_app":"","ui_dispatch_view":""},"searchProviders":[]},"acl":{"perms":{"read":["*","splunk-system-user"],"write":["*","splunk-system-user"]},"owner":"splunk-system-user","modifiable":true,"sharing":"global","app":"splunk_archiver","can_write":true,"ttl":"7200"}},{"name":"| archivebuckets","id":"https://9.99.999.99:0123/services/search/jobs/scheduler__nobody_c3BsdW5rX2FyY2hpdmVy__RMD5473cbac83d6c9db7_at_1536740220_549","updated":"2018-09-12T08:17:01.707+00:00","links":{"alternate":"/services/search/jobs/scheduler__nobody_c3BsdW5rX2FyY2hpdmVy__RMD5473cbac83d6c9db7_at_1536740220_549","search.log":"/services/search/jobs/scheduler__nobody_c3BsdW5rX2FyY2hpdmVy__RMD5473cbac83d6c9db7_at_1536740220_549/search.log","events":"/services/search/jobs/scheduler__nobody_c3BsdW5rX2FyY2hpdmVy__RMD5473cbac83d6c9db7_at_1536740220_549/events","results":"/services/search/jobs/scheduler__nobody_c3BsdW5rX2FyY2hpdmVy__RMD5473cbac83d6c9db7_at_1536740220_549/results","results_preview":"/services/search/jobs/scheduler__nobody_c3BsdW5rX2FyY2hpdmVy__RMD5473cbac83d6c9db7_at_1536740220_549/results_preview","timeline":"/services/search/jobs/scheduler__nobody_c3BsdW5rX2FyY2hpdmVy__RMD5473cbac83d6c9db7_at_1536740220_549/timeline","summary":"/services/search/jobs/scheduler__nobody_c3BsdW5rX2FyY2hpdmVy__RMD5473cbac83d6c9db7_at_1536740220_549/summary","control":"/services/search/jobs/scheduler__nobody_c3BsdW5rX2FyY2hpdmVy__RMD5473cbac83d6c9db7_at_1536740220_549/control"},"published":"2018-09-12T08:17:00.000+00:00","author":"splunk-system-user","content":{"canSummarize":false,"cursorTime":"1970-01-01T00:00:00.000+00:00","defaultSaveTTL":"604800","defaultTTL":"600","delegate":"scheduler","diskUsage":57344,"dispatchState":"DONE","doneProgress":1,"dropCount":0,"earliestTime":"1970-01-01T00:00:00.000+00:00","eventAvailableCount":1,"eventCount":1,"eventFieldCount":0,"eventIsStreaming":true,"eventIsTruncated":true,"eventSearch":"archivebuckets ","eventSorting":"none","isBatchModeSearch":false,"isDone":true,"isEventsPreviewEnabled":false,"isFailed":false,"isFinalized":false,"isPaused":false,"isPreviewEnabled":false,"isRealTimeSearch":false,"isRemoteTi100 15139 100 15123 100 16 23592 24 --:--:-- --:--:-- --:--:-- 23617true,"isTimeCursored":false,"isZombie":false,"keywords":"","label":"Bucket Copy Trigger","latestTime":"2018-09-12T08:17:00.000+00:00","normalizedSearch":"","numPreviews":0,"optimizedSearch":"| archivebuckets","pid":"15570","priority":5,"provenance":"scheduler","remoteSearch":"","reportSearch":"","resultCount":1,"resultIsStreaming":true,"resultPreviewCount":1,"runDuration":0.841277813,"sampleRatio":"1","sampleSeed":"0","savedSearchLabel":"{\"owner\":\"nobody\",\"app\":\"splunk_archiver\",\"sharing\":\"app\"}","scanCount":0,"searchCanBeEventType":false,"searchLatestTime":1536740220,"searchTotalBucketsCount":0,"searchTotalEliminatedBucketsCount":0,"sid":"scheduler__nobody_c3BsdW5rX2FyY2hpdmVy__RMD5473cbac83d6c9db7_at_1536740220_549","statusBuckets":0,"ttl":81,"performance":{"command.archivebuckets":{"duration_secs":0.769,"invocations":2,"input_count":0,"output_count":1},"dispatch.check_disk_usage":{"duration_secs":0.001,"invocations":1},"dispatch.createdSearchResultInfrastructure":{"duration_secs":0.001,"invocations":1},"dispatch.evaluate":{"duration_secs":0.001,"invocations":1},"dispatch.evaluate.archivebuckets":{"duration_secs":0.001,"invocations":1},"dispatch.evaluate.noop":{"invocations":1},"dispatch.fetch":{"duration_secs":0.002,"invocations":2},"dispatch.optimize.FinalEval":{"duration_secs":0.002,"invocations":1},"dispatch.optimize.matchReportAcceleration":{"duration_secs":0.002,"invocations":1},"dispatch.optimize.optimization":{"duration_secs":0.009,"invocations":1},"dispatch.optimize.reparse":{"duration_secs":0.001,"invocations":1},"dispatch.optimize.toJson":{"duration_secs":0.001,"invocations":1},"dispatch.optimize.toSpl":{"duration_secs":0.001,"invocations":1},"dispatch.readEventsInResults":{"duration_secs":0.001,"invocations":1},"dispatch.results_combiner":{"duration_secs":0.002,"invocations":2,"input_count":0,"output_count":0},"dispatch.timeline":{"duration_secs":0.002,"invocations":2},"dispatch.writeStatus":{"duration_secs":0.007,"invocations":7},"startup.configuration":{"duration_secs":0.016,"invocations":1},"startup.handoff":{"duration_secs":0.059,"invocations":1}},"messages":[],"request":{"auto_cancel":"0","auto_pause":"0","buckets":"0","earliest_time":"","index_earliest":"","index_latest":"","indexedRealtime":"","indexedRealtimeMinSpan":"","indexedRealtimeOffset":"","latest_time":"now","lookups":"1","max_count":"500000","max_time":"0","reduce_freq":"10","rt_backfill":"0","rt_maximum_span":"","sample_ratio":"1","spawn_process":"1","time_format":"%FT%T.%Q%:z","ui_dispatch_app":"","ui_dispatch_view":""},"searchProviders":[]},"acl":{"perms":{"read":["*","splunk-system-user"],"write":["*","splunk-system-user"]},"owner":"splunk-system-user","modifiable":true,"sharing":"global","app":"splunk_archiver","can_write":true,"ttl":"7200"}}],"paging":{"total":3,"perPage":0,"offset":0}} \ No newline at end of file diff --git a/tests/stix_transmission/splunk/api_response/result_by_sid.json b/tests/stix_transmission/splunk/api_response/result_by_sid.json new file mode 100644 index 000000000..2d6046b9f --- /dev/null +++ b/tests/stix_transmission/splunk/api_response/result_by_sid.json @@ -0,0 +1,91 @@ +{ + "preview": false, + "init_offset": 0, + "messages": [], + "fields": [ + { + "name": "tag" + }, + { + "name": "_bkt" + }, + { + "name": "_cd" + }, + { + "name": "_eventtype_color" + }, + { + "name": "_indextime" + }, + { + "name": "_raw" + }, + { + "name": "_serial" + }, + { + "name": "_si" + }, + { + "name": "_sourcetype" + }, + { + "name": "_time" + }, + { + "name": "bytes" + }, + { + "name": "dest_ip" + }, + { + "name": "dest_port" + }, + { + "name": "event_count" + }, + { + "name": "log_type" + }, + { + "name": "src_ip" + }, + { + "name": "src_port" + }, + { + "name": "transport" + }, + { + "name": "user" + } + ], + "results": [ + { + "tag": "network", + "_bkt": "shifter~3~6D3E49A0-31FE-44C3-8373-C3AC6B1ABF06", + "_cd": "3:101", + "_eventtype_color": "none", + "_indextime": "1535991010", + "_raw": "{\n\t\"log_type\": \"network\", \n\t\"bytes\": \"300\", \n\t\"dest_ip\": \"127.0.0.1\", \n\t\"dest_port\": \"80\", \n\t\"src_ip\": \"2001:0db8:85a3:0000:0000:8a2e:0370:7334\", \n\t\"src_port\": \"80\", \n\t\"transport\": \"http\", \n\t\"user\": \"IBM\", \n\t\"event_count\": 1\n}", + "_serial": "0", + "_si": [ + "splunk3-01.internal.resilientsystems.com", + "shifter" + ], + "_sourcetype": "_json", + "_time": "2018-09-03T16:10:10.000+00:00", + "bytes": "300", + "dest_ip": "127.0.0.1", + "dest_port": "80", + "event_count": "1", + "log_type": "network", + "src_ip": "2001:0db8:85a3:0000:0000:8a2e:0370:7334", + "src_port": "80", + "transport": "http", + "user": "IBM" + } + ], + "highlighted": {} +} \ No newline at end of file diff --git a/tests/stix_transmission/splunk/api_response/status_by_sid.json b/tests/stix_transmission/splunk/api_response/status_by_sid.json new file mode 100644 index 000000000..f0f963055 --- /dev/null +++ b/tests/stix_transmission/splunk/api_response/status_by_sid.json @@ -0,0 +1,270 @@ +{ + "links": {}, + "origin": "https://9.99.999.99:0123/services/search/jobs", + "updated": "2018-09-13T09:49:40+00:00", + "generator": { + "build": "2b5b15c4ee89", + "version": "7.0.1" + }, + "entry": [ + { + "name": "search eventtype=network_traffic | fields tag| spath", + "id": "https://9.99.999.99:0123/services/search/jobs/1536832140.4293", + "updated": "2018-09-13T09:49:40.645+00:00", + "links": { + "alternate": "/services/search/jobs/1536832140.4293", + "search.log": "/services/search/jobs/1536832140.4293/search.log", + "events": "/services/search/jobs/1536832140.4293/events", + "results": "/services/search/jobs/1536832140.4293/results", + "results_preview": "/services/search/jobs/1536832140.4293/results_preview", + "timeline": "/services/search/jobs/1536832140.4293/timeline", + "summary": "/services/search/jobs/1536832140.4293/summary", + "control": "/services/search/jobs/1536832140.4293/control" + }, + "published": "2018-09-13T09:49:00.000+00:00", + "author": "bhavesh", + "content": { + "canSummarize": false, + "cursorTime": "1970-01-01T00:00:00.000+00:00", + "defaultSaveTTL": "604800", + "defaultTTL": "600", + "delegate": "", + "diskUsage": 73728, + "dispatchState": "DONE", + "doneProgress": 1, + "dropCount": 0, + "earliestTime": "2018-04-20T12:36:17.000+00:00", + "eventAvailableCount": 1, + "eventCount": 1, + "eventFieldCount": 10, + "eventIsStreaming": true, + "eventIsTruncated": false, + "eventSearch": "search eventtype=network_traffic | fields tag | spath ", + "eventSorting": "desc", + "indexEarliestTime": 1535991010, + "indexLatestTime": 1535991010, + "isBatchModeSearch": false, + "isDone": true, + "isEventsPreviewEnabled": false, + "isFailed": false, + "isFinalized": false, + "isPaused": false, + "isPreviewEnabled": false, + "isRealTimeSearch": false, + "isRemoteTimeline": false, + "isSaved": false, + "isSavedSearch": false, + "isTimeCursored": true, + "isZombie": false, + "keywords": "eventtype::network_traffic", + "label": "", + "normalizedSearch": "litsearch (index=shifter log_type=\"network\") | fields tag | spath | fields keepcolorder=t \"_bkt\" \"_cd\" \"_si\" \"host\" \"index\" \"linecount\" \"source\" \"sourcetype\" \"splunk_server\"", + "numPreviews": 0, + "optimizedSearch": "| search eventtype=network_traffic | fields tag | spath", + "pid": "9663", + "priority": 5, + "provenance": "", + "remoteSearch": "litsearch (index=shifter log_type=\"network\") | fields tag | spath | fields keepcolorder=t \"_bkt\" \"_cd\" \"_si\" \"host\" \"index\" \"linecount\" \"source\" \"sourcetype\" \"splunk_server\"", + "reportSearch": "", + "resultCount": 1, + "resultIsStreaming": true, + "resultPreviewCount": 1, + "runDuration": 0.242112944, + "sampleRatio": "1", + "sampleSeed": "0", + "scanCount": 2, + "searchCanBeEventType": false, + "searchTotalBucketsCount": 4, + "searchTotalEliminatedBucketsCount": 2, + "sid": "1536832140.4293", + "statusBuckets": 0, + "ttl": 600, + "performance": { + "command.fields": { + "duration_secs": 0.001, + "invocations": 2, + "input_count": 2, + "output_count": 2 + }, + "command.search": { + "duration_secs": 0.004, + "invocations": 1, + "input_count": 0, + "output_count": 1 + }, + "command.search.calcfields": { + "duration_secs": 0.001, + "invocations": 1, + "input_count": 2, + "output_count": 2 + }, + "command.search.expand_search": { + "duration_secs": 0.026, + "invocations": 1 + }, + "command.search.fieldalias": { + "duration_secs": 0.001, + "invocations": 1, + "input_count": 2, + "output_count": 2 + }, + "command.search.filter": { + "duration_secs": 0.001, + "invocations": 1 + }, + "command.search.index": { + "duration_secs": 0.002, + "invocations": 2 + }, + "command.search.index.usec_1_8": { + "invocations": 85 + }, + "command.search.kv": { + "duration_secs": 0.001, + "invocations": 1 + }, + "command.search.lookups": { + "duration_secs": 0.001, + "invocations": 1, + "input_count": 2, + "output_count": 2 + }, + "command.search.parse_directives": { + "duration_secs": 0.001, + "invocations": 1 + }, + "command.search.rawdata": { + "duration_secs": 0.002, + "invocations": 1 + }, + "command.search.summary": { + "invocations": 1 + }, + "command.search.tags": { + "duration_secs": 0.001, + "invocations": 1, + "input_count": 1, + "output_count": 1 + }, + "command.search.typer": { + "duration_secs": 0.001, + "invocations": 1, + "input_count": 1, + "output_count": 1 + }, + "command.spath": { + "duration_secs": 0.001, + "invocations": 1, + "input_count": 1, + "output_count": 1 + }, + "dispatch.check_disk_usage": { + "duration_secs": 0.001, + "invocations": 1 + }, + "dispatch.createdSearchResultInfrastructure": { + "duration_secs": 0.001, + "invocations": 1 + }, + "dispa100 5684 100 5668 100 16 10361 29 --:--:-- --:--:-- --:--:-- 10704patch.evaluate.fields": { + "duration_secs": 0.001, + "invocations": 1 + }, + "dispatch.evaluate.noop": { + "invocations": 1 + }, + "dispatch.evaluate.search": { + "duration_secs": 0.04, + "invocations": 1 + }, + "dispatch.evaluate.spath": { + "duration_secs": 0.001, + "invocations": 1 + }, + "dispatch.fetch": { + "duration_secs": 0.052, + "invocations": 2 + }, + "dispatch.localSearch": { + "duration_secs": 0.004, + "invocations": 1 + }, + "dispatch.optimize.FinalEval": { + "duration_secs": 0.041, + "invocations": 1 + }, + "dispatch.optimize.matchReportAcceleration": { + "duration_secs": 0.142, + "invocations": 1 + }, + "dispatch.optimize.optimization": { + "duration_secs": 0.001, + "invocations": 1 + }, + "dispatch.optimize.reparse": { + "duration_secs": 0.001, + "invocations": 1 + }, + "dispatch.optimize.toJson": { + "duration_secs": 0.001, + "invocations": 1 + }, + "dispatch.optimize.toSpl": { + "duration_secs": 0.001, + "invocations": 1 + }, + "dispatch.readEventsInResults": { + "duration_secs": 0.001, + "invocations": 1 + }, + "dispatch.stream.local": { + "duration_secs": 0.004, + "invocations": 1 + }, + "dispatch.timeline": { + "duration_secs": 0.002, + "invocations": 2 + }, + "dispatch.writeStatus": { + "duration_secs": 0.007, + "invocations": 7 + }, + "startup.configuration": { + "duration_secs": 0.015, + "invocations": 1 + }, + "startup.handoff": { + "duration_secs": 0.055, + "invocations": 1 + } + }, + "messages": [], + "request": { + "search": "search eventtype=network_traffic | fields tag| spath" + }, + "runtime": { + "auto_cancel": "0", + "auto_pause": "0" + }, + "searchProviders": ["splunk3-01.internal.resilientsystems.com"] + }, + "acl": { + "perms": { + "read": ["bhavesh"], + "write": ["bhavesh"] + }, + "owner": "bhavesh", + "modifiable": true, + "sharing": "global", + "app": "search", + "can_write": true, + "ttl": "600" + } + } + ], + "paging": { + "total": 1, + "perPage": 0, + "offset": 0 + } +} diff --git a/tests/stix_transmission/splunk/test_class.py b/tests/stix_transmission/splunk/test_class.py new file mode 100644 index 000000000..e7b02d824 --- /dev/null +++ b/tests/stix_transmission/splunk/test_class.py @@ -0,0 +1,235 @@ +from stix_transmission.src.modules.splunk import splunk_connector +from unittest.mock import patch +import unittest +import json +import os + + +class SplunkMockResponse: + def __init__(self, response_code, object): + self.code = response_code + self.object = object + + def read(self): + return self.object + + +@patch('stix_transmission.src.modules.splunk.spl_api_client.APIClient.__init__') +class TestSplunkConnection(unittest.TestCase, object): + def test_is_async(self, mock_api_client): + mock_api_client.return_value = None + module = splunk_connector + + config = { + "auth": { + "username": "", + "password": "" + } + } + connection = { + "host": "host", + "port": "8080" + } + + check_async = module.Connector(connection, config).is_async + + assert check_async + + @patch('stix_transmission.src.modules.splunk.spl_api_client.APIClient.ping_box') + def test_ping_endpoint(self, mock_ping_response, mock_api_client): + mock_api_client.return_value = None + mocked_return_value = '["mock", "placeholder"]' + mock_ping_response.return_value = SplunkMockResponse(200, mocked_return_value) + + module = splunk_connector + config = { + "auth": { + "username": "", + "password": "" + } + } + connection = { + "host": "host", + "port": "8080" + } + + ping_response = module.Connector(connection, config).ping() + + assert ping_response is not None + assert ping_response['success'] + + @patch('stix_transmission.src.modules.splunk.spl_api_client.APIClient.create_search') + def test_query_response(self, mock_query_response, mock_api_client): + mock_api_client.return_value = None + mocked_return_value = '{"sid":"1536672851.4012"}' + mock_query_response.return_value = SplunkMockResponse(201, mocked_return_value) + + module = splunk_connector + config = { + "auth": { + "username": "", + "password": "" + } + } + connection = { + "host": "host", + "port": "8080" + } + + query = 'search eventtype=network_traffic | fields + tag| spath' + query_response = module.Connector(connection, config).create_query_connection(query) + + assert query_response is not None + assert query_response['success'] is True + assert 'search_id' in query_response + assert query_response['search_id'] == "1536672851.4012" + + @patch('stix_transmission.src.modules.splunk.spl_api_client.APIClient.get_search', autospec=True) + def test_status_response(self, mock_status_response, mock_api_client): + mock_api_client.return_value = None + + dir_path = os.path.dirname(os.path.realpath(__file__)) + file_path = os.path.join(dir_path, 'api_response', 'status_by_sid.json') + mocked_return_value = open(file_path, 'r').read() + + mock_status_response.return_value = SplunkMockResponse(200, mocked_return_value) + + config = { + "auth": { + "username": "", + "password": "" + } + } + connection = { + "host": "host", + "port": "8080" + } + + search_id = "1536832140.4293" + module = splunk_connector + status_response = module.Connector(connection, config).create_status_connection(search_id) + + assert status_response is not None + assert 'status' in status_response + assert status_response['status'] == 'COMPLETED' + assert 'progress' in status_response + assert status_response['progress'] == 100 + assert 'success' in status_response + assert status_response['success'] is True + + @patch('stix_transmission.src.modules.splunk.spl_api_client.APIClient.get_search_results', autospec=True) + def test_results_response(self, mock_results_response, mock_api_client): + mock_api_client.return_value = None + + dir_path = os.path.dirname(os.path.realpath(__file__)) + file_path = os.path.join(dir_path, 'api_response', 'result_by_sid.json') + mocked_return_value = open(file_path, 'r').read() + + mock_results_response.return_value = SplunkMockResponse(200, mocked_return_value) + + module = splunk_connector + config = { + "auth": { + "username": "", + "password": "" + } + } + connection = { + "host": "host", + "port": "8080" + } + + search_id = "1536832140.4293" + offset = 0 + length = 1 + results_response = module.Connector(connection, config).create_results_connection(search_id, offset, length) + + assert 'success' in results_response + assert results_response['success'] is True + assert 'data' in results_response + assert len(results_response['data']) > 0 + + @patch('stix_transmission.src.modules.splunk.spl_api_client.APIClient.create_search', autospec=True) + @patch('stix_transmission.src.modules.splunk.spl_api_client.APIClient.get_search', autospec=True) + @patch('stix_transmission.src.modules.splunk.spl_api_client.APIClient.get_search_results', autospec=True) + def test_query_flow(self, mock_results_response, mock_status_response, mock_query_response, mock_api_client): + mock_api_client.return_value = None + + config = { + "auth": { + "username": "", + "password": "" + } + } + connection = { + "host": "host", + "port": "8080" + } + + query_mock = '{"sid":"1536832140.4293"}' + mock_query_response.return_value = SplunkMockResponse(201, query_mock) + + dir_path = os.path.dirname(os.path.realpath(__file__)) + file_path = os.path.join(dir_path, 'api_response', 'result_by_sid.json') + results_mock = open(file_path, 'r').read() + mock_results_response.return_value = SplunkMockResponse(200, results_mock) + + status_file_path = os.path.join(dir_path, 'api_response', 'status_by_sid.json') + status_mock = open(status_file_path, 'r').read() + mock_status_response.return_value = SplunkMockResponse(200, status_mock) + + module = splunk_connector + + query = 'search eventtype=network_traffic | fields + tag| spath' + query_response = module.Connector(connection, config).create_query_connection(query) + + assert query_response is not None + assert query_response['success'] is True + assert 'search_id' in query_response + assert query_response['search_id'] == "1536832140.4293" + + search_id = "1536832140.4293" + status_response = module.Connector(connection, config).create_status_connection(search_id) + + assert status_response is not None + assert 'status' in status_response + assert status_response['status'] == 'COMPLETED' + assert 'progress' in status_response + assert status_response['progress'] == 100 + assert 'success' in status_response + assert status_response['success'] is True + + search_id = "1536832140.4293" + offset = 0 + length = 1 + results_response = module.Connector(connection, config).create_results_connection(search_id, offset, length) + + assert 'success' in results_response + assert results_response['success'] is True + assert 'data' in results_response + assert len(results_response['data']) > 0 + + @patch('stix_transmission.src.modules.splunk.spl_api_client.APIClient.delete_search', autospec=True) + def test_delete_search(self, mock_results_response, mock_api_client): + mock_api_client.return_value = None + + config = { + "auth": { + "username": "", + "password": "" + } + } + connection = { + "host": "host", + "port": "8080" + } + + mocked_return_value = '{"messages":[{"type":"INFO","text":"Search job cancelled."}]}' + mock_results_response.return_value = SplunkMockResponse(200, mocked_return_value) + + module = splunk_connector + search_id = "1536832140.4293" + results_response = module.Connector(connection, config).delete_query_connection(search_id) + + assert results_response is not None + assert results_response['success'] is True \ No newline at end of file diff --git a/tests/stix_transmission/test_async_dummy.py b/tests/stix_transmission/test_async_dummy.py new file mode 100644 index 000000000..0e18976b6 --- /dev/null +++ b/tests/stix_transmission/test_async_dummy.py @@ -0,0 +1,109 @@ +from stix_transmission.src.modules.async_dummy import async_dummy_connector +from stix_transmission.src.modules.base.base_status_connector import Status +import unittest + + +class TestAsyncDummyConnection(unittest.TestCase, object): + def test_dummy_async_query(self): + connection = { + "host": "hostbla", + "port": "8080", + "path": "/" + } + query_interface = async_dummy_connector.Connector(connection, None) + + query = "placeholder query text" + + query_response = query_interface.create_query_connection(query) + query_id = query_response['query_id'] + + assert query_id == "uuid_1234567890" + + def test_dummy_async_status(self): + connection = { + "host": "hostbla", + "port": "8080", + "path": "/" + } + status_interface = async_dummy_connector.Connector(connection, None) + + query_id = "uuid_1234567890" + + status_response = status_interface.create_status_connection(query_id) + status = status_response["status"] + assert status == Status.COMPLETED.value + + def test_dummy_async_results_error(self): + connection = { + "host": "hostbla", + "port": "8080", + "path": "/" + } + results_interface = async_dummy_connector.Connector(connection, None) + + query_id = "uuid_should_error" + offset = 0 + length = 1 + results_response = results_interface.create_results_connection(query_id, offset, length) + + success = results_response["success"] + assert success is not True + query_results = results_response["error"] + assert query_results == "Error: query results not found" + + def test_dummy_async_results_running(self): + connection = { + "host": "hostbla", + "port": "8080", + "path": "/" + } + results_interface = async_dummy_connector.Connector(connection, None) + + query_id = "uuid_not_done" + offset = 0 + length = 1 + results_response = results_interface.create_results_connection(query_id, offset, length) + + query_results = results_response["error"] + success = results_response["success"] + assert success is not True + assert query_results == "Query is not finished processing" + + def test_dummy_async_results_success(self): + connection = { + "host": "hostbla", + "port": "8080", + "path": "/" + } + results_interface = async_dummy_connector.Connector(connection, None) + + query_id = "uuid_1234567890" + offset = 0 + length = 1 + + results_response = results_interface.create_results_connection(query_id, offset, length) + query_results = results_response["data"] + + success = results_response["success"] + assert success + assert query_results + + def test_is_async(self): + connection = { + "host": "hostbla", + "port": "8080", + "path": "/" + } + check_async = async_dummy_connector.Connector(connection, None).is_async + assert check_async + + def test_ping(self): + connection = { + "host": "hostbla", + "port": "8080", + "path": "/" + } + + interface = async_dummy_connector.Connector(connection, None) + ping_result = interface.ping() + assert ping_result == "async ping" diff --git a/tests/stix_transmission/test_bigfix.py b/tests/stix_transmission/test_bigfix.py new file mode 100644 index 000000000..dfa0b6200 --- /dev/null +++ b/tests/stix_transmission/test_bigfix.py @@ -0,0 +1,559 @@ +from stix_transmission.src.modules.bigfix import bigfix_connector +from unittest.mock import patch +import unittest + + +class BigFixMockJsonResponse: + def __init__(self, response_code, object): + self.code = response_code + self.object = object + + def read(self): + return self.object + + +class MockHttpResponse: + def __init__(self, string): + self.string = string + + def decode(self, string): + return self.string + + +class BigFixMockHttpXMLResponse: + def __init__(self, response_code, object): + self.code = response_code + self.object = object + + def read(self): + + return self.object + + +config = { + "auth": { + "user_name": "fake", + "password": "fake" + } +} + +connection = { + "host": "fake", + "port": "fake", + "cert": "fake" +} + + +@patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.__init__', autospec=True) +class TestBigfixConnection(unittest.TestCase): + def test_is_async(self, mock_api_client): + mock_api_client.return_value = None + module = bigfix_connector + + check_async = module.Connector(connection, config).is_async + assert check_async + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.ping_box') + def test_ping_endpoint_good_return(self, mock_ping_response, mock_api_client): + mock_api_client.return_value = None + mocked_return_value = MockHttpResponse('/api/clientquery') + mock_ping_response.return_value = BigFixMockHttpXMLResponse(200, mocked_return_value) + + module = bigfix_connector + + ping_response = module.Connector(connection, config).ping() + assert ping_response is not None + assert 'success' in ping_response + assert ping_response['success'] + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.ping_box') + def test_ping_endpoint_not_working_return(self, mock_ping_response, mock_api_client): + mock_api_client.return_value = None + mocked_return_value = MockHttpResponse('/missing') + mock_ping_response.return_value = BigFixMockHttpXMLResponse(200, mocked_return_value) + + module = bigfix_connector + + ping_response = module.Connector(connection, config).ping() + assert ping_response is not None + assert 'success' in ping_response + assert ping_response['success'] == False + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.ping_box') + def test_ping_endpoint_exception(self, mock_ping_response, mock_api_client): + mock_api_client.return_value = None + mocked_return_value = MockHttpResponse('/exception') + mock_ping_response.return_value = BigFixMockHttpXMLResponse(200, mocked_return_value) + mock_ping_response.side_effect = Exception('an error occured retriving ping information') + + module = bigfix_connector + + ping_response = module.Connector(connection, config).ping() + assert ping_response is not None + assert 'success' in ping_response + assert ping_response['success'] == False + assert ping_response['error'] is not None + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.ping_box') + def test_ping_endpoint_bad_return_code(self, mock_ping_response, mock_api_client): + mock_api_client.return_value = None + mocked_return_value = MockHttpResponse('/exception') + mock_ping_response.return_value = BigFixMockHttpXMLResponse(500, mocked_return_value) + + module = bigfix_connector + + ping_response = module.Connector(connection, config).ping() + assert ping_response is not None + assert 'success' in ping_response + assert ping_response['success'] == False + assert ping_response['error'] is not None + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.create_search') + def test_query_response_found(self, mock_query_response, mock_api_client): + mock_api_client.return_value = None + big_fix_return_value = ''\ + ''\ + '' \ + '105' + + mocked_return_value = MockHttpResponse(big_fix_return_value) + mock_query_response.return_value = BigFixMockHttpXMLResponse(200, mocked_return_value) + + module = bigfix_connector + + query = 'bigfix query text' + + query_response = module.Connector(connection, config).create_query_connection(query) + + assert query_response is not None + assert 'success' in query_response + assert query_response['success'] == True + assert 'search_id' in query_response + assert query_response['search_id'] == "105" + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.create_search') + def test_query_response_not_found(self, mock_query_response, mock_api_client): + mock_api_client.return_value = None + big_fix_return_value = 'big fix did not return proper value' + mocked_return_value = MockHttpResponse(big_fix_return_value) + mock_query_response.return_value = BigFixMockHttpXMLResponse(200, mocked_return_value) + + module = bigfix_connector + + query = 'bigfix query text' + + query_response = module.Connector(connection, config).create_query_connection(query) + + assert query_response is not None + assert 'success' in query_response + assert query_response['success'] == False + assert 'error' in query_response + assert 'search_id' in query_response + assert query_response['search_id'] == "UNKNOWN" + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.create_search') + def test_query_response_exception(self, mock_query_response, mock_api_client): + mock_api_client.return_value = None + big_fix_return_value = 'big fix did not return proper value' + mocked_return_value = MockHttpResponse(big_fix_return_value) + mock_query_response.return_value = BigFixMockHttpXMLResponse(200, mocked_return_value) + mock_query_response.side_effect = Exception('an error occured creating search') + + module = bigfix_connector + + query = 'bigfix query text' + + query_response = module.Connector(connection, config).create_query_connection(query) + + assert query_response is not None + assert 'success' in query_response + assert query_response['success'] == False + assert 'error' in query_response + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.create_search') + def test_query_response_bad_return_code(self, mock_query_response, mock_api_client): + mock_api_client.return_value = None + big_fix_return_value = 'big fix did not return proper value' + mocked_return_value = MockHttpResponse(big_fix_return_value) + mock_query_response.return_value = BigFixMockHttpXMLResponse(200, mocked_return_value) + module = bigfix_connector + + query = 'bigfix query text' + + query_response = module.Connector(connection, config).create_query_connection(query) + + assert query_response is not None + assert 'success' in query_response + assert query_response['success'] == False + assert 'error' in query_response + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_search_results') + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_sync_query_results') + def test_status_response_completed(self, mock_sync_query_results, mock_status_response, mock_api_client): + mock_api_client.return_value = None + + mocked_sync_query_return_value = MockHttpResponse('2') + mock_sync_query_results.return_value = BigFixMockHttpXMLResponse(200, mocked_sync_query_return_value) + + mocked_search_results_status = '{"reportingAgents": "2", "totalResults": "100"}' + mock_status_response.return_value=BigFixMockJsonResponse(200,mocked_search_results_status) + + module = bigfix_connector + + search_id = "104" + + status_response = module.Connector(connection, config).create_status_connection(search_id) + + assert status_response is not None + assert 'success' in status_response + assert status_response['success'] == True + assert 'status' in status_response + assert status_response['status'] == "COMPLETED" + assert 'progress' in status_response + assert status_response['progress'] == 100 + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_search_results') + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_sync_query_results') + def test_status_response_running(self, mock_sync_query_results, mock_status_response, mock_api_client): + mock_api_client.return_value = None + + mocked_sync_query_return_value = MockHttpResponse('2') + mock_sync_query_results.return_value = BigFixMockHttpXMLResponse(200, mocked_sync_query_return_value) + + mocked_search_results_status = '{"reportingAgents": "0", "totalResults": "100"}' + mock_status_response.return_value=BigFixMockJsonResponse(200,mocked_search_results_status) + + module = bigfix_connector + + search_id = "104" + + status_response = module.Connector(connection, config).create_status_connection(search_id) + + assert status_response is not None + assert 'success' in status_response + assert status_response['success'] == True + assert 'status' in status_response + assert status_response['status'] == "RUNNING" + assert 'progress' in status_response + assert status_response['progress'] == 0 + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_search_results') + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_sync_query_results') + def test_status_response_running_50_complete(self, mock_sync_query_results, mock_status_response, mock_api_client): + mock_api_client.return_value = None + + mocked_sync_query_return_value = MockHttpResponse('2') + mock_sync_query_results.return_value = BigFixMockHttpXMLResponse(200, mocked_sync_query_return_value) + + mocked_search_results_status = '{"reportingAgents": "1", "totalResults": "100"}' + mock_status_response.return_value=BigFixMockJsonResponse(200,mocked_search_results_status) + + module = bigfix_connector + + search_id = "104" + + status_response = module.Connector(connection, config).create_status_connection(search_id) + + assert status_response is not None + assert 'success' in status_response + assert status_response['success'] == True + assert 'status' in status_response + assert status_response['status'] == "RUNNING" + assert 'progress' in status_response + assert status_response['progress'] == 50 + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_search_results') + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_sync_query_results') + def test_status_response_running_75_complete(self, mock_sync_query_results, mock_status_response, mock_api_client): + mock_api_client.return_value = None + + mocked_sync_query_return_value = MockHttpResponse('10000') + mock_sync_query_results.return_value = BigFixMockHttpXMLResponse(200, mocked_sync_query_return_value) + + mocked_search_results_status = '{"reportingAgents": "7500", "totalResults": "100"}' + mock_status_response.return_value=BigFixMockJsonResponse(200,mocked_search_results_status) + + module = bigfix_connector + + search_id = "104" + + status_response = module.Connector(connection, config).create_status_connection(search_id) + + assert status_response is not None + assert 'success' in status_response + assert status_response['success'] == True + assert 'status' in status_response + assert status_response['status'] == "RUNNING" + assert 'progress' in status_response + assert status_response['progress'] == 75 + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_search_results') + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_sync_query_results') + def test_status_response_error(self, mock_sync_query_results, mock_status_response, mock_api_client): + mock_api_client.return_value = None + + mocked_sync_query_return_value = MockHttpResponse('2') + mock_sync_query_results.return_value = BigFixMockHttpXMLResponse(200, mocked_sync_query_return_value) + + mocked_search_results_status = '{"reportingAgents": "2", "totalResults": "0"}' + mock_status_response.return_value=BigFixMockJsonResponse(200,mocked_search_results_status) + + module = bigfix_connector + + search_id = "104" + + status_response = module.Connector(connection, config).create_status_connection(search_id) + + assert status_response is not None + assert 'success' in status_response + assert status_response['success'] == True + assert 'status' in status_response + assert status_response['status'] == "ERROR" + assert 'progress' in status_response + assert status_response['progress'] == 100 + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_search_results') + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_sync_query_results') + def test_status_response_running_bad_client_query(self, mock_sync_query_results, mock_status_response, + mock_api_client): + mock_api_client.return_value = None + + mocked_sync_query_return_value = MockHttpResponse('bad answer') + mock_sync_query_results.return_value = BigFixMockHttpXMLResponse(200, mocked_sync_query_return_value) + + mocked_search_results_status = '{"reportingAgents": "2", "totalResults": "0"}' + mock_status_response.return_value=BigFixMockJsonResponse(200,mocked_search_results_status) + + module = bigfix_connector + + search_id = "104" + + status_response = module.Connector(connection, config).create_status_connection(search_id) + + assert status_response is not None + assert 'success' in status_response + assert status_response['success'] == True + assert 'status' in status_response + assert status_response['status'] == "RUNNING" + assert 'progress' in status_response + assert status_response['progress'] == 0 + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_search_results') + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_sync_query_results') + def test_status_response_error_exception_status(self, mock_sync_query_results, mock_status_response, + mock_api_client): + mock_api_client.return_value = None + + mocked_sync_query_return_value = MockHttpResponse('bad answer') + mock_sync_query_results.return_value = BigFixMockHttpXMLResponse(200, mocked_sync_query_return_value) + + mocked_search_results_status = '{"reportingAgents": "2", "totalResults": "0"}' + mock_status_response.return_value=BigFixMockJsonResponse(200,mocked_search_results_status) + mock_status_response.side_effect = Exception('an error getting status') + + module = bigfix_connector + + search_id = "104" + + status_response = module.Connector(connection, config).create_status_connection(search_id) + + assert status_response is not None + assert 'success' in status_response + assert status_response['success'] == False + assert 'error' in status_response + assert 'progress' not in status_response + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_search_results') + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_sync_query_results') + def test_status_response_error_exception_result(self, mock_sync_query_results, mock_status_response, + mock_api_client): + mock_api_client.return_value = None + + mocked_sync_query_return_value = MockHttpResponse('bad answer') + mock_sync_query_results.return_value = BigFixMockHttpXMLResponse(200, mocked_sync_query_return_value) + mock_sync_query_results.side_effect = Exception('an error occurred executing sync query') + mocked_search_results_status = '{"reportingAgents": "2", "totalResults": "0"}' + mock_status_response.return_value=BigFixMockJsonResponse(200,mocked_search_results_status) + + module = bigfix_connector + + search_id = "104" + + status_response = module.Connector(connection, config).create_status_connection(search_id) + + assert status_response is not None + assert 'success' in status_response + assert status_response['success'] == False + assert 'error' in status_response + assert 'progress' not in status_response + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_search_results') + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_sync_query_results') + def test_status_response_error_exception_result_bad_return_code(self, mock_sync_query_results, + mock_status_response, + mock_api_client): + mock_api_client.return_value = None + + mocked_sync_query_return_value = MockHttpResponse('2') + mock_sync_query_results.return_value = BigFixMockHttpXMLResponse(200, mocked_sync_query_return_value) + + mocked_search_results_status = '{"reportingAgents": "2", "totalResults": "0"}' + mock_status_response.return_value=BigFixMockJsonResponse(500,mocked_search_results_status) + + module = bigfix_connector + + search_id = "104" + + status_response = module.Connector(connection, config).create_status_connection(search_id) + + assert status_response is not None + assert 'success' in status_response + assert status_response['success'] == False + assert 'error' in status_response + assert 'progress' not in status_response + + def test_delete_query(self, mock_api_client): + mock_api_client.return_value = None + + search_id = "104" + + module = bigfix_connector + status_response = module.Connector(connection, config).delete_query_connection(search_id) + assert status_response is not None + assert 'success' in status_response + assert status_response['success'] == True + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_search_results') + def test_results_response(self, mock_results_response, mock_api_client): + + mock_api_client.return_value = None + mocked_return_value = """{ + "reportingAgents": "100", + "totalResults": "201", + "results": + [ + { + "computerID":12369754, + "computerName":"fake.computer.name", + "subQueryID":1,"isFailure":false, + "result":".err, d41d8cd98f00b204e9800998ecf8427e, u002f.err","ResponseTime":0 + }, + { + "computerID":14821900, + "computerName":"DESKTOP-C30V1JF", + "subQueryID":1, + "isFailure":true, + "result":"12520437.cpx, 0a0feb9eb28bde8cd835716343b03b14, C:\\\\Windows\\\\system32\\\\12520437.cpx","ResponseTime":62000 + } + ] + }""" + mock_results_response.return_value = BigFixMockJsonResponse(200, mocked_return_value) + + module = bigfix_connector + + search_id = "102" + offset = "0" + length = "100" + results_response = module.Connector(connection, config).create_results_connection(search_id, offset, length) + + assert results_response is not None + assert 'success' in results_response + assert results_response['success'] == True + assert 'data' in results_response + assert len(results_response['data']) == 2 + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_search_results') + def test_results_response_exeception(self, mock_results_response, mock_api_client): + + mock_api_client.return_value = None + mock_results_response.side_effect = Exception('an error getting data') + + module = bigfix_connector + + search_id = "102" + offset = "0" + length = "100" + results_response = module.Connector(connection, config).create_results_connection(search_id, offset, length) + + assert results_response is not None + assert 'success' in results_response + assert results_response['success'] == False + assert 'error' in results_response + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_search_results') + def test_results_response_bad_return_code(self, mock_results_response, mock_api_client): + + mock_api_client.return_value = None + mocked_return_value = """{ + "reportingAgents": "100", + "totalResults": "201", + "results": + [ + { + "computerID":12369754, + "computerName":"fake.computer.name", + "subQueryID":1,"isFailure":false, + "result":".err, d41d8cd98f00b204e9800998ecf8427e, u002f.err","ResponseTime":0 + }, + { + "computerID":14821900, + "computerName":"DESKTOP-C30V1JF", + "subQueryID":1, + "isFailure":true, + "result":"12520437.cpx, 0a0feb9eb28bde8cd835716343b03b14, C:\\\\Windows\\\\system32\\\\12520437.cpx","ResponseTime":62000 + } + ] + }""" + mock_results_response.return_value = BigFixMockJsonResponse(500, mocked_return_value) + + module = bigfix_connector + + search_id = "102" + offset = "0" + length = "100" + results_response = module.Connector(connection, config).create_results_connection(search_id, offset, length) + + assert results_response is not None + assert 'success' in results_response + assert results_response['success'] == False + assert 'error' in results_response + + @patch('stix_transmission.src.modules.bigfix.bigfix_api_client.APIClient.get_search_results') + def test_results_response_bad_json(self, mock_results_response, mock_api_client): + + mock_api_client.return_value = None + mocked_return_value = """{ + "reportingAgents": "100", + "totalResults": "201", + "results": + [aDAsdadDAS + { + "computerID":12369754, + "computerName":"fake.computer.name", + "subQueryID":1,"isFailure":false, + "result":".err, d41d8cd98f00b204e9800998ecf8427e, u002f.err","ResponseTime":0 + }, + { + "computerID":14821900, + "computerName":"DESKTOP-C30V1JF", + "subQueryID":1, + "isFailure":true, + "result":"12520437.cpx, 0a0feb9eb28bde8cd835716343b03b14, C:\\\\Windows\\\\system32\\\\12520437.cpx","ResponseTime":62000 + } + ] + }""" + mock_results_response.return_value = BigFixMockJsonResponse(200, mocked_return_value) + + module = bigfix_connector + + search_id = "102" + offset = "0" + length = "100" + results_response = module.Connector(connection, config).create_results_connection(search_id, offset, length) + + assert results_response is not None + assert 'success' in results_response + assert results_response['success'] == False + assert 'error' in results_response diff --git a/tests/stix_transmission/test_qradar.py b/tests/stix_transmission/test_qradar.py new file mode 100644 index 000000000..78b41a197 --- /dev/null +++ b/tests/stix_transmission/test_qradar.py @@ -0,0 +1,205 @@ +from stix_transmission.src.modules.qradar import qradar_connector +from stix_transmission.src.modules.base.base_status_connector import Status +from unittest.mock import patch +import unittest + + +class QRadarMockResponse: + def __init__(self, response_code, object): + self.code = response_code + self.object = object + + def read(self): + return self.object + + +@patch('stix_transmission.src.modules.qradar.arielapiclient.APIClient.__init__', autospec=True) +class TestQRadarConnection(unittest.TestCase, object): + def test_is_async(self, mock_api_client): + mock_api_client.return_value = None + module = qradar_connector + + config = { + "auth": { + "SEC": "bla" + } + } + connection = { + "host": "hostbla", + "port": "8080", + "ceft": "cert" + } + check_async = module.Connector(connection, config).is_async + + assert check_async + + @patch('stix_transmission.src.modules.qradar.arielapiclient.APIClient.ping_box') + def test_ping_endpoint(self, mock_ping_response, mock_api_client): + mock_api_client.return_value = None + mocked_return_value = '["mock", "placeholder"]' + mock_ping_response.return_value = QRadarMockResponse(200, mocked_return_value) + + module = qradar_connector + config = { + "auth": { + "SEC": "bla" + } + } + connection = { + "host": "hostbla", + "port": "8080", + "ceft": "cert" + } + ping_response = module.Connector(connection, config).ping() + + assert ping_response is not None + assert ping_response['success'] + + @patch('stix_transmission.src.modules.qradar.arielapiclient.APIClient.create_search') + def test_query_response(self, mock_query_response, mock_api_client): + mock_api_client.return_value = None + mocked_return_value = '{"search_id": "108cb8b0-0744-4dd9-8e35-ea8311cd6211"}' + mock_query_response.return_value = QRadarMockResponse(201, mocked_return_value) + + module = qradar_connector + config = { + "auth": { + "SEC": "bla" + } + } + connection = { + "host": "hostbla", + "port": "8080", + "ceft": "cert" + } + query = '{"query":"SELECT sourceIP from events"}' + query_response = module.Connector(connection, config).create_query_connection(query) + + assert query_response is not None + assert 'search_id' in query_response + assert query_response['search_id'] == "108cb8b0-0744-4dd9-8e35-ea8311cd6211" + + @patch('stix_transmission.src.modules.qradar.arielapiclient.APIClient.get_search', autospec=True) + def test_status_response(self, mock_status_response, mock_api_client): + mock_api_client.return_value = None + mocked_return_value = '{"search_id": "108cb8b0-0744-4dd9-8e35-ea8311cd6211", "status": "COMPLETED", "progress": "100"}' + mock_status_response.return_value = QRadarMockResponse(200, mocked_return_value) + + module = qradar_connector + config = { + "auth": { + "SEC": "bla" + } + } + connection = { + "host": "hostbla", + "port": "8080", + "ceft": "cert" + } + search_id = "108cb8b0-0744-4dd9-8e35-ea8311cd6211" + status_response = module.Connector(connection, config).create_status_connection(search_id) + + assert status_response['success'] + assert status_response is not None + assert 'status' in status_response + assert status_response['status'] == Status.COMPLETED.value + + @patch('stix_transmission.src.modules.qradar.arielapiclient.APIClient.get_search_results', autospec=True) + def test_results_response(self, mock_results_response, mock_api_client): + mock_api_client.return_value = None + mocked_return_value = """{ + "search_id": "108cb8b0-0744-4dd9-8e35-ea8311cd6211", + "events": { + "events": [ + { + "sourceIP":"9.21.122.81" + }, + { + "sourceIP":"9.21.122.81" + } + ] + } + }""" + mock_results_response.return_value = QRadarMockResponse(200, mocked_return_value) + + module = qradar_connector + config = { + "auth": { + "SEC": "bla" + } + } + connection = { + "host": "hostbla", + "port": "8080", + "ceft": "cert" + } + search_id = "108cb8b0-0744-4dd9-8e35-ea8311cd6211" + offset = 0 + length = 1 + results_response = module.Connector(connection, config).create_results_connection(search_id, offset, length) + + assert results_response is not None + assert results_response['success'] + assert 'data' in results_response + assert 'events' in results_response['data'] + assert len(results_response['data']) > 0 + + @patch('stix_transmission.src.modules.qradar.arielapiclient.APIClient.create_search', autospec=True) + @patch('stix_transmission.src.modules.qradar.arielapiclient.APIClient.get_search', autospec=True) + @patch('stix_transmission.src.modules.qradar.arielapiclient.APIClient.get_search_results', autospec=True) + def test_query_flow(self, mock_results_response, mock_status_response, mock_query_response, mock_api_client): + mock_api_client.return_value = None + query_mock = '{"search_id": "108cb8b0-0744-4dd9-8e35-ea8311cd6211"}' + status_mock = '{"search_id": "108cb8b0-0744-4dd9-8e35-ea8311cd6211", "status": "COMPLETED", "progress": "100"}' + results_mock = """{ + "search_id": "108cb8b0-0744-4dd9-8e35-ea8311cd6211", + "events": { + "events": [ + { + "sourceIP":"9.21.122.81" + }, + { + "sourceIP":"9.21.122.81" + } + ] + } + }""" + mock_results_response.return_value = QRadarMockResponse(200, results_mock) + mock_status_response.return_value = QRadarMockResponse(200, status_mock) + mock_query_response.return_value = QRadarMockResponse(201, query_mock) + module = qradar_connector + + config = { + "auth": { + "SEC": "bla" + } + } + connection = { + "host": "hostbla", + "port": "8080", + "ceft": "cert" + } + + query = '{"query":"SELECT sourceIP from events"}' + + query_response = module.Connector(connection, config).create_query_connection(query) + + assert query_response is not None + assert 'search_id' in query_response + assert query_response['search_id'] == "108cb8b0-0744-4dd9-8e35-ea8311cd6211" + + search_id = "108cb8b0-0744-4dd9-8e35-ea8311cd6211" + status_response = module.Connector(connection, config).create_status_connection(search_id) + + assert status_response is not None + assert 'status' in status_response + assert status_response['status'] == Status.COMPLETED.value + + offset = 0 + length = 1 + results_response = module.Connector(connection, config).create_results_connection(search_id, offset, length) + + assert results_response is not None + assert 'data' in results_response + assert 'events' in results_response['data'] + assert len(results_response['data']) > 0 diff --git a/tests/stix_transmission/test_synchronous_dummy.py b/tests/stix_transmission/test_synchronous_dummy.py new file mode 100644 index 000000000..23dc5b5bc --- /dev/null +++ b/tests/stix_transmission/test_synchronous_dummy.py @@ -0,0 +1,37 @@ +from stix_transmission.src.modules.synchronous_dummy import synchronous_dummy_connector +from stix_transmission.src.modules.synchronous_dummy import synchronous_dummy_results_connector +from stix_transmission.src.modules.synchronous_dummy import synchronous_dummy_ping +import unittest + + +class TestSynchronousDummyConnection(unittest.TestCase, object): + def test_is_async(self): + module = synchronous_dummy_connector + check_async = module.Connector().is_async + assert check_async == False + + def test_ping(self): + ping_interface = synchronous_dummy_ping.SynchronousDummyPing() + ping_result = ping_interface.ping() + + assert ping_result == "synchronous ping" + + def test_dummy_sync_results(self): + results_interface = synchronous_dummy_results_connector.SynchronousDummyResultsConnector() + options = {} + params = { + "config": { + "port": 443, + "ip": "127.0.0.1", + "host": "localhost", + "path": "/async_dummy/query_path" + }, + "query": "placeholder query text" + } + + results_response = results_interface.create_results_connection(params, options) + response_code = results_response["response_code"] + query_results = results_response["query_results"] + + assert response_code == 200 + assert isinstance(query_results, dict)