diff --git a/README.md b/README.md index af6a98a..4e14da2 100644 --- a/README.md +++ b/README.md @@ -18,7 +18,6 @@ it. Once a job is finished, if the result is successful, the server will re-schedule it to retrieve new data. By default, items fetched by each job will be published using a Redis queue. -Additionally, they can be written to an Elastic Search index. ## Usage @@ -26,7 +25,7 @@ Additionally, they can be written to an Elastic Search index. ### arthurd ``` usage: arthurd [-c ] [-g] [-h ] [-p ] [-d ] - [--es-index ] [--log-path ] [--archive-path ] + [--log-path ] [--archive-path ] [--no-archive] [--no-daemon] | --help King Arthur commands his loyal knight Perceval on the quest @@ -46,7 +45,6 @@ optional arguments: -p, --port set listening TCP port (default: 8080) -d, --database URL database connection (default: 'redis://localhost/8') -s, --sync work in synchronous mode (without workers) - --es-index output ElasticSearch server index --log-path path where logs are stored --archive-path path to archive manager directory --no-archive do not archive fetched raw data @@ -121,13 +119,12 @@ $ python3 setup.py install ## How to run it The first step is to run a Redis server that will be used for communicating -Arthur's components. Moreover, an Elastic Search server can be used to store -the items generated by jobs. Please refer to their documentation to know how to -install and run them both. +Arthur's components. Please refer to its documentation to know how to +install and run it. To run Arthur server: ``` -$ arthurd -g -d redis://localhost/8 --es-index http://localhost:9200/items --log-path /tmp/logs/arthud --no-archive +$ arthurd -g -d redis://localhost/8 --log-path /tmp/logs/arthud --no-archive ``` To run a worker: @@ -186,9 +183,6 @@ Then, send this JSON stream to the server calling `add` method. $ curl -H "Content-Type: application/json" --data @tasks.json http://127.0.0.1:8080/add ``` -For this example, items will be stored in the `items` index on the -Elastic Search server (http://localhost:9200/items). - ## Listing tasks The list of tasks currently scheduled can be obtained using the method `tasks`. diff --git a/bin/arthurd b/bin/arthurd index 1865b2b..90461ee 100644 --- a/bin/arthurd +++ b/bin/arthurd @@ -33,7 +33,6 @@ import redis from arthur.common import ARCHIVES_DEFAULT_PATH from arthur.server import ArthurServer -from arthur.writers import ElasticItemsWriter ARTHURD_USAGE_MSG = \ @@ -59,7 +58,6 @@ optional arguments: -p, --port set listening TCP port (default: 8080) -d, --database URL database connection (default: 'redis://localhost/8') -s, --sync work in synchronous mode (without workers) - --es-index output ElasticSearch server index --log-path path where logs are stored --archive-path path to archive manager directory --no-archive do not archive fetched raw data @@ -81,17 +79,11 @@ def main(): conn = connect_to_redis(args.database) - if args.es_index: - writer = ElasticItemsWriter(args.es_index) - else: - writer = None - # Set archive manager directory base_archive_path = None if args.no_archive else args.archive_path app = ArthurServer(conn, base_archive_path, - async_mode=args.sync_mode, - writer=writer) + async_mode=args.sync_mode) run_daemon = not args.no_daemon @@ -196,8 +188,6 @@ def create_common_arguments_parser(defaults): parser.add_argument('-s', '--sync', dest='sync_mode', action='store_false', help=argparse.SUPPRESS) - parser.add_argument('--es-index', dest='es_index', - help=argparse.SUPPRESS) parser.add_argument('--log-path', dest='log_path', default=os.path.expanduser('~/.arthur/logs/'), help=argparse.SUPPRESS)