Skip to content

Commit

Permalink
[readme] Remove writer information
Browse files Browse the repository at this point in the history
  • Loading branch information
valeriocos committed Mar 23, 2018
1 parent c935dab commit 7d93d9f
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 21 deletions.
14 changes: 4 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,15 +18,14 @@ it. Once a job is finished, if the result is successful, the server will
re-schedule it to retrieve new data.

By default, items fetched by each job will be published using a Redis queue.
Additionally, they can be written to an Elastic Search index.


## Usage

### arthurd
```
usage: arthurd [-c <file>] [-g] [-h <host>] [-p <port>] [-d <database>]
[--es-index <index>] [--log-path <path>] [--archive-path <cpath>]
[--log-path <path>] [--archive-path <cpath>]
[--no-archive] [--no-daemon] | --help
King Arthur commands his loyal knight Perceval on the quest
Expand All @@ -46,7 +45,6 @@ optional arguments:
-p, --port set listening TCP port (default: 8080)
-d, --database URL database connection (default: 'redis://localhost/8')
-s, --sync work in synchronous mode (without workers)
--es-index output ElasticSearch server index
--log-path path where logs are stored
--archive-path path to archive manager directory
--no-archive do not archive fetched raw data
Expand Down Expand Up @@ -121,13 +119,12 @@ $ python3 setup.py install
## How to run it

The first step is to run a Redis server that will be used for communicating
Arthur's components. Moreover, an Elastic Search server can be used to store
the items generated by jobs. Please refer to their documentation to know how to
install and run them both.
Arthur's components. Please refer to its documentation to know how to
install and run it.

To run Arthur server:
```
$ arthurd -g -d redis://localhost/8 --es-index http://localhost:9200/items --log-path /tmp/logs/arthud --no-archive
$ arthurd -g -d redis://localhost/8 --log-path /tmp/logs/arthud --no-archive
```

To run a worker:
Expand Down Expand Up @@ -186,9 +183,6 @@ Then, send this JSON stream to the server calling `add` method.
$ curl -H "Content-Type: application/json" --data @tasks.json http://127.0.0.1:8080/add
```

For this example, items will be stored in the `items` index on the
Elastic Search server (http://localhost:9200/items).

## Listing tasks

The list of tasks currently scheduled can be obtained using the method `tasks`.
Expand Down
12 changes: 1 addition & 11 deletions bin/arthurd
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,6 @@ import redis

from arthur.common import ARCHIVES_DEFAULT_PATH
from arthur.server import ArthurServer
from arthur.writers import ElasticItemsWriter


ARTHURD_USAGE_MSG = \
Expand All @@ -59,7 +58,6 @@ optional arguments:
-p, --port set listening TCP port (default: 8080)
-d, --database URL database connection (default: 'redis://localhost/8')
-s, --sync work in synchronous mode (without workers)
--es-index output ElasticSearch server index
--log-path path where logs are stored
--archive-path path to archive manager directory
--no-archive do not archive fetched raw data
Expand All @@ -81,17 +79,11 @@ def main():

conn = connect_to_redis(args.database)

if args.es_index:
writer = ElasticItemsWriter(args.es_index)
else:
writer = None

# Set archive manager directory
base_archive_path = None if args.no_archive else args.archive_path

app = ArthurServer(conn, base_archive_path,
async_mode=args.sync_mode,
writer=writer)
async_mode=args.sync_mode)

run_daemon = not args.no_daemon

Expand Down Expand Up @@ -196,8 +188,6 @@ def create_common_arguments_parser(defaults):
parser.add_argument('-s', '--sync', dest='sync_mode',
action='store_false',
help=argparse.SUPPRESS)
parser.add_argument('--es-index', dest='es_index',
help=argparse.SUPPRESS)
parser.add_argument('--log-path', dest='log_path',
default=os.path.expanduser('~/.arthur/logs/'),
help=argparse.SUPPRESS)
Expand Down

0 comments on commit 7d93d9f

Please sign in to comment.