Skip to content

Commit

Permalink
Merge pull request #65 from beshiniii/master
Browse files Browse the repository at this point in the history
Update readme file
  • Loading branch information
sajithaliyanage authored Mar 20, 2021
2 parents 9e76c1d + 26d19d6 commit 4718e09
Showing 1 changed file with 14 additions and 6 deletions.
20 changes: 14 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,9 +17,21 @@ CrawlerX includes the following runtimes to do the crawling jobs for you.
- **MongoDB Server** - for store crawled data
- **ElasticSearch**- for job/query seaching mechanisams

### Setup
### Setup on the Container based Environments

Please follow the below steps in order to set it up CrawlerX in your VM based environment.
#### Docker Composer

Please follow the below steps to setup CrawlerX on the container environment.

```sh
docker-compose up --build
```

Open http://localhost:8080 to view the CrawlerX web UI in the browser.

### Setup on the VM based Environment

#### Please follow the below steps in order to set it up CrawlerX in your VM based environment.

Start the frontend :
```sh
Expand Down Expand Up @@ -69,10 +81,6 @@ $ docker pull elasticsearch:7.8.1
$ docker run -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" elasticsearch:7.8.1
```

#### Docker + Kubernetes

Above server configurations need to be wrap with Docker images and write k8s deployment files.

### Todos

- Improve VueJS based frontend with functionalities.
Expand Down

0 comments on commit 4718e09

Please sign in to comment.