Prerequisites:
- Install Go
- Install Docker
- Install Hasura DDN Cli
git clone https://github.com/hasura/ndc-elasticsearch.git
cd ndc-elasticsearch
Set up the required environment variables for the Elasticsearch connector:
export ELASTICSEARCH_URL=<Your_Elasticsearch_Instance_URL>
export ELASTICSEARCH_USERNAME=<Your_Elasticsearch_Username>
export ELASTICSEARCH_PASSWORD=<Your_Elasticsearch_Password>
export ELASTICSEARCH_API_KEY=<Your_Elasticsearch_API_Key>
export ELASTICSEARCH_CA_CERT_PATH=<Path_To_Your_CA_Certificate>
export ELASTICSEARCH_INDEX_PATTERN=<Regex_Pattern_For_Indices>
Replace the placeholders with your actual Elasticsearch details.
Compile the executable:
go build
Initialize with your database schema:
ndc-elasticsearch update
Execute the connector:
ndc-elasticsearch serve
Access at: http://localhost:8080
Check the schema:
curl http://localhost:8080/schema
Use the /query endpoint for queries.
Instructions for building and running the connector using Docker:
Build the docker image using the provided Dockerfile
:
docker build -t ndc-elasticsearch .
Run the Docker Container
docker run -p 8080:8080 -v <path_to_your_configuration.json>:/etc/connector/configuration.json -e "ELASTICSEARCH_URL:<Your_URL>" -e "ELASTICSEARCH_USERNAME:<Your_Username>" -e "ELASTICSEARCH_PASSWORD:<Your_Password>" -it ndc-elasticsearch
Replace placeholders with your Elasticsearch details.
Use docker-compose.yaml
for setting up a local dev environment with Hasura v3-engine
and other services:
docker compose up -d
open http://localhost:3000 # Graphiql
open http://localhost:4002 # Jaeger
open http://localhost:9090 # Prometheus
Update /etc/hosts Add required local mappings.
Load Sample Data:
Run the script below after starting Elasticsearch and Kibana to load sample data:
python resources/data/load_sample_data.py
Test with GraphiQL
query MyQuery {
app_dsKibanaSampleDataLogs {
id
}
}
Set x-hasura-role to admin in headers for testing.
{"x-hasura-role": "admin"}