Skip to content

Commit

Permalink
Merge pull request #7 from allora-network/diego/ora-1488-reputer-fix-…
Browse files Browse the repository at this point in the history
…namings-and-refactor

Change truth to data-provider
  • Loading branch information
xmariachi authored May 24, 2024
2 parents a1aeab6 + 0e1443b commit b472298
Show file tree
Hide file tree
Showing 8 changed files with 49 additions and 34 deletions.
6 changes: 3 additions & 3 deletions .github/workflows/build_push_ecr.yml
Original file line number Diff line number Diff line change
Expand Up @@ -34,8 +34,8 @@ jobs:
id: login-ecr
uses: aws-actions/amazon-ecr-login@v1

- name: Truth Build, tag, and push image to Amazon ECR
id: build-push-image-truth
- name: Data Provider Build, tag, and push image to Amazon ECR
id: build-push-image-data-provider
env:
ECR_REGISTRY: ${{ steps.login-ecr.outputs.registry }}
ECR_REPOSITORY: ${{github.event.repository.name}}-truth
Expand All @@ -52,7 +52,7 @@ jobs:
fi
# Build a docker container and push it to ECR so that it can be deployed to ECS.
docker build --pull -f Dockerfile_truth \
docker build --pull -f Dockerfile_data_provider \
-t $ECR_REGISTRY/$ECR_REPOSITORY:$IMAGE_TAG .
docker push $ECR_REGISTRY/$ECR_REPOSITORY:$IMAGE_TAG
Expand Down
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -6,5 +6,5 @@ logs/*
.env
worker-data
head-data
truth-data
data-provider-data
prices.db
File renamed without changes.
27 changes: 22 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Coin Price Reputer

An example application: a node to repute and provide ground truth for ETH predictions.
An example application: a node to repute and provide reputation for ETH predictions.

This is an example of a setup for running an Allora Network reputer node for providing ground truth and reputation, where the Allora Network node defers the requests to another container which is responsible for providing the ground truth, which is run in a separate container.
It also provides a means of updating the internal database of the ground truth provider.
Expand Down Expand Up @@ -95,10 +95,10 @@ The head node has the only open port, and responds to requests in port 6000.
Example request:
```
curl --location 'http://localhost:6000/api/v1/functions/execute' --header 'Accept: application/json, text/plain, */*' --header 'Content-Type: application/json;charset=UTF-8' --data '{
"function_id": "bafybeigpiwl3o73zvvl6dxdqu7zqcub5mhg65jiky2xqb4rdhfmikswzqm",
"method": "allora-inference-function.wasm",
"function_id": "bafybeihrfb7zic7ffb3vr7xelzve2pi75wizzfz3j3yslzre62xh3tef2u",
"method": "loss-calculation-eth.wasm",
"parameters": null,
"topic": "1",
"topic": "1/reputer",
"config": {
"env_vars": [
{
Expand All @@ -107,9 +107,18 @@ curl --location 'http://localhost:6000/api/v1/functions/execute' --header 'Accep
},
{
"name": "ALLORA_ARG_PARAMS",
"value": "1711064725"
"value": "1712337671"
} ,
{
"name":"ALLORA_BLOCK_HEIGHT_CURRENT",
"value":"200"
},
{
"name":"ALLORA_BLOCK_HEIGHT_EVAL",
"value":"100"
}
],
"stdin": "{\"networkInference\":\"46071353120000000000\",\"inferrerInferences\":[{\"node\":\"allo1inf1\",\"value\":\"46071353100000000000\"},{\"node\":\"allo1inf2\",\"value\":\"46071353220000000000\"},{\"node\":\"allo1inf0000\",\"value\":\"46071353121000000000\"}],\"forecasterInferences\":[{\"node\":\"allo1inf1\",\"value\":\"46071353110000000000\"},{\"node\":\"allo1inf2\",\"value\":\"46071353320000000000\"},{\"node\":\"allo1inf1111\",\"value\":\"4607135311000000000\"}],\"naiveNetworkInference\":\"46071353100000000000\",\"oneOutNetworkInferences\":[{\"node\":\"allo1inf1\",\"value\":\"46071353000000000000\"},{\"node\":\"allo1inf2\",\"value\":\"46071353124000000000\"},{\"node\":\"allo1inf0000\",\"value\":\"46071353160000000000\"}],\"oneInNetworkInferences\":[{\"node\":\"allo1inf1\",\"value\":\"46071353050000000000\"},{\"node\":\"allo1inf2\",\"value\":\"46071353125000000000\"},{\"node\":\"allo1inf1111\",\"value\":\"46071353080000000000\"}]}",
"number_of_nodes": -1,
"timeout" : 2
}
Expand All @@ -133,6 +142,14 @@ To only test the ground truth model, you can simply follow these steps:
```


## Env vars

ALLORA_BLOCK_HEIGHT_CURRENT: Current block being reputed
ALLORA_BLOCK_HEIGHT_EVAL: Previous block to evaluate (to build EMA with)
LOSS_FUNCTION_ALLOWS_NEGATIVE: whether the loss function allows negative values or not. Default: true.
ALLORA_ARG_PARAMS: The timestamp at which ground truth must be obtained.


## Connecting to the Allora network
In order to connect to the Allora network, both the head and the worker need to register against it. More details on [allora-inference-base](https://github.com/allora-network/allora-inference-base) repo.
The following optional flags are used in the `command:` section of the `docker-compose.yml` file to define the connectivity with the Allora network.
Expand Down
22 changes: 10 additions & 12 deletions app.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,11 @@


ETHUSD_TOKEN = "ETHUSD"
API_PORT = int(os.environ.get('TRUTH_API_PORT', 5000))
API_PORT = int(os.environ.get('API_PORT', 5000))
ALLORA_VALIDATOR_API_URL = str(os.environ.get('ALLORA_VALIDATOR_API_URL','http://localhost:1317/emissions/v1/network_loss/'))
app = Flask(__name__)

TRUTH_DATABASE_PATH = os.environ.get('TRUTH_DATABASE_PATH', 'prices.db')
DATABASE_PATH = os.environ.get('DATABASE_PATH', 'prices.db')
GEN_TEST_DATA = bool(os.environ.get('GEN_TEST_DATA', False))
WORKER_ADDRESS_TEST_1 = str(os.environ.get('WORKER_ADDRESS_TEST_1', "allo1tvh6nv02vq6m4mevsa9wkscw53yxvfn7xt8rud"))

Expand All @@ -31,7 +31,7 @@ def fetch_prices(url):
return response.json()

def check_create_table():
conn = sqlite3.connect(TRUTH_DATABASE_PATH)
conn = sqlite3.connect(DATABASE_PATH)
cursor = conn.cursor()
cursor.execute('''CREATE TABLE IF NOT EXISTS prices
(timestamp INTEGER PRIMARY KEY, token TEXT, price REAL)''')
Expand Down Expand Up @@ -61,7 +61,7 @@ def update_price(token_name, token_from, token_to):
token = token_name.lower()

# Save price into database
conn = sqlite3.connect(TRUTH_DATABASE_PATH)
conn = sqlite3.connect(DATABASE_PATH)
cursor = conn.cursor()
cursor.execute("INSERT INTO prices (timestamp, token, price) VALUES (?, ?, ?)", (timestamp, token, price))
cursor.close()
Expand All @@ -80,7 +80,7 @@ def update_price(token_name, token_from, token_to):

@app.route('/gt/<token>/<timestamp>')
def get_price(token, timestamp):
conn = sqlite3.connect(TRUTH_DATABASE_PATH)
conn = sqlite3.connect(DATABASE_PATH)
cursor = conn.cursor()
cursor.execute("SELECT timestamp, price FROM prices WHERE token=? ORDER BY ABS(timestamp - ?) LIMIT 1", (token.lower(), timestamp,))
result = cursor.fetchone()
Expand All @@ -95,7 +95,7 @@ def init_price_token(token_name, token_from, token_to):
try:
check_create_table()
# Check if there is any existing data for the specified token
conn = sqlite3.connect(TRUTH_DATABASE_PATH)
conn = sqlite3.connect(DATABASE_PATH)
cursor = conn.cursor()
cursor.execute("SELECT COUNT(*) FROM prices WHERE token=? ", (token_name.lower(),))
count = cursor.fetchone()[0]
Expand All @@ -119,7 +119,7 @@ def init_price_token(token_name, token_from, token_to):
historical_data = response.json()['prices']

# Parse and insert historical data into the database
conn = sqlite3.connect(TRUTH_DATABASE_PATH)
conn = sqlite3.connect(DATABASE_PATH)
cursor = conn.cursor()
for data_point in historical_data:
timestamp = int(data_point[0] / 1000) # Convert milliseconds to seconds
Expand Down Expand Up @@ -150,13 +150,12 @@ def get_test_losses_data():
def get_losses_data(topic, blockHeight):
try:
url = ALLORA_VALIDATOR_API_URL + topic + "/" + blockHeight
print(f"url: {url}")
response = requests.get(url)
print(f"url: {url} , response: {str(response)}")
response.raise_for_status() # Raise exception if request fails
return response.json(), HTTP_RESPONSE_CODE_200
except Exception as e:
print(f'Failed to get data for {topic} token: {str(e)}')
print(f'Not providing last losses for {topic} token')
print(f'Failed to get data for {topic} topic for url: {url}: {str(e)}')
return '{}', HTTP_RESPONSE_CODE_500


Expand All @@ -181,8 +180,7 @@ def get_losses(topic, blockHeight):
return jsonify(losses_data_json), HTTP_RESPONSE_CODE_200

except Exception as e:
print(f'Failed to get data for {topic} token: {str(e)}')
print(f'Not providing last losses for {topic} token')
print(f'Failed to get data for {topic} topic: {str(e)}')
return '{}', HTTP_RESPONSE_CODE_500

if __name__ == '__main__':
Expand Down
18 changes: 9 additions & 9 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -1,30 +1,30 @@
version: '3'

services:
truth:
data-provider:
build:
context: .
dockerfile: Dockerfile_truth
dockerfile: Dockerfile_data_provider
command: python -u /app/app.py
environment:
- TRUTH_DATABASE_PATH=/app/data/prices.db
- TRUTH_API_PORT=8000
- DATABASE_PATH=/app/data/prices.db
- API_PORT=8000
# - ALLORA_VALIDATOR_API_URL=https://allora-api.testnet.allora.network/emissions/v1/network_loss/
- ALLORA_VALIDATOR_API_URL=https://localhost:1317/emissions/v1/network_loss/
# For local testing via URL without workers/heads up:
ports:
- "8000:8000"
volumes:
- ./truth-data:/app/data
- ./data-provider-data:/app/data
networks:
eth-model-local:
aliases:
- truth
- data-provider
ipv4_address: 172.20.0.4

worker:
environment:
- TRUTH_API_ADDRESS=http://truth:8000
- DATA_PROVIDER_API_ADDRESS=http://data-provider:8000
- HOME=/data
- LOG_FILE=/tmp/app.log
build:
Expand All @@ -50,7 +50,7 @@ services:
- ./worker-data:/data
working_dir: /data
depends_on:
- truth
- data-provider
- head
networks:
eth-model-local:
Expand Down Expand Up @@ -97,4 +97,4 @@ networks:
volumes:
worker-data:
head-data:
truth-data:
data-provider-data:
6 changes: 3 additions & 3 deletions main.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,17 +15,17 @@ def config_logging(log_file):
logging.basicConfig(filename=log_file, level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')

ETHUSD_TOKEN = "ETHUSD"
TRUTH_ADDRESS = os.environ['TRUTH_API_ADDRESS']
DATA_PROVIDER_API_ADDRESS = os.environ['DATA_PROVIDER_API_ADDRESS']
LOG_FILE = os.environ.get('LOG_FILE', '/tmp/app.log')
config_logging(LOG_FILE)

def get_ground_truth(token_name, timestamp):
url = f"{TRUTH_ADDRESS}/gt/{token_name}/{timestamp}"
url = f"{DATA_PROVIDER_API_ADDRESS}/gt/{token_name}/{timestamp}"
response = requests.get(url)
return response.text

def get_previous_losses(topic, blockHeight):
url = f"{TRUTH_ADDRESS}/losses/{topic}/{blockHeight}"
url = f"{DATA_PROVIDER_API_ADDRESS}/losses/{topic}/{blockHeight}"
response = requests.get(url)
return response.text

Expand Down
2 changes: 1 addition & 1 deletion update_app.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import os
import requests

inference_address = os.environ['TRUTH_API_ADDRESS']
inference_address = os.environ['DATA_PROVIDER_API_ADDRESS']
url = f"{inference_address}/update"

response = requests.get(url)
Expand Down

0 comments on commit b472298

Please sign in to comment.