Skip to content

Commit

Permalink
Merge pull request #322 from jembi/add-openfn-to-platform
Browse files Browse the repository at this point in the history
Add the openfn package to platform. Also fixed CU-86bzwyh69
  • Loading branch information
drizzentic authored Dec 3, 2024
2 parents 8de658c + 5450bee commit 9a5a967
Show file tree
Hide file tree
Showing 20 changed files with 6,304 additions and 2 deletions.
1 change: 1 addition & 0 deletions config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ packages:
- database-postgres
- reprocess-mediator
- fhir-ig-importer
- openfn
- datalake

profiles:
Expand Down
40 changes: 40 additions & 0 deletions documentation/packages/openfn/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
# Package Documentation

## Introduction

Welcome to the documentation for the `openfn` package! This package is designed to provide a platform for seamless integration and automation of data workflows. Whether you are a developer, data analyst, or data scientist, this package will help you streamline your data processing tasks.

## Usage

Once you have added the `openfn` package, you can start using it in your projects. Here is how to instantiate the package

`instant package init -n openfn --dev`

## Demo

To get a hands-on experience with the `openfn` package, try out the demo. The demo showcases the package's capabilities and provides a sample project used to export data from CDR to NDR with transformations. It utilizes a Kafka queue and a custom adapter to map Bundles to be compliant with the FHIR Implementation Guide (IG).

### Getting Started

To access the demo, follow these steps:

1. Visit the [OpenFn Demo](http://localhost:4000) website.
2. Use the following demo credentials

```
username: [email protected]
password: instant101
```

3. Configure the Kafka trigger
Change the trigger type from webhook to “Kafka Consumer”
Enter in configuration details → see [docs](https://docs.google.com/document/d/1cefHnp6IS6zvFwqQs8EsMQo5D4npsoE-S33R4OBpQjI/edit?usp=sharing)
Kafka topic: {whichever you want to use} (e.g., “cdr-ndr”)
Hosts: {cdr host name}
Initial offset reset policy: earliest
Connection timeout: 30 (default value, but can be adjusted)
Warning: Check Disable this trigger to ensure that consumption doesn’t start until you are ready to run the workflow! Once unchecked, it will immediately start consuming messages off the topic.

### Documentation

For more detailed information on the `openfn` package and its functionalities, please refer to the [official documentation](https://github.com/openfn/docs). The documentation covers various topics, including installation instructions, usage guidelines, and advanced features.
184 changes: 184 additions & 0 deletions documentation/packages/openfn/environment-variables.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,184 @@
## Environment Variables

<table>
<thead>
<tr>
<th>Variable Name</th>
<th>Description</th>
<th>Type</th>
<th>Relevance</th>
<th>Required</th>
<th>Default</th>
</tr>
</thead>
<tbody>
<tr>
<td>DATABASE_URL</td>
<td>The URL of the PostgreSQL database</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>DISABLE_DB_SSL</td>
<td>Whether to disable SSL for the database connection</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>IS_RESETTABLE_DEMO</td>
<td>Whether the application is running in resettable demo mode</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>LISTEN_ADDRESS</td>
<td>The IP address to listen on</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>LOG_LEVEL</td>
<td>The log level for the application</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>ORIGINS</td>
<td>The allowed origins for CORS</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>PRIMARY_ENCRYPTION_KEY</td>
<td>The primary encryption key</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>SECRET_KEY_BASE</td>
<td>The secret key base</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>WORKER_RUNS_PRIVATE_KEY</td>
<td>The private key for worker runs</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>POSTGRES_USER</td>
<td>The username for the PostgreSQL database</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>POSTGRES_SERVICE</td>
<td>The service name for the PostgreSQL database</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>POSTGRES_DATABASE</td>
<td>The name of the PostgreSQL database</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>POSTGRES_PASSWORD</td>
<td>The password for the PostgreSQL database</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>POSTGRES_PORT</td>
<td>The port number for the PostgreSQL database</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>OpenFn_POSTGRESQL_DB</td>
<td>The name of the OpenFn PostgreSQL database</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>OpenFn_POSTGRESQL_USERNAME</td>
<td>The username for the OpenFn PostgreSQL database</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>OpenFn_POSTGRESQL_PASSWORD</td>
<td>The password for the OpenFn PostgreSQL database</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>WORKER_LIGHTNING_PUBLIC_KEY</td>
<td>The public key for the worker lightning</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>WORKER_SECRET</td>
<td>The secret key for the worker</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>OpenFn_IMAGE</td>
<td>The image name for OpenFn</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>OpenFn_WORKER_IMAGE</td>
<td>The image name for OpenFn worker</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
</tbody>
</table>
2 changes: 1 addition & 1 deletion kafka-mapper-consumer/docker-compose.config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ services:
configs:
kafka-mapper-consumer-openhimConfig.js:
file: ./openhimConfig.js
name: kafka-mapper-consumer-openhimConfig.js-${fhir_ig_importer_config_importer_openhimConfig_js_DIGEST:?err}
name: kafka-mapper-consumer-openhimConfig.js-${kafka_mapper_consumer_openhimConfig_js_DIGEST:?err}
labels:
name: kafka-mapper-consumer
kafka-mapper-consumer-consumer-ui-app.json:
Expand Down
13 changes: 13 additions & 0 deletions openfn/docker-compose.dev.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
version: '3.9'

services:
openfn:
ports:
- target: 4000
published: 4000
mode: host
worker:
ports:
- target: 2222
published: 2222
mode: host
64 changes: 64 additions & 0 deletions openfn/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
version: '3.9'

services:
openfn:
image: ${OPENFN_IMAGE}
# This command runs a shell script that performs the following actions:
# 1. Executes the Lightning.Release.migrate() function to handle database migrations.
# 2. Sets up a user with the provided first name, last name, email, password, and role using the Lightning.Setup.setup_user function.
# - The user details are hardcoded with the first name "Test", last name "User", email "[email protected]", and password "instant101".
# - The role assigned to the user is "superuser".
# - The function also takes an API key from the environment variable ${OPENFN_API_KEY}.
# - Additionally, it sets up a schema with the name "openhim ndr" and type "http", including credentials and a base URL from environment variables.
# 3. Starts the Lightning application using the /app/bin/lightning start command.
command: >
sh -c "/app/bin/lightning eval 'Lightning.Release.migrate()' && /app/bin/lightning eval 'Lightning.Setup.setup_user(%{first_name: \"Test\",last_name: \"User\",email: \"[email protected]\",password: \"instant101\", role: :superuser}, \"${OPENFN_API_KEY}\" ,[%{name: \"openhim ndr\", schema: \"http\", body: %{username: \"${FHIR_SERVER_USERNAME}\", password: \"${FHIR_SERVER_PASSWORD}\", baseUrl: \"${FHIR_SERVER_BASE_URL}\"}}])' && /app/bin/lightning start"
deploy:
resources:
limits:
cpus: '${OPENFN_DOCKER_WEB_CPUS:-0}'
memory: '${OPENFN_DOCKER_WEB_MEMORY:-0}'
environment:
- DATABASE_URL=${OPENFN_DATABASE_URL}
- DISABLE_DB_SSL=${OPENFN_DISABLE_DB_SSL}
- IS_RESETTABLE_DEMO=${OPENFN_IS_RESETTABLE_DEMO}
- LISTEN_ADDRESS=${OPENFN_LISTEN_ADDRESS}
- LOG_LEVEL=${OPENFN_LOG_LEVEL}
- ORIGINS=${OPENFN_ORIGINS}
- PRIMARY_ENCRYPTION_KEY=${OPENFN_PRIMARY_ENCRYPTION_KEY}
- SECRET_KEY_BASE=${OPENFN_SECRET_KEY_BASE}
- WORKER_RUNS_PRIVATE_KEY=${OPENFN_WORKER_RUNS_PRIVATE_KEY}
- WORKER_SECRET=${OPENFN_WORKER_SECRET}
- KAFKA_TRIGGERS_ENABLED=${OPENFN_KAFKA_TRIGGERS_ENABLED}
healthcheck:
test: '${ DOCKER_WEB_HEALTHCHECK_TEST:-curl localhost:4000/health_check}'
interval: '10s'
timeout: '3s'
start_period: '5s'
retries: 3
networks:
- kafka_public
- postgres
worker:
image: ${OPENFN_WORKER_IMAGE}
deploy:
resources:
limits:
cpus: '${OPENFN_DOCKER_WORKER_CPUS:-0}'
memory: '${OPENFN_DOCKER_WORKER_MEMORY:-0}'
environment:
- WORKER_LIGHTNING_PUBLIC_KEY=${OPENFN_WORKER_LIGHTNING_PUBLIC_KEY}
- WORKER_SECRET=${OPENFN_WORKER_SECRET}
- NODE_ENV=production
command: [ 'pnpm', 'start:prod', '-l', 'ws://openfn:${URL_PORT-4000}/worker' ]
networks:
- kafka_public
- postgres

networks:
kafka_public:
name: kafka_public
external: true
postgres:
name: postgres_public
external: true
73 changes: 73 additions & 0 deletions openfn/importer/postgres/create-db.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
const { Pool } = require("pg");

const user = process.env.POSTGRES_USER || "postgres";
const host = process.env.POSTGRES_SERVICE || "postgres-1";
const database = process.env.POSTGRES_DATABASE || "postgres";
const password = process.env.POSTGRES_PASSWORD || "instant101";
const port = process.env.POSTGRES_PORT || 5432;
const newDb = process.env.NEW_DATABASE_NAME || "openfn";
const newUser = process.env.NEW_DATABASE_USER || "openfn";
const newUserPassword = process.env.NEW_DATABASE_PASSWORD || "instant101";

const pool = new Pool({
user,
host,
database,
password,
port,
});

const tableQueries = [];
const insertQueries = [];

(async () => {
const client = await pool.connect();

const createDb = async (db) => {
//Check db exists before creating

const result = await client.query(
"SELECT 1 FROM pg_database WHERE datname = $1",
[db]
);

if (!result.rows.length) {
await client.query(`CREATE DATABASE ${db};`);

console.log(`Database '${db}' created successfully`);
} else {
console.log(`Database '${db}' already exists`);
}
};

const createUser = async () => {
const user = await client.query(
"SELECT 1 FROM pg_user WHERE usename = $1",
[newUser]
);

if (!user.rows.length) {
await client.query(
`CREATE USER ${newUser} WITH ENCRYPTED PASSWORD '${newUserPassword}';`
);
await client.query(
`GRANT ALL PRIVILEGES ON DATABASE ${newDb} TO ${newUser};`
);
console.log(`User ${newUser} created`);
}
};

try {
await createDb(newDb);

await createUser();
await Promise.all(tableQueries.map((query) => client.query(query)));

await Promise.all(insertQueries.map((query) => client.query(query)));
} catch (error) {
console.error("Error in db operations:", error.message);
} finally {
client.release();
pool.end();
}
})();
Loading

0 comments on commit 9a5a967

Please sign in to comment.