Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

T28 write access #35

Merged
merged 10 commits into from
Jan 9, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,14 @@ FROM python:3.9-slim

WORKDIR /opt/orcid_integration

RUN apt-get update && apt-get install -y
RUN apt-get install -y libxml2-dev libxmlsec1-dev libxmlsec1-openssl build-essential pkg-config

COPY *.py ./
COPY requirements.txt .
COPY orcidflask/*.py ./orcidflask/
COPY orcidflask/templates ./orcidflask/templates/

RUN apt-get update && apt-get install -y
RUN apt-get install -y libxml2-dev libxmlsec1-dev libxmlsec1-openssl build-essential pkg-config
RUN pip install -r requirements.txt

ENV FLASK_APP=orcidflask
Expand Down
49 changes: 20 additions & 29 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,47 +9,38 @@ ORCID middleware to enable our researchers to designate GW as a trusted partner
3. Copy the example Flask configuration file and edit it to provide sensitive keys, including the SERVER_KEY, ORCID client ID and ORCID client secret. The `SERVER_KEY` should be the key used to encrypt the Flask session objects, as described [here](https://flask.palletsprojects.com/en/2.2.x/config/).
`cp example.config.py config.py`
4. Copy `example.docker-compose.yml` to `docker-compose.yml` and `example.env` to `.env`.
5. Bring up the Docker container(s): `docker-compose up -d`. This will install all necessary dependencies and launch the Flask app with gunicorn on port `8080`. For development, comment out the first three lines under the `volumes` section of the `flask-app` service and uncomment the line `.:/opt/orcid_integration`. This will use the local copy of the Python code.
5. Add the hostname of your server to the `VIRTUAL_HOST` environment variable in `.env`.
- If using SSL, see the additional instructions below for configuring the Nginx Docker container.
- If not using SSL, comment out the volume mapping in the `docker-compose.yml` file under the `nginx-proxy` service.
6. Bring up the Docker container(s): `docker-compose up -d`. This will install all necessary dependencies and launch the Flask app with gunicorn on port `8080`, and it will start an Nginx server to proxy port `8080` to `80`/`443`.
- For development, comment out the first three lines under the `volumes` section of the `flask-app` service and uncomment the line `.:/opt/orcid_integration`. This will use the local copy of the Python code.
6. When the Flask app starts up, it will check for the presence of a database encryption key file (as specified in `example.env`). If the file is not present, it will create a new database encryption key. **Be careful with this key.** Once the data has been encrypted using it, the key is necessary to decrypt the data again. Loss of the key means loss of the data.
7. The postgres container will store data outside of the container, in the `./data` directory.
- When first run, postgres will set the permissions on this directory to a system user.
- To avoid having the reset permissions on `./data` every time you start up the container, **after starting the container the first time**, modify the `db` service in `docker-compose.yml` to include the following line (where `UID` and `GID` are the system ID's of the user and group to which you want to assign ownership of the `./data` directory):
```
user: "UID:GID"
```
- To setup the database, run the migrations:
```
docker exec -it orcid-integration-flask-app-1 /bin/bash
flask db upgrade
```
8. If you need to provide an XML file to your SAML IdP, with the `flask-app` container running, do the following:
```
docker exec -it orcid-integration_flask-app_1 /bin/bash
python generate_saml_metadata.xml
docker exec -it orcid-integration-flask-app-1 /bin/bash
python generate_saml_metadata.py
```
The SAML metadata file should be written to the `orcidflask/saml` directory (bind-mounted outside the container).
9. For SSL, use gunicorn with nginx:

### SSL with Nginx proxy

1. Create SSL key and cert (either self-signed or using a certificate authority)
2. Install nginx: `sudo apt-get install nginx`
3. Remove the defaul SSL configuration:
`cd /etc/nginx/sites-enabled`
`sudo rm default`
4. Create a new nginx configuration to proxy to the Flask app as follows:
```
server {
listen 80;
listen [::]:80;
server_name gworcid-dev.wrlc.org;
return 302 https://$server_name$request_uri;
}
server {
listen 443 ssl;
listen [::]:443 ssl;
ssl_certificate /etc/ssl/certs/server.crt;
ssl_certificate_key /etc/ssl/private/server.key;
location / {
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header HOST $http_host;
proxy_pass http://127.0.0.1:8080;
proxy_redirect off;
}
}```
10. To quickly serialize the database as a JSON file, you can run the following command (if outside the container), providing the path to a file in a mounted volume:
2. Follow the name conventions in the [nginx-proxy documentation](https://github.com/nginx-proxy/nginx-proxy/tree/main/docs#ssl-support), ensuring that the key and certificate files are placed in the same directory, which should be mapped to the `/etc/nginx/certs` directory in the `docker-compose.yml` file.

### Serializing the database

To quickly serialize the database as a JSON file, you can run the following command (if outside the container), providing the path to a file in a mounted volume:
```
docker exec -it orcid-integration_flask-app_1 flask serialize-db ./data/token-dump.json
```
Empty file modified data/.keep
100755 → 100644
Empty file.
14 changes: 14 additions & 0 deletions example.docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,3 +29,17 @@ services:
- ./config.py:/opt/orcid_integration/config.py
- ./orcidflask/db:/opt/orcid_integration/orcidflask/db
#- .:/opt/orcid_integration
restart: always
nginx-proxy:
image: nginxproxy/nginx-proxy:1.5
environment:
- LOG_JSON=true
ports:
- "443:443"
- "80:80"
volumes:
- /var/run/docker.sock:/tmp/docker.sock:ro
# Note that the nginxproxy image require cert & key to reside in the same directory
# And to follow certain naming conventions
- /etc/ssl/certs:/etc/nginx/certs
restart: always
1 change: 1 addition & 0 deletions example.env
Original file line number Diff line number Diff line change
Expand Up @@ -6,3 +6,4 @@ POSTGRES_PORT=5432
DB_ENCRYPTION_FILE=/opt/orcid_integration/orcidflask/db/db-encrypt.key
# Values are sandbox or prod
ORCID_SERVER=sandbox
VIRTUAL_HOST=
1 change: 1 addition & 0 deletions migrations/README
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Single-database configuration for Flask.
50 changes: 50 additions & 0 deletions migrations/alembic.ini
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
# A generic, single database configuration.

[alembic]
# template used to generate migration files
# file_template = %%(rev)s_%%(slug)s

# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false


# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic,flask_migrate

[handlers]
keys = console

[formatters]
keys = generic

[logger_root]
level = WARN
handlers = console
qualname =

[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine

[logger_alembic]
level = INFO
handlers =
qualname = alembic

[logger_flask_migrate]
level = INFO
handlers =
qualname = flask_migrate

[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic

[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S
113 changes: 113 additions & 0 deletions migrations/env.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,113 @@
import logging
from logging.config import fileConfig

from flask import current_app

from alembic import context

# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config

# Interpret the config file for Python logging.
# This line sets up loggers basically.
fileConfig(config.config_file_name)
logger = logging.getLogger('alembic.env')


def get_engine():
try:
# this works with Flask-SQLAlchemy<3 and Alchemical
return current_app.extensions['migrate'].db.get_engine()
except (TypeError, AttributeError):
# this works with Flask-SQLAlchemy>=3
return current_app.extensions['migrate'].db.engine


def get_engine_url():
try:
return get_engine().url.render_as_string(hide_password=False).replace(
'%', '%%')
except AttributeError:
return str(get_engine().url).replace('%', '%%')


# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
config.set_main_option('sqlalchemy.url', get_engine_url())
target_db = current_app.extensions['migrate'].db

# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.


def get_metadata():
if hasattr(target_db, 'metadatas'):
return target_db.metadatas[None]
return target_db.metadata


def run_migrations_offline():
"""Run migrations in 'offline' mode.
This configures the context with just a URL
and not an Engine, though an Engine is acceptable
here as well. By skipping the Engine creation
we don't even need a DBAPI to be available.
Calls to context.execute() here emit the given string to the
script output.
"""
url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url, target_metadata=get_metadata(), literal_binds=True
)

with context.begin_transaction():
context.run_migrations()


def run_migrations_online():
"""Run migrations in 'online' mode.
In this scenario we need to create an Engine
and associate a connection with the context.
"""

# this callback is used to prevent an auto-migration from being generated
# when there are no changes to the schema
# reference: http://alembic.zzzcomputing.com/en/latest/cookbook.html
def process_revision_directives(context, revision, directives):
if getattr(config.cmd_opts, 'autogenerate', False):
script = directives[0]
if script.upgrade_ops.is_empty():
directives[:] = []
logger.info('No changes in schema detected.')

conf_args = current_app.extensions['migrate'].configure_args
if conf_args.get("process_revision_directives") is None:
conf_args["process_revision_directives"] = process_revision_directives

connectable = get_engine()

with connectable.connect() as connection:
context.configure(
connection=connection,
target_metadata=get_metadata(),
**conf_args
)

with context.begin_transaction():
context.run_migrations()


if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()
24 changes: 24 additions & 0 deletions migrations/script.py.mako
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
"""${message}

Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}

"""
from alembic import op
import sqlalchemy as sa
${imports if imports else ""}

# revision identifiers, used by Alembic.
revision = ${repr(up_revision)}
down_revision = ${repr(down_revision)}
branch_labels = ${repr(branch_labels)}
depends_on = ${repr(depends_on)}


def upgrade():
${upgrades if upgrades else "pass"}


def downgrade():
${downgrades if downgrades else "pass"}
39 changes: 39 additions & 0 deletions migrations/versions/ac9a61050c66_initial_migration.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
"""Initial migration.

Revision ID: ac9a61050c66
Revises:
Create Date: 2024-04-26 13:11:48.534267

"""
from alembic import op
import sqlalchemy as sa
from orcidflask.models import EncryptedValue


# revision identifiers, used by Alembic.
revision = 'ac9a61050c66'
down_revision = None
branch_labels = None
depends_on = None


def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('token',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('userId', sa.String(length=80), nullable=False),
sa.Column('access_token', EncryptedValue(), nullable=False),
sa.Column('refresh_token', EncryptedValue(), nullable=False),
sa.Column('expires_in', sa.Integer(), nullable=False),
sa.Column('token_scope', sa.String(length=80), nullable=False),
sa.Column('orcid', sa.String(length=80), nullable=False),
sa.Column('timestamp', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=True),
sa.PrimaryKeyConstraint('id')
)
# ### end Alembic commands ###


def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_table('token')
# ### end Alembic commands ###
5 changes: 3 additions & 2 deletions orcidflask/__init__.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
from flask_migrate import Migrate
import os
import click
from orcid_utils import load_encryption_key, new_encryption_key
Expand All @@ -20,8 +21,8 @@
app.config['orcid_register_url'] = base_url + '/oauth/authorize?client_id={orcid_client_id}&response_type=code&scope={scopes}&redirect_uri={redirect_uri}&family_names={lastname}&given_names={firstname}&email={emailaddress}&show_login=false'
app.config['orcid_token_url'] = base_url + '/oauth/token'
app.config['SAML_PATH'] = os.path.join(os.path.dirname(os.path.abspath(__file__)), 'saml')
app.config["SESSION_COOKIE_DOMAIN"] = app.config["SERVER_NAME"]
app.secret_key = app.config['SECRET_KEY']

postgres_user = os.getenv('POSTGRES_USER')
postgres_pwd = os.getenv('POSTGRES_PASSWORD')
postgres_db_host = os.getenv('POSTGRES_DB_HOST')
Expand All @@ -32,7 +33,7 @@
db_key_file = os.getenv('DB_ENCRYPTION_FILE')
app.config['db_encryption_key'] = load_encryption_key(db_key_file)
db = SQLAlchemy(app)
db.create_all()
migrate = Migrate(app, db)

import orcidflask.views
from orcidflask.models import Token
Expand Down
Loading