Skip to content

Commit

Permalink
Merge pull request #59 from CDLUC3/development
Browse files Browse the repository at this point in the history
Initial sync of development to main
  • Loading branch information
briri authored Aug 8, 2024
2 parents e697be2 + 1d722ef commit 413280a
Show file tree
Hide file tree
Showing 101 changed files with 17,248 additions and 1,318 deletions.
1 change: 1 addition & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
node_modules/
23 changes: 23 additions & 0 deletions .editorconfig
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
# https://editorconfig.org

# Taken from Facebook's config

# top-most EditorConfig file
root = true

[*]
charset = utf-8
# Unix-style newlines with a newline ending every file
end_of_line = lf
insert_final_newline = true
indent_size = 2
indent_style = space
max_line_length = 80
trim_trailing_whitespace = true

[*.md]
max_line_length = 0
trim_trailing_whitespace = false

[COMMIT_EDITMSG]
max_line_length = 0
29 changes: 29 additions & 0 deletions .env-example
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
# Basic config used whether running in the local Docker env or in the cloud
LOG_LEVEL=debug
DMSP_BASE_URL=https://doi.org/10.11111/ZZ

# If you are running this system locally and want to run "offline" you should
# set this variable to `true`
USE_MOCK_DATA=false

# If you want access to live data you will need to set the above variable to `false`
# and then provide fill out the following variables
AWS_REGION=us-west-2

# JSON Web Token (JWT) settings
JWT_SECRET=ihef93hgf9-u3hgfi3hfte4g4tg4tg4
JWT_TTL=1hr

# DMPHub API
DMPHUB_AUTH_URL=https://auth.mydomain.edu
DMPHUB_API_BASE_URL=https://api.mydomain.edu
DMPHUB_API_CLIENT_ID=1234567890
DMPHUB_API_CLIENT_SECRET=zyxwvutsrq

# MySQL database connections
MYSQL_CONNECTION_LIMIT=5
MYSQL_HOST=localhost
MYSQL_PORT=3306
MYSQL_DATABASE=dmsp
MYSQL_USER=root
MYSQL_PASSWORD=
17 changes: 13 additions & 4 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,14 +1,23 @@
# Ignore the Dotenv file
.env

# Ignore the local data-migration log.
data-migrations/processed.log

# Skip all of the dependencies in node_modules
node_modules/

# Skip the compile distribution files
dist/

# testing
/coverage
# Skip the Jest code coverage folder
coverage/

# Ignore the persisted data sources for the local Docker dev env
docker/

# Ignore VS Code files, like the launch.json file
.vscode/

# Visual Studio Code
.vscode
# OS specific files
.DS_Store
34 changes: 34 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@

### Added
- Schema, Mocks, Models and Resolvers for Affiliations and tests for the Models and Resolvers
- Added new DataSource for the DmptoolApi with endpoints for Affiliations and a new mock for this data source for use in tests

### Updated
- Updated schemas.ts, resolvers.ts, mocks.ts and codegen.ts to use new Affiliation files
- Updated express.ts middleware file to pull in and initialize the new DmptoolApi datasource

## v0.1
Initial Apollo Server build

### Added
- Added unit tests for User model and contributorRole resolver, and added @types/pino
- Added editor config
- initial Apollo server config
- Initial Schema for ContributorRole
- Initial Schema for DMSP
- Resolvers for ContributorRole
- Initial resolver to fetch a single DMSP from the DMPHub API
- Initial DMPHub API data source
- Initial MySQL data source
- Custom GraphQL scalars for ROR, ORCID and DMSP IDs
- Mechanism for Apollo to use mocks when a resolver has not yet been implemented
- Mocks for User
- Data migration mechanism `./data-migrations/process.sh`
- Documentation!
- Local Docker Compose config
- Pino logger with ECS formatter
- Plugin to log request/response lifecycle events
- Add Logger to the context and then used it in the resolvers

### Updated
- Made some updates to auth code based on testing out recent changes with frontend [#34]
26 changes: 26 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
# Dockerfile
# preferred node version chosen here (22.1.0-alpine3.19 as of 05/04/2024)
FROM public.ecr.aws/docker/library/node:22.1.0-alpine3.19

# Create the directory on the node image
# where our Next.js app will live
RUN mkdir -p /app

# Set /app as the working directory in container
WORKDIR /app

# Copy package.json and package-lock.json
# to the /app working directory
COPY package*.json tsconfig.json codegen.ts .env ./

# Copy the rest of our Apollo Server folder into /app
COPY . .

# Install dependencies in /app
RUN npm install

# Ensure port 3000 is accessible to our system
EXPOSE 4000

# Command to run the Next.js app in development mode
CMD ["npm", "run", "dev"]
20 changes: 20 additions & 0 deletions Dockerfile.aws
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# syntax = docker/dockerfile:1

# This version of the Dockerfile is used by the buildspec.yaml within the AWS environment
FROM public.ecr.aws/docker/library/node:current-alpine

# Create the directory on the node image where our Apollo server will live
RUN mkdir -p /dist

# Copy package.json and package-lock.json to the /app working directory
COPY package*.json ./

# Build the node_modules for production mode
RUN npm install

# The app was built in the CodeBuild buildspec.yaml, so just copy dist/ in
COPY dist/ ./dist

EXPOSE 4000

CMD ["node", "dist/index.js"]
63 changes: 62 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,65 @@

Apollo Server to support GraphQL interactions with the UI and external partners.

To run locally: `npm start`
Our Apollo server installation consists of:
- Data Sources: Code used to communicate with external APIs, databases, file systems, etc.
- GraphQL Schemas: The definition of our Graph including types, queries and mutations
- Resolvers: The code that processes incoming queries and uses the available data sources to generate the results
- Mocks: Stub or placeholder data that can be returned when a resolver has not yet been developed

## Installation
- Make a local copy of the example dotenv file: `cp .env-example .env`
- Setup MySQL:
- If you are running on a Mac:
- If you have homebrew installed, run `brew install mysql` and then start it `brew services start mysql`
- Initialize the Database and the dataMigrations table: `./data-migrations/database-init.sh`
- Run all database migrations: `./data-migrations/process.sh`
- Install all of the dependencies: `npm install`
- Generate the Typescript types: `npm run generate`
- Startup the application in development mode: `npm run dev`
- Navigate to `http://localhost:4000` in your browser

## Running current database migrations
- See the readme file in the `data-migrations` directory for instructions on running data migrations in your local environment.

## Adding a new query/mutation
You should always start by updating an existing GraphQL Schema file or adding a new one to the `src/schemas` directory.

Please include comments everywhere. These comments appear in the GraphQL explorer and help others undertand how to interact with the query or mutation.

If you added a new schema file, make sure you update the `src/schemas.ts` file to pull in the new file when the server starts up.

Once the schema has been added, you will need to run `npm run generate` this kicks off a script that builds out Typescript Types for the new schema and queries.

### Create a new Model
You will need to create a Model if your new query/mutation will need to transform the response from the data source in any way prior to sending it to the caller or to the data source.

For example:
- If my data source returns a property called `funder_id` and I want to send a boolean flag called `isFunder` to the caller, I perform the logic in a Model.
- If I simply want to rename a property prior to returning it to the client like the data source returning `identifier` but needing to send `DMPId` to the caller.

Make sure that you transform the raw response from the data source into your Model in your new resolver.
For example:
```
const response = await someDataSource.query('test');
return new MyModel(response);
```

### Create a Mock
If you will be unable to create the corresponding resolver(s) at this point because of time constraints or because the data source is not yet ready, then you should add a new Mock file to the `src/schemas/` directory (or update and existing one with your changes). If you add a new mock be sure to update the `src/mocks.ts` to pull in your new mock when the server starts up.

Note that mocks should represent the data structure that will be returned from your resolver to the caller. NOT the dtat structure that the resolver receives from the data source!

### Create a Resolver
If your data source is ready and you have the bandwidth, add a new Resolver to the `src/resolvers/` directory (or update one with your new query/mutation). If you add a new Resolver be sure to update the `src/resolvers.ts` file to make sure it is included when the server starts up.

### Add tests
You MUST add tests if you added or modified a Model! To do so, find the corresponding file (or add a new one) in the `src/models/__tests_/` directory.

Resolver tests are not yet particularly useful. We will be updating this to add these integration tests in the near future.

## Useful commands
- To run the Codegen utility to generate our Typescript types: `npm run generate`
- To run the server in development mode: `npm run dev`
- To run the server normally: `npm start`
- To build the application: `npm run build`
79 changes: 79 additions & 0 deletions buildspec.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
# Build specifications for AWS CodeBuild
# See: https://docs.aws.amazon.com/codebuild/latest/userguide/build-spec-ref.html

# Each input artifact is extracted to its own directory by CodePipeline, the locations of which
# are stored in environment variables. The directory for the primary source artifact (this repo)
# is made available with $CODEBUILD_SRC_DIR. The directory for the DMPTool push artifacts is
# made available with $CODEBUILD_SRC_DIR_dmptool-commit.
# Do not change version. This is the version of aws buildspec, not the version of your buldspec file.
version: 0.2

phases:
install:
runtime-versions:
# Apollo gives error when building on < 21
nodejs: 21
commands:
# Install any libraries necessary for testing and compilation
# - echo Installing Mocha...
# - npm install -g mocha
pre_build:
commands:
# Set some ENV variables here because CF only allows a limit of 1000 characters in the
# EnvironmentVariable config for the Pipeline action :(
- export AWS_VERSION=$(aws --version)

# Fetch the ECR repository name
- echo $ECR_REPOSITORY_URI >> .ecr
- export SHORT_ECR_URI=$(awk '{gsub("$ECR_REPOSITORY_NAME", "");print}' .ecr)
- rm .ecr

# Set the repository URI to your ECR image and add an image tag with the first seven characters of the Git
# commit ID of the source.
- echo Logging in to Amazon ECR ...
- aws ecr get-login-password --region $AWS_REGION | docker login --username AWS --password-stdin $SHORT_ECR_URI
- IMAGE_TAG=${COMMIT_HASH:=apollo-latest}

# Install MySQL so we can run DB migrations
# - dnf -y install mariadb105
build:
commands:
- echo "Running build in ${NODE_ENV} mode - started on `date`"
- cd $CODEBUILD_SRC_DIR

# - echo Checking for DB migrations
# - cd $CODEBUILD_SRC_DIR
# - cd data-migrations && ./process-aws.sh $NODE_ENV && cd ..

# Install all of the dependencies (including dev so we can compile TS)
- npm install --production=false

# Generate all of the GraphQL schema types
- npm run generate

# Run any tests here
# - npm run test

# Build the Apollo server which writes to the ./dist dir
- npm run build

- echo Building the Docker image...
- docker build -f Dockerfile.aws -t $SHORT_ECR_URI:apollo-latest .
- docker tag $ECR_REPOSITORY_URI:apollo-latest $SHORT_ECR_URI:$IMAGE_TAG
post_build:
commands:
# Push the Docker image to the ECR repository. Fargate will pick it up an deploy automatically
- echo Build completed on `date`
- echo Pushing the Docker images...
- cd $CODEBUILD_SRC_DIR
- docker push $SHORT_ECR_URI:apollo-latest
- docker push $SHORT_ECR_URI:$IMAGE_TAG

- echo Writing image definitions file...
- printf '[{"name":"%s","imageUri":"%s"}]' $TASK_DEFINITION_CONTAINER_NAME $ECR_REPOSITORY_URI:$IMAGE_TAG > imagedefinitions.json

- echo Build completed on `date`

artifacts:
# The Deploy step is expecting this name
files: imagedefinitions.json
20 changes: 20 additions & 0 deletions codegen.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
import type { CodegenConfig } from "@graphql-codegen/cli";

const config: CodegenConfig = {
schema: "./src/schemas/*.ts",
generates: {
"./src/types.ts": {
plugins: ["typescript", "typescript-resolvers"],
config: {
contextType: "./context#MyContext",
enumsAsTypes: true,
mappers: {
Dmsp: "./models/Dmsp#DmspModel",
ContributorRole: "./models/ContributorRole#ContributorRoleModel",
},
},
},
},
};

export default config;
11 changes: 11 additions & 0 deletions data-migrations/2024-05-02-1530-create-contributor-roles.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
CREATE TABLE `contributorRoles` (
`id` int NOT NULL AUTO_INCREMENT,
`label` varchar(255) NOT NULL,
`url` varchar(255) NOT NULL,
`description` text,
`displayOrder` int NOT NULL,
`created` timestamp DEFAULT CURRENT_TIMESTAMP,
`modified` timestamp DEFAULT CURRENT_TIMESTAMP,
PRIMARY KEY (`id`),
CONSTRAINT unique_contributor_role_url UNIQUE (`url`)
) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8mb3;
6 changes: 6 additions & 0 deletions data-migrations/2024-05-03-1414-seed-contributor-roles.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
INSERT INTO contributorRoles (label, url, description, displayOrder)
VALUES
('Data Manager', 'https://credit.niso.org/contributor-roles/data-curation/', 'An individual engaged in management activities to annotate (produce metadata), scrub data and maintain research data (including software code, where it is necessary for interpreting the data itself) for initial use and later re-use.', 3),
('Principal Investigator (PI)', 'https://credit.niso.org/contributor-roles/investigation/', 'An individual conducting a research and investigation process, specifically performing the experiments, or data/evidence collection.', 1),
('Project Administrator', 'https://credit.niso.org/contributor-roles/project-administration/', 'An individual with management and coordination responsibility for the research activity planning and execution.', 2),
('Other', 'http://dmptool.org/contributor_roles/other', '', 4);
7 changes: 7 additions & 0 deletions data-migrations/2024-06-04-1103-create-user-table.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
CREATE TABLE `users` (
`id` INT AUTO_INCREMENT PRIMARY KEY,
`email` varchar(255) NOT NULL,
`password` varchar(255) NOT NULL,
`created` TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
`modified` TIMESTAMP DEFAULT CURRENT_TIMESTAMP
)
8 changes: 8 additions & 0 deletions data-migrations/2024-06-04-1255-update-user-table.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
ALTER TABLE `users`
ADD COLUMN `role` VARCHAR(16) AFTER `password`;

ALTER TABLE `users`
ADD COLUMN `givenName` VARCHAR(255) AFTER `role`;

ALTER TABLE `users`
ADD COLUMN `surName` VARCHAR(255) AFTER `givenName`;
12 changes: 12 additions & 0 deletions data-migrations/2024-06-07-1259-create-oauthClients.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
CREATE TABLE `oauthClients` (
`id` INT AUTO_INCREMENT PRIMARY KEY,
`name` varchar(255) NOT NULL,
`redirectUris` text NOT NULL,
`grants` varchar(255) NOT NULL,
`clientId` varchar(255) NOT NULL,
`clientSecret` varchar(255) NOT NULL,
`userId` INT NOT NULL,
`created_at` TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
`modified` TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT unique_contributor_name UNIQUE (`name`)
);
11 changes: 11 additions & 0 deletions data-migrations/2024-06-07-1311-create-oauthCodes.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
CREATE TABLE `oauthCodes` (
`code` varchar(255) PRIMARY KEY,
`redirectUri` varchar(255) NOT NULL,
`scope` varchar(255) NOT NULL,
`clientId` varchar(255) NOT NULL,
`userId` INT NOT NULL,
`codeChallenge` varchar(255),
`codeChallengeMethod` varchar(255),
`created_at` TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
`modified` TIMESTAMP DEFAULT CURRENT_TIMESTAMP
) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8mb3;
Loading

0 comments on commit 413280a

Please sign in to comment.