Skip to content

Latest commit

 

History

History
227 lines (152 loc) · 7.24 KB

README.md

File metadata and controls

227 lines (152 loc) · 7.24 KB

Content Sources

What is it?

Content Sources is an application for storing information about external content (currently YUM repositories) in a central location.

Developing

Requirements:

  1. podman & podman-compose installed or docker & docker-compose installed (and docker running)
    • This is used to start a set of containers that are dependencies for content-sources-backend
  2. yaml2json tool installed (pip install json2yaml).

Create your configuration

Create a config file from the example:

$ cp ./configs/config.yaml.example ./configs/config.yaml

Build needed kafka container

$ make compose-build

Start dependency containers

$ make compose-up

Run the server!

$ make run

Hit the API:

  $ curl -H "$( ./scripts/header.sh 9999 1111 )" http://localhost:8000/api/content-sources/v1.0/repositories/

Stop dependency containers

When its time to shut down the running containers:

$ make compose-down

And clean the volume that it uses by (this stops the container before doing it if it were running):

$ make compose-clean

There are other make rules that could be helpful, run make help to list them. Some are highlighted below

Database Commands

Migrate the Database

$ make db-migrate-up

Seed the database

$ make db-migrate-seed

Get an interactive shell:

$ make db-shell

Or open directly a postgres client by running:

$ make db-cli-connect

Kafka commands

You can open an interactive shell by:

$ make kafka-shell

You can run kafka-console-consumer.sh using KAFKA_TOPIC by:

$ make kafka-topic-consume KAFKA_TOPIC=my-kafka-topic
$ make kafka-topic-consume # Use the first topic at KAFKA_TOPICS list

There are other make rules that could be helpful, run make help to list them.

Start / Stop prometheus

Create the configuration for prometheus, getting started with the example one.

Update the configs/prometheus.yaml file to set your hostname instead of localhost at scrape_configs.job_name.targets:

# Note that the targets object cannot reference localhost, it needs the name of your host where
# the prometheus container is executed.
$ cat ./configs/prometheus.example.yaml | sed "s/localhost/$(hostname)/g" > ./configs/prometheus.yaml

To start prometheus run:

$ make prometheus-up

To stop prometheus container run:

$ make prometheus-down

To open the prometheus web UI, once the container is up, run the below:

$ make prometheus-ui

Start / Stop mock for rbac

Configuration requirements

  • To use this you need to enable RBAC into config/configs.yaml file:

    clients:
      rbac_enabled: True
      rbac_base_url: http://localhost:8800/api/rbac/v1
      rbac_timeout: 30
    mocks:
      rbac:
        user_read_write: ["[email protected]","jdoe"]
        user_read: ["[email protected]","tdoe"]

Running it

  • Run the application by: make run or ./release/content-sources api consumer instrumentation mock_rbac.
  • Make some request using: ./scripts/header.sh 12345 [email protected] for admin or ./scripts/header.sh 12345 [email protected] for viewer.

RBAC mock service is started for make run To use it running directly the service: ./release/content-sources api consumer instrumentation mock_rbac Add the option mock_rbac

Migrate your database (and seed it if desired)

$ make db-migrate-up
$ make db-migrate-seed

Run the server!

$ make run

Hit the API:

$ curl -H "$( ./scripts/header.sh 9999 1111 )" http://localhost:8000/api/content-sources/v1.0/repositories/

Generating new openapi docs:

$ make openapi

Configuration

The default configuration file in ./configs/config.yaml.example shows all available config options. Any of these can be overridden with an environment variable. For example "database.name" can be passed in via an environment variable named "DATABASE_NAME".

Linting

To use golangci-lint:

  1. make install-golangci-lint
  2. make lint

To use pre-commit linter: make install-pre-commit

Code Layout

Path Description
api Openapi docs and doc generation code
db/migrations Database Migrations
pkg/api API Structures that are used for handling data within our API Handlers
pkg/config Config loading and application bootstrapping code
pkg/dao Database Access Object. Abstraction layer that provides an interface and implements it for our default database provider (postgresql). It is separated out for abstraction and easier testing
pkg/db Database connection and migration related code
pkg/handler Methods that directly handle API requests
pkg/middleware Hold all the middleware components created for the service.
pkg/event Event message logic. Mre info here.
pkg/models Structs that represent database models (Gorm)
pkg/seeds Code to help seed the database for both development and testing

More info