Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DM-41664: Rename prompt_prototype package and containers #96

Merged
merged 6 commits into from
Nov 14, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/build-base.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ jobs:
name: Update base image
runs-on: ubuntu-latest
env:
IMAGE_NAME: prompt-proto-base
IMAGE_NAME: prompt-base
STACK_TAG: ${{ inputs.stackTag }}
steps:
- name: Checkout code
Expand Down
12 changes: 6 additions & 6 deletions .github/workflows/build-service.yml
Original file line number Diff line number Diff line change
Expand Up @@ -61,14 +61,14 @@ jobs:
- name: Run tests
run: |
docker run \
-v $GITHUB_WORKSPACE:/home/lsst/prompt_prototype \
ghcr.io/${{ github.repository_owner }}/prompt-proto-base:${{ matrix.baseTag }} \
-v $GITHUB_WORKSPACE:/home/lsst/prompt_processing \
ghcr.io/${{ github.repository_owner }}/prompt-base:${{ matrix.baseTag }} \
bash -c '
cd /home/lsst/prompt_prototype
cd /home/lsst/prompt_processing
source /opt/lsst/software/stack/loadLSST.bash
setup -r .
# Fix permissions; arg must be absolute path.
git config --global --add safe.directory /home/lsst/prompt_prototype
git config --global --add safe.directory /home/lsst/prompt_processing
scons'

update-service-image:
Expand All @@ -81,7 +81,7 @@ jobs:
matrix:
baseTag: ${{ fromJSON(needs.matrix-gen.outputs.matrix) }}
env:
IMAGE_NAME: prompt-proto-service
IMAGE_NAME: prompt-service
BASE_TAG: ${{ matrix.baseTag }}
steps:
- name: Checkout code
Expand All @@ -94,7 +94,7 @@ jobs:
password: ${{ secrets.GITHUB_TOKEN }}
- name: Determine eups tag
run: |
docker run ghcr.io/${{ github.repository_owner }}/prompt-proto-base:"$BASE_TAG" bash -c "cat stack/miniconda*/ups_db/global.tags" > eups.tag
docker run ghcr.io/${{ github.repository_owner }}/prompt-base:"$BASE_TAG" bash -c "cat stack/miniconda*/ups_db/global.tags" > eups.tag
echo "Eups tag = $(< eups.tag)"
- name: Build image
run: |
Expand Down
4 changes: 2 additions & 2 deletions Dockerfile.activator
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
ARG BASE_TAG=latest
FROM ghcr.io/lsst-dm/prompt-proto-base:${BASE_TAG}
FROM ghcr.io/lsst-dm/prompt-base:${BASE_TAG}
ENV PYTHONUNBUFFERED True
ENV APP_HOME /app
ENV PROMPT_PROTOTYPE_DIR $APP_HOME
ENV PROMPT_PROCESSING_DIR $APP_HOME
ARG PUBSUB_VERIFICATION_TOKEN
ARG PORT
WORKDIR $APP_HOME
Expand Down
14 changes: 8 additions & 6 deletions README.rst
Original file line number Diff line number Diff line change
@@ -1,13 +1,15 @@
################
prompt_prototype
################
#################
prompt_processing
#################

``prompt_prototype`` is a package in the `LSST Science Pipelines <https://pipelines.lsst.io>`_.
``prompt_processing`` is a package in the `LSST Science Pipelines <https://pipelines.lsst.io>`_.

``prompt_prototype`` contains code used for testing concepts for the Rubin Observatory Prompt Processing framework.
The design for the framework is described in `DMTN-219`_.
``prompt_processing`` contains code used for the Rubin Observatory Prompt Processing framework.
The design for the framework is described in `DMTN-219`_ and `DMTN-260`_.

.. _DMTN-219: https://dmtn-219.lsst.io/

.. _DMTN-260: https://dmtn-260.lsst.io/

At present, the package does not conform exactly to the layout or conventions of LSST stack packages.
In particular, the Python code does not use the ``lsst`` namespace, and the docs do not support Sphinx builds (even local ones).
2 changes: 1 addition & 1 deletion SConstruct
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# -*- python -*-
from lsst.sconsUtils import scripts
# Python-only package
scripts.BasicSConstruct("prompt_prototype", disableCc=True, noCfgFile=True)
scripts.BasicSConstruct("prompt_processing", disableCc=True, noCfgFile=True)
2 changes: 1 addition & 1 deletion bin.src/make_hsc_rc2_export.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
#!/usr/bin/env python
# This file is part of prompt_prototype.
# This file is part of prompt_processing.
#
# Developed for the LSST Data Management System.
# This product includes software developed by the LSST Project
Expand Down
2 changes: 1 addition & 1 deletion bin.src/make_latiss_export.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
#!/usr/bin/env python
# This file is part of prompt_prototype.
# This file is part of prompt_processing.
#
# Developed for the LSST Data Management System.
# This product includes software developed by the LSST Project
Expand Down
2 changes: 1 addition & 1 deletion bin.src/make_preloaded_export.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
#!/usr/bin/env python
# This file is part of prompt_prototype.
# This file is part of prompt_processing.
#
# Developed for the LSST Data Management System.
# This product includes software developed by the LSST Project
Expand Down
4 changes: 2 additions & 2 deletions bin.src/make_remote_butler.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
#!/usr/bin/env python
# This file is part of prompt_prototype.
# This file is part of prompt_processing.
#
# Developed for the LSST Data Management System.
# This product includes software developed by the LSST Project
Expand Down Expand Up @@ -49,7 +49,7 @@ def _make_parser():
parser.add_argument("--target-repo", required=True,
help="The URI of the repository to create.")
parser.add_argument("--seed-config",
default=os.path.join(getPackageDir("prompt_prototype"), "etc", "db_butler.yaml"),
default=os.path.join(getPackageDir("prompt_processing"), "etc", "db_butler.yaml"),
help="The config file to use for the new repository. Defaults to etc/db_butler.yaml.")
parser.add_argument("--export-file", default="export.yaml",
help="The export file containing the repository contents. Defaults to ./export.yaml.")
Expand Down
2 changes: 1 addition & 1 deletion bin.src/make_template_export.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
#!/usr/bin/env python
# This file is part of prompt_prototype.
# This file is part of prompt_processing.
#
# Developed for the LSST Data Management System.
# This product includes software developed by the LSST Project
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
#!/bin/bash
# This file is part of prompt_prototype.
# This file is part of prompt_processing.
#
# Developed for the LSST Data Management System.
# This product includes software developed by the LSST Project
Expand All @@ -21,7 +21,7 @@
# along with this program. If not, see <https://www.gnu.org/licenses/>.

# This script uploads the raw files from the HSC PDR2 run to a bucket at USDF.
# It renames the files to match prompt_prototype conventions. The user must
# It renames the files to match prompt_processing conventions. The user must
# have bucket access already configured.

set -e # Abort on any error
Expand Down
37 changes: 18 additions & 19 deletions doc/playbook.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
#########################################################
Playbook for the Prompt Processing Proposal and Prototype
#########################################################
######################################################
Developers' Playbook for the Prompt Processing Service
######################################################

.. _DMTN-219: https://dmtn-219.lsst.io/

Expand All @@ -9,24 +9,24 @@ Table of Contents

* `Containers`_
* `Buckets`_
* `Prototype Service`_
* `Development Service`_
* `tester`_
* `Databases`_


Containers
==========

The prototype consists of two containers.
The service consists of two containers.
The first is a base container with the Science Pipelines "stack" code and networking utilities.
The second is a service container made from the base that has the Prompt Processing prototype service code.
All containers are managed by `GitHub Container Registry <https://github.com/orgs/lsst-dm/packages?repo_name=prompt_prototype>`_ and are built using GitHub Actions.
The second is a service container made from the base that has the Prompt Processing service code.
All containers are managed by `GitHub Container Registry <https://github.com/orgs/lsst-dm/packages?repo_name=prompt_processing>`_ and are built using GitHub Actions.

To build the base container:

* If there are changes to the container, push them to a branch, then open a PR.
The container should be built automatically.
* If there are no changes (typically because you want to use an updated Science Pipelines container), go to the repository's `Actions tab <https://github.com/lsst-dm/prompt_prototype/actions/workflows/build-base.yml>`_ and select "Run workflow".
* If there are no changes (typically because you want to use an updated Science Pipelines container), go to the repository's `Actions tab <https://github.com/lsst-dm/prompt_processing/actions/workflows/build-base.yml>`_ and select "Run workflow".
From the dropdown, select the branch whose container definition will be used, and the label of the Science Pipelines container.
* New containers built from ``main`` are tagged with the corresponding Science Pipelines release (plus ``w_latest`` or ``d_latest`` if the release was requested by that name).
For automatic ``main`` builds, or if the corresponding box in the manual build is checked, the new container also has the ``latest`` label.
Expand All @@ -42,7 +42,7 @@ To build the service container:

* If there are changes to the service, push them to a branch, then open a PR.
The container should be built automatically using the ``latest`` base container.
* To force a rebuild manually, go to the repository's `Actions tab <https://github.com/lsst-dm/prompt_prototype/actions/workflows/build-service.yml>`_ and select "Run workflow".
* To force a rebuild manually, go to the repository's `Actions tab <https://github.com/lsst-dm/prompt_processing/actions/workflows/build-service.yml>`_ and select "Run workflow".
From the dropdown, select the branch whose code should be built.
The container will be built using the ``latest`` base container, even if there is a branch build of the base.
* To use a base other than ``latest``, edit ``.github/workflows/build-service.yml`` on the branch and override the ``BASE_TAG_LIST`` variable.
Expand Down Expand Up @@ -112,8 +112,8 @@ To inspect them with the MinIO Client ``mc`` tool, first set up an alias (e.g. `

For Butler not to complain about the bucket names, set the environment variable ``LSST_DISABLE_BUCKET_VALIDATION=1``.

Prototype Service
=================
Development Service
===================

The service can be controlled with ``kubectl`` from ``rubin-devl``.
You must first `get credentials for the development cluster <https://k8s.slac.stanford.edu/usdf-prompt-processing-dev>`_ on the web; ignore the installation instructions and copy the commands from the second box.
Expand Down Expand Up @@ -146,8 +146,7 @@ This file fully supports the Go template syntax.

A few useful commands for managing the service:

* ``kubectl config set-context usdf-prompt-processing-dev --namespace=prompt-proto-service`` sets the default namespace for the following ``kubectl`` commands to ``prompt-proto-service``.
Note that many of the workflows in `slaclab/rubin-usdf-prompt-processing`_ run in the ``knative-serving`` or ``knative-eventing`` namespaces; to examine the resources of these workflows, add e.g. ``-n knative-eventing`` to the examples below.
* ``kubectl config set-context usdf-prompt-processing-dev --namespace=prompt-proto-service-<instrument>`` sets the default namespace for the following ``kubectl`` commands to ``prompt-proto-service-<instrument>``.
* ``kubectl get serving`` summarizes the state of the service, including which revision(s) are currently handling messages.
A revision with 0 replicas is inactive.
* ``kubectl get pods`` lists the Kubernetes pods that are currently running, how long they have been active, and how recently they crashed.
Expand Down Expand Up @@ -179,14 +178,14 @@ To delete such services manually:
Identifying a Pod's Codebase
^^^^^^^^^^^^^^^^^^^^^^^^^^^^

To identify which version of ``prompt-prototype`` a pod is running, run
To identify which version of Prompt Processing a pod is running, run

.. code-block:: sh

kubectl describe pod <pod name> | grep "prompt-proto-service@"
kubectl describe pod <pod name> | grep "prompt-service@"

This gives the hash of the service container running on that pod.
Actually mapping the hash to a branch version may require a bit of detective work; `the GitHub container registry <https://github.com/lsst-dm/prompt_prototype/pkgs/container/prompt-proto-service>`_ (which calls hashes "Digests") is a good starting point.
Actually mapping the hash to a branch version may require a bit of detective work; `the GitHub container registry <https://github.com/lsst-dm/prompt_processing/pkgs/container/prompt-service>`_ (which calls hashes "Digests") is a good starting point.

To find the version of Science Pipelines used, find the container's page in the GitHub registry, then search for ``EUPS_TAG``.

Expand Down Expand Up @@ -218,12 +217,12 @@ It can be run from ``rubin-devl``, but requires the user to install the ``conflu

You must have a profile set up for the ``rubin:rubin-pp`` bucket (see `Buckets`_, above).

Install the prototype code, and set it up before use:
Install the Prompt Processing code, and set it up before use:

.. code-block:: sh

git clone https://github.com/lsst-dm/prompt_prototype
setup -r prompt_prototype
git clone https://github.com/lsst-dm/prompt_processing
setup -r prompt_processing

The tester scripts send ``next_visit`` events for each detector via Kafka on the ``next-visit-topic`` topic.
They then upload a batch of files representing the snaps of the visit to the ``rubin:rubin-pp`` S3 bucket, simulating incoming raw images.
Expand Down
2 changes: 1 addition & 1 deletion etc/db_butler.yaml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Seed config for generating Prompt Prototype central repo.
# Seed config for generating Prompt Processing central repo.
# This is the test repo to be used with upload.py, not the repo for processing AuxTel data.
registry:
db: postgresql://[email protected]/ppcentralbutler
Expand Down
2 changes: 1 addition & 1 deletion pipelines/LATISS/ApPipe.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ description: Alert Production pipeline specialized for LATISS

# This file should strive to contain just an import from ap_pipe.
# Exceptions are allowed temporarily when urgent bug fixes and
# prompt_prototype build can't wait for the lsst_distrib
# prompt_processing build can't wait for the lsst_distrib
# release schedule.
imports:
- location: $AP_PIPE_DIR/pipelines/LATISS/ApPipe.yaml
2 changes: 1 addition & 1 deletion pipelines/LATISS/Isr.yaml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
description: ISR-only pipeline specialized for LATISS

imports:
- location: $PROMPT_PROTOTYPE_DIR/pipelines/LATISS/ApPipe.yaml
- location: $PROMPT_PROCESSING_DIR/pipelines/LATISS/ApPipe.yaml
include:
- isr
2 changes: 1 addition & 1 deletion pipelines/LATISS/SingleFrame.yaml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
description: Single-frame pipeline specialized for LATISS

imports:
- location: $PROMPT_PROTOTYPE_DIR/pipelines/LATISS/ApPipe.yaml
- location: $PROMPT_PROCESSING_DIR/pipelines/LATISS/ApPipe.yaml
include:
- processCcd
4 changes: 2 additions & 2 deletions python/activator/activator.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This file is part of prompt_prototype.
# This file is part of prompt_processing.
#
# Developed for the LSST Data Management System.
# This product includes software developed by the LSST Project
Expand Down Expand Up @@ -46,7 +46,7 @@
)
from .visit import FannedOutVisit

PROJECT_ID = "prompt-proto"
PROJECT_ID = "prompt-processing"

# The short name for the instrument.
instrument_name = os.environ["RUBIN_INSTRUMENT"]
Expand Down
2 changes: 1 addition & 1 deletion python/activator/config.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This file is part of prompt_prototype.
# This file is part of prompt_processing.
#
# Developed for the LSST Data Management System.
# This product includes software developed by the LSST Project
Expand Down
4 changes: 2 additions & 2 deletions python/activator/logger.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This file is part of prompt_prototype.
# This file is part of prompt_processing.
#
# Developed for the LSST Data Management System.
# This product includes software developed by the LSST Project
Expand Down Expand Up @@ -99,7 +99,7 @@ def _set_context_logger():


def setup_usdf_logger(labels=None):
"""Set global logging settings for prompt_prototype.
"""Set global logging settings for prompt_processing.

Calling this function redirects all warnings to go through the logger.

Expand Down
2 changes: 1 addition & 1 deletion python/activator/middleware_interface.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This file is part of prompt_prototype.
# This file is part of prompt_processing.
#
# Developed for the LSST Data Management System.
# This product includes software developed by the LSST Project
Expand Down
2 changes: 1 addition & 1 deletion python/activator/raw.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This file is part of prompt_prototype.
# This file is part of prompt_processing.
#
# Developed for the LSST Data Management System.
# This product includes software developed by the LSST Project
Expand Down
21 changes: 21 additions & 0 deletions python/activator/visit.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,24 @@
# This file is part of prompt_processing.
#
# Developed for the LSST Data Management System.
# This product includes software developed by the LSST Project
# (https://www.lsst.org).
# See the COPYRIGHT file at the top-level directory of this distribution
# for details of code ownership.
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.

__all__ = ["FannedOutVisit", "SummitVisit", "BareVisit"]

from dataclasses import dataclass, field, asdict
Expand Down
21 changes: 21 additions & 0 deletions python/tester/upload.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,24 @@
# This file is part of prompt_processing.
#
# Developed for the LSST Data Management System.
# This product includes software developed by the LSST Project
# (https://www.lsst.org).
# See the COPYRIGHT file at the top-level directory of this distribution
# for details of code ownership.
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <https://www.gnu.org/licenses/>.

import dataclasses
import datetime
import itertools
Expand Down
8 changes: 3 additions & 5 deletions python/tester/upload_hsc_rc2.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# This file is part of prompt_prototype.
# This file is part of prompt_processing.
#
# Developed for the LSST Data Management System.
# This product includes software developed by the LSST Project
Expand Down Expand Up @@ -149,10 +149,8 @@ def get_hsc_visit_list(butler, n_sample):
def prepare_one_visit(kafka_url, group_id, butler, visit_id):
"""Extract metadata and send next_visit events for one HSC-RC2 visit

One ``next_visit`` message is sent for each detector, to mimic the
current prototype design in which a single message is sent from the
Summit to the USDF and then a USDF-based server translates it into
multiple messages.
One ``next_visit`` message is sent to the development fan-out service,
which translates it into multiple messages.

Parameters
----------
Expand Down
Loading
Loading