Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Production deploy 10/2/2023 #526

Merged
merged 30 commits into from
Oct 5, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
30 commits
Select commit Hold shift + click to select a range
4ad3c73
add ADR 0007: ADR: Manage total record retention dynamically
stvnrlly Sep 19, 2023
21fdaeb
Bump vulture from 2.8 to 2.9.1
dependabot[bot] Sep 26, 2023
f32e3f6
Merge pull request #509 from GSA/dependabot/pip/vulture-2.9.1
stvnrlly Sep 27, 2023
5a4e828
add date
stvnrlly Sep 27, 2023
5c08762
Merge branch 'main' into adr/auto/0007
stvnrlly Sep 27, 2023
f1a4516
notify-api-360 stop worrying about aws regions
Sep 27, 2023
a6c563c
Merge pull request #467 from GSA/adr/auto/0007
stvnrlly Sep 27, 2023
6ac331e
Merge pull request #510 from GSA/notify-api-360
ccostino Sep 27, 2023
7355a21
Bump async-timeout from 4.0.2 to 4.0.3
dependabot[bot] Sep 27, 2023
bebce82
instructions for bulk testing and change delivery receipt delay to 2 …
Sep 28, 2023
294723c
merge from main
Sep 28, 2023
e019e9c
Update OWASP ZAP scans
ccostino Sep 28, 2023
27197b6
code review feedback
Sep 29, 2023
96b224d
streamline setup instructions
stvnrlly Sep 29, 2023
cd37c24
remove unmaintained docker stuff
stvnrlly Sep 29, 2023
6ae3417
Merge pull request #511 from GSA/dependabot/pip/async-timeout-4.0.3
stvnrlly Sep 29, 2023
66494f9
Merge pull request #519 from GSA/stvnrlly/setup-streamline
ccostino Sep 29, 2023
d5d432d
Merge pull request #516 from GSA/bulk_testing
ccostino Sep 29, 2023
f81f048
Merge pull request #517 from GSA/update-zap-scans
stvnrlly Sep 29, 2023
8af7a55
notify-api-520 persist the provider response even for successful sms …
Sep 29, 2023
819cbb3
Merge pull request #523 from GSA/notify-api-520
ccostino Sep 29, 2023
bd09c63
notify-api-521 fix sms temporary failure message
Oct 2, 2023
2487aeb
remove debugging
Oct 2, 2023
d3c9614
Bump urllib3 from 1.26.16 to 1.26.17
dependabot[bot] Oct 3, 2023
6af998a
notify-api-521 code review feedback and fix code coverage
Oct 3, 2023
ef64bd4
fix time passed to boto3
Oct 3, 2023
679dee8
fix tests
Oct 3, 2023
d666020
Merge pull request #529 from GSA/dependabot/pip/urllib3-1.26.17
ccostino Oct 3, 2023
5c31365
merge from main
Oct 3, 2023
f8bb8af
Merge pull request #527 from GSA/notify-api-521
ccostino Oct 3, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/checks.yml
Original file line number Diff line number Diff line change
Expand Up @@ -131,7 +131,7 @@ jobs:
run: make run-flask &
env:
SQLALCHEMY_DATABASE_TEST_URI: postgresql://user:password@localhost:5432/test_notification_api
- name: Run OWASP Baseline Scan
- name: Run OWASP API Scan
uses: zaproxy/[email protected]
with:
docker_name: 'ghcr.io/zaproxy/zaproxy:weekly'
Expand Down
10 changes: 7 additions & 3 deletions .github/workflows/daily_checks.yml
Original file line number Diff line number Diff line change
Expand Up @@ -75,14 +75,18 @@ jobs:
run: make bootstrap
env:
SQLALCHEMY_DATABASE_TEST_URI: postgresql://user:password@localhost:5432/test_notification_api
NOTIFY_E2E_TEST_EMAIL: ${{ secrets.NOTIFY_E2E_TEST_EMAIL }}
NOTIFY_E2E_TEST_HTTP_AUTH_PASSWORD: ${{ secrets.NOTIFY_E2E_TEST_HTTP_AUTH_PASSWORD }}
NOTIFY_E2E_TEST_HTTP_AUTH_USER: ${{ secrets.NOTIFY_E2E_TEST_HTTP_AUTH_USER }}
NOTIFY_E2E_TEST_PASSWORD: ${{ secrets.NOTIFY_E2E_TEST_PASSWORD }}
- name: Run server
run: make run-flask &
env:
SQLALCHEMY_DATABASE_TEST_URI: postgresql://user:password@localhost:5432/test_notification_api
- name: Run OWASP Baseline Scan
uses: zaproxy/action-api-scan@v0.4.0
- name: Run OWASP API Scan
uses: zaproxy/action-api-scan@v0.5.0
with:
docker_name: 'owasp/zap2docker-weekly'
docker_name: 'ghcr.io/zaproxy/zaproxy:weekly'
target: 'http://localhost:6011/docs/openapi.yml'
fail_action: true
allow_issue_writing: false
Expand Down
44 changes: 9 additions & 35 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,7 @@ On MacOS, using [Homebrew](https://brew.sh/) for package management is highly re
* [postgresql](https://www.postgresql.org/): `brew install postgresql@15` (Homebrew requires a version pin, but any recent version will work)
* [redis](https://redis.io/): `brew install redis`
* [pyenv](https://github.com/pyenv/pyenv): `brew install pyenv`
* [poetry](https://python-poetry.org/docs/#installation): `brew install poetry`
1. [Log into cloud.gov](https://cloud.gov/docs/getting-started/setup/#set-up-the-command-line): `cf login -a api.fr.cloud.gov --sso`
1. Ensure you have access to the `notify-local-dev` and `notify-staging` spaces in cloud.gov
1. Run the development terraform with:
Expand All @@ -39,19 +40,19 @@ On MacOS, using [Homebrew](https://brew.sh/) for package management is highly re
```

1. If you want to send data to New Relic from your local develpment environment, set `NEW_RELIC_LICENSE_KEY` within `.env`
1. Follow the instructions for either `Direct installation` or `Docker installation` below
1. Start Postgres && Redis

### Direct installation

1. Set up Postgres && Redis on your machine

1. Install [poetry](https://python-poetry.org/docs/#installation)
```
brew services start postgresql@15
brew services start redis
```

1. Install
1. Run the project setup

`make bootstrap`

1. Run the web server and background worker
1. Run the web server and background workers

`make run-procfile`

Expand All @@ -65,33 +66,6 @@ On MacOS, using [Homebrew](https://brew.sh/) for package management is highly re

`make run-celery`


### VS Code && Docker installation

If you're working in VS Code, you can also leverage Docker for a containerized dev environment

1. Uncomment the `Local Docker setup` lines in `.env` and comment out the `Local direct setup` lines.

1. Install the Remote-Containers plug-in in VS Code

1. With Docker running, create the network:

`docker network create notify-network`

1. Using the command palette (shift+cmd+p) or green button thingy in the bottom left, search and select “Remote Containers: Open Folder in Container...” When prompted, choose **devcontainer-api** folder (note: this is a *subfolder* of notifications-api). This will start the container in a new window, replacing the current one.

1. Wait a few minutes while things happen 🍵

1. Open a VS Code terminal and run the Flask application:

`make run-flask`

1. Open another VS Code terminal and run Celery:

`make run-celery`

NOTE: when you change .env in the future, you'll need to rebuild the devcontainer for the change to take effect. VS Code _should_ detect the change and prompt you with a toast notification during a cached build. If not, you can find a manual rebuild in command pallette or just `docker rm` the notifications-api container.

### Known installation issues

On M1 Macs, if you get a `fatal error: 'Python.h' file not found` message, try a different method of installing Python. Installation via `pyenv` is known to work.
Expand Down Expand Up @@ -156,4 +130,4 @@ Work through [commit `e604385`](https://github.com/GSA/notifications-api/commit/

## Contributing

As stated in [CONTRIBUTING.md](CONTRIBUTING.md), all contributions to this project will be released under the CC0 dedication. By submitting a pull request, you are agreeing to comply with this waiver of copyright interest.
As stated in [CONTRIBUTING.md](CONTRIBUTING.md), all contributions to this project will be released under the CC0 dedication. By submitting a pull request, you are agreeing to comply with this waiver of copyright interest.
37 changes: 28 additions & 9 deletions app/celery/provider_tasks.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
import os
from datetime import datetime, timedelta
from time import time

from flask import current_app
from sqlalchemy.orm.exc import NoResultFound
Expand All @@ -21,8 +20,13 @@
NOTIFICATION_DELIVERED,
NOTIFICATION_FAILED,
NOTIFICATION_TECHNICAL_FAILURE,
NOTIFICATION_TEMPORARY_FAILURE,
)

# This is the amount of time to wait after sending an sms message before we check the aws logs and look for delivery
# receipts
DELIVERY_RECEIPT_DELAY_IN_SECONDS = 120


@notify_celery.task(
bind=True,
Expand All @@ -45,9 +49,17 @@ def check_sms_delivery_receipt(self, message_id, notification_id, sent_at):
status = "success"
provider_response = "this is a fake successful localstack sms message"
else:
status, provider_response = aws_cloudwatch_client.check_sms(
message_id, notification_id, sent_at
)
try:
status, provider_response = aws_cloudwatch_client.check_sms(
message_id, notification_id, sent_at
)
except NotificationTechnicalFailureException as ntfe:
provider_response = "Unable to find carrier response -- still looking"
status = "pending"
update_notification_status_by_id(
notification_id, status, provider_response=provider_response
)
raise self.retry(exc=ntfe)

if status == "success":
status = NOTIFICATION_DELIVERED
Expand All @@ -56,7 +68,9 @@ def check_sms_delivery_receipt(self, message_id, notification_id, sent_at):
# if status is not success or failure the client raised an exception and this method will retry

if status == NOTIFICATION_DELIVERED:
sanitize_successful_notification_by_id(notification_id)
sanitize_successful_notification_by_id(
notification_id, provider_response=provider_response
)
current_app.logger.info(
f"Sanitized notification {notification_id} that was successfully delivered"
)
Expand All @@ -74,8 +88,6 @@ def check_sms_delivery_receipt(self, message_id, notification_id, sent_at):
)
def deliver_sms(self, notification_id):
try:
# Get the time we are doing the sending, to minimize the time period we need to check over for receipt
now = round(time() * 1000)
current_app.logger.info(
"Start sending SMS for notification id: {}".format(notification_id)
)
Expand All @@ -96,11 +108,18 @@ def deliver_sms(self, notification_id):
message_id = send_to_providers.send_sms_to_provider(notification)
# We have to put it in UTC. For other timezones, the delay
# will be ignored and it will fire immediately (although this probably only affects developer testing)
my_eta = datetime.utcnow() + timedelta(seconds=300)
my_eta = datetime.utcnow() + timedelta(
seconds=DELIVERY_RECEIPT_DELAY_IN_SECONDS
)
check_sms_delivery_receipt.apply_async(
[message_id, notification_id, now], eta=my_eta, queue=QueueNames.CHECK_SMS
[message_id, notification_id, notification.created_at],
eta=my_eta,
queue=QueueNames.CHECK_SMS,
)
except Exception as e:
update_notification_status_by_id(
notification_id, NOTIFICATION_TEMPORARY_FAILURE
)
if isinstance(e, SmsClientResponseException):
current_app.logger.warning(
"SMS notification delivery for id: {} failed".format(notification_id),
Expand Down
37 changes: 16 additions & 21 deletions app/clients/cloudwatch/aws_cloudwatch.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,14 @@
import os
import re
import time
from datetime import datetime, timedelta

from boto3 import client
from flask import current_app

from app.clients import AWS_CLIENT_CONFIG, Client
from app.cloudfoundry_config import cloud_config
from app.exceptions import NotificationTechnicalFailureException


class AwsCloudwatchClient(Client):
Expand Down Expand Up @@ -59,14 +61,14 @@ def _get_log(self, my_filter, log_group_name, sent_at):
logGroupName=log_group_name,
filterPattern=my_filter,
nextToken=next_token,
startTime=beginning,
startTime=int(beginning.timestamp() * 1000),
endTime=now,
)
else:
response = self._client.filter_log_events(
logGroupName=log_group_name,
filterPattern=my_filter,
startTime=beginning,
startTime=int(beginning.timestamp() * 1000),
endTime=now,
)
log_events = response.get("events", [])
Expand All @@ -80,28 +82,17 @@ def _get_log(self, my_filter, log_group_name, sent_at):
break
return all_log_events

def _extract_account_number(self, ses_domain_arn, region):
account_number = ses_domain_arn
# handle cloud.gov case
if "aws-us-gov" in account_number:
account_number = account_number.replace(f"arn:aws-us-gov:ses:{region}:", "")
account_number = account_number.split(":")
account_number = account_number[0]
# handle staging case
else:
account_number = account_number.replace(f"arn:aws:ses:{region}:", "")
account_number = account_number.split(":")
account_number = account_number[0]
def _extract_account_number(self, ses_domain_arn):
account_number = ses_domain_arn.split(":")
return account_number

def check_sms(self, message_id, notification_id, created_at):
region = cloud_config.sns_region
# TODO this clumsy approach to getting the account number will be fixed as part of notify-api #258
account_number = self._extract_account_number(
cloud_config.ses_domain_arn, region
)
account_number = self._extract_account_number(cloud_config.ses_domain_arn)

log_group_name = f"sns/{region}/{account_number}/DirectPublishToPhoneNumber"
time_now = datetime.utcnow()
log_group_name = f"sns/{region}/{account_number[4]}/DirectPublishToPhoneNumber"
current_app.logger.info(
f"Log group name: {log_group_name} message id: {message_id}"
)
Expand All @@ -115,9 +106,9 @@ def check_sms(self, message_id, notification_id, created_at):
return "success", message["delivery"]["providerResponse"]

log_group_name = (
f"sns/{region}/{account_number}/DirectPublishToPhoneNumber/Failure"
f"sns/{region}/{account_number[4]}/DirectPublishToPhoneNumber/Failure"
)
# current_app.logger.info(f"Failure log group name: {log_group_name}")
current_app.logger.info(f"Failure log group name: {log_group_name}")
all_failed_events = self._get_log(filter_pattern, log_group_name, created_at)
if all_failed_events and len(all_failed_events) > 0:
current_app.logger.info("SHOULD RETURN FAILED BECAUSE WE FOUND A FAILURE")
Expand All @@ -126,6 +117,10 @@ def check_sms(self, message_id, notification_id, created_at):
current_app.logger.info(f"MESSAGE {message}")
return "failure", message["delivery"]["providerResponse"]

raise Exception(
if time_now > (created_at + timedelta(hours=3)):
# see app/models.py Notification. This message corresponds to "permanent-failure",
# but we are copy/pasting here to avoid circular imports.
return "failure", "Unable to find carrier response."
raise NotificationTechnicalFailureException(
f"No event found for message_id {message_id} notification_id {notification_id}"
)
23 changes: 9 additions & 14 deletions app/dao/notifications_dao.py
Original file line number Diff line number Diff line change
Expand Up @@ -139,7 +139,11 @@ def update_notification_status_by_id(
notification.provider_response = provider_response
if not notification.sent_by and sent_by:
notification.sent_by = sent_by
return _update_notification_status(notification=notification, status=status)
return _update_notification_status(
notification=notification,
status=status,
provider_response=notification.provider_response,
)


@autocommit
Expand Down Expand Up @@ -288,24 +292,15 @@ def _filter_query(query, filter_dict=None):
return query


@autocommit
def sanitize_successful_notification_by_id(notification_id):
# TODO what to do for international?
# phone_prefix = '1'
# Notification.query.filter(
# Notification.id.in_([notification_id]),
# ).update(
# {'to': phone_prefix, 'normalised_to': phone_prefix, 'status': 'delivered'}
# )
# db.session.commit()

def sanitize_successful_notification_by_id(notification_id, provider_response):
update_query = """
update notifications set notification_status='delivered', "to"='1', normalised_to='1'
update notifications set provider_response=:response, notification_status='delivered', "to"='1', normalised_to='1'
where id=:notification_id
"""
input_params = {"notification_id": notification_id}
input_params = {"notification_id": notification_id, "response": provider_response}

db.session.execute(update_query, input_params)
db.session.commit()


@autocommit
Expand Down
4 changes: 2 additions & 2 deletions app/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -1727,8 +1727,8 @@ def formatted_status(self):
"sms": {
"failed": "Failed",
"technical-failure": "Technical failure",
"temporary-failure": "Phone not accepting messages right now",
"permanent-failure": "Phone number doesn’t exist",
"temporary-failure": "Unable to find carrier response -- still looking",
"permanent-failure": "Unable to find carrier response.",
"delivered": "Delivered",
"sending": "Sending",
"created": "Sending",
Expand Down
16 changes: 5 additions & 11 deletions app/notifications/receive_notifications.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,19 +54,13 @@ def receive_sns_sms():
)
return jsonify(result="success", message="SMS-SNS callback succeeded"), 200

content = message.get("messageBody")
from_number = message.get("originationNumber")
provider_ref = message.get("inboundMessageId")
date_received = post_data.get("Timestamp")
provider_name = "sns"

inbound = create_inbound_sms_object(
service,
content=content,
from_number=from_number,
provider_ref=provider_ref,
date_received=date_received,
provider_name=provider_name,
content=message.get("messageBody"),
from_number=message.get("originationNumber"),
provider_ref=message.get("inboundMessageId"),
date_received=post_data.get("Timestamp"),
provider_name="sns",
)

tasks.send_inbound_sms_to_service.apply_async(
Expand Down
Loading