Skip to content
This repository has been archived by the owner on Nov 25, 2024. It is now read-only.

Save k8s logs to a volume with a retention period #278

Closed
dleard opened this issue May 29, 2024 · 5 comments
Closed

Save k8s logs to a volume with a retention period #278

dleard opened this issue May 29, 2024 · 5 comments

Comments

@dleard
Copy link

dleard commented May 29, 2024

Our pod logs currently only write to stdout & are lost when a pod is cycled. We do have access to kibana, but that is managed outside of the team, we can't download any of the logs & the retention period is out of our hands.

We should save our own set of logs to a volume & set a retention period of 2 years for prod & 3 months for dev/test environments.

There are likely several different ways we could handle this. One possible solution would be to run a sidecar container that tails the logs to a file on an interval. There's a suggestion for implementation here. One upgrade I'd make to the above implementation is to write to a different file daily (ie: app-name-yyyy-mm-dd.log).

Update
After some exploration, the above implementation with the sidecar container appears to assume control of cluster-level-logging which I don't think we have at our permission level. We can access the logs for a container with the following command: oc logs <pod-name> -c <container-name>.
oc logs documentation

If we can fetch the logs from a sidecar container dynamically & save them to a file the efk stack described below could read that file.
A container that has k8s or openshift image that can run kubectl logs... or oc logs.... should be able to do this.

Get pod name from ENV
get container name from helm values / templates
oc logs --since 10m <pod name> -c <container name> (get logs for last 10 min)
sleep 10 minutes

Suggested actions for this ticket:

  • reach out via RC to ask if other teams are saving their app logs to a PVC / ask platform services if we're able to tap into that logging agent ourselves (programatically in k8s beyond the above oc logs.. command.
  • If not, we could potentially run a cronjob that fetches each pod's logs either using a python script like trigger-k8s-cronjob or k8s image running the above command on a schedule and saving the logs to a volume that way. The cronjob will need a service account that has the permissions necessary to fetch the logs & save to volume.

Second Update
Something to explore from reaching out in RC: https://github.com/bcgov/elmsd-nodejs/tree/main/packages/openshift/templates/efk-stack
A couple other suggestions for where to look from RC thread:
https://stackoverflow.developer.gov.bc.ca/questions/147
https://stackoverflow.developer.gov.bc.ca/questions/732

@dleard
Copy link
Author

dleard commented May 29, 2024

@pbastia @hannavovk @patriciarussellCAS @marcellmueller
Here is a card for the log retention thing I've brought up recently. I've put this in cas-obps for the registration namespace, but once this is done for one namespace I'd suggest doing it for all of our app pods in the other namespaces as well.

@ayeshmcg
Copy link

@dleard @pbastia The EFK stack is ready, created helm chart and added that to the repository https://github.com/bcgov/cas-efk
As discussed with @dleard EFK stack looks good, now need to look into how to access logs
As of now the logs are not in the log file. So researching more on that to find a way to either read logs directly from STDOUT or do we need to generate a log file and then have our EFK stack configured.

@ayeshmcg
Copy link

ayeshmcg commented Jul 30, 2024

@dleard
I have created this document that includes the details about the logging architecture and how I configured it.
https://bcgov-my.sharepoint.com/:w:/g/personal/ayesha_ayesha_gov_bc_ca/ESpUM3y-Nv1Alx3ZofHv4JsBTQD9YqLBbIIfQ_tVsHRyBw?e=KTDkEa

Logging Architecture:Included in the document along with miro link

This is the link to the code changes with which I tested with CIF-Dev : bcgov/cas-cif#1926

Elastic Search and Kibana are running in 9212c9-tools

@patrickisaac
Copy link

Moving the status on the Moose project side to match Giraffe

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

9 participants