Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reduce install-chaos-faults duration #4796

Open
wants to merge 11 commits into
base: master
Choose a base branch
from

Conversation

jemlog
Copy link

@jemlog jemlog commented Jul 28, 2024

Proposed changes

스크린샷 2024-07-29 오전 1 00 01

Previously, even though all ChaosExperiment CRs were created, the delay time of install-chaos-faults was long because it slept for an additional 30 seconds.

To solve this problem, I counted the number of yaml files under the /tmp directory and assigned it to faultCount variable. Next, I added a loop operation to check whether the number of CRs searched through kubectl get -f /tmp/ --no-headers | wc -l is same as faultCount.

스크린샷 2024-07-29 오전 1 02 39

This operation can shorten the install-chaos-fault execution time.

Types of changes

What types of changes does your code introduce to Litmus? Put an x in the boxes that apply

  • New feature (non-breaking change which adds functionality)
  • Bugfix (non-breaking change which fixes an issue)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Documentation Update (if none of the other choices applies)

Checklist

Put an x in the boxes that apply. You can also fill these out after creating the PR. If you're unsure about any of them, don't hesitate to ask. We're here to help! This is simply a reminder of what we are going to look for before merging your code.

  • I have read the CONTRIBUTING doc
  • I have signed the commit for DCO to be passed.
  • Lint and unit tests pass locally with my changes
  • I have added tests that prove my fix is effective or that my feature works (if appropriate)
  • I have added necessary documentation (if appropriate)

Dependency

  • Please add the links to the dependent PR need to be merged before this (if any).

Special notes for your reviewer:

@namkyu1999 namkyu1999 changed the title Update install-chaos-faults duration Reduce install-chaos-faults duration Jul 29, 2024
Copy link
Member

@namkyu1999 namkyu1999 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 🚀

Copy link
Contributor

@amityt amityt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 🚀

Comment on lines 67 to 69
args: [
'kubectl apply -f /tmp/ -n {{workflow.parameters.adminModeNamespace}} && faultCount=$(ls /tmp/ | grep -E "(.yaml$|.yml$)" | wc -l) && until [[ $(kubectl get -f /tmp/ --no-headers | wc -l) -eq $faultCount ]]; do sleep 1; echo "Waiting for ChaosExperiment CR Ready..." ; done; echo "ChaosExperiment CR Ready"'
]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @jemlog
There are some edge cases which can happen here.
Lets say older CR for chaos experiments already exists on user's cluster. Now with this command, it will always pass & won't actually wait for new CR to get applied. Which is fine incase we were not making any changes to experiment CR. But with every release, we update go-runner in chaos-experiment CR. So, it's required that only when new experiment CR is applied on cluster, then only we should move on to other step.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @Jonsy13
To solve the edge case you mentioned, I included the process of deleting the existing ChaosExperiment CR first.
I tried this solution because CR generation does not take much time. Please tell me if there are other edge cases.
Thanks!

@namkyu1999
Copy link
Member

can you check a build-pipeline @jemlog ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: In Review
Development

Successfully merging this pull request may close these issues.

4 participants