Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve format consistency with lint plugins #863

Open
wants to merge 16 commits into
base: release/v4.1.0
Choose a base branch
from

Conversation

Tgenz1213
Copy link

Supersedes #833

Previous PR was based on wrong branch, and rebasing lead to a cumbersome amount of merge conflicts. There were too many ambitious edits as well.

This PR solves future basic linting and style problems with issue #818 assuming that everyone uses it and reviewers enforce most rules.

@surapuramakhil surapuramakhil changed the title v4.1.0 Contribution Improve format consistency with lint plugins Nov 15, 2024
.isort.cfg Outdated Show resolved Hide resolved
config.py Outdated Show resolved Hide resolved
main.py Show resolved Hide resolved
requirements.txt Outdated
@@ -19,13 +19,14 @@ Levenshtein==0.25.1
loguru==0.7.2
openai==1.37.1
pdfminer.six==20221105
pre-commit
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you pin the revision to avoid conflicts later. Can you please do that for all the dependencies without a specific revision

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added versions to everything. Please let me know if they're all good.

Copy link
Author

@Tgenz1213 Tgenz1213 Nov 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Testing seems to have checked out. Was everything as expected @49Simon?

.pre-commit-config.yaml Outdated Show resolved Hide resolved
main.py Outdated Show resolved Hide resolved
@@ -483,7 +485,7 @@ def _handle_upload_fields(self, element: WebElement, job_context: JobContext) ->
job_context.job.resume_path = str(self.resume_path.resolve())
job_context.job_application.resume_path = str(self.resume_path.resolve())
logger.debug(f"Resume uploaded from path: {self.resume_path.resolve()}")
else:
elif config.LLM_MODEL_TYPE is constants.OPENAI:
logger.debug("Resume path not found or invalid, generating new resume")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here it should brake execution not debug logs. throw error which can close program

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@surapuramakhil That would cause the program to exit for anyone not using OpenAI.

It was not my goal to change any functionality. I simply wanted to update the conditional checks to match the current dependency logic. I kept seeing feder's old resume generator throw an error because I'm using Ollama. It was annoying and discracting, so this fix helps get rid of that error message for those who don't need to see it.

I can add an else statement again with an info log saying that no resume is selected and resume generation is unavailable for the current llm model?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this would cause exit - when they are using resume generator and not on open AI - which is clearly what we are as of now. as resume generator only supports open AI.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@surapuramakhil no, that is exactly why I opened #820. The logic is incomplete. This program always tries to upload a resume whether the flag is set or not.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Tgenz1213 explain current code behavior

  1. No resume is passed
  2. Resume generator won't be called because of LLM type

what would Agent submit? for resume

Copy link
Author

@Tgenz1213 Tgenz1213 Nov 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The other option is: make --resume require an argument.

Let's please think about the basic functionality of a CLI app. No arguments = default behavior. Allowing --resume to have optional arguments is either wrong or misleading. Using --resume alone means it changes the default behavior to do "something." Using --resume with args changes the behavior of --resume.

#820 means to explain there's no behavior difference between "python main.py" and "python main.py --resume" when there should be.

Copy link
Collaborator

@49Simon 49Simon Nov 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay I've a couple of comments:

  1. Since this PR is supposed to be for formatting/styling the whole codebase, this commit should not have been included in this. It should be a separate PR addressing a separate issue.
  2. The --resume flag is supposed to require an argument. It is used when you want to use your own resume pdf file instead of having the bot generate a new resume for every application. I've checkout out to this PR, and --resume seems to work as expected:
Screenshot 2024-11-17 at 12 50 11 PM 3. In terms of fixing this issue (which should not be included in this PR), this commit will wait until the bot scrolls through jobs, select a job, scroll again, determine suitability, start filling forms and then find out it cant generate resume if you're not using OpenAI api. Instead, a better fix would be to have a check when a user runs `python main.py` and exit the app if they're not using OpenAI api by stating it wont be able to generate resume and they should apply with their own resume file by passing the flag. Because currently, it wont apply to any jobs if you're not providing your resume and you're not using OpenAI anyways.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@49Simon I stand corrected, at least for v4.1.0. There's a reason I thought --resume worked without args though. It might be an issue on main and I was keeping that in mind without double checking on this branch.

I'll revert the commit when I have a chance later and then double check my observations about behavior and other stuff. I'll work this issue and others separately. I just wanted to quickly remedy a bug while I noticed it. Next time I'll open a bug issue or mention it in an existing one.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For the resume, I just tested it on main and it still requires an argument. My understanding was it always required one tbh. If you find the instance where it allowed --resume flag without the file path, let me know.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Absolutely! I will research this at my next availability.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants