Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Spike on supporting Py3.12 in dbt-spark #981

Closed
3 tasks done
colin-rogers-dbt opened this issue Feb 14, 2024 · 1 comment · Fixed by #1081
Closed
3 tasks done

[Feature] Spike on supporting Py3.12 in dbt-spark #981

colin-rogers-dbt opened this issue Feb 14, 2024 · 1 comment · Fixed by #1081
Assignees
Labels
enhancement New feature or request

Comments

@colin-rogers-dbt
Copy link
Contributor

Is this your first time submitting a feature request?

  • I have read the expectations for open source contributors
  • I have searched the existing issues, and I could not find an existing issue for this feature
  • I am requesting a straightforward extension of existing dbt-spark functionality, rather than a Big Idea better suited to a discussion

Describe the feature

We need to try upgrading to py312 and either it works with minimal changes or scope out what needs to happen to make it work

Describe alternatives you've considered

No response

Who will this benefit?

No response

Are you interested in contributing this feature?

No response

Anything else?

No response

@colin-rogers-dbt colin-rogers-dbt added enhancement New feature or request triage and removed triage labels Feb 14, 2024
@colin-rogers-dbt colin-rogers-dbt changed the title [Feature] Spike on supporting Py3.12 [Feature] Spike on supporting Py3.12 in dbt-spark Jul 3, 2024
@VersusFacit
Copy link
Contributor

VersusFacit commented Jul 28, 2024

We need to bump pydobc to 5.0 or higher (included in my PR alreadt).

Pre 5.0 does not come with wheels per the official repo.

official repo
SO

This ticket will be blocked until the apache spark netcat issue breaking apache-spark is solved.

Thankfully, other tests are running just swimmingly 🏊‍♀

Because spark uses 3.12 for most non test release functions, I really just need to run through local operations and CI to get a safe merge, I think.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants