Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix test failures for Spark 4.0.0 #11004

Open
14 of 26 tasks
razajafri opened this issue Jun 8, 2024 · 0 comments
Open
14 of 26 tasks

Fix test failures for Spark 4.0.0 #11004

razajafri opened this issue Jun 8, 2024 · 0 comments
Labels
bug Something isn't working Spark 4.0+ Spark 4.0+ issues

Comments

@razajafri
Copy link
Collaborator

razajafri commented Jun 8, 2024

This is an epic that will contain all the issues for various integration failures and unit test failures

Tasks

  1. Spark 4.0+ bug
    mythrocks
  2. Spark 4.0+ bug
    mythrocks
  3. Spark 4.0+ bug
  4. Spark 4.0+ bug
  5. Spark 4.0+ bug
  6. Spark 4.0+ bug
  7. Spark 4.0+ bug
  8. Spark 4.0+ bug
  9. 0 of 3
    Spark 4.0+ bug
  10. Spark 4.0+ bug
  11. Spark 4.0+ bug
  12. Spark 4.0+ bug
  13. Spark 4.0+ bug
    razajafri
  14. Spark 4.0+ bug
    razajafri
  15. Spark 4.0+ bug
    razajafri
  16. Spark 4.0+ bug
    mythrocks
  17. Spark 4.0+ bug
    mythrocks
  18. Spark 4.0+ bug
    mythrocks
  19. Spark 4.0+ bug
    mythrocks
  20. Spark 4.0+ bug
    mythrocks
  21. Spark 4.0+ bug
    mythrocks
  22. Spark 4.0+ bug
    mythrocks
  23. Spark 4.0+ bug
    mythrocks
  24. Spark 4.0+ bug
    mythrocks
  25. Spark 4.0+ bug
    mythrocks
  26. Spark 4.0+ bug
    mythrocks
@razajafri razajafri added bug Something isn't working ? - Needs Triage Need team to review and classify Spark 4.0+ Spark 4.0+ issues labels Jun 8, 2024
@mattahrens mattahrens removed the ? - Needs Triage Need team to review and classify label Jun 11, 2024
mythrocks added a commit to mythrocks/spark-rapids that referenced this issue Sep 27, 2024
Fixes NVIDIA#11015.
Contributes to NVIDIA#11004.

This commit addresses the tests that fail in parquet_test.py, when
run on Spark 4.

1. Some of the tests were failing as a result of NVIDIA#5114.  Those tests
have been disabled, at least until we get around to supporting
aggregations with ANSI mode enabled.

2. `test_parquet_check_schema_compatibility` fails on Spark 4 regardless
of ANSI mode, because it tests implicit type promotions where the read
schema includes wider columns than the write schema.  This will require
new code.  The test is disabled until NVIDIA#11512 is addressed.

3. `test_parquet_int32_downcast` had an erroneous setup phase that fails
   in ANSI mode.  This has been corrected. The test was refactored to
run in ANSI and non-ANSI mode.

Signed-off-by: MithunR <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working Spark 4.0+ Spark 4.0+ issues
Projects
None yet
Development

No branches or pull requests

2 participants