You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Figure out the correct procedure to perform incremental writes to partitioned parquet directory. This is necessary when working with files that exceed memory capabilities. Dask dataframe supports "appends" natively but polars documentation re-directs to pyarrow, which doesn't provide a very robust example. Note it is important to update metadata update upon writing a new partition.
Figure out the correct procedure to perform incremental writes to partitioned parquet directory. This is necessary when working with files that exceed memory capabilities. Dask dataframe supports "appends" natively but polars documentation re-directs to pyarrow, which doesn't provide a very robust example. Note it is important to update metadata update upon writing a new partition.
Here is starter code that needs further testing:
Check the test.parquet result against the native polars write to dataset function:
The text was updated successfully, but these errors were encountered: