Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Import Tables -- Batch Processing #2862

Open
3 tasks
therealslimhsiehdy opened this issue Sep 27, 2024 · 0 comments
Open
3 tasks

Import Tables -- Batch Processing #2862

therealslimhsiehdy opened this issue Sep 27, 2024 · 0 comments
Labels

Comments

@therealslimhsiehdy
Copy link
Contributor

therealslimhsiehdy commented Sep 27, 2024

Issue description

We are currently unable to re-import data from Stable onto a target sandbox due to too much data trying to be processed at once. We need to break up the import into processing 1000 rows per file like we are in export_tables. See Slack discussion for reference and images.

Acceptance criteria

  • Break up processing for import into 1000 rows per file in import_tables.py -- please reference code from export_tables.py that is doing something very similar for the export process.
  • When importing, a process should not be marked as "Killed" and should process successfully
  • Run the new script successfully on litterbox which will close Re-import data from stable onto litterbox #2709 (please reference this ticket in your PR with the "resolves" keyword)

Additional context

Currently it is failing on the Contact table.

Links to other issues

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Status: 👶 New
Development

No branches or pull requests

1 participant