Skip to content

Commit

Permalink
Add comments for exporting urls from S3 in BigQueryDriver
Browse files Browse the repository at this point in the history
  • Loading branch information
KSDaemon committed Sep 30, 2024
1 parent 0d02a3a commit 17ed037
Showing 1 changed file with 5 additions and 0 deletions.
5 changes: 5 additions & 0 deletions packages/cubejs-bigquery-driver/src/BigQueryDriver.ts
Original file line number Diff line number Diff line change
Expand Up @@ -325,6 +325,11 @@ export class BigQueryDriver extends BaseDriver implements DriverInterface {
const bigQueryTable = this.bigquery.dataset(schema).table(tableName);
const [job] = await bigQueryTable.createExtractJob(destination, { format: 'CSV', gzip: true });
await this.waitForJobResult(job, { table }, false);
// There is an implementation for extracting and signing urls from S3
// @see BaseDriver->extractUnloadedFilesFromS3()
// Please use that if you need. Here is a different flow
// because bigquery requires storage/bucket object for other things,
// and there is no need to initiate another one (created in extractUnloadedFilesFromS3()).
const [files] = await this.bucket.getFiles({ prefix: `${table}-` });
const urls = await Promise.all(files.map(async file => {
const [url] = await file.getSignedUrl({
Expand Down

0 comments on commit 17ed037

Please sign in to comment.