You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When exporting a result set containing a timestamp column, the timestamp values appear as epochs in the CSV file. The user expects this timestamp to appear in the same format as displayed in the result set.
Versions of Everything
Databricks Driver for SQLTools v0.4.2
Visual Studio Code v1.92.2 (Universal)
Date: 2024-08-14T17:29:30.058Z
Electron: 30.1.2
ElectronBuildId: 9870757
Chromium: 124.0.6367.243
Node.js: 20.14.0
V8: 12.4.254.20-electron.0
OS: Darwin arm64 23.5.0
Steps to Recreate
I am run the following query against a SQL Warehouse via Visual Studio Code c/o the Databricks Driver for SQLTools:
select dtv from explode (
array(
curdate(),
date_add(curdate(), cast(-100* round(random(), 2) as tinyint)),
date_add(curdate(), cast(-100* round(random(), 2) as tinyint)),
date_add(curdate(), cast(-100* round(random(), 2) as tinyint)),
date_add(curdate(), cast(-100* round(random(), 2) as tinyint)),
date_add(curdate(), cast(-100* round(random(), 2) as tinyint)),
date_add(curdate(), cast(-100* round(random(), 2) as tinyint))
)
) as t(dtv);
The query runs successfully. I see the output in a results pane in VSC:
I right click on the result set and choose Save Results as CSV from the context menu:
This produces a CSV file where the dates are represented as epochs:
The values that appear in the csv file are as follows:
User is trying to use the CSV export feature to share results with non-technical personnel. They find manipulating the CSV file to turn the epoch back into a date/time string tedious.
Workaround
Explicitly converting the timestamp to a string care-of the date_format function produces the desired output. For example, the following code:
select date_format(dtv, 'M/d/y') as dtv from explode (
array(
curdate(),
date_add(curdate(), cast(-100* round(random(), 2) as tinyint)),
date_add(curdate(), cast(-100* round(random(), 2) as tinyint)),
date_add(curdate(), cast(-100* round(random(), 2) as tinyint)),
date_add(curdate(), cast(-100* round(random(), 2) as tinyint)),
date_add(curdate(), cast(-100* round(random(), 2) as tinyint)),
date_add(curdate(), cast(-100* round(random(), 2) as tinyint))
)
) as t(dtv);
NOTE: This issue was initially filed against the sqltools-databricks-driver project - see issue #88. That team indicated the code for the export feature was a part of the sqltools project. Pursuant, I'm file the issue with sqltools.
The text was updated successfully, but these errors were encountered:
When exporting a result set containing a timestamp column, the timestamp values appear as epochs in the CSV file. The user expects this timestamp to appear in the same format as displayed in the result set.
Versions of Everything
Steps to Recreate
I am run the following query against a SQL Warehouse via Visual Studio Code c/o the Databricks Driver for SQLTools:
The query runs successfully. I see the output in a results pane in VSC:
I right click on the result set and choose Save Results as CSV from the context menu:
This produces a CSV file where the dates are represented as epochs:
The values that appear in the csv file are as follows:
User is trying to use the CSV export feature to share results with non-technical personnel. They find manipulating the CSV file to turn the epoch back into a date/time string tedious.
Workaround
Explicitly converting the timestamp to a string care-of the date_format function produces the desired output. For example, the following code:
Creates saved CSV output that appears as follows:
NOTE: This issue was initially filed against the sqltools-databricks-driver project - see issue #88. That team indicated the code for the export feature was a part of the sqltools project. Pursuant, I'm file the issue with sqltools.
The text was updated successfully, but these errors were encountered: