Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Datetime Fields Appear as Epoch when Exported from Visual Studio Code #1374

Open
adamfortuno opened this issue Aug 29, 2024 · 0 comments
Open
Labels

Comments

@adamfortuno
Copy link

When exporting a result set containing a timestamp column, the timestamp values appear as epochs in the CSV file. The user expects this timestamp to appear in the same format as displayed in the result set.

Versions of Everything

  • Databricks Driver for SQLTools v0.4.2
  • Visual Studio Code v1.92.2 (Universal)
    • Date: 2024-08-14T17:29:30.058Z
    • Electron: 30.1.2
    • ElectronBuildId: 9870757
    • Chromium: 124.0.6367.243
    • Node.js: 20.14.0
    • V8: 12.4.254.20-electron.0
    • OS: Darwin arm64 23.5.0

Steps to Recreate

I am run the following query against a SQL Warehouse via Visual Studio Code c/o the Databricks Driver for SQLTools:

select dtv from explode (
    array(
        curdate(),
        date_add(curdate(), cast(-100 * round(random(), 2) as tinyint)),
        date_add(curdate(), cast(-100 * round(random(), 2) as tinyint)),
        date_add(curdate(), cast(-100 * round(random(), 2) as tinyint)),
        date_add(curdate(), cast(-100 * round(random(), 2) as tinyint)),
        date_add(curdate(), cast(-100 * round(random(), 2) as tinyint)),
        date_add(curdate(), cast(-100 * round(random(), 2) as tinyint))
    )
) as t(dtv);

The query runs successfully. I see the output in a results pane in VSC:

image

I right click on the result set and choose Save Results as CSV from the context menu:

image

This produces a CSV file where the dates are represented as epochs:

image

The values that appear in the csv file are as follows:

"dtv"
1724198400000
1719360000000
1719014400000
1720569600000
1721433600000
1719792000000
1715904000000

User is trying to use the CSV export feature to share results with non-technical personnel. They find manipulating the CSV file to turn the epoch back into a date/time string tedious.

Workaround

Explicitly converting the timestamp to a string care-of the date_format function produces the desired output. For example, the following code:

select date_format(dtv, 'M/d/y') as dtv from explode (
    array(
        curdate(),
        date_add(curdate(), cast(-100 * round(random(), 2) as tinyint)),
        date_add(curdate(), cast(-100 * round(random(), 2) as tinyint)),
        date_add(curdate(), cast(-100 * round(random(), 2) as tinyint)),
        date_add(curdate(), cast(-100 * round(random(), 2) as tinyint)),
        date_add(curdate(), cast(-100 * round(random(), 2) as tinyint)),
        date_add(curdate(), cast(-100 * round(random(), 2) as tinyint))
    )
) as t(dtv);

Creates saved CSV output that appears as follows:

"dtv"
"8/21/2024"
"8/11/2024"
"8/3/2024"
"7/22/2024"
"5/28/2024"
"6/29/2024"
"7/19/2024"

NOTE: This issue was initially filed against the sqltools-databricks-driver project - see issue #88. That team indicated the code for the export feature was a part of the sqltools project. Pursuant, I'm file the issue with sqltools.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant