You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The extract function pulls out a part of a timestamp. In particular this issue is about these components:
* MILLISECOND Return number of milliseconds since the last full second.
* MICROSECOND Return number of microseconds since the last full millisecond.
* NANOSECOND Return number of nanoseconds since the last full microsecond.
* SUBSECOND Return number of microseconds since the last full second of the given timestamp.
Consider the query SELECT EXTRACT(MICROSECONDS FROM TIMESTAMP '2016-12-31 13:30:15.100259');
I would expect, according to that definition, that I would get 259. However, I get the following:
Engine
Result
Postgres
15100259
DuckDb
15100259
Datafusion
Not supported
Pyarrow Compute
259
SQL Server (datepart)
100259
Velox
unknown
Spark
can only extract seconds, which returns 15.100259
So, we have at least 3 different behaviors. Maybe we should just drop these from the method?
The text was updated successfully, but these errors were encountered:
At least with Spark 3.4, it doesn't appear to be possible to extract anything smaller than a second from a timestamp, and extracting seconds gives fractional seconds. That is SELECT EXTRACT(SECONDS FROM TIMESTAMP '2016-12-31 13:30:15.100259') returns 15.100259.
The
extract
function pulls out a part of a timestamp. In particular this issue is about these components:Consider the query
SELECT EXTRACT(MICROSECONDS FROM TIMESTAMP '2016-12-31 13:30:15.100259');
I would expect, according to that definition, that I would get
259
. However, I get the following:So, we have at least 3 different behaviors. Maybe we should just drop these from the method?
The text was updated successfully, but these errors were encountered: