Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

convert_datatype could have improved coerce flexibility #69

Open
geekpete opened this issue Mar 1, 2019 · 0 comments
Open

convert_datatype could have improved coerce flexibility #69

geekpete opened this issue Mar 1, 2019 · 0 comments
Assignees

Comments

@geekpete
Copy link

geekpete commented Mar 1, 2019

For all general issues, please provide the following details for fast resolution:

  • Version: 6.6.0
  • Operating System: Ubuntu 18.04.2 LTS
  • Sample Data:

This was the farequote dataset.
CSV file with a field that contains numeric value that contains a comma, eg 10,144.183

"February 11th 2017, 18:59:25.000",AVjeRNz82UMJWf0OwkY7,"farequote-2017",,responsetime,JBU,"10,144.183"

  • Steps to Reproduce:

Found that Dissect's convert_datatype cannot coerce the number with the comma
for "thousands", resulting in this warning:

[2019-02-08T15:01:06,484][WARN ][org.logstash.dissect.Dissector] Dissector datatype conversion, value cannot be coerced, field: responsetime, value: 10,144.183

Whereas the Mutate filter's convert was able to coerce it.
This means having to use an additional mutate filter block to handle the conversion to get
this data into a numeric type.

I also saw this especially illuminating ticket: #10

which goes into some of the detail of how different the implementations are between this plugin and others for seemingly similar functionalities, so will understand if this enhancement isn't a trivial ask.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants