Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added basic JSON column support #65

Open
wants to merge 4 commits into
base: master
Choose a base branch
from
Open

Added basic JSON column support #65

wants to merge 4 commits into from

Conversation

kalinon
Copy link
Contributor

@kalinon kalinon commented Aug 14, 2018

At a minimum we should be able to read/write to JSON columns via a String. This solves for that.

#64
amberframework/granite#193

@kalinon
Copy link
Contributor Author

kalinon commented Aug 14, 2018

Changed it to serialize/deserialize a JSON::Any object

Copy link

@ysbaddaden ysbaddaden left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good. Maybe just use MySQL::Type::JSON instead of MySQL::Type::Json but that's nitpicking.

@kalinon
Copy link
Contributor Author

kalinon commented Aug 18, 2018 via email

@ysbaddaden
Copy link

@bcardiff any input?

@RX14 RX14 requested a review from bcardiff August 31, 2018 08:47
Copy link
Member

@bcardiff bcardiff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Besides some specific feedback, this is my overall thinking of the topic:

  1. I would prefer that the JSON parsing would work directly from the IO instead of reading all the input with read_lenenc_string. For this a IO::Sized should probably be used, plus some checks in the connection. (I would even want to expose that when doing ResultSet#read(IO)).

  2. I would like see support for types using JSON.mapping in order to have a type safe type version. Both for reading and writing. Similar to 1, without allocating the whole json in a string before sending it to the socket.

  3. At the same time JSON api in stdlib had had some changes in recent versions. I would wait for them to settle down a bit before adding them as a hard dependency.

I would move forward using IO for JSON types for now. At build later the parsing of json on top of that, or at least allow the user to choose how to do it.

@@ -49,6 +49,7 @@ DB::DriverSpecs(MySql::Any).run do
sample_value Time.utc(2016, 2, 15, 10, 15, 30, nanosecond: 543_012_000), "timestamp(6)", "TIMESTAMP '2016-02-15 10:15:30.543012'"
sample_value Time::Span.new(0, 10, 15, 30, nanoseconds: 543_000_000), "Time(3)", "TIME '10:15:30.543'"
sample_value Time::Span.new(0, 10, 15, 30, nanoseconds: 543_012_000), "Time(6)", "TIME '10:15:30.543012'"
sample_value JSON::Any.new({"example" => JSON::Any.new("json")}), "JSON", "'{\"example\": \"json\"}'", type_safe_value: false
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought it would be a type_safe_value why it's not?

str = packet.read_lenenc_string.lchop("\u0013")
JSON.parse(str)
rescue e
nil
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why exceptions should be swallowed?

end

def self.read(packet)
str = packet.read_lenenc_string.lchop("\u0013")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does MySQL sends trailing \u0013 on all json values? Is there a documentation/reference for this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it seems to be a prefix for all the JSON data. I have not been able to find mysql documentation on it however.

@kalinon
Copy link
Contributor Author

kalinon commented Jun 6, 2019

Brining branch up to speed and checking in.

Does this need more work?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants