You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi!
I have a table called worker (id, user_id, surname, forename) and the following (tracked as mutation) function:
CREATE OR REPLACE FUNCTION public.insert_records(objects worker[])
RETURNS SETOF worker
LANGUAGE plpgsql VOLATILE
AS $ function $ BEGIN
INSERT INTO public.worker
SELECT * FROM unnest(objects) RETURNING ret;
RETURN QUERY select * from public.worker;
END;
$ function $
{
"errors": [
{
"extensions": {
"code": "parse-failed",
"path": "$.selectionSet.insert_records.args.args.objects"
},
"message": "A string is expected for type: _worker"
}
]
}
Why would hasura interpret worker array as _worker string?
Can I use the hasura-generated [worker_insert_input!]! as input args, and make it work with a function?
My goal is to circumvent the max 65535 parameters limit, so I can bulk insert hundreds of thousands of records in the same transaction. I found that UNNEST sql function could 'unpack' an array, so this is the reason to create a new sql function for the bulk insert. If not the method above, what would be the correct way to achieve this goal? Batching the insert would be very complicated due to my insert triggers and other side-effects (not explained above), I try to avoid it.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi!
I have a table called worker (id, user_id, surname, forename) and the following (tracked as mutation) function:
I'd like to execute a mutation from graphql:
When I run the mutation, I got this error:
Why would hasura interpret worker array as
_worker
string?Can I use the hasura-generated
[worker_insert_input!]!
as input args, and make it work with a function?My goal is to circumvent the max 65535 parameters limit, so I can bulk insert hundreds of thousands of records in the same transaction. I found that UNNEST sql function could 'unpack' an array, so this is the reason to create a new sql function for the bulk insert. If not the method above, what would be the correct way to achieve this goal? Batching the insert would be very complicated due to my insert triggers and other side-effects (not explained above), I try to avoid it.
regards,
Norbi
Beta Was this translation helpful? Give feedback.
All reactions