Distributed batch or streaming serving with Apache Spark #2177
Unanswered
jack1981
asked this question in
Show and tell
Replies: 1 comment
-
Hi @jack1981, we have some sample code for running batch inference on Spark with 0.13 release, although it's not an official API yet. The new version BentoML 1.0 is around the corner and we are building some new APIs for running batch inference, would be great to discuss and hear more about your use case. Are you in the community slack? Feel free to ping me there. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi team
I found in the feature list , BentoML supports "Distributed batch or streaming serving with Apache Spark" , I couldn't find any example or demo to understand it , could you please publish it ?
Thanks !
Jack Song from Mastercard
Beta Was this translation helpful? Give feedback.
All reactions