GSoC 2024 Project 12: Prototype for JAX/Flax Models Support (or Keras 3, Mindspore) #23531
Replies: 4 comments 12 replies
-
@MonalSD, @awayzjj, @chux0519, @LucaTamSapienza, @AsakusaRinne, fyi |
Beta Was this translation helpful? Give feedback.
-
Hi Roman, I'm very interested in exploring this extended choice because I'm familiar with Tensorflow and XLA (As you can see, I'm one of the maintainers of Tensorflow.NET). Could you please talk about the rationale behind this idea more deeply? In the idea description, it's mentioned that "Jax/Flax provides much faster training than PyTorch/Tensorflow". However, I find this claim not entirely convincing because it's actually benefited from XLA compiler, which is unfair to be compared with native PyTorch code. (Please correct me if I'm mistaken here) 😊 I hope to be able to maintain this support for Jax/Flax in the long term if my prototype is accepted in the future, even after GSoC ends in September. So I would be more than happy if supporting Jax/Flax proves to be valuable for users and openvino team. would you mind if I send you an email to ask for some suggestions directly? The proposal submission deadline is approaching, and I believe I can complete the draft this week. :) Additionally, should I aim to complete the work within 175 hours? I'm not sure if more time is necessary to ensure a thorough completion. Thank you for your time and support. Best regards, |
Beta Was this translation helpful? Give feedback.
-
Hey @rkazants , @AsakusaRinne and @chux0519 , It's truly invigorating to witness such a wealth of innovative ideas being brought to the table for this project. As a newcomer to the world of open source, I find this level of enthusiasm incredibly motivating. I agree with Yongsheng's point regarding the necessity of preprocessing on the Python side before parsing and generating the corresponding IR. Developing an adapter for JAX-based models to standardize them into an abstract format seems like a good approach to start with. This would facilitate seamless processing and compatibility moving forward. Moreover, considering that locally saved models essentially adhere to a pytree structure, devising our reconstruction of this pytree during IR conversion would enable us to mirror the practices embraced by the wider community, thereby empowering new contributors with a familiar framework to engage with. Additionally, I propose that benchmarking our converted OpenVINO models against the original JAX models could serve as a valuable tool. Not only would it provide us with insights into the potential impact we can achieve, but it would also bolster the credibility of our proof of concept. Looking forward to delving deeper into these concepts with all of you. |
Beta Was this translation helpful? Give feedback.
-
Dear contributors (@MonalSD, @awayzjj, @chux0519, @LucaTamSapienza, @AsakusaRinne), If you want to apply for Best regards, |
Beta Was this translation helpful? Give feedback.
-
Dear participants in GSoC'24,
Pay attention that project idea 12 was extended with a choice to implement Prototype for support of Flax/JAX models by OpenVINO.
Feel free to ask your questions under this thread.
Best regards,
Roman
Beta Was this translation helpful? Give feedback.
All reactions