Skip to content
This repository has been archived by the owner on Jul 18, 2024. It is now read-only.

Intel® End-to-End AI Optimization Kit release v1.1

Compare
Choose a tag to compare
@xuechendi xuechendi released this 28 Apr 04:02
· 407 commits to main since this release
dc292e4

Highlights

This release introduces a new component: Model Adaptor. It adopts transfer learning methodologies to reduce training time, improve inference throughput and reduce data labeling by taking the advantage of public pretrained models and datasets. The three methods in Model Adaptor are: Finetuner, Distiller, and Domain Adapter. Currently, model adaptor supports ResNet, BERT, GPT2, 3D Unet models, covering Image Classification, Natural Language Processing and Medical Segmentation domains.

This release provides following major features:

  • Model Adaptor Finetuner
  • Model Adaptor Distiller
  • Model Adaptor Domain Adaptor
  • Support Hugging Face models in training free NAS

Improvements

  • Updated demo with colab click-to-run support
  • Updated docker with jupyter support

Papers and Blogs

Versions and Components

  • TensorFlow 2.10.0
  • PyTorch 1.5, 1.12
  • Intel® Extension for TensorFlow 2.10.x
  • Intel® Extension for Pytorch 0.2, 1.12.x
  • Horovod 0.26
  • Python 3.9.12

Links

Full Changelog: https://github.com/intel/e2eAIOK/commits/v1.1