AIVM is a cutting-edge framework designed for privacy-preserving inference using advanced cryptographic protocols. With AIVM, you can deploy a local development network (devnet) to explore private inference using provided examples or custom models.
-
Create a virtual environment:
python3 -m venv .venv
-
Activate the virtual environment:
On Linux/macOS:
source .venv/bin/activate
On Windows:
.\venv\Scripts\activate
-
Install the package:
If you are going to execute the examples execute:
pip install "nillion-aivm[examples]"
Otherwise, if you are going to produce your own code, you can just:
pip install nillion-aivm
-
Start the AIVM devnet:
aivm-devnet
-
Open the provided Jupyter notebook examples/getting-started.ipynb to run private inference examples on AIVM.
-
After completing your tasks, terminate the devnet process by pressing
CTRL+C
.
For additional usage, refer to the examples folder, which demonstrates how to set up private inference workflows using AIVM.
- First, import the AIVM client and check available models:
import aivm_client as aic
# List all supported models
available_models = aic.get_supported_models()
print(available_models)
- Prepare your input data. Here's an example using PyTorch to generate a random input:
import torch
# Create a sample input (e.g., for LeNet5 MNIST)
random_input = torch.randn((1, 1, 28, 28)) # Batch size 1, 1 channel, 28x28 pixels
- Encrypt your input using the appropriate Cryptensor:
# Encrypt the input
encrypted_input = aic.LeNet5Cryptensor(random_input)
- Perform secure inference:
# Get prediction while maintaining privacy
result = aic.get_prediction(encrypted_input, "LeNet5MNIST")
The get_prediction
function automatically handles the secure computation protocol with the aivm-devnet
nodes, ensuring that your input data remains private throughout the inference process.
You can deploy your own trained models to AIVM, provided they follow the supported architectures (BertTiny or LeNet5).
- Import the AIVM client:
import aivm_client as aic
- Upload your custom model:
# For BertTiny models
aic.upload_bert_tiny_model(model_path, "MyCustomBertTiny")
# For LeNet5 models
aic.upload_lenet5_model(model_path, "MyCustomLeNet5")
- Perform inference with your custom model:
# For BertTiny models
result = aic.get_prediction(private_berttiny_input, "MyCustomBertTiny")
# For LeNet5 models
result = aic.get_prediction(private_lenet5_input, "MyCustomLeNet5")
This project is licensed under the MIT License.