The Julia Ecosystem around Flux
One of the main strengths of Julia lies in an ecosystem of packages globally providing a rich and consistent user experience.
This is a non-exhaustive list of Julia packages, nicely complementing Flux
in typical machine learning and deep learning workflows. To add your project please send a PR. See also academic work citing Flux or citing Zygote.
Flux models
- Flux's model-zoo contains examples from many domains.
Computer vision
- ObjectDetector.jl provides ready-to-go image detection via YOLO.
- Metalhead.jl includes many state-of-the-art computer vision models which can easily be used for transfer learning.
- UNet.jl is a generic UNet implementation.
Natural language processing
- Transformers.jl provides components for Transformer models for NLP, as well as providing several trained models out of the box.
- TextAnalysis.jl provides several NLP algorithms that use Flux models under the hood.
Reinforcement learning
- AlphaZero.jl provides a generic, simple and fast implementation of Deepmind's AlphaZero algorithm.
- ReinforcementLearning.jl offers a collection of tools for doing reinforcement learning research in Julia.
Graph learning
- GraphNeuralNetworks.jl is a fresh, performant and flexible graph neural network library based on Flux.jl.
- GeometricFlux.jl is the first graph neural network library for julia.
- NeuralOperators.jl enables training infinite dimensional PDEs by learning a continuous function instead of using the finite element method.
- SeaPearl.jl is a Constraint Programming solver that uses Reinforcement Learning based on graphs as input.
Time series
- FluxArchitectures.jl is a collection of advanced network architectures for time series forecasting.
Robust networks
- RobustNeuralNetworks.jl includes classes of neural networks that are constructed to naturally satisfy robustness constraints.
Tools closely associated with Flux
Utility tools you're unlikely to have met if you never used Flux!
High-level training flows
- FastAI.jl is a Julia port of Python's fast.ai library.
- FluxTraining.jl is a package for using and writing powerful, extensible training loops for deep learning models. It supports callbacks for many common use cases like hyperparameter scheduling, metrics tracking and logging, checkpointing, early stopping, and more. It powers training in FastAI.jl
- Ignite.jl is a Julia port of the Python library
ignite
for simplifying neural network training and validation loops, using events and handlers. - Tsunami.jl adds high-level ways to control training, parameter schedules & logging, heavily inspired by
pytorch-lightning
.
Datasets
Commonly used machine learning datasets are provided by the following packages in the julia ecosystem:
- MLDatasets.jl focuses on downloading, unpacking, and accessing benchmark datasets.
- GraphMLDatasets.jl: a library for machine learning datasets on graph.
Plumbing
Tools to put data into the right order for creating a model.
- Augmentor.jl is a real-time library augmentation library for increasing the number of training images.
- DataAugmentation.jl aims to make it easy to build stochastic, label-preserving augmentation pipelines for vision use cases involving images, keypoints and segmentation masks.
- MLUtils.jl (replaces MLDataUtils.jl and MLLabelUtils.jl) is a library for processing Machine Learning datasets.
Parameters
- ParameterSchedulers.jl standard scheduling policies for machine learning.
Differentiable programming
Packages based on differentiable programming but not necessarily related to Machine Learning.
- The SciML ecosystem uses Flux and Zygote to mix neural nets with differential equations, to get the best of black box and mechanistic modelling.
- DiffEqFlux.jl provides tools for creating Neural Differential Equations.
- Flux3D.jl shows off machine learning on 3D data.
- RayTracer.jl combines ML with computer vision via a differentiable renderer.
- Duckietown.jl Differentiable Duckietown simulator.
- The Yao.jl project uses Flux and Zygote for Quantum Differentiable Programming.
- AtomicGraphNets.jl enables learning graph based models on atomic systems used in chemistry.
- DiffImages.jl differentiable computer vision modeling in Julia with the Images.jl ecosystem.
Probabilistic programming
- Turing.jl extends Flux's differentiable programming capabilities to probabilistic programming.
- Omega.jl is a research project aimed at causal, higher-order probabilistic programming.
- Stheno.jl provides flexible Gaussian processes.
Statistics
- OnlineStats.jl provides single-pass algorithms for statistics.
Useful miscellaneous packages
Some useful and random packages!
- AdversarialPrediction.jl provides a way to easily optimise generic performance metrics in supervised learning settings using the Adversarial Prediction framework.
- Mill.jl helps to prototype flexible multi-instance learning models.
- MLMetrics.jl is a utility for scoring models in data science and machine learning.
- Torch.jl exposes torch in Julia.
- ValueHistories.jl is a utility for efficient tracking of optimization histories, training curves or other information of arbitrary types and at arbitrarily spaced sampling times.
- InvertibleNetworks.jl Building blocks for invertible neural networks in the Julia programming language.
- ProgressMeter.jl progress meters for long-running computations.
- TensorBoardLogger.jl easy peasy logging to tensorboard in Julia
- ArgParse.jl is a package for parsing command-line arguments to Julia programs.
- Parameters.jl types with default field values, keyword constructors and (un-)pack macros.
- BSON.jl is a package for working with the Binary JSON serialisation format.
- DataFrames.jl in-memory tabular data in Julia.
- DrWatson.jl is a scientific project assistant software.
This tight integration among Julia packages is shown in some of the examples in the model-zoo repository.
Alternatives to Flux
Julia has several other libraries for making neural networks.
SimpleChains.jl is focused on making small, simple, CPU-based, neural networks fast. Uses LoopVectorization.jl. (Was
FastChain
in DiffEqFlux.jl)Knet.jl is a neural network library built around AutoGrad.jl.
Lux.jl (earlier ExplicitFluxLayers.jl) shares much of the design, use-case, and NNlib.jl / Optimisers.jl back-end of Flux. But instead of encapsulating all parameters within the model structure, it separates this into 3 components: a model, a tree of parameters, and a tree of model states.
Flux's training docs talk about changes from Zygote's implicit to explicit gradients, dictionary-like to tree-like structures. (See also Zygote's description of these.) Lux also uses Zygote, but uses the word "explicit" to mean something unrelated, namely storing the tree of parameters (and of state) separately from the model.