Flyte is a container-native, type-safe workflow and pipelines platform optimized for large scale processing and machine learning written in Golang. Workflows can be written in any language, with out of the box support for Python.
https://flyte.org Docs: https://lyft.github.io/flyte
Flyte is a fabric that connects disparate computation backends using a type safe data dependency graph. It records all changes to a pipeline, making it possible to rewind time. It also stores a history of all executions and provides an intuitive UI, CLI and REST/gRPC API to interact with the computation.
Flyte is more than a workflow engine, it provides workflows as a core concepts, but it also provides a single unit of execution - tasks, as a top level concept. Multiple tasks arranged in a data producer-consumer order creates a workflow. Flyte workflows are pure specification and can be created using any language. Every task can also by any language. We do provide first class support for python, making it perfect for modern Machine Learning and Data processing pipelines.
- Used at Scale in production by 500+ users at Lyft with more than 400k workflows a month and more than 20+ million container executions per month
- Centralized Inventory of Tasks, Workflows and Executions
- gRPC / REST interface to define and executes tasks and workflows
- Type safe construction of pipelines, each task has an interface which is characterized by its input and outputs. Thus illegal construction of pipelines fails during declaration rather than at runtime
- Types that help in creating machine learning and data processing pipelines like - Blobs (images, arbitrary files), Directories, Schema (columnar structured data), collections, maps etc
- Memoization and Lineage tracking
- Workflows features
- Multiple Schedules for every workflow
- Parallel step execution
- Extensible Backend to add customized plugin experiences
- Arbitrary container execution
- Branching
- Inline Subworkflows (a workflow can be embeded within one node of the top level workflow)
- Distributed Remote Child workflows (a remote workflow can be triggered and statically verified at compile time)
- Array Tasks (map some function over a large dataset, controlled execution of 1000's of containers)
- Dynamic Workflow creation and execution - with runtime type safety
- Container side plugins with first class support in python
- Maintain an inventory of tasks and workflows
- Record history of all executions and executions (as long as they follow convention) are completely repeatable
- Multi Cloud support (AWS, GCP and others)
- Extensible core
- Modularized
- Automated notifications to Slack, Email, Pagerduty
- Deep observability
- Multi K8s cluster support
- Comes with many system supported out of the box on K8s like Spark etc.
- Snappy Console
- Python CLI
- Written in Golang and optimized for performance
- Single Task Execution support
- Reactive pipelines
- More integrations
- Containers
- K8s Pods
- AWS Batch Arrays
- K8s Pod arrays
- K8s Spark (native pyspark and java/scala)
- Qubole Hive
- Presto Queries
- Lyft Rideshare
- Lyft L5 autonomous
- Juno
Repo | Language | Purpose |
---|---|---|
flyte | Kustomize,RST | deployment, documentation, issues |
flyteidl | Protobuf | interface definitions |
flytepropeller | Go | execution engine |
flyteadmin | Go | control plane |
flytekit | Python | python SDK and tools |
flyteconsole | Typescript | admin console |
datacatalog | Go | manage input & output artifacts |
flyteplugins | Go | flyte plugins |
flytestdlib | Go | standard library |
flytesnacks | Python | examples, tips, and tricks |
Repo | Language | Purpose |
---|---|---|
Spark | Go | Apache Spark batch |
Flink | Go | Apache Flink streaming |