Skip to content

fryant24/argo

Repository files navigation

slack

Argoproj - Get stuff done with Kubernetes

Argo Image

Quickstart

kubectl create namespace argo
kubectl apply -n argo -f https://raw.githubusercontent.com/argoproj/argo/stable/manifests/install.yaml

What is Argoproj?

Argoproj is a collection of tools for getting work done with Kubernetes.

  • Argo Workflows - Container-native Workflow Engine
  • Argo CD - Declarative GitOps Continuous Delivery
  • Argo Events - Event-based Dependency Manager
  • Argo Rollouts - Deployment CR with support for Canary and Blue Green deployment strategies

Also argoproj-labs is a separate GitHub org that we setup for community contributions related to the Argoproj ecosystem. Repos in argoproj-labs are administered by the owners of each project. Please reach out to us on the Argo slack channel if you have a project that you would like to add to the org to make it easier to others in the Argo community to find, use, and contribute back.

What is Argo Workflows?

Argo Workflows is an open source container-native workflow engine for orchestrating parallel jobs on Kubernetes. Argo Workflows is implemented as a Kubernetes CRD (Custom Resource Definition).

  • Define workflows where each step in the workflow is a container.
  • Model multi-step workflows as a sequence of tasks or capture the dependencies between tasks using a graph (DAG).
  • Easily run compute intensive jobs for machine learning or data processing in a fraction of the time using Argo Workflows on Kubernetes.
  • Run CI/CD pipelines natively on Kubernetes without configuring complex software development products.

Why Argo Workflows?

  • Designed from the ground up for containers without the overhead and limitations of legacy VM and server-based environments.
  • Cloud agnostic and can run on any Kubernetes cluster.
  • Easily orchestrate highly parallel jobs on Kubernetes.
  • Argo Workflows puts a cloud-scale supercomputer at your fingertips!

Documentation

Features

  • DAG or Steps based declaration of workflows
  • Artifact support (S3, Artifactory, HTTP, Git, raw)
  • Step level input & outputs (artifacts/parameters)
  • Loops
  • Parameterization
  • Conditionals
  • Timeouts (step & workflow level)
  • Retry (step & workflow level)
  • Resubmit (memoized)
  • Suspend & Resume
  • Cancellation
  • K8s resource orchestration
  • Exit Hooks (notifications, cleanup)
  • Garbage collection of completed workflow
  • Scheduling (affinity/tolerations/node selectors)
  • Volumes (ephemeral/existing)
  • Parallelism limits
  • Daemoned steps
  • DinD (docker-in-docker)
  • Script steps

Who uses Argo?

As the Argo Community grows, we'd like to keep track of our users. Please send a PR with your organization name.

Currently officially using Argo:

  1. Adevinta
  2. Admiralty
  3. Adobe
  4. Alibaba Cloud
  5. Ant Financial
  6. BasisAI
  7. BioBox Analytics
  8. BlackRock
  9. Canva
  10. Capital One
  11. CarTrack
  12. CCRi
  13. Codec
  14. Commodus Tech
  15. CoreFiling
  16. Cratejoy
  17. CyberAgent
  18. Cyrus Biotechnology
  19. Datadog
  20. DataStax
  21. EBSCO Information Services
  22. Equinor
  23. Fairwinds
  24. Gardener
  25. GitHub
  26. Gladly
  27. Google
  28. Greenhouse
  29. HOVER
  30. IBM
  31. InsideBoard
  32. Interline Technologies
  33. Intuit
  34. Karius
  35. KintoHub
  36. Localytics
  37. Maersk
  38. Max Kelsen
  39. Mirantis
  40. NVIDIA
  41. OVH
  42. Peak AI
  43. Preferred Networks
  44. Quantibio
  45. Ramboll Shair
  46. Red Hat
  47. SAP Fieldglass
  48. SAP Hybris
  49. Sidecar Technologies
  50. Styra
  51. Threekit
  52. Tiger Analytics
  53. Wavefront

Community Blogs and Presentations

Project Resources