Skip to content
@Burla-Cloud

Burla

Scale your program across 1000s of computers with one line of code.

Goal:

It’s 2024, it should be trivial, even for complete beginners, to scale python over thousands of computers in the cloud, with any hardware, and any software environment. Moreover, the software used to do this, should be free and open source.

Overview:

Burla is a python package that makes it easy to run code on (lots of) other computers.

Burla only has one function: remote_parallel_map.
This function requires just two arguments, here's how it works:

from burla import remote_parallel_map

# Arg 1: Any python function:
def my_function(my_input):
    ...

# Arg 2: List of inputs for `my_function`
my_inputs = [1, 2, 3, ...]

# Calls `my_function` on every input in `my_inputs`,
# at the same time, each on a separate computer in the cloud.
remote_parallel_map(my_function, my_inputs)
  • Burla is fast and scalable.
    Code starts running within 1 second, on up to 1000 CPU's.
  • Running code remotely with Burla feels like local development. This means that:
    • Errors thrown on remote computers are raised on your local machine.
    • Anything you print appears in the terminal on your local machine.
    • Your python environment is automaticaly cloned on all remote computers.
      This allows you to call any local python package in a function sent to remote_parallel_map.
      After installing once, environments are cached to keep latency below 1 second.
  • Burla is easy to install.
    Try our managed service with two commands. Install Burla in your cloud with three commands.
  • Burla supports custom resource requirements.
    Allocate up to 96 CPUs and 360G of ram to each individual function call with two simple arguments.
  • Burla supports GPU's.
    Just add one argument: remote_parallel_map(my_function, my_inputs, gpu="A100")
  • Burla supports custom Docker images.
    Just add one argument: remote_parallel_map(my_function, my_inputs, dockerfile="./Dockerfile")
    After building once, images are cached to keep latency below 1 second.
  • Burla offers simple network storage.
    By default, all remote machines are attached to the same persistent network disk.
    Upload & download files to this disk through a simple CLI: > burla nas upload / download / ls / rm ...

Components / How it works:

Unlike many open-source projects Burla does not to use a monorepo.
Instead major components are split across 4 separate GitHub repositories:

  1. Burla
    The python package (the client).
  2. main_service
    Service representing a single cluster, manages nodes, routes requests to node_services.
  3. node_service
    Service running on each node, manages containers, routes requests to container_services.
  4. container_service
    Service running inside each container, executes user submitted functions.

Read about how Burla works: How-Burla-works.md

Burla is currently under devlopment and is not ready to be used.

To join our mailing list go to burla.dev.
If you have any questions, email me at: jake@burla.dev, or join us on Discord.

Popular repositories

  1. burla burla Public

    Scale your program across 1000s of computers with one line of code.

    Python 45 2

  2. main_service main_service Public

    Service responsible for Burla client interaction & Burla cluster management.

    Python 3 1

  3. documentation_website documentation_website Public

    View at: docs.burla.dev

    JavaScript 1 1

  4. node_service node_service Public

    Service that runs on each node in a Burla cluster.

    Python 1

  5. container_service container_service Public

    Service that runs in each container in a Burla cluster.

    Python 1

  6. .github .github Public

Repositories

Showing 6 of 6 repositories

People

This organization has no public members. You must be a member to see who’s a part of this organization.

Top languages

Loading…

Most used topics

Loading…