Skip to main content

Lablup Backend.AI Meta-package

Project description

Backend.AI is a streamlined backend service framework hosting heterogeneous programming languages and popular AI frameworks. It manages the underlying computing resources for multi-tenant computation sessions where such sessions are spawned and executed instantly on demand.

In the names of sub-projects, we use a private code-name “Sorna” which we take from a famous science fiction “Jurassic Park” – meaning that we do all the dirty jobs in the behind. In the novel, Isla Nublar is the “front-end” island where tourists see the dinosaurs and Isla Sorna is the “back-end” island where a secret dinosaurs production facility is located.

All sub-projects are licensed under LGPLv3+.

Server-side Components

Manager with API Gateway

It routes external API requests from front-end services to individual agents. It also monitors and scales the cluster of multiple agents (a few tens to hundreds).

Agent

It manages individual server instances and launches/destroys Docker containers where REPL daemons (kernels) run. Each agent on a new EC2 instance self-registers itself to the instance registry via heartbeats.

REPL

A set of small ZMQ-based REPL daemons in various programming languages and configurations. It also includes a sandbox implemented using ptrace-based sytem call filtering written in Go.

Sorna Common

A collection of utility modules commonly shared throughout Backend.AI projects.

Client-side Components

Client Libraries

A client library to access the Sorna API servers with ease.

Sorna Media

The front-end support libraries to handle multi-media outputs (e.g., SVG plots, animated vector graphics)

  • The Python package (lablup) is installed inside kernel containers.

  • To interpret and display media generated by the Python package, you need to load the Javascript part in the front-end.

  • https://github.com/lablup/sorna-media

Integrations with IDEs and Editors

Sorna Jupyter Kernel

Jupyter kernel integration of the Sorna Cloud API.

Visual Studio Code Extension

Extension for Visual Studio Code to run your code on the Lablup.AI clouds or your own Backend.AI servers.

Atom Editor plugin

Atom Editor Plugin that allows running your code on the Lablup.AI clouds or your own Backend.AI servers.

Installation

The Sorna project uses latest features in Python 3.6+ and Docker CE 17.05+.

To install the manager with API gateway, run:

pip install backend.ai[manager]

For each computing servers, install the agent using:

pip install backend.ai[agent]

NOTE: More details about configuration will be released soon.

Development

git flow

The sorna repositories use git flow to streamline branching during development and deployment. We use the default configuration (master -> preparation for release, develop -> main development, feature/ -> features, etc.) as-is.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

backend.ai-1.0.1.tar.gz (3.5 kB view hashes)

Uploaded Source

Built Distribution

backend.ai-1.0.1-py3-none-any.whl (6.6 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page