Skip to main content

OpenLLM Core: Core components for OpenLLM.

Project description

Banner for OpenLLM

🦑 OpenLLM Core

pypi_status test_pypi_status Twitter Discord ci pre-commit.ci status
python_version Hatch code style Ruff types - mypy types - pyright

OpenLLM Core: Core components for OpenLLM.

📖 Introduction

With OpenLLM, you can run inference with any open-source large-language models, deploy to the cloud or on-premises, and build powerful AI apps, and more.

To learn more about OpenLLM, please visit OpenLLM's README.md

This package holds the core components of OpenLLM, and considered as internal.

Components includes:

  • Configuration generation.
  • Utilities for interacting with OpenLLM server.
  • Schema and generation utilities for OpenLLM server.

Gif showing OpenLLM Intro

Gif showing Agent integration

📔 Citation

If you use OpenLLM in your research, we provide a citation to use:

@software{Pham_OpenLLM_Operating_LLMs_2023,
author = {Pham, Aaron and Yang, Chaoyu and Sheng, Sean and  Zhao, Shenyang and Lee, Sauyon and Jiang, Bo and Dong, Fog and Guan, Xipeng and Ming, Frost},
license = {Apache-2.0},
month = jun,
title = {{OpenLLM: Operating LLMs in production}},
url = {https://github.com/bentoml/OpenLLM},
year = {2023}
}

Click me for full changelog

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openllm_core-0.4.22.tar.gz (62.7 kB view hashes)

Uploaded Source

Built Distribution

openllm_core-0.4.22-py3-none-any.whl (79.4 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page