LM Async Client, openai client, azure openai client ...
Project description
lmclient
LM Async Client, OpenAI, Azure ...
Install
pip install lmclient-core
Usage
from lmclient import LMClient, AzureCompletion, OpenAICompletion
openai_completion = OpenAICompletion(model='gpt-3.5-turbo')
# azure_completion = AzureCompletion()
client = LMClient(openai_completion, async_capacity=5, max_requests_per_minute=20)
prompts = [
'Hello, my name is',
'can you please tell me your name?',
'i want to know your name',
'what is your name?',
]
values = client.async_run(prompts=prompts)
print(values)
Advanced Usage
# limit max_requests_per_minute to 20
# limit async_capacity to 5 (max 5 async requests at the same time)
# use cache
# set error_mode to ignore (ignore or raise)
from lmclient import LMClient, OpenAICompletion
openai_completion = OpenAICompletion(model='gpt-3.5-turbo', max_requests_per_minute=20, async_capacity=5, cache_dir='openai_cache', error_mode='ignore')
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
lmclient-core-0.2.0.tar.gz
(8.0 kB
view hashes)
Built Distribution
Close
Hashes for lmclient_core-0.2.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 32fcc86c1400dba13f769e4bc4a29882c18e20e8d5d57027ed48f7bd7ffb2878 |
|
MD5 | 8ef1a303772ba07ea53ddf4a63911a75 |
|
BLAKE2b-256 | a67b35f00de42fd8522f1cead82da53803ad369840d4e79811567b0237c2c9e3 |