LM Async Client, openai client, azure openai client ...
Project description
lmclient
面向于大规模异步请求 OpenAI 接口设计的客户端,使用场景 self-instruct, 大规模翻译等
Features
- 支持大规模异步请求 openai 接口
- 支持进度条
- 支持限制每分钟最大请求次数
- 支持限制异步容量 (类似于线程池的大小)
- 支持磁盘缓存
- 100% type hints
- 非常易用
- 支持 OpenAI, Azure, Minimax, MinimaxPro, 智谱, 百度文心, 腾讯混元
- 支持 FunctionCall
安装方式
支持 python3.8 及以上
pip install lmclient-core
使用方法
- CompletionEngine
from lmclient import CompletionEngine, OpenAIChat, OpenAIChatParameters
model = OpenAIChat('gpt-3.5-turbo', parameters=OpenAIChatParameters(temperature=0))
# 控制每分钟最大请求次数为 20, 异步容量为 5
client = CompletionEngine(model, async_capacity=5, max_requests_per_minute=20)
prompts = [
'Hello, my name is',
'can you please tell me your name?',
[{'role': 'user', 'content': 'hello, who are you?'}],
'what is your name?',
]
outputs = client.async_run(prompts=prompts)
for output in outputs:
print(output.reply)
- ChatEngine
from lmclient import ChatEngine, OpenAIChat
model = OpenAIChat('gpt-3.5-turbo')
chat_engine = ChatEngine(model)
print(chat_engine.chat('你好,我是 chat_engine'))
print(chat_engine.chat('我上一句话是什么?'))
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
lmclient_core-0.8.3.tar.gz
(20.7 kB
view hashes)
Built Distribution
Close
Hashes for lmclient_core-0.8.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5612492fa413ca7b160a3304271adde113da34bdf091d61f5340dd0be952172c |
|
MD5 | baf6eff13662997918b5675ced9b576b |
|
BLAKE2b-256 | 41deb9ece906c9b191ed57bba73a7d6ef241f4fefc40c0ea6d8130acbe9d0b0c |