LLM plugin to access Google's Gemini family of models
Project description
llm-gemini
API access to Google's Gemini models
Installation
Install this plugin in the same environment as LLM.
llm install llm-gemini
Usage
Configure the model by setting a key called "gemini" to your API key:
llm keys set gemini
<paste key here>
Now run the model using -m gemini-pro
, for example:
llm -m gemini-pro "A joke about a pelican and a walrus"
Why did the pelican get mad at the walrus?
Because he called him a hippo-crit.
To chat interactively with the model, run llm chat
:
llm chat -m gemini-pro
If you have access to the Gemini 1.5 Pro preview you can use -m gemini-1.5-pro-latest
to work with that model.
Development
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd llm-gemini
python3 -m venv venv
source venv/bin/activate
Now install the dependencies and test dependencies:
llm install -e '.[test]'
To run the tests:
pytest
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for llm_gemini-0.1a1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 06ec0820acd2bb2bb77e459cdf560ff71061bb7e4e629446d2b14f0444b7e832 |
|
MD5 | 6277a8d6403bdeb63232d614b2471202 |
|
BLAKE2b-256 | 8c3757c3d60ecc156a5eccc1544946a2ac909ae5ec2019bbd0c5f9a4d02a7f78 |