LLM plugin to access Google's Gemini family of models
Project description
llm-gemini
API access to Google's Gemini models
Installation
Install this plugin in the same environment as LLM.
llm install llm-gemini
Usage
Configure the model by setting a key called "gemini" to your API key:
llm keys set gemini
<paste key here>
Now run the model using -m gemini-pro
, for example:
llm -m gemini-pro "A joke about a pelican and a walrus"
Why did the pelican get mad at the walrus?
Because he called him a hippo-crit.
To chat interactively with the model, run llm chat
:
llm chat -m gemini-pro
If you have access to the Gemini 1.5 Pro preview you can use -m gemini-1.5-pro-latest
to work with that model.
Development
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd llm-gemini
python3 -m venv venv
source venv/bin/activate
Now install the dependencies and test dependencies:
llm install -e '.[test]'
To run the tests:
pytest
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for llm_gemini-0.1a2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 66094edf2fb4ece49ae110fef845a866fc52073ea06e766078583a72dab7d88c |
|
MD5 | 007b382dd09e8cdbcc309d2fc4db4d4e |
|
BLAKE2b-256 | 103b7bf9f715e5d22f0c5c1ff7573e959f5d1d640296ab05c758bfecc76274cb |