Skip to main content

LLM toolkit for lightning-fast, high-quality development

Project description

Mirascope

Simplicity through idiomatic syntax Faster and more reliable releases
Semi-opinionated methods Reduced complexity that speeds up development
Reliability through validation More robust applications with fewer bugs

Tests Coverage Docs PyPI Version Stars Stars


Mirascope is an open-source Python toolkit built on top of Pydantic that makes working with Large Language Models (LLMs):

  • Durable: Seamlessly customize and extend functionality.
  • Intuitive: Editor support that you expect (e.g. autocompletioninline errors)
  • Clean: Pydantic together with our Prompt CLI eliminates prompt-related bugs.
  • Integrable: Easily integrate with JSON Schema and other tools such as FastAPI
  • Convenient: Tooling that is cleanelegant, and delightful that you don't need to maintain.
  • Open: Dedication to building open-source tools you can use with your choice of LLM.

We support any model that works with the OpenAI API, as well as other models such as Gemini.

Installation

Install Mirascope and start building with LLMs in minutes.

pip install mirascope

You can also install additional optional dependencies if you’re using those features:

pip install mirascope[wandb]   # WandbPrompt
pip install mirascope[gemini]  # GeminiPrompt, ...

Usage

With Mirascope, everything happens with prompts. The idea is to colocate any functionality that may impact the quality of your prompt — from the template variables to the temperature — so that you don’t need to worry about code changes external to your prompt affecting quality. For simple use-cases, we find that writing prompts as docstrings provides enhanced readability:

from mirascope.openai import OpenAICallParams, OpenAIPrompt


class BookRecommendation(OpenAIPrompt):
    """Please recommend a {genre} book."""

    genre: str

    call_params = OpenAICallParams(
        model="gpt-4",
        temperature=0.3,
    )


recommendation = BookRecommendation(genre="fantasy").create()
print(recommendation)
#> I recommend "The Name of the Wind" by Patrick Rothfuss. It is...

If you add any of the OpenAI message roles (SYSTEM, USER, ASSISTANT, TOOL) as keywords to your prompt docstring, they will automatically get parsed into a list of messages:

from mirascope.openai import OpenAIPrompt


class BookRecommendation(OpenAIPrompt):
    """
    SYSTEM:
    You are the world's greatest librarian.

    USER:
    Please recommend a {genre} book.
    """

    genre: str


prompt = BookRecommendation(genre="fantasy")
print(prompt.messages)
#> [{'role': 'system', 'content': "You are the world's greatest librarian."},
#   {'role': 'user', 'content': 'Please recommend a fantasy book.'}]

If you want to write the messages yourself instead of using the docstring message parsing, there’s nothing stopping you!

from mirascope.openai import OpenAIPrompt
from openai.types.chat import ChatCompletionMessageParam


class BookRecommendation(OpenAIPrompt):
    """This is now just a normal docstring.

    Note that you'll lose any functionality dependent on it,
    such as `template`.
    """

    genre: str

    @property
    def messages(self) -> list[ChatCompletionMessageParam]:
        """Returns the list of OpenAI prompt messages."""
        return [
            {"role": "system", "content": "You are the world's greatest librarian."},
            {"role": "user", "content": f"Please recommend a {self.genre} book."},
        ]


recommendation = BookRecommendation(genre="fantasy").create()
print(recommendation)
#> I recommend "The Name of the Wind" by Patrick Rothfuss. It is...

Create, Stream, Extract

Prompt classes such as OpenAIPrompt have three methods for interacting with the LLM:

  • create: Generate a response given a prompt. This will generate raw text unless tools are provided as part of the call_params.
  • stream: Same as create except the generated response is returned as a stream of chunks. All chunks together become the full completion.
  • extract: Convenience tooling built on top of tools to make it easy to extract structured information given a prompt and schema.

Using Different LLM Providers

The OpenAIPrompt class supports any endpoint that supports the OpenAI API, including (but not limited to) Anyscale, Together, and Groq. Simply update the base_url and set the proper api key in your environment:

import os

from mirascope.openai import OpenAICallParams, OpenAIPrompt

os.environ["OPENAI_API_KEY"] = "TOGETHER_API_KEY"


class BookRecommendation(OpenAIPrompt):
    """Please recommend a {genre} book."""

    genre: str

    call_params = OpenAICallParams(
            model="mistralai/Mixtral-8x7B-Instruct-v0.1",
            base_url="https://api.together.xyz/v1",
    )


recommendation = BookRecommendation(genre="fantasy").create()

We also support other providers such as Gemini.

Dive Deeper

Examples

You can find more usage examples in our examples directory, such as how to easily integrate with FastAPI.

We also have more detailed walkthroughs in our Cookbook docs section. Each cookbook has corresponding full code examples in the cookbook directory.

What’s Next?

We have a lot on our minds for what to build next, but here are a few things (in no particular order) that come to mind first:

  • Extracting structured information using LLMs
  • Agents
  • Support for more LLM providers:
    • Claude
    • Mistral
    • HuggingFace
  • Integrations:
    • Weights & Biases
    • LangSmith
    • … tell us what you’d like integrated!
  • Evaluating prompts and their quality by version
  • Additional docstring parsing for more complex messages

Versioning

Mirascope uses Semantic Versioning.

License

This project is licensed under the terms of the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mirascope-0.3.0.tar.gz (40.6 kB view hashes)

Uploaded Source

Built Distribution

mirascope-0.3.0-py3-none-any.whl (56.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page