← Back to Cookbook
cookbook maxim mistral integration
Details
File: third_party/Maxim/cookbook_maxim_mistral_integration.ipynb
Type: Jupyter Notebook
Use Cases:
Integrations: Maxim
Content
Notebook content (JSON format):
{ "nbformat": 4, "nbformat_minor": 0, "metadata": { "colab": { "provenance": [] }, "kernelspec": { "name": "python3", "display_name": "Python 3" }, "language_info": { "name": "python" } }, "cells": [ { "attachments": {}, "cell_type": "markdown", "id": "a3b1829d", "metadata": {}, "source": [ "<a href=\"https://colab.research.google.com/drive/1gqJkkLQQTtSdUla2oiIg-QyR6i5BpmvV?usp=sharing\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>" ] }, { "cell_type": "markdown", "source": [ "<!-- <center>\n", " <p style=\"text-align:center\">\n", " <img alt=\"langfuse logo\" src=\"https://cdn.prod.website-files.com/665ab0daac869acad030a349/666eaa8185c77cf08cbbfbd6_OG%20image_1200x630.png\" width=\"200\"/>\n", " <br>\n", " <a href=\"https://getmaxim.ai/docs\">Docs</a>\n", " |\n", " <a href=\"https://github.com/maximhq\">GitHub</a>\n", " |\n", " <a href=\"https://discord.langfuse.com/\">Discord</a>\n", " </p>\n", "</center> -->\n", "<h1 align=\"center\">Observability with Mistral AI and Maxim</h1>\n", "\n", "In this cookbook, we show you how to use [Maxim](https://www.getmaxim.ai/), to observe Mistral LLM calls & metrics.\n", "\n", "## What is Maxim?\n", "\n", "Maxim AI provides comprehensive observability for your Mistral based AI applications. With Maxim's one-line integration, you can easily trace and analyse LLM calls, metrics, and more.\n", "\n", "**Pros:**\n", "\n", "* Performance Analytics: Track latency, tokens consumed, and costs\n", "* Advanced Visualisation: Understand agent trajectories through intuitive dashboards\n" ], "metadata": { "id": "m7Fv4fGJ7BFc" } }, { "cell_type": "markdown", "source": [ "## Install and Import Required Modules\n", "You need to install `mistralai` and `maxim-py` packages from [pypy](https://pypy.org/)" ], "metadata": { "id": "qzuTYI0l7Ear" } }, { "cell_type": "code", "source": [ "!pip install mistralai maxim-py" ], "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "P0zpZNlQ25f2", "outputId": "932c6ce6-4989-4a6c-df12-c89e784cf347" }, "execution_count": null, "outputs": [ { "output_type": "stream", "name": "stdout", "text": [ "Collecting mistralai\n", " Downloading mistralai-1.8.1-py3-none-any.whl.metadata (33 kB)\n", "Collecting maxim-py\n", " Downloading maxim_py-3.8.1-py3-none-any.whl.metadata (13 kB)\n", "Collecting eval-type-backport>=0.2.0 (from mistralai)\n", " Downloading eval_type_backport-0.2.2-py3-none-any.whl.metadata (2.2 kB)\n", "Requirement already satisfied: httpx>=0.28.1 in /usr/local/lib/python3.11/dist-packages (from mistralai) (0.28.1)\n", "Requirement already satisfied: pydantic>=2.10.3 in /usr/local/lib/python3.11/dist-packages (from mistralai) (2.11.5)\n", "Requirement already satisfied: python-dateutil>=2.8.2 in /usr/local/lib/python3.11/dist-packages (from mistralai) (2.9.0.post0)\n", "Requirement already satisfied: typing-inspection>=0.4.0 in /usr/local/lib/python3.11/dist-packages (from mistralai) (0.4.1)\n", "Requirement already satisfied: requests in /usr/local/lib/python3.11/dist-packages (from maxim-py) (2.32.3)\n", "Requirement already satisfied: urllib3 in /usr/local/lib/python3.11/dist-packages (from maxim-py) (2.4.0)\n", "Requirement already satisfied: typing-extensions in /usr/local/lib/python3.11/dist-packages (from maxim-py) (4.13.2)\n", "Collecting filetype (from maxim-py)\n", " Downloading filetype-1.2.0-py2.py3-none-any.whl.metadata (6.5 kB)\n", "Requirement already satisfied: anyio in /usr/local/lib/python3.11/dist-packages (from httpx>=0.28.1->mistralai) (4.9.0)\n", "Requirement already satisfied: certifi in /usr/local/lib/python3.11/dist-packages (from httpx>=0.28.1->mistralai) (2025.4.26)\n", "Requirement already satisfied: httpcore==1.* in /usr/local/lib/python3.11/dist-packages (from httpx>=0.28.1->mistralai) (1.0.9)\n", "Requirement already satisfied: idna in /usr/local/lib/python3.11/dist-packages (from httpx>=0.28.1->mistralai) (3.10)\n", "Requirement already satisfied: h11>=0.16 in /usr/local/lib/python3.11/dist-packages (from httpcore==1.*->httpx>=0.28.1->mistralai) (0.16.0)\n", "Requirement already satisfied: annotated-types>=0.6.0 in /usr/local/lib/python3.11/dist-packages (from pydantic>=2.10.3->mistralai) (0.7.0)\n", "Requirement already satisfied: pydantic-core==2.33.2 in /usr/local/lib/python3.11/dist-packages (from pydantic>=2.10.3->mistralai) (2.33.2)\n", "Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.11/dist-packages (from python-dateutil>=2.8.2->mistralai) (1.17.0)\n", "Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.11/dist-packages (from requests->maxim-py) (3.4.2)\n", "Requirement already satisfied: sniffio>=1.1 in /usr/local/lib/python3.11/dist-packages (from anyio->httpx>=0.28.1->mistralai) (1.3.1)\n", "Downloading mistralai-1.8.1-py3-none-any.whl (373 kB)\n", "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m373.2/373.2 kB\u001b[0m \u001b[31m3.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading maxim_py-3.8.1-py3-none-any.whl (174 kB)\n", "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m174.2/174.2 kB\u001b[0m \u001b[31m7.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25hDownloading eval_type_backport-0.2.2-py3-none-any.whl (5.8 kB)\n", "Downloading filetype-1.2.0-py2.py3-none-any.whl (19 kB)\n", "Installing collected packages: filetype, eval-type-backport, maxim-py, mistralai\n", "Successfully installed eval-type-backport-0.2.2 filetype-1.2.0 maxim-py-3.8.1 mistralai-1.8.1\n" ] } ] }, { "cell_type": "markdown", "source": [ "## Set the environment variables\n", "You can sign up on [Maxim](https://getmaxim.ai/signup) and create a new Api Key from Settings. After that go to Logs section and create a new Log Repository, you will receive a Log Repository Id. Get ready with your Mistral Api Key also." ], "metadata": { "id": "93c3DfpN7KV9" } }, { "cell_type": "code", "source": [ "import os\n", "from google.colab import userdata\n", "\n", "MAXIM_API_KEY=userdata.get(\"MAXIM_API_KEY\")\n", "MAXIM_LOG_REPO_ID=userdata.get(\"MAXIM_REPO_ID\")\n", "MISTRAL_API_KEY=userdata.get(\"MISTRAL_API_KEY\")\n", "\n", "os.environ[\"MAXIM_API_KEY\"] = MAXIM_API_KEY\n", "os.environ[\"MAXIM_LOG_REPO_ID\"] = MAXIM_LOG_REPO_ID\n", "os.environ[\"MISTRAL_API_KEY\"] = MISTRAL_API_KEY" ], "metadata": { "id": "UtDtXy573YoU" }, "execution_count": null, "outputs": [] }, { "cell_type": "markdown", "source": [ "## Initialize logger\n", "\n", "Create an instance of Maxim Logger" ], "metadata": { "id": "nGp-lef67OC4" } }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "RCIqz-iS2lcN", "outputId": "769e7d8b-6de5-4a2e-98a6-d7c98f753a8e" }, "outputs": [ { "output_type": "stream", "name": "stderr", "text": [ "INFO:maxim:[MaximSDK] Starting flush thread with interval {10} seconds\n" ] }, { "output_type": "stream", "name": "stdout", "text": [ "\u001b[32m[MaximSDK] Initializing Maxim AI(v3.8.1)\u001b[0m\n", "\u001b[32m[MaximSDK] Using info logging level.\u001b[0m\n", "\u001b[32m[MaximSDK] For debug logs, set global logging level to debug logging.basicConfig(level=logging.DEBUG).\u001b[0m\n" ] } ], "source": [ "from maxim import Maxim\n", "\n", "logger = Maxim().logger()" ] }, { "cell_type": "markdown", "source": [ "## Make LLM calls using MaximMistralClient\n", "\n", "Make a call to Mistral via Mistral Api Client provided by Maxim, define the model you want to use and list of messages." ], "metadata": { "id": "4zztRoOV4j1d" } }, { "cell_type": "code", "source": [ "from mistralai import Mistral\n", "from maxim.logger.mistral import MaximMistralClient\n", "import os\n", "\n", "with MaximMistralClient(Mistral(\n", " api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n", "), logger) as mistral:\n", "\n", " res = mistral.chat.complete(\n", " model=\"mistral-medium-latest\",\n", " messages=[\n", " {\n", " \"content\": \"Who is the best French painter? Answer in one short sentence.\",\n", " \"role\": \"user\",\n", " },\n", " ]\n", " )\n", "\n", " # Handle response\n", " print(res)" ], "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "x574Q-Nx38UI", "outputId": "3266ba87-81ca-48f9-e5c4-f03e96aa9209" }, "execution_count": null, "outputs": [ { "output_type": "stream", "name": "stdout", "text": [ "id='e161b1cd452042549d5292b6a60f6b83' object='chat.completion' model='mistral-medium-latest' usage=UsageInfo(prompt_tokens=16, completion_tokens=23, total_tokens=39) created=1749107242 choices=[ChatCompletionChoice(index=0, message=AssistantMessage(content='Claude Monet is often regarded as the best French painter, renowned for his pioneering role in Impressionism.', tool_calls=None, prefix=False, role='assistant'), finish_reason='stop')]\n" ] } ] }, { "cell_type": "markdown", "source": [ "To check the logs shared by Mistral SDK with Maxim -\n", "1. Go to Logs section in Maxim Platform\n", "2. Go to the respective Log Repository you created.\n", "3. Switch to `Logs` from top tab view and analyse the traces received\n", "\n", "" ], "metadata": { "id": "VOvIeagzKwef" } }, { "cell_type": "markdown", "source": [ "## Async LLM call" ], "metadata": { "id": "-38BV06J4gYo" } }, { "cell_type": "code", "source": [ "async with MaximMistralClient(Mistral(\n", " api_key=os.getenv('MISTRAL_API_KEY', ''),\n", "), logger) as mistral:\n", "\n", " response = await mistral.chat.complete_async(\n", " model='mistral-small-latest',\n", " messages=[\n", " {\n", " 'role': 'user',\n", " 'content': 'Explain the difference between async and sync programming in Python in one sentence.'\n", " }\n", " ]\n", " )\n", " print(response)" ], "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "me5mxEG94ST1", "outputId": "21f29a32-13f2-4dd7-eaf6-26c7fbc39c85" }, "execution_count": null, "outputs": [ { "output_type": "stream", "name": "stdout", "text": [ "id='ef228964167649278f1eeecfe6d985d4' object='chat.completion' model='mistral-small-latest' usage=UsageInfo(prompt_tokens=18, completion_tokens=34, total_tokens=52) created=1749106669 choices=[ChatCompletionChoice(index=0, message=AssistantMessage(content='Async programming in Python allows for concurrent execution of tasks using `async` and `await` keywords, while sync programming executes tasks sequentially, blocking until each task completes.', tool_calls=None, prefix=False, role='assistant'), finish_reason='stop')]\n" ] } ] }, { "cell_type": "markdown", "source": [ "# Feedback\n", "---\n", "\n", "If you have any feedback or requests, please create a GitHub [Issue](https://github.com/maximhq/maxim-docs)." ], "metadata": { "id": "UvT-PbyHLqli" } } ] }