Welcome to this tutorial on using Mistral AI’s function calling capabilities with the Linkup API for web search integration. This guide will help you leverage Mistral’s intelligence combined with real-time web data to create powerful and up-to-date applications.

By combining Mistral’s advanced language understanding with Linkup’s search capabilities, you can create applications that:

  1. Access up-to-date information beyond the model’s training data
  2. Find specific facts, statistics, and current events
  3. Research topics with accurate citations and references
  4. Verify information from authoritative sources

Mistral AI’s function calling is available in their latest models including mistral-small-latest, mistral-medium-latest, and mistral-large-latest.

2

Set Up Your Environment

First, let’s import the necessary libraries and set up our environment:

pip install linkup-sdk mistralai
from mistralai import Mistral, SystemMessage, UserMessage, AssistantMessage, ToolMessage
from mistralai.models import TextChunk
from linkup import LinkupClient
import json
from typing import List, Literal

Configure your API keys by setting environment variables or storing them securely in your application:

LINKUP_API_KEY = 'your_linkup_api_key'
MISTRAL_API_KEY = 'your_mistral_api_key'

Initialize the Mistral client:

client = Mistral(api_key=MISTRAL_API_KEY)
3

Implement Core Functions

Create the chat completion function to handle Mistral interactions:

def chat_completion_request(
messages: List[dict], tools: List[dict] = None, model: str = "mistral-large-latest"
):
    try:
        response = client.chat.complete(
            model=model, messages=messages, tools=tools, tool_choice="any"
        )
        return response
    except Exception as e:
        print("Unable to generate ChatCompletion response")
        print(f"Exception: {e}")
        return e

Create a helper function for message creation:

def create_message(role: str, content: str, **kwargs):
    message_classes = {
        "system": SystemMessage,
        "user": UserMessage,
        "assistant": AssistantMessage,
        "tool": ToolMessage,
    }
    message_class = message_classes.get(role)
    if not message_class:
        raise ValueError(f"Invalid role: {role}")
    return {"role": role, "content": content, **kwargs}

Set up the Linkup integration with search and formatting functions:

linkup_output_type: Literal["searchResults", "sourcedAnswer"] = "searchResults"

def linkup_web_search(query: str, depth: str = "standard") -> str:
    client = LinkupClient(api_key=LINKUP_API_KEY)
    response = client.search(query=query, depth=depth, output_type=linkup_output_type)
    return response


def format_linkup_response(
    response,
    output_type: Literal["searchResults", "sourcedAnswer"] = linkup_output_type,
) -> str:
    if output_type == "sourcedAnswer":
        return response.answer
    elif output_type == "searchResults":
        results = getattr(response, "results", [{"content": "No answer provided."}])
        answer = "\n".join(
            [f"{i}. {doc.content}" for i, doc in enumerate(results, start=1)]
        )
        return f"Search Results:\n{answer}"
4

Configure Function Calling Tools

Define the tools that Mistral can use to interact with Linkup:

tools = [
    {
        "type": "function",
        "function": {
            "name": "linkup_web_search",
            "description": "Performs an online search using the Linkup search engine and retrieves the top results as a string. This function is useful for accessing real-time information, including news, articles, and other relevant web content.",
            "parameters": {
                "type": "object",
                "properties": {
                    "query": {
                        "type": "string",
                        "description": "The search query to perform.",
                    },
                    "depth": {
                        "type": "string",
                        "enum": ["standard", "deep"],
                        "default": "standard",
                        "description": "The search depth to perform. Use 'standard' for straightforward queries with likely direct answers (e.g., facts, definitions, simple explanations). Use 'deep' for: 1) complex queries requiring comprehensive analysis or information synthesis, 2) queries containing uncommon terms, specialized jargon, or abbreviations that may need additional context, or 3) questions likely requiring up-to-date or specialized web search results to answer effectively.",
                    },
                },
                "required": ["query", "depth"],
            },
        },
    }
]
5

Implement the Chatbot Interaction

Create the main chatbot interaction function:

def chatbot_interaction(user_message):
    # Create chat history
    messages = [
        create_message(role="system", content="You are a helpful assistant."),
        create_message(role="user", content=user_message),
    ]

    # Generate initial response
    completion = chat_completion_request(messages=messages, tools=tools)

    # Handle tool calls
    tool_calls = completion.choices[0].message.tool_calls or []
    for tool_call in tool_calls:
        function_name = tool_call.function.name
        args = json.loads(tool_call.function.arguments)
        print(f"Calling Linkup tool with parameters {args}")
        result = linkup_web_search(**args)
        result = format_linkup_response(result)

        # Update messages with tool result
        messages.append(completion.choices[0].message)
        messages.append(
            {
                "role": "tool",
                "name": function_name,
                "content": result,
                "tool_call_id": tool_call.id,
            }
        )

    # Generate final response if tool calls were made
    if tool_calls:
        completion = chat_completion_request(messages=messages)

    response = completion.choices[0].message.content
    if isinstance(response, list):
        # If Mistral's chat completion response sometimes returns a list of text chunks instead of a single string,
        # you should process the output to ensure it's always a single string
        response = " ".join(
            [chunk.text for chunk in response if isinstance(chunk, TextChunk)]
        )
        return response
    return response
6

Test Your Integration

Try out your chatbot with a sample query:

result = chatbot_interaction("What are the latest developments in quantum computing?")
print(result)

Best Practices

  1. Model Selection: Choose the appropriate Mistral model based on your needs:

    • mistral-small-latest: Good for basic tasks
    • mistral-medium-latest: Balanced performance and cost
    • mistral-large-latest: Best performance for complex tasks
  2. Error Handling: Always implement proper error handling for API calls and tool executions.

  3. Rate Limiting: Be mindful of API rate limits for both Mistral and Linkup.

  4. Security: Never expose API keys in your code. Use environment variables or secure secret management.

  5. Response Formatting: Consider customizing the format_linkup_response function to better suit your needs.

  6. System Messages: Use appropriate system messages to guide the model’s behavior and responses.

Conclusion

This integration allows you to combine Mistral AI’s powerful language understanding with Linkup’s real-time web search capabilities. You can now create applications that provide up-to-date information while maintaining the model’s natural language processing abilities.

For more information, visit: