Welcome to this tutorial on using Claude’s function calling capabilities with the Linkup API for web search integration. This guide will help you leverage Claude’s intelligence combined with real-time web data to create powerful and up-to-date applications.

By combining Claude’s advanced language understanding with Linkup’s search capabilities, you can create applications that:

  1. Access up-to-date information beyond Claude’s training data
  2. Find specific facts, statistics, and current events
  3. Research topics with accurate citations and references
  4. Verify information from authoritative sources

Check out the Google Colab version of this tutorial if you prefer.

2

Set Up Your Environment

First, let’s import the necessary libraries and set up our environment:

pip install linkup-sdk anthropic
import anthropic
from linkup import LinkupClient
import json
from pprint import pprint
from typing import List, Dict, Literal

Configure your API keys by setting environment variables or storing them securely in your application:

LINKUP_API_KEY = 'your_linkup_api_key'
ANTHROPIC_API_KEY = 'your_anthropic_api_key'

Initialize the Anthropic client:

client = anthropic.Anthropic(api_key=ANTHROPIC_API_KEY)
3

Implement Core Functions

Create the chat completion function to handle Claude interactions:

def chat_completion_request(messages: List[dict], tools: List[dict]=None, model: str="claude-3-7-sonnet-20250219", system_message: str = "You are a helpful AI assistant"):
    try:
        response = client.messages.create(
            model=model,
            max_tokens=1000,
            temperature=1,
            system=system_message,
            messages=messages,
            tools=tools,
        )
        return response
    except Exception as e:
        print("Unable to generate ChatCompletion response")
        print(f"Exception: {e}")
        return e

Create a helper function for message creation:

def create_message(role: str, content: str, **kwargs):
    return {'role': role, 'content': content, **kwargs}

Set up the Linkup integration with search and formatting functions:

linkup_output_type: Literal["searchResults", "sourcedAnswer"] = "searchResults"

def linkup_web_search(query: str) -> str:
    client = LinkupClient(api_key=LINKUP_API_KEY)
    
    response = client.search(
        query=query,
        depth="standard",
        output_type=linkup_output_type
    )
    return response

def format_linkup_response(response, output_type: Literal["searchResults", "sourcedAnswer"] = linkup_output_type) -> str:
    if output_type == "sourcedAnswer":
        return response.answer
    elif output_type == "searchResults":
        results = getattr(response, "results", [{"content": "No answer provided."}])
        answer = "\n".join([f"{i}. {doc.content}" for i, doc in enumerate(results, start=1)])
        return f'Search Results:\n{answer}'
4

Configure Function Calling Tools

Define the tools that Claude can use to interact with Linkup:

tools = [{
    "name": "linkup_web_search",
    "description": "Performs a search for user input query using Linkup sdk then returns a string of the top search results. Should be used to search real-time data.",
    "input_schema": {
        "type": "object",
        "properties": {
            "query": {
                "type": "string",
                "description": "The search query to perform."
            }
        },
        "required": [
            "query"
        ]
    }
}]
5

Implement the Chatbot Interaction

Create the main chatbot interaction function:

def chatbot_interaction(user_message):
    # we create chat history that will be then passed as an input to our LLM
    messages = [create_message(role="user", content=user_message)]

    # generate LLM response
    response = chat_completion_request(messages=messages, tools=tools)
    messages.append(create_message(role="assistant", content=response.content))

    # perform function calling based on LLM response
    while response.stop_reason == "tool_use":
        tool_use = next(block for block in response.content if block.type == "tool_use")
        tool_name = tool_use.name
        tool_input = tool_use.input

        # perform actual function call based on the arguments parsed from
        # the LLM response
        tool_result = process_tool_call(tool_name, tool_input)

        # construct the chat history taking into account the result of the function call
        messages = [
            {"role": "user", "content": user_message},
            {"role": "assistant", "content": response.content},
            {
                "role": "user",
                "content": [
                    {
                        "type": "tool_result",
                        "tool_use_id": tool_use.id,
                        "content": str(tool_result),
                    }
                ],
            },
        ]
        # generate LLM response based on the chat history
        response = chat_completion_request(messages=messages, tools=tools)

    final_response = next(
        (block.text for block in response.content if hasattr(block, "text")),
        None,
    )
    return final_response
6

Test Your Integration

Try out your chatbot with a sample query:

result = chatbot_interaction("What are the latest developments in AI technology?")
print(result)

Best Practices

  1. Error Handling: Always implement proper error handling for API calls and tool executions.
  2. Rate Limiting: Be mindful of API rate limits for both Claude and Linkup.
  3. Security: Never expose API keys in your code. Use environment variables or secure secret management.
  4. Response Formatting: Consider customizing the format_linkup_response function to better suit your needs.
  5. System Messages: Use appropriate system messages to guide Claude’s behavior and responses.

Conclusion

This integration allows you to combine Claude’s powerful language understanding with Linkup’s real-time web search capabilities. You can now create applications that provide up-to-date information while maintaining Claude’s natural language processing abilities.

For more information, visit: