Skip to main content
Baseten provides serverless GPU inference for open-source models with no infrastructure to manage. With Linkup, you can ground these models in real-time web data, giving them the ability to retrieve current facts, news, and source-backed information beyond their training data. This guide demonstrates the integration using GLM 5, but the same approach works with any model available on Baseten that supports function calling.
1

Get your API Keys

Get your Linkup API key

Create a Linkup account for free to get your API key.

Get your Baseten API key

Navigate to your Baseten settings, select API Keys, click Create API key, and copy the generated key.
2

Set Up Your Environment

Initialize your project and create a virtual environment:
mkdir baseten-linkup
cd baseten-linkup
python3 -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
Install dependencies:
pip install openai linkup-sdk
Configure your API keys as environment variables:
export BASETEN_API_KEY=paste_your_baseten_key_here
export LINKUP_API_KEY=paste_your_linkup_key_here
3

Build the Agent

Create a file named agent.py and add the following code:
agent.py
import os
import json
from datetime import datetime
from openai import OpenAI
from linkup import LinkupClient

client = OpenAI(
    api_key=os.environ.get("BASETEN_API_KEY"),
    base_url="https://inference.baseten.co/v1"
)
linkup_client = LinkupClient(api_key=os.environ.get("LINKUP_API_KEY"))

tools = [{
    "type": "function",
    "function": {
        "name": "search_web",
        "description": "Search the web in real time. Use this tool whenever the user needs trusted facts, news, or source-backed information. Returns comprehensive content from the most relevant sources.",
        "parameters": {
            "type": "object",
            "properties": {
                "query": {
                    "type": "string",
                    "description": "The search query"
                }
            },
            "required": ["query"]
        }
    }
}]

def main():
    print("--- GLM 5 + Linkup ---")
    print("Type 'quit' to exit.\n")

    today_str = datetime.now().strftime("%B %d, %Y")
    system_prompt = (
        f"You are a helpful assistant. Today is {today_str}. "
        f"Use web search when you need current information. "
        f"Prefer searching over relying on your training data for anything that could be outdated."
    )

    history = [{"role": "system", "content": system_prompt}]

    while True:
        try:
            user_input = input("You: ")
            if user_input.lower() in ["quit", "exit"]:
                print("Goodbye!")
                break

            history.append({"role": "user", "content": user_input})

            response = client.chat.completions.create(
                model="zai-org/GLM-5",
                messages=history,
                tools=tools,
                tool_choice="auto"
            )
            message = response.choices[0].message

            while message.tool_calls:
                history.append(message)
                for tc in message.tool_calls:
                    q = json.loads(tc.function.arguments)["query"]
                    print(f"Searching using Linkup: {q}...")
                    try:
                        result = linkup_client.search(
                            query=q,
                            depth="standard",
                            output_type="searchResults"
                        )
                        content = "\n\n".join(
                            f"{r.name}\n{r.url}\n{r.content}"
                            for r in result.results
                        )
                    except Exception as e:
                        content = f"Search error: {e}"
                    history.append({
                        "role": "tool",
                        "tool_call_id": tc.id,
                        "content": content
                    })

                response = client.chat.completions.create(
                    model="zai-org/GLM-5",
                    messages=history,
                    tools=tools,
                    tool_choice="auto"
                )
                message = response.choices[0].message

            print(f"Agent: {message.content}\n")
            history.append(message)

        except Exception as e:
            print(f"Error: {e}")

if __name__ == "__main__":
    main()
4

Run the Agent

python agent.py
5

Try Different Scenarios

Internal knowledge (no tool call):
You: What is the definition of philosophy?
Agent: Philosophy is the study of fundamental questions about existence,
       knowledge, values, reason, mind, and language...
Tool-augmented reasoning:
You: What are the latest books published on logic?
Searching using Linkup: latest logic books 2025...
Agent: Synthesizes a response citing recent publications found via Linkup.
For more information, visit:
Facing issues? Reach out to our engineering team at support@linkup.so or via our Discord or book a 15 minutes call with a member of our technical team.