This guide will help you leverage Cerebras’ lightning-fast (3,000+ tokens/sec) inference with OpenAI’s gpt-oss-120b model combined with real-time web data to create powerful and responsive applications.
2

Set Up Your Environment

First, let’s import the necessary libraries and set up our environment:
pip install linkup-sdk cerebras-cloud-sdk
import os
from cerebras.cloud.sdk import Cerebras
from linkup import LinkupClient
import json
Configure your API keys by setting environment variables or storing them securely in your application:
LINKUP_API_KEY = 'your_linkup_api_key'
CEREBRAS_API_KEY = 'your_cerebras_api_key'
Initialize the Cerebras client:
client = Cerebras(api_key=CEREBRAS_API_KEY)
3

Set Up LinkUp Search Function

Create a simple function to search the web with LinkUp:
def search_web(query: str):
    linkup_client = LinkupClient(api_key=LINKUP_API_KEY)
    response = linkup_client.search(query=query, depth="standard", output_type="searchResults")
    
    # Format the results
    results = []
    for i, doc in enumerate(response.results, 1):
        results.append(f"{i}. {doc.content}")
    
    return "\n".join(results)
4

Define the Search Tool

Tell Cerebras about the search function it can use:
tools = [
    {
        "type": "function",
        "function": {
            "name": "search_web",
            "description": "Search the web in real time. Use this tool whenever the user needs trusted facts, news, source-backed information, or anything to which you don't have the answer. Returns comprehensive content from the most relevant sources.",
            "parameters": {
                "type": "object",
                "properties": {
                    "query": {
                        "type": "string",
                        "description": "What to search for"
                    }
                },
                "required": ["query"]
            }
        }
    }
]
5

Use It

Now you can chat with web search capabilities:
# Create your conversation
messages = [
    {"role": "system", "content": "You are a helpful assistant. Use web search when you need current information."},
    {"role": "user", "content": "What are the latest AI developments this week?"}
]

# Get response from Cerebras
response = client.chat.completions.create(
    model="gpt-oss-120b",
    messages=messages,
    tools=tools
)

# Check if the model wants to search
if response.choices[0].message.tool_calls:
    # The model decided to search
    tool_call = response.choices[0].message.tool_calls[0]
    search_query = json.loads(tool_call.function.arguments)["query"]
    
    print(f"Searching for: {search_query}")
    search_results = search_web(search_query)
    
    # Add the search results to the conversation
    messages.append(response.choices[0].message)
    messages.append({
        "role": "tool",
        "tool_call_id": tool_call.id,
        "content": search_results
    })
    
    # Get final response with search results
    final_response = client.chat.completions.create(
        model="gpt-oss-120b",
        messages=messages
    )
    print(final_response.choices[0].message.content)
else:
    # No search needed
    print(response.choices[0].message.content)
6

Try Different Questions

Test with different types of questions:
# Try other questions by changing the user message:

# For current events:
# {"role": "user", "content": "What's happening with Tesla stock today?"}

# For general knowledge (no search needed):
# {"role": "user", "content": "What is Python programming?"}

# For recent news:
# {"role": "user", "content": "What are today's top tech headlines?"}
For more information, visit:
Facing issues? Reach out to our engineering team at support@linkup.so or via our Discord or book a 15 minutes call with a member of our technical team.