from llama_index.core.agent import FunctionCallingAgentfrom llama_index.llms.openai import OpenAIfrom llama_index.tools.linkup_research.base import LinkupToolSpec# Tool initializationlinkup_tool = LinkupToolSpec( api_key="<YOUR LINKUP API KEY>", depth="standard", # Options: "fast", "standard" or "deep" output_type="searchResults", # Options: "searchResults", "sourcedAnswer", or "structured")
Parameter
Options
Description
depth
fast, standard, deep
Controls search depth. fast is the fastest with no LLM processing, standard leverages agentic search, deep performs more thorough research.
output_type
searchResults, sourcedAnswer, structured
Determines the format of returned information.
4
Initialize and run your agent
# Agent initializationagent = FunctionCallingAgent.from_tools( linkup_tool.to_tool_list(), llm=OpenAI( api_key="<YOUR OPENAI API KEY>", model="gpt-4o-mini" ),)# Agent invocationresponse = agent.chat("Can you tell me which women were awarded the Physics Nobel Prize")print(response)
# Sample output{ "response": "Marie Curie (1903), Maria Goeppert Mayer (1963), Donna Strickland (2018), and Andrea Ghez (2020) have been awarded the Nobel Prize in Physics.", "sources": [ # Source information would appear here ]}
Need help? Email support@linkup.so, ping us on Discord, or talk to us.