Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.linkup.so/llms.txt

Use this file to discover all available pages before exploring further.

Overview

Linkup can be used with LlamaIndex to create advanced AI agents and workflows based on internal and web data.

Getting Started with Linkup in LLamaIndex

1

Get your Linkup API Key

Get your API key

Create a Linkup account for free to get your API key.
2

Install dependencies

pip install llama-index llama-index-tools-linkup-research
3

Create your agent

from llama_index.core.agent import FunctionCallingAgent
from llama_index.llms.openai import OpenAI
from llama_index.tools.linkup_research.base import LinkupToolSpec

# Tool initialization
linkup_tool = LinkupToolSpec(
    api_key="<YOUR LINKUP API KEY>",
    depth="standard", # Options: "fast", "standard" or "deep"
    output_type="searchResults", # Options: "searchResults", "sourcedAnswer", or "structured"
)
ParameterOptionsDescription
depthfast, standard, deepControls search depth. fast is the fastest with no LLM processing, standard leverages agentic search, deep performs more thorough research.
output_typesearchResults, sourcedAnswer, structuredDetermines the format of returned information.
4

Initialize and run your agent

# Agent initialization
agent = FunctionCallingAgent.from_tools(
    linkup_tool.to_tool_list(),
    llm=OpenAI(
      api_key="<YOUR OPENAI API KEY>",
      model="gpt-4o-mini"
    ),
)

# Agent invocation
response = agent.chat("Can you tell me which women were awarded the Physics Nobel Prize")
print(response)

Example Response

# Sample output
{
  "response": "Marie Curie (1903), Maria Goeppert Mayer (1963), Donna Strickland (2018), and Andrea Ghez (2020) have been awarded the Nobel Prize in Physics.",
  "sources": [
    # Source information would appear here
  ]
}
Need help? Email support@linkup.so, ping us on Discord, or talk to us.