Overview

Linkup can be used with LlamaIndex to create advanced AI agents and workflows based on internal and web data.

Getting Started with Linkup in LLamaIndex

2

Install dependencies

pip install llama-index llama-index-tools-linkup-research
3

Create your agent

from llama_index.core.agent import FunctionCallingAgent
from llama_index.llms.openai import OpenAI
from llama_index.tools.linkup_research.base import LinkupToolSpec

# Tool initialization
linkup_tool = LinkupToolSpec(
    api_key="<YOUR LINKUP API KEY>",
    depth="standard", # Options: "standard" or "deep"
    output_type="searchResults", # Options: "searchResults", "sourcedAnswer", or "structured"
)

Linkup can be used with the following configuration options (learn more)

ParameterOptionsDescription
depthstandard, deepControls search depth. deep performs more thorough research, standard is faster.
output_typesearchResults, sourcedAnswer, structuredDetermines the format of returned information.
4

Initialize and run your agent

# Agent initialization
agent = FunctionCallingAgent.from_tools(
    linkup_tool.to_tool_list(),
    llm=OpenAI(
      api_key="<YOUR OPENAI API KEY>",
      model="gpt-4o-mini"
    ),
)

# Agent invocation
response = agent.chat("Can you tell me which women were awarded the Physics Nobel Prize")
print(response)

Example Response

# Sample output
{
  "response": "Marie Curie (1903), Maria Goeppert Mayer (1963), Donna Strickland (2018), and Andrea Ghez (2020) have been awarded the Nobel Prize in Physics.",
  "sources": [
    # Source information would appear here
  ]
}

Facing issues? Reach out to our engineering team at support@linkup.so or via our Discord.