Overview

Linkup can be used with LangChain as a Retriever. This integration allows you to build powerful applications that retrieve contextual information from the internet.

Getting Started with Linkup in LangChain

1

Install the LangChain integration

pip install langchain-linkup
3

Create and use the Linkup Retriever

from langchain_linkup import LinkupSearchRetriever
import os

os.environ["LINKUP_API_KEY"] = "PASTE_YOUR_API_KEY_HERE"

retriever = LinkupSearchRetriever(
depth="deep",  # "standard" or "deep"
)

# Perform a search query
documents = retriever.invoke(input="What is Linkup, the new French AI startup?")
print(documents)

Linkup can be used with the following configuration options (learn more)

ParameterOptionsDescription
depthstandard, deepControls search depth. deep performs more thorough research, standard is faster.
output_typesearchResults, sourcedAnswer, structuredDetermines the format of returned information.

Example Project: RAG Pipeline with OpenAI

1

Install dependencies

pip install langchain-openai python-dotenv
2

Set up API keys

You need API keys for both Linkup and OpenAI. You can get an OpenAI API key here.

import os
from dotenv import load_dotenv

# Load from .env file if available
load_dotenv()

# Or set manually
os.environ["LINKUP_API_KEY"] = "YOUR_LINKUP_API_KEY"
os.environ["OPENAI_API_KEY"] = "YOUR_OPENAI_API_KEY"
3

Create the RAG pipeline

from typing import Any, Literal
from langchain_core.documents import Document
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.runnables import Runnable, RunnableLambda, RunnablePassthrough
from langchain_openai import ChatOpenAI
from langchain_linkup import LinkupSearchRetriever

# Configuration
query: str = "What is Linkup, the new French AI startup?"
linkup_depth: Literal["standard", "deep"] = "standard"
open_ai_model: str = "gpt-4o-mini"

# Initialize retriever
retriever = LinkupSearchRetriever(depth=linkup_depth)

# Format documents helper function
def format_retrieved_documents(docs: list[Document]) -> str:
    return "\n\n".join(
        [
            f"{document.metadata['name']} ({document.metadata['url']}):\n{document.page_content}"
            for document in docs
        ]
    )

# Debug helper function
def inspect_context(state: dict[str, Any]) -> dict[str, Any]:
    print(f"Context: {state['context']}\n\n")
    return state

# Create prompt and model
generation_prompt_template = """Answer the question based only on the following context:

{context}

Question: {question}
"""
prompt = ChatPromptTemplate.from_template(generation_prompt_template)
model = ChatOpenAI(model=open_ai_model)
4

Run the pipeline

# Build and execute the chain
chain: Runnable[Any, str] = (
    {"context": retriever | format_retrieved_documents, "question": RunnablePassthrough()}
    | RunnableLambda(inspect_context)
    | prompt
    | model
    | StrOutputParser()
)

# Get response
response = chain.invoke(input=query)
print(f"Response: {response}")

Example Response

Context: Linkup (https://www.linkup.fr):
Linkup is a French AI startup that provides a search API for LLMs, enabling them to search the web and access up-to-date information.

Response: Linkup is a French AI startup that has developed a search API specifically designed for Large Language Models (LLMs). Their technology allows LLMs to search the web and access current information, which helps overcome the limitation of outdated training data that many AI models face. This enables applications built with LLMs to provide more accurate and up-to-date responses by connecting them to real-time information from the internet.

Facing issues? Reach out to our engineering team at support@linkup.so or via our Discord.