πŸ‘€ Overview

Linkup can be used with LangChain as a Retriever. This tutorial shows to set up a basic LangChain pipeline, leveraging Linkup as a Retriever to get contextual information from the internet.

πŸ“¦ Installation

Install the LangChain integration using pip:

shell
pip install langchain-linkup

πŸ› οΈ Usage

Setting Up Your Environment

  1. πŸ”‘ Get an API Key:

Create an API key for free by creating an account on the Linkup App.

  1. βš™οΈ Set-up the API Key:

    Option 1: Export the LINKUP_API_KEY environment variable in your shell before using the Linkup LangChain component.

    shell
    export LINKUP_API_KEY='YOUR_LINKUP_API_KEY'
    

    Option 2: Set the LINKUP_API_KEY environment variable directly within Python, using for instance os.environ or python-dotenv with a .env file (python-dotenv needs to be installed separately in this case), before creating the Linkup LangChain component.

    python
    import os
    from langchain_linkup import LinkupSearchRetriever
    
    os.environ["LINKUP_API_KEY"] = "YOUR_LINKUP_API_KEY"
    # or dotenv.load_dotenv()
    retriever = LinkupSearchRetriever(...)
    ...
    

    Option 3: Pass the Linkup API key to the Linkup LangChain component when creating it.

    python
    from langchain_linkup import LinkupSearchRetriever
    
    retriever = LinkupSearchRetriever(api_key="YOUR_LINKUP_API_KEY", ...)
    ...
    

Use the Linkup Retriever

You can then use the Linkup retriever to get information on your question from the web.

python
documents = retriever.invoke(input="<YOUR_QUESTION_HERE>")
print(documents)

πŸ“‹ Example

Linkup search queries can be used with one of two very different modes:

  • with depth="standard", the search will be straightforward and fast, suited for relatively simple queries (e.g. β€œWhat’s the weather in Paris today?”)
  • with depth="deep", the search will use an agentic workflow, which makes it in general slower, but it will be able to solve more complex queries (e.g. β€œWhat is the company profile of LangChain accross the last few years, and how does it compare to its concurrent?”)

πŸ” Retriever (Easy)

python
from langchain_linkup import LinkupSearchRetriever
import os

os.environ["LINKUP_API_KEY"] = "YOUR_LINKUP_API_KEY"

retriever = LinkupSearchRetriever(
    depth="deep",  # "standard" or "deep"
)

# Perform a search query
documents = retriever.invoke(input="What is Linkup, the new French AI startup?")
print(documents)

πŸ€– RAG with the OpenAI LLM (Intermediate)

This is a RAG example using the Linkup API and LangChain’s LCEL (LangChain Expression Language). This RAG uses OpenAI.

You need an API key for Linkup, and another one for OpenAI for the final generation.

You can get an OpenAI API key here.

You can set these keys manually as the LINKUP_API_KEY and OPENAI_API_KEY environment variables, or you can duplicate the file .env.example in a .env file, fill the missing values, and the environment variables will be automatically loaded from it. Alternatively, you can replace the corresponding variables below.

python
#pip install langchain-openai

from typing import Any, Literal
from dotenv import load_dotenv
from langchain_core.documents import Document
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.runnables import Runnable, RunnableLambda, RunnablePassthrough
from langchain_openai import ChatOpenAI

from langchain_linkup import LinkupSearchRetriever

os.environ["LINKUP_API_KEY"] = "YOUR_LINKUP_API_KEY"
os.environ["OPENAI_API_KEY"] = "YOUR_OPENAI_API_KEY"

# You can change the RAG query and parameters here. If you prefer not to use environment variables
# you can fill them here.
query: str = "What is Linkup, the new French AI startup?"
linkup_depth: Literal["standard", "deep"] = "standard"
linkup_api_key = None
open_ai_model: str = "gpt-4o-mini"
openai_api_key = None

load_dotenv()  # Load environment variables from .env file if there is one


retriever = LinkupSearchRetriever(linkup_api_key=linkup_api_key, depth=linkup_depth)


def format_retrieved_documents(docs: list[Document]) -> str:
    """Format the documents retrieved by the Linkup API as a text."""

    return "\n\n".join(
        [
            f"{document.metadata['name']} ({document.metadata['url']}):\n{document.page_content}"
            for document in docs
        ]
    )


def inspect_context(state: dict[str, Any]) -> dict[str, Any]:
    """Print the context retrieved by the retriever."""
    print(f"Context: {state['context']}\n\n")
    return state


generation_prompt_template = """Answer the question based only on the following context:

{context}

Question: {question}
"""
prompt = ChatPromptTemplate.from_template(generation_prompt_template)
model = ChatOpenAI(model=open_ai_model, api_key=openai_api_key)


chain: Runnable[Any, str] = (
    {"context": retriever | format_retrieved_documents, "question": RunnablePassthrough()}
    | RunnableLambda(inspect_context)
    | prompt
    | model
    | StrOutputParser()
)
response = chain.invoke(input=query)
print(f"Response: {response}")

Facing issues? Reach out to our engineering team at support@linkup.so