Documentation Index
Fetch the complete documentation index at: https://docs.linkup.so/llms.txt
Use this file to discover all available pages before exploring further.
Drop-in agent instructions
Paste the block below verbatim into your agent’s instruction file —AGENTS.md, agent.md, CLAUDE.md, .cursorrules, or any system
prompt. No edits required.
Click on Copy prompt to download our multi-endpoint integration guide for AI agents.
npx-installable version of the same guidance, see
Skill installation below.
Per-endpoint integration briefs
Each endpoint also has a self-containedfor-agents page with a
copy-pasteable function-calling tool definition and a focused integration
prompt:
/search for agents
Synchronous web search. Three modes, three output types. <1s–~30s.
/fetch for agents
Real-time URL → clean LLM-ready markdown. ~1s.
/research for agents
Async, multi-step research with audit-trail citations. 2–20 minutes.
/tasks for agents
Async batch wrapper for bulk and scheduled workloads.
Plug into your agent stack
Linkup integrates with most ways of building or running an agent. Pick the path that matches the stack:MCP server
Hosted endpoint, MCPB bundle, or local install. The default path for
Cursor, Claude Desktop, Claude Code, and other MCP clients.
Linkup Skill
npx skills package that teaches coding agents (Cursor, Claude Code,
etc.) the same integration guidance as this page.Agent frameworks
LangChain, CrewAI, LlamaIndex, Vercel AI SDK, Agno, Composio, and more —
Linkup ships as a tool you can register.
Coding agents & platforms
OpenClaw and other agent platforms with Linkup pre-wired.
The skill installs the same guidance as this page into your agent’s
instruction file.