diff --git a/sources/_partials/_third-party-integration.mdx b/sources/_partials/_third-party-integration.mdx
new file mode 100644
index 0000000000..fa3688dae4
--- /dev/null
+++ b/sources/_partials/_third-party-integration.mdx
@@ -0,0 +1,5 @@
+:::info Help keep this page up to date
+
+This integration uses a third-party service. If you find outdated content, please [submit an issue on GitHub](https://github.com/apify/apify-docs/issues).
+
+:::
diff --git a/sources/platform/integrations/ai/agno.md b/sources/platform/integrations/ai/agno.md
index fd77a6f224..fcdd7b3291 100644
--- a/sources/platform/integrations/ai/agno.md
+++ b/sources/platform/integrations/ai/agno.md
@@ -6,15 +6,11 @@ sidebar_position: 19
slug: /integrations/agno
---
-## What is Agno?
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
-[Agno](https://docs.agno.com/) is an open-source framework for building intelligent AI agents. It provides a flexible architecture to create agents with custom tools, enabling seamless integration with external services like Apify for tasks such as web scraping, data extraction and automation.
+[Agno](https://docs.agno.com/) is an open-source framework for building intelligent AI agents. It provides a flexible architecture to create agents with custom tools, enabling seamless integration with external services like Apify for tasks such as web scraping, data extraction, and automation. Check out the [Agno documentation](https://docs.agno.com/introduction) for more details on building AI agents.
-:::note Agno documentation
-
-Check out the [Agno documentation](https://docs.agno.com/introduction) for more details on building AI agents.
-
-:::
+
## How to use Apify with Agno
diff --git a/sources/platform/integrations/ai/aws_bedrock.md b/sources/platform/integrations/ai/aws_bedrock.md
index 945dff06bf..78b26a19a4 100644
--- a/sources/platform/integrations/ai/aws_bedrock.md
+++ b/sources/platform/integrations/ai/aws_bedrock.md
@@ -6,6 +6,8 @@ sidebar_position: 15
slug: /integrations/aws_bedrock
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
[Amazon Bedrock](https://aws.amazon.com/bedrock/) is a fully managed service that provides access to large language models (LLMs), allowing users to create and manage retrieval-augmented generative (RAG) pipelines, and create AI agents to plan and perform actions.
AWS Bedrock supports a wide range of models from providers such as A21 Labs, Anthropic, Cohere, Meta, and Mistral AI.
These models are designed to handle complex, multistep tasks across systems, knowledge bases, and APIs, making them versatile for various use cases.
@@ -14,6 +16,8 @@ In this tutorial, we’ll demonstrate how to create and use AWS Bedrock AI agent
The AI agent will be configured to either answer questions from an internal LLM knowledge or to leverage the [RAG Web Browser](https://apify.com/apify/rag-web-browser) to perform internet searches for relevant information.
This approach enables the agent to provide more comprehensive and accurate responses by combining internal knowledge with real-time data from the web.
+
+
## AWS Bedrock AI agents
Amazon Bedrock allows you to create AI agents powered by large language models to analyze user input and determine the required data sources, and execute actions needed to fulfill the user requests.
diff --git a/sources/platform/integrations/ai/chatgpt.md b/sources/platform/integrations/ai/chatgpt.md
index 8b5259baa0..6b6069ba29 100644
--- a/sources/platform/integrations/ai/chatgpt.md
+++ b/sources/platform/integrations/ai/chatgpt.md
@@ -6,6 +6,8 @@ sidebar_position: 12
slug: /integrations/chatgpt
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
The _ChatGPT_ integration enables you to connect ChatGPT to Apify's extensive library of [Actors](https://apify.com/store) through the [Model Context Protocol (MCP)](https://modelcontextprotocol.io/docs/getting-started/intro).
This allows ChatGPT to access real-time web data and automation capabilities by using Apify tools directly in conversations.
By default, the Apify MCP server exposes a set of tools that let you search and run any Actor you have access to, including all public Actors and rental Actors you have rented.
@@ -14,6 +16,8 @@ _Example query_: "Find and run an Actor that scrapes Instagram profiles and gets
In this tutorial, you'll learn how to connect _ChatGPT_ to the _Apify MCP server_ using a custom connector.
+
+
## Prerequisites
Before connecting ChatGPT to Apify, you'll need:
diff --git a/sources/platform/integrations/ai/crewai.md b/sources/platform/integrations/ai/crewai.md
index 9270a03ad9..7d17bff301 100644
--- a/sources/platform/integrations/ai/crewai.md
+++ b/sources/platform/integrations/ai/crewai.md
@@ -6,15 +6,11 @@ sidebar_position: 3
slug: /integrations/crewai
---
-## What is CrewAI
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
-[CrewAI](https://www.crewai.com/) is an open-source Python framework designed to orchestrate autonomous, role-playing AI agents that collaborate as a "crew" to tackle complex tasks. It enables developers to define agents with specific roles, assign tasks, and integrate tools - like Apify Actors - for real-world data retrieval and automation.
+[CrewAI](https://www.crewai.com/) is an open-source Python framework designed to orchestrate autonomous, role-playing AI agents that collaborate as a "crew" to tackle complex tasks. It enables developers to define agents with specific roles, assign tasks, and integrate tools - like Apify Actors - for real-world data retrieval and automation. For more details, check out the [CrewAI documentation](https://docs.crewai.com/).
-:::note Explore CrewAI
-
-For more in-depth details on CrewAI, check out its [official documentation](https://docs.crewai.com/).
-
-:::
+
## How to use Apify with CrewAI
@@ -36,6 +32,7 @@ First, import all required packages:
```python
import os
+
from crewai import Agent, Task, Crew
from crewai_tools import ApifyActorsTool
from langchain_openai import ChatOpenAI
diff --git a/sources/platform/integrations/ai/flowise.md b/sources/platform/integrations/ai/flowise.md
index 8d3295dc88..fed789b8b0 100644
--- a/sources/platform/integrations/ai/flowise.md
+++ b/sources/platform/integrations/ai/flowise.md
@@ -6,9 +6,11 @@ sidebar_position: 10
slug: /integrations/flowise
---
-## What is Flowise?
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
-Flowise is an open-source UI visual tool to build your customized LLM flow using Langchain.
+[Flowise](https://flowiseai.com/) is an open-source UI visual tool to build customized LLM flows using LangChain.
+
+
## How to use Apify with Flowise
diff --git a/sources/platform/integrations/ai/google-adk.md b/sources/platform/integrations/ai/google-adk.md
index 100c5b1c9f..a3fd3940b3 100644
--- a/sources/platform/integrations/ai/google-adk.md
+++ b/sources/platform/integrations/ai/google-adk.md
@@ -6,15 +6,11 @@ sidebar_position: 2
slug: /integrations/google-adk
---
-## What is the Google ADK
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
-[Google Agent Development Kit](https://github.com/google/adk-python) is a framework for developing and deploying AI agents.
+[Google Agent Development Kit](https://github.com/google/adk-python) (ADK) is a framework for developing and deploying AI agents. For more details, check out the [Google ADK documentation](https://google.github.io/adk-docs/).
-:::note Explore Google ADK
-
-For more details, check out [Google ADK documentation](https://google.github.io/adk-docs/).
-
-:::
+
## How to use Apify with Google ADK
diff --git a/sources/platform/integrations/ai/haystack.md b/sources/platform/integrations/ai/haystack.md
index 110a1a6b74..49c089aba5 100644
--- a/sources/platform/integrations/ai/haystack.md
+++ b/sources/platform/integrations/ai/haystack.md
@@ -6,6 +6,8 @@ sidebar_position: 4
slug: /integrations/haystack
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
[Haystack](https://haystack.deepset.ai/) is an open source framework for building production-ready LLM applications, agents, advanced retrieval-augmented generative pipelines, and state-of-the-art search systems that work intelligently over large document collections. For more information on Haystack, visit its [documentation](https://docs.haystack.deepset.ai/docs/intro).
In this example, we'll use the [Website Content Crawler](https://apify.com/apify/website-content-crawler) Actor, which can deeply crawl websites such as documentation sites, knowledge bases, or blogs, and extract text content from the web pages.
@@ -175,6 +177,8 @@ for doc in results["retriever"]["documents"]:
To run it, you can use the following command: `python apify_integration.py`
+
+
## Resources
- [Apify-haystack integration documentation](https://haystack.deepset.ai/integrations/apify)
diff --git a/sources/platform/integrations/ai/langchain.md b/sources/platform/integrations/ai/langchain.md
index 807f9d0b47..15626114ee 100644
--- a/sources/platform/integrations/ai/langchain.md
+++ b/sources/platform/integrations/ai/langchain.md
@@ -6,6 +6,8 @@ sidebar_position: 5
slug: /integrations/langchain
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
> For more information on LangChain visit its [documentation](https://docs.langchain.com/oss/python/langchain/overview).
In this example, we'll use the [Website Content Crawler](https://apify.com/apify/website-content-crawler) Actor, which can deeply crawl websites such as documentation, knowledge bases, help centers, or blogs and extract text content from the web pages.
@@ -148,6 +150,8 @@ print("Documents:", loader.load())
Similarly, you can use other Apify Actors to load data into LangChain and query the vector index.
+
+
## Resources
- [LangChain quickstart](https://docs.langchain.com/oss/python/langchain/quickstart)
diff --git a/sources/platform/integrations/ai/langflow.md b/sources/platform/integrations/ai/langflow.md
index d06df1060e..74e6543e33 100644
--- a/sources/platform/integrations/ai/langflow.md
+++ b/sources/platform/integrations/ai/langflow.md
@@ -6,15 +6,11 @@ sidebar_position: 6
slug: /integrations/langflow
---
-## What is Langflow
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
-[Langflow](https://www.langflow.org/) is a low-code, visual tool that enables developers to build powerful AI agents and workflows that can use any API, models, or databases.
+[Langflow](https://www.langflow.org/) is a low-code, visual tool that enables developers to build powerful AI agents and workflows that can use any API, models, or databases. For more information, visit the [Langflow documentation](https://docs.langflow.org/).
-:::note Explore Langflow
-
-For more information on Langflow, visit its [documentation](https://docs.langflow.org/).
-
-:::
+
## How to use Apify with Langflow
diff --git a/sources/platform/integrations/ai/langgraph.md b/sources/platform/integrations/ai/langgraph.md
index b89d9824f4..b325287cc4 100644
--- a/sources/platform/integrations/ai/langgraph.md
+++ b/sources/platform/integrations/ai/langgraph.md
@@ -6,15 +6,11 @@ sidebar_position: 8
slug: /integrations/langgraph
---
-## What is LangGraph
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
-[LangGraph](https://www.langchain.com/langgraph) is a framework designed for constructing stateful, multi-agent applications with Large Language Models (LLMs), allowing developers to build complex AI agent workflows that can leverage tools, APIs, and databases.
+[LangGraph](https://www.langchain.com/langgraph) is a framework for constructing stateful, multi-agent applications with large language models (LLMs). It allows developers to build complex AI agent workflows that can leverage tools, APIs, and databases. For more details, check out the [LangGraph documentation](https://langchain-ai.github.io/langgraph/).
-:::note Explore LangGraph
-
-For more in-depth details on LangGraph, check out its [official documentation](https://langchain-ai.github.io/langgraph/).
-
-:::
+
## How to use Apify with LangGraph
diff --git a/sources/platform/integrations/ai/lindy.md b/sources/platform/integrations/ai/lindy.md
index 1aa382c35f..4661e7c116 100644
--- a/sources/platform/integrations/ai/lindy.md
+++ b/sources/platform/integrations/ai/lindy.md
@@ -6,8 +6,12 @@ sidebar_position: 9
slug: /integrations/lindy
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
[Lindy](https://www.lindy.ai/) is an AI-powered automation platform that lets you create intelligent workflows and automate complex tasks. By integrating Apify with Lindy, you can leverage Apify's web scraping capabilities within Lindy's AI-driven automation workflows to extract data, monitor websites, and trigger actions based on scraped information.
+
+
## Prerequisites
To use the Apify integration with Lindy, you need:
diff --git a/sources/platform/integrations/ai/llama.md b/sources/platform/integrations/ai/llama.md
index 94ec03ba68..79dbdf6df0 100644
--- a/sources/platform/integrations/ai/llama.md
+++ b/sources/platform/integrations/ai/llama.md
@@ -6,8 +6,12 @@ sidebar_position: 7
slug: /integrations/llama-index
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
> For more information on LlamaIndex, visit its [documentation](https://developers.llamaindex.ai/python/framework/).
+
+
## What is LlamaIndex?
LlamaIndex is a platform that allows you to create and manage vector databases and LLMs.
diff --git a/sources/platform/integrations/ai/mastra.md b/sources/platform/integrations/ai/mastra.md
index c3996a0c6c..1605bd821f 100644
--- a/sources/platform/integrations/ai/mastra.md
+++ b/sources/platform/integrations/ai/mastra.md
@@ -6,15 +6,11 @@ sidebar_position: 11
slug: /integrations/mastra
---
-## What is Mastra
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
-[Mastra](https://mastra.ai) is an open-source TypeScript framework for building AI applications efficiently. It provides essential tools like agents, workflows, retrieval-augmented generation (RAG), integrations, and evaluations. Supporting any LLM (e.g., GPT-4, Claude, Gemini). You can run it locally or deploy it to a serverless cloud like [Apify](https://apify.com).
+[Mastra](https://mastra.ai) is an open-source TypeScript framework for building AI applications efficiently. It provides essential tools like agents, workflows, retrieval-augmented generation (RAG), integrations, and evaluations. It supports any LLM (e.g. GPT-4, Claude, Gemini), and you can run it locally or deploy it to a serverless cloud like [Apify](https://apify.com). For more information, check out the [Mastra documentation](https://mastra.ai/docs).
-:::note Explore Mastra
-
-Check out the [Mastra docs](https://mastra.ai/docs) for more information.
-
-:::
+
## What is MCP server
@@ -43,6 +39,7 @@ First, import all required packages:
import { Agent } from '@mastra/core/agent';
import { MastraMCPClient } from '@mastra/mcp';
import { openai } from '@ai-sdk/openai';
+
// For Anthropic use
// import { anthropic } from '@ai-sdk/anthropic';
```
diff --git a/sources/platform/integrations/ai/mcp.md b/sources/platform/integrations/ai/mcp.md
index 3842045fc5..1fdbb940ed 100644
--- a/sources/platform/integrations/ai/mcp.md
+++ b/sources/platform/integrations/ai/mcp.md
@@ -11,6 +11,7 @@ toc_max_heading_level: 4
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
The Apify's MCP server ([mcp.apify.com](https://mcp.apify.com)) allows AI applications and agents to interact with the Apify platform
using [Model Context Protocol](https://modelcontextprotocol.io/docs/getting-started/intro). The server enables AI agents to
@@ -19,6 +20,8 @@ and enables AI coding assistants to access Apify documentation and tutorials.

+
+
## Prerequisites
Before connecting your AI to Apify, you'll need three things:
@@ -303,7 +306,6 @@ Use the UI configurator `https://mcp.apify.com/` to select your tools visually,
| `add-actor`* | experimental | ❔ | Add an Actor as a new tool for the user to call |
| `get-actor-output`* | - | ✅ | Retrieve the output from an Actor call which is not included in the output preview of the Actor tool. |
-
:::note Retrieving full output
The `get-actor-output` tool is automatically included with any Actor-related tool, such as `call-actor`, `add-actor`, or specific Actor tools like `apify-slash-rag-web-browser`. When you call an Actor, you receive an output preview. Depending on the output format and length, the preview may contain the complete output or only a limited version to avoid overwhelming the LLM. To retrieve the full output, use the `get-actor-output` tool with the `datasetId` from the Actor call. This tool supports limit, offset, and field filtering.
@@ -318,7 +320,6 @@ It can search Apify Store for relevant Actors using the `search-actors` tool, in
This dynamic discovery means your AI can adapt to new tasks without manual configuration.
Each discovered Actor becomes immediately available for future use in the conversation.
-
:::note Dynamic tool discovery
When you use the `actors` tool category, clients that support dynamic tool discovery (such as Claude.ai web and VS Code) will automatically receive the `add-actor` tool instead of `call-actor` for enhanced Actor discovery capabilities.
diff --git a/sources/platform/integrations/ai/milvus.md b/sources/platform/integrations/ai/milvus.md
index 759a758936..e9a88d7d92 100644
--- a/sources/platform/integrations/ai/milvus.md
+++ b/sources/platform/integrations/ai/milvus.md
@@ -8,12 +8,16 @@ toc_min_heading_level: 2
toc_max_heading_level: 4
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
[Milvus](https://milvus.io/) is an open-source vector database optimized for performing similarity searches on large datasets of high-dimensional vectors.
Its focus on efficient vector similarity search allows for the creation of powerful and scalable retrieval systems.
The Apify integration for Milvus allows exporting results from Apify Actors and Dataset items into a Milvus collection.
It can also be connected to a managed Milvus instance on [Zilliz Cloud](https://cloud.zilliz.com).
+
+
## Prerequisites
Before you begin, ensure that you have the following:
diff --git a/sources/platform/integrations/ai/openai_agents.md b/sources/platform/integrations/ai/openai_agents.md
index ad653d85fc..43bff03f45 100644
--- a/sources/platform/integrations/ai/openai_agents.md
+++ b/sources/platform/integrations/ai/openai_agents.md
@@ -6,11 +6,15 @@ sidebar_position: 13
slug: /integrations/openai-agents
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
The _OpenAI Agents Python SDK_ enables you to build AI agents powered by OpenAI's language models that can use tools, manage context, and interact with external systems through the [Model Context Protocol (MCP)](https://modelcontextprotocol.io/docs/getting-started/intro).
By connecting to the Apify MCP server, your agents can access Apify's extensive library of Actors to perform web scraping, data extraction, and automation tasks in real time.
For more details about the OpenAI Agents SDK, refer to the [official documentation](https://openai.github.io/openai-agents-python/).
+
+
## Prerequisites
Before integrating Apify with OpenAI Agents SDK, you'll need:
@@ -38,7 +42,6 @@ from agents.mcp import MCPServerStreamableHttp
os.environ["APIFY_TOKEN"] = "Your Apify API token"
os.environ["OPENAI_API_KEY"] = "Your OpenAI API key"
-
async def main() -> None:
# Create MCP server connection with Bearer token
async with MCPServerStreamableHttp(
@@ -62,7 +65,6 @@ async def main() -> None:
result = await Runner.run(agent, "Search the web and summarize recent trends in AI agents")
print(result.final_output)
-
if __name__ == "__main__":
asyncio.run(main())
```
@@ -133,7 +135,6 @@ from agents.mcp import MCPServerStreamableHttp
os.environ["APIFY_TOKEN"] = "Your Apify API token"
os.environ["OPENAI_API_KEY"] = "Your OpenAI API key"
-
async def main() -> None:
# Create MCP server connection
async with MCPServerStreamableHttp(
@@ -157,7 +158,6 @@ async def main() -> None:
result = await Runner.run(agent, "Search the web and summarize recent trends in AI agents")
print(result.final_output)
-
if __name__ == "__main__":
asyncio.run(main())
```
@@ -176,7 +176,6 @@ from agents.mcp import MCPServerStreamableHttp
os.environ["APIFY_TOKEN"] = "Your Apify API token"
os.environ["OPENAI_API_KEY"] = "Your OpenAI API key"
-
async def main() -> None:
# Create MCP server connection with Instagram scraper
async with MCPServerStreamableHttp(
@@ -202,7 +201,6 @@ async def main() -> None:
)
print(result.final_output)
-
if __name__ == "__main__":
asyncio.run(main())
```
@@ -221,7 +219,6 @@ from agents.mcp import MCPServerStreamableHttp
os.environ["APIFY_TOKEN"] = "Your Apify API token"
os.environ["OPENAI_API_KEY"] = "Your OpenAI API key"
-
async def main() -> None:
# Connect to Apify MCP server for testing
async with MCPServerStreamableHttp(
@@ -251,7 +248,6 @@ async def main() -> None:
)
print(result.final_output)
-
if __name__ == "__main__":
asyncio.run(main())
```
diff --git a/sources/platform/integrations/ai/openai_assistants.md b/sources/platform/integrations/ai/openai_assistants.md
index e260c41f50..06ab8497b3 100644
--- a/sources/platform/integrations/ai/openai_assistants.md
+++ b/sources/platform/integrations/ai/openai_assistants.md
@@ -6,6 +6,8 @@ sidebar_position: 14
slug: /integrations/openai-assistants
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
[OpenAI Assistants API](https://platform.openai.com/docs/assistants/overview) allows you to build your own AI applications such as chatbots, virtual assistants, and more.
The OpenAI Assistants can access OpenAI knowledge base ([vector store](https://platform.openai.com/docs/api-reference/vector-stores)) via file search and use function calling for dynamic interaction and data retrieval.
@@ -14,6 +16,8 @@ Unlike Custom GPT, OpenAI Assistants are available via API, enabling integration
In this tutorial, we’ll start by demonstrating how to create an assistant and integrate real-time data using function calling with the [RAG Web Browser](https://apify.com/apify/rag-web-browser).
Next, we’ll show how to save data from Apify Actors into the OpenAI Vector Store for easy retrieval through [file-search](https://platform.openai.com/docs/assistants/tools/file-search).
+
+
## Real-time search data for OpenAI Assistant
We'll use the [RAG Web Browser](https://apify.com/apify/rag-web-browser) Actor to fetch the latest information from the web and provide it to the OpenAI Assistant through [function calling](https://platform.openai.com/docs/assistants/tools/function-calling?context=without-streaming).
diff --git a/sources/platform/integrations/ai/pinecone.md b/sources/platform/integrations/ai/pinecone.md
index 52a370de1c..e0edf1ae79 100644
--- a/sources/platform/integrations/ai/pinecone.md
+++ b/sources/platform/integrations/ai/pinecone.md
@@ -8,10 +8,14 @@ toc_min_heading_level: 2
toc_max_heading_level: 4
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
[Pinecone](https://www.pinecone.io) is a managed vector database that allows users to store and query dense vectors for AI applications such as recommendation systems, semantic search, and retrieval augmented generation (RAG).
The Apify integration for Pinecone enables you to export results from Apify Actors and Dataset items into a specific Pinecone vector index.
+
+
## Prerequisites
Before you begin, ensure that you have the following:
diff --git a/sources/platform/integrations/ai/qdrant.md b/sources/platform/integrations/ai/qdrant.md
index 6ac6c29f56..3e7ca39215 100644
--- a/sources/platform/integrations/ai/qdrant.md
+++ b/sources/platform/integrations/ai/qdrant.md
@@ -8,10 +8,14 @@ toc_min_heading_level: 2
toc_max_heading_level: 4
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
[Qdrant](https://qdrant.tech) is a high performance managed vector database that allows users to store and query dense vectors for next generation AI applications such as recommendation systems, semantic search, and retrieval augmented generation (RAG).
The Apify integration for Qdrant enables you to export results from Apify Actors and Dataset items into a specific Qdrant collection.
+
+
## Prerequisites
Before you begin, ensure that you have the following:
diff --git a/sources/platform/integrations/ai/skyfire.md b/sources/platform/integrations/ai/skyfire.md
index e6674a73dd..e07c2af655 100644
--- a/sources/platform/integrations/ai/skyfire.md
+++ b/sources/platform/integrations/ai/skyfire.md
@@ -8,6 +8,7 @@ slug: /integrations/skyfire
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
Agentic payments enable AI agents to autonomously run Apify Actors using third-party payment providers, without requiring traditional Apify user accounts. This allows agents to discover, execute, and pay for web scraping and automation tasks independently.
@@ -19,6 +20,8 @@ Keep in mind that agentic payments are an experimental feature and may undergo s
:::
+
+
## What is Skyfire?
[Skyfire](https://skyfire.xyz/) is a payment network built specifically for AI agents, enabling autonomous transactions with digital wallets and spending controls. It provides the infrastructure necessary for agents to make payments on behalf of users, allowing autonomous AI-driven workflows.
diff --git a/sources/platform/integrations/ai/vercel-ai-sdk.md b/sources/platform/integrations/ai/vercel-ai-sdk.md
index ddaa75c215..89f8461027 100644
--- a/sources/platform/integrations/ai/vercel-ai-sdk.md
+++ b/sources/platform/integrations/ai/vercel-ai-sdk.md
@@ -6,15 +6,11 @@ sidebar_position: 2
slug: /integrations/vercel-ai-sdk
---
-## What is the Vercel AI SDK
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
-[Vercel AI SDK](https://ai-sdk.dev/) is the TypeScript toolkit designed to help developers build AI-powered applications and agents with React, Next.js, Vue, Svelte, Node.js, and more.
+[Vercel AI SDK](https://ai-sdk.dev/) is a TypeScript toolkit designed to help developers build AI-powered applications and agents with React, Next.js, Vue, Svelte, Node.js, and more. For more details, check out the [Vercel AI SDK documentation](https://ai-sdk.dev/docs/introduction).
-:::note Explore Vercel AI SDK
-
-For more in-depth details, check out [Vercel AI SDK documentation](https://ai-sdk.dev/docs/introduction).
-
-:::
+
## How to use Apify with Vercel AI SDK
@@ -37,6 +33,7 @@ First, import all required packages:
import { experimental_createMCPClient as createMCPClient, generateText, stepCountIs } from 'ai';
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp.js';
import { createOpenRouter } from '@openrouter/ai-sdk-provider';
+
```
Connect to the Apify MCP server and get all available tools for the AI agent. You can use the [UI configurator](https://mcp.apify.com/) to select your tools visually and generate the configuration code below:
diff --git a/sources/platform/integrations/data-storage/airbyte.md b/sources/platform/integrations/data-storage/airbyte.md
index 7bf5b7d5c5..d8e688038d 100644
--- a/sources/platform/integrations/data-storage/airbyte.md
+++ b/sources/platform/integrations/data-storage/airbyte.md
@@ -6,6 +6,8 @@ sidebar_position: 1
slug: /integrations/airbyte
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
Airbyte is an open-source data integration platform that allows you to move your data between different sources and destinations using pre-built connectors, which are maintained either by Airbyte itself or by its community.
One of these connectors is the Apify Dataset connector, which makes it simple to move data from Apify datasets to any supported destination.
@@ -14,6 +16,8 @@ To use Airbyte's Apify connector you need to:
* Have an Apify account.
* Have an Airbyte account.
+
+
## Set up Apify connector in Airbyte
Once you have all the necessary accounts set up, you need to set up the Apify connector.
diff --git a/sources/platform/integrations/data-storage/airtable/console_integration.md b/sources/platform/integrations/data-storage/airtable/console_integration.md
index 9d6798c09b..8375bb7229 100644
--- a/sources/platform/integrations/data-storage/airtable/console_integration.md
+++ b/sources/platform/integrations/data-storage/airtable/console_integration.md
@@ -6,10 +6,14 @@ sidebar_position: 1
slug: /integrations/airtable/console
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
[Airtable](https://www.airtable.com/) is a cloud-based platform for organizing, managing, and collaborating on data. With Apify integration for Airtable, you can automatically upload Actor run results to Airtable after a successful run.
This integration uses OAuth 2.0, a secure authorization protocol, to connect your Airtable account to Apify and manage data transfers.
+
+
## Connect Apify with Airtable
To use the Apify integration for Airtable, ensure you have:
diff --git a/sources/platform/integrations/data-storage/airtable/index.md b/sources/platform/integrations/data-storage/airtable/index.md
index 0fc9a65beb..c1026ab870 100644
--- a/sources/platform/integrations/data-storage/airtable/index.md
+++ b/sources/platform/integrations/data-storage/airtable/index.md
@@ -6,10 +6,14 @@ sidebar_position: 4
slug: /integrations/airtable
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
[Airtable](https://www.airtable.com/) is a cloud-based platform for organizing, managing, and collaborating on data. With the Apify integration for Airtable, you can automatically upload Actor run results to Airtable after a successful run.
This integration uses OAuth 2.0, a secure authorization protocol, to connect your Airtable account to Apify and manage data transfers.
+
+
## Connect Apify with Airtable
To use the Apify integration for Airtable, ensure you have:
diff --git a/sources/platform/integrations/data-storage/drive.md b/sources/platform/integrations/data-storage/drive.md
index 3f8bc8e026..685eeaac83 100644
--- a/sources/platform/integrations/data-storage/drive.md
+++ b/sources/platform/integrations/data-storage/drive.md
@@ -6,6 +6,12 @@ sidebar_position: 3
slug: /integrations/drive
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
+Save Apify Actor run results directly to Google Drive. Set up the integration on your task to automatically upload files after each successful run.
+
+
+
## Get started
To use the Apify integration for Google Drive, you will need:
diff --git a/sources/platform/integrations/data-storage/keboola.md b/sources/platform/integrations/data-storage/keboola.md
index 20b21da5ee..ec96c842ad 100644
--- a/sources/platform/integrations/data-storage/keboola.md
+++ b/sources/platform/integrations/data-storage/keboola.md
@@ -6,10 +6,14 @@ sidebar_position: 2
slug: /integrations/keboola
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
With Apify integration for [Keboola](https://www.keboola.com/), you can extract data from various sources using your Apify Actors and load it into Keboola for further processing, transformation, and integration with other platforms.
The Keboola integration allows you to run your Actors, fetch items from datasets, and retrieve results, all within the Keboola platform.
+
+
## Connect Apify with Keboola
To use the Apify integration on Keboola, you will need to:
diff --git a/sources/platform/integrations/workflows-and-notifications/activepieces.md b/sources/platform/integrations/workflows-and-notifications/activepieces.md
index 58aa2cf40b..b2bc2b1740 100644
--- a/sources/platform/integrations/workflows-and-notifications/activepieces.md
+++ b/sources/platform/integrations/workflows-and-notifications/activepieces.md
@@ -6,6 +6,8 @@ sidebar_position: 7
slug: /integrations/activepieces
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
[Activepieces](https://www.activepieces.com) is an open-source automation platform that lets you build workflows to connect apps and automate tasks without writing code. With the Apify piece, you can connect your Apify Actors and tasks to other services, build data pipelines, and react to scraping results in real time.
This guide shows you how to integrate Apify with Activepieces to build automated workflows. You'll learn how to:
@@ -22,6 +24,8 @@ An Activepieces flow consists of three key parts:
The Apify piece lets you trigger flows when an Actor or task run finishes, start Actor or task runs from any other trigger, or retrieve data from datasets and key-value stores.
+
+
## Prerequisites
Before using the Apify piece in Activepieces, you need:
diff --git a/sources/platform/integrations/workflows-and-notifications/bubble.md b/sources/platform/integrations/workflows-and-notifications/bubble.md
index e85fc568ea..2b2f1e8e1a 100644
--- a/sources/platform/integrations/workflows-and-notifications/bubble.md
+++ b/sources/platform/integrations/workflows-and-notifications/bubble.md
@@ -6,6 +6,8 @@ sidebar_position: 7
slug: /integrations/bubble
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
[Bubble](https://bubble.io/) is a no-code platform that allows you to build web applications without writing code. With the [Apify integration for Bubble](https://bubble.io/plugin/apify-1749639212621x698168698147962900), you can easily connect your Apify Actors to your Bubble applications to automate workflows and display scraped data.
:::tip Explore the live demo
@@ -14,6 +16,8 @@ Open the demo Bubble app to check out the integration end-to-end before building
:::
+
+
## Get started
To use the Apify integration for Bubble, you will need:
diff --git a/sources/platform/integrations/workflows-and-notifications/dify.md b/sources/platform/integrations/workflows-and-notifications/dify.md
index 5f5e99ee07..231a41547a 100644
--- a/sources/platform/integrations/workflows-and-notifications/dify.md
+++ b/sources/platform/integrations/workflows-and-notifications/dify.md
@@ -6,6 +6,8 @@ sidebar_position: 9
slug: /integrations/dify
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
**Connect Apify with Dify to automate workflows by running Actors, extracting structured data, and responding to Actor or task events.**
---
@@ -14,6 +16,8 @@ slug: /integrations/dify
This guide explains how to set up authentication and incorporate the Apify plugin into your Dify applications as either a tool (action) or a trigger.
+
+
## Prerequisites
Before you begin, make sure you have:
diff --git a/sources/platform/integrations/workflows-and-notifications/gmail.md b/sources/platform/integrations/workflows-and-notifications/gmail.md
index a41cd88810..ed9f3726ac 100644
--- a/sources/platform/integrations/workflows-and-notifications/gmail.md
+++ b/sources/platform/integrations/workflows-and-notifications/gmail.md
@@ -6,6 +6,12 @@ sidebar_position: 6
slug: /integrations/gmail
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
+Send automated email notifications with Actor run results to any Gmail address. Set up the integration on your task to receive emails after each successful run.
+
+
+
## Get started
To use the Apify integration for Gmail, you will need:
diff --git a/sources/platform/integrations/workflows-and-notifications/gumloop/index.md b/sources/platform/integrations/workflows-and-notifications/gumloop/index.md
index 6df6ec189d..1ece39ad9c 100644
--- a/sources/platform/integrations/workflows-and-notifications/gumloop/index.md
+++ b/sources/platform/integrations/workflows-and-notifications/gumloop/index.md
@@ -6,6 +6,8 @@ sidebar_position: 3
slug: /integrations/gumloop
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
With the Gumloop Apify integration you can retrieve key data for your AI-powered workflows in a flash.
Gumloop supports two types of integrations with Apify:
@@ -13,6 +15,8 @@ Gumloop supports two types of integrations with Apify:
- Direct integrations with Apify Actors through MCP nodes, where you can prompt the data you need (Recommended)
- General Apify integration using the Apify task runner node
+
+
## Direct integrations with Apify Actors (recommended)
Gumloop offers native nodes for popular Apify use cases that provide enhanced functionality and easier configuration.
diff --git a/sources/platform/integrations/workflows-and-notifications/gumloop/instagram.md b/sources/platform/integrations/workflows-and-notifications/gumloop/instagram.md
index 67a7f8f2fa..4757dbf339 100644
--- a/sources/platform/integrations/workflows-and-notifications/gumloop/instagram.md
+++ b/sources/platform/integrations/workflows-and-notifications/gumloop/instagram.md
@@ -6,6 +6,8 @@ sidebar_position: 1
slug: /integrations/gumloop/instagram
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
Get Instagram profile posts, details, stories, reels, post comments and hashtags, users, and tagged posts in Gumloop.
---
@@ -14,6 +16,8 @@ The Gumloop integration for Instagram provides a direct interface for running Ap
Using the Gumloop Instagram MCP node, you can prompt the Instagram data you need and Gumloop will retrieve it from relevant Apify Actors. From there you can connect this data to other tools and AI models to process the information.
+
+
## Available actions
You can pull the following types of data from public Instagram accounts using Gumloop’s Instagram node (via Apify). Each action has a credit cost.
diff --git a/sources/platform/integrations/workflows-and-notifications/gumloop/maps.md b/sources/platform/integrations/workflows-and-notifications/gumloop/maps.md
index 005399362f..63b82f426c 100644
--- a/sources/platform/integrations/workflows-and-notifications/gumloop/maps.md
+++ b/sources/platform/integrations/workflows-and-notifications/gumloop/maps.md
@@ -6,6 +6,8 @@ sidebar_position: 2
slug: /integrations/gumloop/maps
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
Search, extract, and enrich business data from Google Maps in Gumloop.
---
@@ -14,6 +16,8 @@ The Gumloop Google Maps integration provides a native interface for running Apif
Using the Gumloop Google Maps MCP node, you can simply prompt the location data you need and Gumloop will retrieve it from relevant Apify Actors. From there, you can connect it to your favorite tools and AI agents to process the information.
+
+
## Available actions
You can pull the following types of place data from Google Maps using Gumloop’s Google Maps node (via Apify). Each action has a credit cost.
@@ -26,7 +30,6 @@ You can pull the following types of place data from Google Maps using Gumloop’
| Get place reviews | Fetch reviews for specific locations, including text, rating, and reviewer info. | 3 credits per item |
| Find places in area | Return all visible places within a defined map area or bounding box. | 3 credits per item |
-
## Retrieve Google Maps data in Gumloop
1. _Add the Gumloop Google Maps MCP node._
diff --git a/sources/platform/integrations/workflows-and-notifications/gumloop/tiktok.md b/sources/platform/integrations/workflows-and-notifications/gumloop/tiktok.md
index f6cf1a2db0..5145231e12 100644
--- a/sources/platform/integrations/workflows-and-notifications/gumloop/tiktok.md
+++ b/sources/platform/integrations/workflows-and-notifications/gumloop/tiktok.md
@@ -6,6 +6,8 @@ sidebar_position: 3
slug: /integrations/gumloop/tiktok
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
Get TikTok hashtag videos, profile videos, followers, video details, and search results in Gumloop.
---
@@ -13,6 +15,8 @@ Get TikTok hashtag videos, profile videos, followers, video details, and search
The Gumloop TikTok integration provides a native interface for running Apify’s TikTok scrapers directly in your workflows. No API tokens or manual polling required. All you need is a Gumloop account.
Using the Gumloop TikTok MCP node, you can simply prompt the TikTok data you need and Gumloop will retrieve it from relevant Apify Actors. From there, you can connect it to your favorite tools and AI agents to process the information.
+
+
## Available actions
You can pull the following types of data from TikTok using Gumloop’s TikTok node (via Apify). Each action has a credits cost.
diff --git a/sources/platform/integrations/workflows-and-notifications/gumloop/youtube.md b/sources/platform/integrations/workflows-and-notifications/gumloop/youtube.md
index 1f4946e701..6c03528114 100644
--- a/sources/platform/integrations/workflows-and-notifications/gumloop/youtube.md
+++ b/sources/platform/integrations/workflows-and-notifications/gumloop/youtube.md
@@ -6,6 +6,8 @@ sidebar_position: 4
slug: /integrations/gumloop/youtube
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
Get YouTube search results, video details, channel videos, playlists, and channel metadata in Gumloop.
---
@@ -14,6 +16,8 @@ The Gumloop YouTube integration provides a native interface for running Apify’
Using the Gumloop YouTube MCP node, you can simply prompt the YouTube data you need and Gumloop will retrieve it from relevant Apify Actors. From there, you can connect it to your favorite tools and AI agents to process the information.
+
+
## Available actions
You can pull the following types of data from YouTube using Gumloop’s YouTube node (via Apify). Each action has a credit cost:
diff --git a/sources/platform/integrations/workflows-and-notifications/ifttt.md b/sources/platform/integrations/workflows-and-notifications/ifttt.md
index 6fa9751962..2ba68372b0 100644
--- a/sources/platform/integrations/workflows-and-notifications/ifttt.md
+++ b/sources/platform/integrations/workflows-and-notifications/ifttt.md
@@ -6,6 +6,8 @@ sidebar_position: 7
slug: /integrations/ifttt
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
[IFTTT](https://ifttt.com) is a service that helps you create automated workflows called Applets. With the [Apify integration for IFTTT](https://ifttt.com/apify), you can connect your Apify Actors to hundreds of services like Twitter, Gmail, Google Sheets, Slack, and more.
This guide shows you how to integrate Apify Actors with IFTTT to build automated workflows. You'll learn how to create IFTTT Applets that can be triggered by Apify events or that can execute Apify tasks.
@@ -18,6 +20,8 @@ An IFTTT Applet consists of three key parts:
The Apify integration lets you trigger workflows when an Actor or task run finishes, start Actor or task runs from other triggers, or retrieve data from datasets and key-value stores.
+
+
## Prerequisites
Before using the Apify integration with IFTTT, you need:
diff --git a/sources/platform/integrations/workflows-and-notifications/kestra.md b/sources/platform/integrations/workflows-and-notifications/kestra.md
index 9501cd706f..534fbf7bd9 100644
--- a/sources/platform/integrations/workflows-and-notifications/kestra.md
+++ b/sources/platform/integrations/workflows-and-notifications/kestra.md
@@ -6,10 +6,14 @@ sidebar_position: 7
slug: /integrations/kestra
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
[Kestra](https://kestra.io/) is an open-source, event-driven orchestration platform. The [Apify plugin for Kestra](https://github.com/kestra-io/plugin-kestra) connects Apify Actors and storage to your workflows. Run scrapers, extract structured data - all defined declaratively in YAML and orchestrated directly from the UI.
This guide shows you how to set up the integration, configure authentication, and create a workflow that runs an Actor and processes its results.
+
+
## Prerequisites
Before you begin, make sure you have:
diff --git a/sources/platform/integrations/workflows-and-notifications/make/ai-crawling.md b/sources/platform/integrations/workflows-and-notifications/make/ai-crawling.md
index ba07ea07bb..e9f47eb789 100644
--- a/sources/platform/integrations/workflows-and-notifications/make/ai-crawling.md
+++ b/sources/platform/integrations/workflows-and-notifications/make/ai-crawling.md
@@ -7,6 +7,12 @@ slug: /integrations/make/ai-crawling
toc_max_heading_level: 4
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
+Use the Apify Scraper for AI Crawling module in [Make](https://www.make.com/) to extract clean Markdown from websites and feed AI models, RAG pipelines, or LLM frameworks at scale.
+
+
+
## Apify Scraper for AI Crawling
Apify Scraper for AI Crawling from [Apify](https://apify.com/) lets you extract text content from websites to feed AI models, LLM applications, vector databases, or Retrieval Augmented Generation (RAG) pipelines. It supports rich formatting using Markdown, cleans the HTML of irrelevant elements, downloads linked files, and integrates with AI ecosystems like LangChain, LlamaIndex, and other LLM frameworks.
diff --git a/sources/platform/integrations/workflows-and-notifications/make/amazon.md b/sources/platform/integrations/workflows-and-notifications/make/amazon.md
index 28475015b7..7a808ac6fb 100644
--- a/sources/platform/integrations/workflows-and-notifications/make/amazon.md
+++ b/sources/platform/integrations/workflows-and-notifications/make/amazon.md
@@ -7,6 +7,12 @@ slug: /integrations/make/amazon
unlisted: true
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
+Use the Amazon Scraper module in [Make](https://www.make.com/) to extract product listings, search results, and category data from Amazon.
+
+
+
## Apify Scraper for Amazon Data
The Amazon Scraper module from [Apify](https://apify.com) allows you to extract product, search, or category data from Amazon.
@@ -233,5 +239,4 @@ There are other native Make Apps powered by Apify. You can check out Apify Scrap
- [YouTube Data](/platform/integrations/make/youtube)
- [AI crawling](/platform/integrations/make/ai-crawling)
-
And more! Because you can access any of thousands of our scrapers on Apify Store by using the [general Apify connections](https://www.make.com/en/integrations/apify).
diff --git a/sources/platform/integrations/workflows-and-notifications/make/facebook.md b/sources/platform/integrations/workflows-and-notifications/make/facebook.md
index 58ae6116b4..172ade8029 100644
--- a/sources/platform/integrations/workflows-and-notifications/make/facebook.md
+++ b/sources/platform/integrations/workflows-and-notifications/make/facebook.md
@@ -7,6 +7,12 @@ slug: /integrations/make/facebook
unlisted: true
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
+Use the Facebook Scraper modules in [Make](https://www.make.com/) to extract posts, comments, and profile data from Facebook.
+
+
+
## Apify Scraper for Facebook Data
The Facebook Scraper modules from [Apify](https://apify.com/) allow you to extract posts, comments, and profile data from Facebook.
diff --git a/sources/platform/integrations/workflows-and-notifications/make/index.md b/sources/platform/integrations/workflows-and-notifications/make/index.md
index e4660f64cd..0f8e022221 100644
--- a/sources/platform/integrations/workflows-and-notifications/make/index.md
+++ b/sources/platform/integrations/workflows-and-notifications/make/index.md
@@ -6,8 +6,12 @@ sidebar_position: 2
slug: /integrations/make
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
[Make](https://www.make.com/) _(formerly Integromat)_ allows you to create scenarios where you can integrate various services (modules) to automate and centralize jobs. Apify has its own module you can use to run Apify Actors, get notified about run statuses, and receive Actor results directly in your Make scenario.
+
+
## Connect Apify to Make
To use the Apify integration on Make, you will need:
diff --git a/sources/platform/integrations/workflows-and-notifications/make/instagram.md b/sources/platform/integrations/workflows-and-notifications/make/instagram.md
index 7b1b5c51dd..851ebdabfb 100644
--- a/sources/platform/integrations/workflows-and-notifications/make/instagram.md
+++ b/sources/platform/integrations/workflows-and-notifications/make/instagram.md
@@ -7,6 +7,12 @@ slug: /integrations/make/instagram
unlisted: true
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
+Use the Instagram Scraper modules in [Make](https://www.make.com/) to extract posts, comments, and profile data from Instagram.
+
+
+
## Apify Scraper for Instagram Data
The Instagram Scraper modules from [Apify](https://apify.com) allow you to extract posts, comments, and profile data from Instagram.
diff --git a/sources/platform/integrations/workflows-and-notifications/make/llm.md b/sources/platform/integrations/workflows-and-notifications/make/llm.md
index 9253cc620a..376ab3ea66 100644
--- a/sources/platform/integrations/workflows-and-notifications/make/llm.md
+++ b/sources/platform/integrations/workflows-and-notifications/make/llm.md
@@ -7,6 +7,12 @@ slug: /integrations/make/llm
toc_max_heading_level: 4
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
+Use the Apify Scraper for LLMs module in [Make](https://www.make.com/) to query Google Search, scrape the top results, and return clean Markdown for AI assistants and RAG pipelines.
+
+
+
## Apify Scraper for LLMs
Apify Scraper for LLMs from [Apify](https://apify.com) is a web browsing module for OpenAI Assistants, RAG pipelines, and AI agents. It can query Google Search, scrape the top results, and return page content as Markdown for downstream AI processing.
diff --git a/sources/platform/integrations/workflows-and-notifications/make/maps.md b/sources/platform/integrations/workflows-and-notifications/make/maps.md
index 9882b1f941..6169a5eb1b 100644
--- a/sources/platform/integrations/workflows-and-notifications/make/maps.md
+++ b/sources/platform/integrations/workflows-and-notifications/make/maps.md
@@ -8,6 +8,12 @@ toc_max_heading_level: 4
unlisted: true
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
+Use the Google Maps Leads Scraper modules in [Make](https://www.make.com/) to extract business contact details, reviews, and location data for sales and marketing workflows.
+
+
+
## Apify Scraper for Google Maps Leads
The Google Maps Leads Scraper modules from [apify.com](http://apify.com/) allow you to extract valuable business lead data from Google Maps, including contact information, email addresses, social media profiles, business websites, phone numbers, and detailed location data. Perfect for sales teams, marketers, and business developers looking to build targeted lead lists, marketers or other commercial teams looking to data mine reviews or assess sentiment analysis wide geographies.
diff --git a/sources/platform/integrations/workflows-and-notifications/make/search.md b/sources/platform/integrations/workflows-and-notifications/make/search.md
index 6af241e4f1..d1256e3818 100644
--- a/sources/platform/integrations/workflows-and-notifications/make/search.md
+++ b/sources/platform/integrations/workflows-and-notifications/make/search.md
@@ -7,6 +7,12 @@ slug: /integrations/make/search
unlisted: true
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
+Use the Google Search Scraper modules in [Make](https://www.make.com/) to crawl SERPs and extract structured results in JSON, XML, CSV, or Excel.
+
+
+
## Apify Scraper for Google Search
The Google search modules from [Apify](https://apify.com) allows you to crawl Google Search Results Pages (SERPs) and extract data from those web pages in structured format such as JSON, XML, CSV, or Excel.
diff --git a/sources/platform/integrations/workflows-and-notifications/make/tiktok.md b/sources/platform/integrations/workflows-and-notifications/make/tiktok.md
index 2b84ca90e2..1b40585693 100644
--- a/sources/platform/integrations/workflows-and-notifications/make/tiktok.md
+++ b/sources/platform/integrations/workflows-and-notifications/make/tiktok.md
@@ -7,6 +7,12 @@ slug: /integrations/make/tiktok
unlisted: true
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
+Use the TikTok Scraper modules in [Make](https://www.make.com/) to extract hashtag videos, comments, and profile data from TikTok.
+
+
+
## Apify Scraper for TikTok Data
The TikTok Scraper modules from [Apify](https://apify.com) allow you to extract hashtag, comments, and profile data from TikTok.
diff --git a/sources/platform/integrations/workflows-and-notifications/make/youtube.md b/sources/platform/integrations/workflows-and-notifications/make/youtube.md
index 475d12e7d6..f6c399bfea 100644
--- a/sources/platform/integrations/workflows-and-notifications/make/youtube.md
+++ b/sources/platform/integrations/workflows-and-notifications/make/youtube.md
@@ -7,6 +7,12 @@ slug: /integrations/make/youtube
unlisted: true
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
+Use the YouTube Scraper module in [Make](https://www.make.com/) to extract channel info, video data, streams, shorts, and search results from YouTube.
+
+
+
## Apify Scraper for YouTube Data
The YouTube Scraper module from [apify.com](https://apify.com) allows you to extract channel, video, streams, shorts, and search data from YouTube.
diff --git a/sources/platform/integrations/workflows-and-notifications/n8n/index.md b/sources/platform/integrations/workflows-and-notifications/n8n/index.md
index 289d314f32..e8c762873b 100644
--- a/sources/platform/integrations/workflows-and-notifications/n8n/index.md
+++ b/sources/platform/integrations/workflows-and-notifications/n8n/index.md
@@ -6,10 +6,14 @@ sidebar_position: 7
slug: /integrations/n8n
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
[n8n](https://n8n.io/) is an open source, fair-code licensed tool for workflow automation. With the [Apify integration for n8n](https://github.com/apify/n8n-nodes-apify), you can connect Apify Actors and storage to hundreds of services You can run scrapers, extract data, and trigger workflows based on Actor or task events.
In this guide, you'll learn how to install the Apify node, set up authentication, and incorporate it into your n8n workflows as either a trigger or an action.
+
+
## Prerequisites
Before you begin, make sure you have:
diff --git a/sources/platform/integrations/workflows-and-notifications/n8n/website-content-crawler.md b/sources/platform/integrations/workflows-and-notifications/n8n/website-content-crawler.md
index 7dd0c75423..0c7a68ea24 100644
--- a/sources/platform/integrations/workflows-and-notifications/n8n/website-content-crawler.md
+++ b/sources/platform/integrations/workflows-and-notifications/n8n/website-content-crawler.md
@@ -7,10 +7,14 @@ slug: /integrations/n8n/website-content-crawler
toc_max_heading_level: 4
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
Website Content Crawler from [Apify](https://apify.com/apify/website-content-crawler) lets you extract text content from websites to feed AI models, LLM applications, vector databases, or Retrieval Augmented Generation (RAG) pipelines. It supports rich formatting using Markdown, cleans the HTML of irrelevant elements, downloads linked files, and integrates with AI ecosystems like Langchain, LlamaIndex, and other LLM frameworks.
To use these modules, you need an [API token](https://docs.apify.com/platform/integrations/api#api-token). You can find your token in the [Apify Console](https://console.apify.com/) under **Settings > Integrations**. After connecting, you can automate content extraction at scale and incorporate the results into your AI workflows.
+
+
## Prerequisites
Before you begin, make sure you have:
@@ -56,7 +60,6 @@ See the [**Connect** section for n8n self-hosted](#connect-self-hosted) for deta
With authentication set up, you can now create workflows that incorporate the Apify node.
-
## n8n self-hosted setup
This section explains how to install and connect the Apify node when running your own n8n instance.
@@ -96,7 +99,6 @@ If you're running a self-hosted n8n instance, you can install the Apify communit

-
## Website Content Crawler by Apify module
This module provides complete control over the content extraction process, allowing you to fine-tune every aspect of the crawling and transformation pipeline. This module is ideal for complex websites, JavaScript-heavy applications, or when you need precise control over content extraction.
diff --git a/sources/platform/integrations/workflows-and-notifications/slack.md b/sources/platform/integrations/workflows-and-notifications/slack.md
index ed46810a76..6d32321057 100644
--- a/sources/platform/integrations/workflows-and-notifications/slack.md
+++ b/sources/platform/integrations/workflows-and-notifications/slack.md
@@ -6,6 +6,8 @@ sidebar_position: 5
slug: /integrations/slack
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
**Learn how to integrate your Apify Actors with Slack. This article guides you from installation through to automating your whole workflow in Slack.**
A tutorial can be found on [Apify Help](https://help.apify.com/en/articles/6454058-apify-integration-for-slack).
@@ -14,9 +16,10 @@ A tutorial can be found on [Apify Help](https://help.apify.com/en/articles/64540
> Explore the [integration for Slack tutorial](https://help.apify.com/en/articles/6454058-apify-integration-for-slack).
-
[Slack](https://slack.com/) allows you to install various services in your workspace in order to automate and centralize jobs. Apify is one of these services, and it allows you to run your Apify Actors, get notified about their run statuses, and receive your results, all without opening your browser.
+
+
## Get started
To use the Apify integration for Slack, you will need:
diff --git a/sources/platform/integrations/workflows-and-notifications/telegram.md b/sources/platform/integrations/workflows-and-notifications/telegram.md
index 9bb6443288..edeee642c4 100644
--- a/sources/platform/integrations/workflows-and-notifications/telegram.md
+++ b/sources/platform/integrations/workflows-and-notifications/telegram.md
@@ -6,6 +6,8 @@ sidebar_position: 4
slug: /integrations/telegram
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
With [Apify integration for Zapier](https://zapier.com/apps/apify/integrations), you can connect your Apify Actors to Slack, Trello, Google Sheets, Dropbox, Salesforce, and loads more.
Your Zapier workflows can start Apify Actors or tasks, fetch items from a dataset, set and get records from key-value stores, or find Actor or task runs.
@@ -16,6 +18,8 @@ Complementary to the following guide we've created a detailed video, that will g
+
+
## Connect Apify with Zapier
To use the Apify integration on Zapier, you will need to:
diff --git a/sources/platform/integrations/workflows-and-notifications/windmill.md b/sources/platform/integrations/workflows-and-notifications/windmill.md
index f63fad7c1b..d70b187624 100644
--- a/sources/platform/integrations/workflows-and-notifications/windmill.md
+++ b/sources/platform/integrations/workflows-and-notifications/windmill.md
@@ -6,10 +6,14 @@ sidebar_position: 8
slug: /integrations/windmill
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
[Windmill](https://www.windmill.dev/) is an open-source automation platform for building scripts and flows that connect your tools and data. With the Apify integration for Windmill, you can run Actors and tasks, scrape websites, extract data from storage, and trigger workflows based on Apify events.
This guide shows you how to install the Apify package, set up authentication, and create automated workflows that integrate with Apify.
+
+
## Prerequisites
Before you begin, make sure you have:
diff --git a/sources/platform/integrations/workflows-and-notifications/workato.md b/sources/platform/integrations/workflows-and-notifications/workato.md
index edf628df71..86595f873d 100644
--- a/sources/platform/integrations/workflows-and-notifications/workato.md
+++ b/sources/platform/integrations/workflows-and-notifications/workato.md
@@ -6,8 +6,12 @@ sidebar_position: 7
slug: /integrations/workato
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
[Workato](https://www.workato.com/) is an automation platform where you can build recipes, automated workflows that connect your apps with no-code connectors. With the [Apify Connector](https://apify.com), you can run _Apify Actors_ inside your recipes to launch web scraping and automation jobs, watch for run events, and further work with the results.
+
+
## Get started
To use the Apify integration with Workato, you will need:
diff --git a/sources/platform/integrations/workflows-and-notifications/zapier.md b/sources/platform/integrations/workflows-and-notifications/zapier.md
index 9c47fd5f1e..c641ed8531 100644
--- a/sources/platform/integrations/workflows-and-notifications/zapier.md
+++ b/sources/platform/integrations/workflows-and-notifications/zapier.md
@@ -6,12 +6,16 @@ sidebar_position: 1
slug: /integrations/zapier
---
+import ThirdPartyDisclaimer from '@site/sources/_partials/_third-party-integration.mdx';
+
With [Apify integration for Zapier](https://zapier.com/apps/apify/integrations), you can connect your Apify Actors to Slack, Trello, Google Sheets, Dropbox, Salesforce, and loads more.
Your Zapier workflows can start Apify Actors or tasks, fetch items from a dataset, set and get records from key-value stores, or find Actor or task runs.
You can use the Zapier integration to trigger a workflow whenever an Actor or a task finishes.
+
+
## Connect Apify with Zapier
To use the Apify integration on Zapier, you will need to: