Providers

View as markdown

Integrate your MCP servers with popular AI SDKs and frameworks.

Composio MCP servers only support Streamable HTTP transport.

Anthropic SDK

Use MCP servers with the Anthropic Claude API.

from anthropic import Anthropic
from composio import Composio

# Initialize clients
composio = Composio()
anthropic = Anthropic(api_key="your-anthropic-api-key")

# Create MCP server with GitHub and Linear tools
server = composio.mcp.create(
    name="dev-workflow-server",
    toolkits=[
        {"toolkit": "github", "auth_config": "ac_github_id"},
        {"toolkit": "linear", "auth_config": "ac_linear_id"}
    ],
    allowed_tools=["GITHUB_LIST_PRS", "GITHUB_CREATE_COMMENT", "LINEAR_CREATE_ISSUE"]
)

# Generate MCP instance for user
instance = server.generate("user@example.com")

# Use MCP with Anthropic to manage development workflow
response = anthropic.beta.messages.create(
    model="claude-sonnet-4-5",
    system="You are a helpful assistant with access to GitHub and Linear tools. Use these tools to help manage development workflows. Do not ask for confirmation before using the tools.",
    max_tokens=1000,
    messages=[{
        "role": "user",
        "content": "Check my GitHub PRs for review comments, create Linear tasks for any requested changes, and update the PR descriptions with task links"
    }],
    mcp_servers=[{
        "type": "url",
        "url": instance['url'],
        "name": "composio-mcp-server"
    }],
    betas=["mcp-client-2025-04-04"]  # Enable MCP beta
)

print(response.content)
import class Anthropic
API Client for interfacing with the Anthropic API.
Anthropic
from '@anthropic-ai/sdk';
import { class Composio<TProvider extends BaseComposioProvider<unknown, unknown, unknown> = OpenAIProvider>
This is the core class for Composio. It is used to initialize the Composio SDK and provide a global configuration.
Composio
} from '@composio/core';
// Initialize clients const const composio: Composio<OpenAIProvider>composio = new new Composio<OpenAIProvider>(config?: ComposioConfig<OpenAIProvider> | undefined): Composio<OpenAIProvider>
Creates a new instance of the Composio SDK. The constructor initializes the SDK with the provided configuration options, sets up the API client, and initializes all core models (tools, toolkits, etc.).
@paramconfig - Configuration options for the Composio SDK@paramconfig.apiKey - The API key for authenticating with the Composio API@paramconfig.baseURL - The base URL for the Composio API (defaults to production URL)@paramconfig.allowTracking - Whether to allow anonymous usage analytics@paramconfig.provider - The provider to use for this Composio instance (defaults to OpenAIProvider)@example```typescript // Initialize with default configuration const composio = new Composio(); // Initialize with custom API key and base URL const composio = new Composio({ apiKey: 'your-api-key', baseURL: 'https://api.composio.dev' }); // Initialize with custom provider const composio = new Composio({ apiKey: 'your-api-key', provider: new CustomProvider() }); ```
Composio
();
const const anthropic: Anthropicanthropic = new new Anthropic({ baseURL, apiKey, authToken, ...opts }?: ClientOptions): Anthropic
API Client for interfacing with the Anthropic API.
@paramopts.apiKey@paramopts.authToken@paramopts.baseURL ://api.anthropic.com] - Override the default base URL for the API.@paramopts.timeout minutes] - The maximum amount of time (in milliseconds) the client will wait for a response before timing out.@paramopts.fetchOptions - Additional `RequestInit` options to be passed to `fetch` calls.@paramopts.fetch - Specify a custom `fetch` function implementation.@paramopts.maxRetries - The maximum number of times the client will retry a request.@paramopts.defaultHeaders - Default headers to include with every request to the API.@paramopts.defaultQuery - Default query parameters to include with every request to the API.@paramopts.dangerouslyAllowBrowser - By default, client-side use of this library is not allowed, as it risks exposing your secret API credentials to attackers.
Anthropic
({
ClientOptions.apiKey?: string | Anthropic.ApiKeySetter | null | undefined
API key used for authentication. - Accepts either a static string or an async function that resolves to a string. - Defaults to process.env['ANTHROPIC_API_KEY']. - When a function is provided, it is invoked before each request so you can rotate or refresh credentials at runtime. - The function must return a non-empty string; otherwise an AnthropicError is thrown. - If the function throws, the error is wrapped in an AnthropicError with the original error available as `cause`.
apiKey
: var process: NodeJS.Processprocess.NodeJS.Process.env: NodeJS.ProcessEnv
The `process.env` property returns an object containing the user environment. See [`environ(7)`](http://man7.org/linux/man-pages/man7/environ.7.html). An example of this object looks like: ```js { TERM: 'xterm-256color', SHELL: '/usr/local/bin/bash', USER: 'maciej', PATH: '~/.bin/:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin', PWD: '/Users/maciej', EDITOR: 'vim', SHLVL: '1', HOME: '/Users/maciej', LOGNAME: 'maciej', _: '/usr/local/bin/node' } ``` It is possible to modify this object, but such modifications will not be reflected outside the Node.js process, or (unless explicitly requested) to other `Worker` threads. In other words, the following example would not work: ```bash node -e 'process.env.foo = "bar"' &#x26;&#x26; echo $foo ``` While the following will: ```js import { env } from 'node:process'; env.foo = 'bar'; console.log(env.foo); ``` Assigning a property on `process.env` will implicitly convert the value to a string. **This behavior is deprecated.** Future versions of Node.js may throw an error when the value is not a string, number, or boolean. ```js import { env } from 'node:process'; env.test = null; console.log(env.test); // => 'null' env.test = undefined; console.log(env.test); // => 'undefined' ``` Use `delete` to delete a property from `process.env`. ```js import { env } from 'node:process'; env.TEST = 1; delete env.TEST; console.log(env.TEST); // => undefined ``` On Windows operating systems, environment variables are case-insensitive. ```js import { env } from 'node:process'; env.TEST = 1; console.log(env.test); // => 1 ``` Unless explicitly specified when creating a `Worker` instance, each `Worker` thread has its own copy of `process.env`, based on its parent thread's `process.env`, or whatever was specified as the `env` option to the `Worker` constructor. Changes to `process.env` will not be visible across `Worker` threads, and only the main thread can make changes that are visible to the operating system or to native add-ons. On Windows, a copy of `process.env` on a `Worker` instance operates in a case-sensitive manner unlike the main thread.
@sincev0.1.27
env
.string | undefinedANTHROPIC_API_KEY,
}); // Create MCP server with Google Sheets tools const const server: MCPConfigCreateResponseserver = await const composio: Composio<OpenAIProvider>composio.Composio<OpenAIProvider>.mcp: MCP
Model Context Protocol server management
mcp
.
MCP.create(name: string, mcpConfig: {
    toolkits: (string | {
        toolkit?: string | undefined;
        authConfigId?: string | undefined;
    })[];
    allowedTools?: string[] | undefined;
    manuallyManageConnections?: boolean | undefined;
}): Promise<MCPConfigCreateResponse>
Create a new MCP configuration.
@paramparams - Parameters for creating the MCP configuration@paramparams.authConfig - Array of auth configurations with id and allowed tools@paramparams.options - Configuration options@paramparams.options.name - Unique name for the MCP configuration@paramparams.options.manuallyManageConnections - Whether to use chat-based authentication or manually connect accounts@returnsCreated server details with instance getter@example```typescript const server = await composio.mcpConfig.create("personal-mcp-server", { toolkits: ["github", "slack"], allowedTools: ["GMAIL_FETCH_EMAILS", "SLACK_SEND_MESSAGE"], manuallyManageConnections: false } }); const server = await composio.mcpConfig.create("personal-mcp-server", { toolkits: [{ toolkit: "gmail", authConfigId: "ac_243434343" }], allowedTools: ["GMAIL_FETCH_EMAILS"], manuallyManageConnections: false } }); ```
create
(
"analytics-server", {
toolkits: (string | {
    toolkit?: string | undefined;
    authConfigId?: string | undefined;
})[]
toolkits
: [
{ toolkit?: string | undefinedtoolkit: "googlesheets", authConfigId?: string | undefinedauthConfigId: "ac_sheets_id" } ], allowedTools?: string[] | undefinedallowedTools: ["GOOGLESHEETS_GET_DATA", "GOOGLESHEETS_UPDATE_DATA", "GOOGLESHEETS_CREATE_SHEET"] } ); // Generate MCP instance for user const
const instance: {
    name: string;
    type: "streamable_http";
    id: string;
    userId: string;
    allowedTools: string[];
    url: string;
    authConfigs: string[];
}
instance
= await const server: MCPConfigCreateResponseserver.
MCPConfigCreateResponse.generate: (userId: string) => Promise<{
    name: string;
    type: "streamable_http";
    id: string;
    userId: string;
    allowedTools: string[];
    url: string;
    authConfigs: string[];
}>
Creates an instance for a user of the specific MCP Server/COnfig
@paramuserId@returns
generate
("user@example.com");
// Use MCP with Anthropic for spreadsheet operations const
const response: Anthropic.Beta.Messages.BetaMessage & {
    _request_id?: string | null;
}
response
= await const anthropic: Anthropicanthropic.Anthropic.beta: Anthropic.Betabeta.Beta.messages: Anthropic.Beta.Messagesmessages.Messages.create(params: Anthropic.Beta.Messages.MessageCreateParamsNonStreaming, options?: RequestOptions): APIPromise<Anthropic.Beta.Messages.BetaMessage> (+2 overloads)
Send a structured list of input messages with text and/or image content, and the model will generate the next message in the conversation. The Messages API can be used for either single queries or stateless multi-turn conversations. Learn more about the Messages API in our [user guide](https://docs.claude.com/en/docs/initial-setup)
@example```ts const betaMessage = await client.beta.messages.create({ max_tokens: 1024, messages: [{ content: 'Hello, world', role: 'user' }], model: 'claude-sonnet-4-5-20250929', }); ```
create
({
MessageCreateParamsBase.model: Anthropic.Messages.Model
Body param: The model that will complete your prompt.\n\nSee [models](https://docs.anthropic.com/en/docs/models-overview) for additional details and options.
model
: "claude-sonnet-4-5",
MessageCreateParamsBase.system?: string | Anthropic.Beta.Messages.BetaTextBlockParam[] | undefined
Body param: System prompt. A system prompt is a way of providing context and instructions to Claude, such as specifying a particular goal or role. See our [guide to system prompts](https://docs.claude.com/en/docs/system-prompts).
system
: "You are a helpful assistant with access to Google Sheets tools. Use these tools to analyze and manage spreadsheet data. Do not ask for confirmation before using the tools.",
MessageCreateParamsBase.max_tokens: number
Body param: The maximum number of tokens to generate before stopping. Note that our models may stop _before_ reaching this maximum. This parameter only specifies the absolute maximum number of tokens to generate. Different models have different maximum values for this parameter. See [models](https://docs.claude.com/en/docs/models-overview) for details.
max_tokens
: 1000,
MessageCreateParamsBase.messages: Anthropic.Beta.Messages.BetaMessageParam[]
Body param: Input messages. Our models are trained to operate on alternating `user` and `assistant` conversational turns. When creating a new `Message`, you specify the prior conversational turns with the `messages` parameter, and the model then generates the next `Message` in the conversation. Consecutive `user` or `assistant` turns in your request will be combined into a single turn. Each input message must be an object with a `role` and `content`. You can specify a single `user`-role message, or you can include multiple `user` and `assistant` messages. If the final message uses the `assistant` role, the response content will continue immediately from the content in that message. This can be used to constrain part of the model's response. Example with a single `user` message: ```json [{ "role": "user", "content": "Hello, Claude" }] ``` Example with multiple conversational turns: ```json [ { "role": "user", "content": "Hello there." }, { "role": "assistant", "content": "Hi, I'm Claude. How can I help you?" }, { "role": "user", "content": "Can you explain LLMs in plain English?" } ] ``` Example with a partially-filled response from Claude: ```json [ { "role": "user", "content": "What's the Greek name for Sun? (A) Sol (B) Helios (C) Sun" }, { "role": "assistant", "content": "The best answer is (" } ] ``` Each input message `content` may be either a single `string` or an array of content blocks, where each block has a specific `type`. Using a `string` for `content` is shorthand for an array of one content block of type `"text"`. The following input messages are equivalent: ```json { "role": "user", "content": "Hello, Claude" } ``` ```json { "role": "user", "content": [{ "type": "text", "text": "Hello, Claude" }] } ``` See [input examples](https://docs.claude.com/en/api/messages-examples). Note that if you want to include a [system prompt](https://docs.claude.com/en/docs/system-prompts), you can use the top-level `system` parameter — there is no `"system"` role for input messages in the Messages API. There is a limit of 100,000 messages in a single request.
messages
: [{
BetaMessageParam.role: "user" | "assistant"role: "user", BetaMessageParam.content: string | Anthropic.Beta.Messages.BetaContentBlockParam[]content: "Analyze the sales data in my Google Sheets 'Q4 Revenue' spreadsheet, calculate month-over-month growth, and add a new summary sheet with visualizations" }], MessageCreateParamsBase.mcp_servers?: Anthropic.Beta.Messages.BetaRequestMCPServerURLDefinition[] | undefined
Body param: MCP servers to be utilized in this request
mcp_servers
: [{
BetaRequestMCPServerURLDefinition.type: "url"type: "url", BetaRequestMCPServerURLDefinition.url: stringurl:
const instance: {
    name: string;
    type: "streamable_http";
    id: string;
    userId: string;
    allowedTools: string[];
    url: string;
    authConfigs: string[];
}
instance
.url: stringurl,
BetaRequestMCPServerURLDefinition.name: stringname: "composio-mcp-server" }], MessageCreateParamsBase.betas?: Anthropic.Beta.AnthropicBeta[] | undefined
Header param: Optional header to specify the beta version(s) you want to use.
betas
: ["mcp-client-2025-04-04"] // Enable MCP beta
}); var console: Console
The `console` module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers. The module exports two specific components: * A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream. * A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and [`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module. _**Warning**_: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for more information. Example using the global `console`: ```js console.log('hello world'); // Prints: hello world, to stdout console.log('hello %s', 'world'); // Prints: hello world, to stdout console.error(new Error('Whoops, something bad happened')); // Prints error message and stack trace to stderr: // Error: Whoops, something bad happened // at [eval]:5:15 // at Script.runInThisContext (node:vm:132:18) // at Object.runInThisContext (node:vm:309:38) // at node:internal/process/execution:77:19 // at [eval]-wrapper:6:22 // at evalScript (node:internal/process/execution:76:60) // at node:internal/main/eval_string:23:3 const name = 'Will Robinson'; console.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to stderr ``` Example using the `Console` class: ```js const out = getStreamSomehow(); const err = getStreamSomehow(); const myConsole = new console.Console(out, err); myConsole.log('hello world'); // Prints: hello world, to out myConsole.log('hello %s', 'world'); // Prints: hello world, to out myConsole.error(new Error('Whoops, something bad happened')); // Prints: [Error: Whoops, something bad happened], to err const name = 'Will Robinson'; myConsole.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to err ```
@see[source](https://github.com/nodejs/node/blob/v24.x/lib/console.js)
console
.Console.log(message?: any, ...optionalParams: any[]): void (+1 overload)
Prints to `stdout` with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html) (the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)). ```js const count = 5; console.log('count: %d', count); // Prints: count: 5, to stdout console.log('count:', count); // Prints: count: 5, to stdout ``` See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
@sincev0.1.100
log
(
const response: Anthropic.Beta.Messages.BetaMessage & {
    _request_id?: string | null;
}
response
.BetaMessage.content: Anthropic.Beta.Messages.BetaContentBlock[]
Content generated by the model. This is an array of content blocks, each of which has a `type` that determines its shape. Example: ```json [{ "type": "text", "text": "Hi, I'm Claude." }] ``` If the request input `messages` ended with an `assistant` turn, then the response `content` will continue directly from that last turn. You can use this to constrain the model's output. For example, if the input `messages` were: ```json [ { "role": "user", "content": "What's the Greek name for Sun? (A) Sol (B) Helios (C) Sun" }, { "role": "assistant", "content": "The best answer is (" } ] ``` Then the response `content` might be: ```json [{ "type": "text", "text": "B)" }] ```
content
);

OpenAI SDK

Integrate MCP servers with OpenAI GPT models.

from openai import OpenAI
from composio import Composio

# Initialize clients
composio = Composio()
openai = OpenAI(api_key="your-openai-api-key")

# Create MCP server with Google Sheets and Notion tools
server = composio.mcp.create(
    name="data-docs-server",
    toolkits=[
        {"toolkit": "googlesheets", "auth_config": "ac_sheets_id"},
        {"toolkit": "notion", "auth_config": "ac_notion_id"}
    ],
    allowed_tools=["GOOGLESHEETS_GET_DATA", "GOOGLESHEETS_UPDATE_DATA", "NOTION_CREATE_PAGE"]
)

# Generate MCP instance for user
instance = server.generate("user@example.com")

# Use MCP with OpenAI for data management
response = openai.responses.create(
    model="gpt-5",
    tools=[
        {
            "type": "mcp",
            "server_label": "composio-server",
            "server_description": "Composio MCP server with Google Sheets and Notion integrations",
            "server_url": instance['url'],
            "require_approval": "never",
        },
    ],
    input="Export the Q4 metrics from Google Sheets and create a comprehensive Notion page with charts and analysis",
)

print("OpenAI MCP Response:", response.output_text)
import class OpenAI
API Client for interfacing with the OpenAI API.
OpenAI
from 'openai';
import { class Composio<TProvider extends BaseComposioProvider<unknown, unknown, unknown> = OpenAIProvider>
This is the core class for Composio. It is used to initialize the Composio SDK and provide a global configuration.
Composio
} from '@composio/core';
// Initialize clients const const composio: Composio<OpenAIProvider>composio = new new Composio<OpenAIProvider>(config?: ComposioConfig<OpenAIProvider> | undefined): Composio<OpenAIProvider>
Creates a new instance of the Composio SDK. The constructor initializes the SDK with the provided configuration options, sets up the API client, and initializes all core models (tools, toolkits, etc.).
@paramconfig - Configuration options for the Composio SDK@paramconfig.apiKey - The API key for authenticating with the Composio API@paramconfig.baseURL - The base URL for the Composio API (defaults to production URL)@paramconfig.allowTracking - Whether to allow anonymous usage analytics@paramconfig.provider - The provider to use for this Composio instance (defaults to OpenAIProvider)@example```typescript // Initialize with default configuration const composio = new Composio(); // Initialize with custom API key and base URL const composio = new Composio({ apiKey: 'your-api-key', baseURL: 'https://api.composio.dev' }); // Initialize with custom provider const composio = new Composio({ apiKey: 'your-api-key', provider: new CustomProvider() }); ```
Composio
();
const const openai: OpenAIopenai = new new OpenAI({ baseURL, apiKey, organization, project, webhookSecret, ...opts }?: ClientOptions): OpenAI
API Client for interfacing with the OpenAI API.
@paramopts.apiKey@paramopts.organization@paramopts.project@paramopts.webhookSecret@paramopts.baseURL ://api.openai.com/v1] - Override the default base URL for the API.@paramopts.timeout minutes] - The maximum amount of time (in milliseconds) the client will wait for a response before timing out.@paramopts.fetchOptions - Additional `RequestInit` options to be passed to `fetch` calls.@paramopts.fetch - Specify a custom `fetch` function implementation.@paramopts.maxRetries - The maximum number of times the client will retry a request.@paramopts.defaultHeaders - Default headers to include with every request to the API.@paramopts.defaultQuery - Default query parameters to include with every request to the API.@paramopts.dangerouslyAllowBrowser - By default, client-side use of this library is not allowed, as it risks exposing your secret API credentials to attackers.
OpenAI
({
ClientOptions.apiKey?: string | ApiKeySetter | undefined
API key used for authentication. - Accepts either a static string or an async function that resolves to a string. - Defaults to process.env['OPENAI_API_KEY']. - When a function is provided, it is invoked before each request so you can rotate or refresh credentials at runtime. - The function must return a non-empty string; otherwise an OpenAIError is thrown. - If the function throws, the error is wrapped in an OpenAIError with the original error available as `cause`.
apiKey
: var process: NodeJS.Processprocess.NodeJS.Process.env: NodeJS.ProcessEnv
The `process.env` property returns an object containing the user environment. See [`environ(7)`](http://man7.org/linux/man-pages/man7/environ.7.html). An example of this object looks like: ```js { TERM: 'xterm-256color', SHELL: '/usr/local/bin/bash', USER: 'maciej', PATH: '~/.bin/:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin', PWD: '/Users/maciej', EDITOR: 'vim', SHLVL: '1', HOME: '/Users/maciej', LOGNAME: 'maciej', _: '/usr/local/bin/node' } ``` It is possible to modify this object, but such modifications will not be reflected outside the Node.js process, or (unless explicitly requested) to other `Worker` threads. In other words, the following example would not work: ```bash node -e 'process.env.foo = "bar"' &#x26;&#x26; echo $foo ``` While the following will: ```js import { env } from 'node:process'; env.foo = 'bar'; console.log(env.foo); ``` Assigning a property on `process.env` will implicitly convert the value to a string. **This behavior is deprecated.** Future versions of Node.js may throw an error when the value is not a string, number, or boolean. ```js import { env } from 'node:process'; env.test = null; console.log(env.test); // => 'null' env.test = undefined; console.log(env.test); // => 'undefined' ``` Use `delete` to delete a property from `process.env`. ```js import { env } from 'node:process'; env.TEST = 1; delete env.TEST; console.log(env.TEST); // => undefined ``` On Windows operating systems, environment variables are case-insensitive. ```js import { env } from 'node:process'; env.TEST = 1; console.log(env.test); // => 1 ``` Unless explicitly specified when creating a `Worker` instance, each `Worker` thread has its own copy of `process.env`, based on its parent thread's `process.env`, or whatever was specified as the `env` option to the `Worker` constructor. Changes to `process.env` will not be visible across `Worker` threads, and only the main thread can make changes that are visible to the operating system or to native add-ons. On Windows, a copy of `process.env` on a `Worker` instance operates in a case-sensitive manner unlike the main thread.
@sincev0.1.27
env
.string | undefinedOPENAI_API_KEY,
}); // Create MCP server with Linear and Notion tools const const server: MCPConfigCreateResponseserver = await const composio: Composio<OpenAIProvider>composio.Composio<OpenAIProvider>.mcp: MCP
Model Context Protocol server management
mcp
.
MCP.create(name: string, mcpConfig: {
    toolkits: (string | {
        toolkit?: string | undefined;
        authConfigId?: string | undefined;
    })[];
    allowedTools?: string[] | undefined;
    manuallyManageConnections?: boolean | undefined;
}): Promise<MCPConfigCreateResponse>
Create a new MCP configuration.
@paramparams - Parameters for creating the MCP configuration@paramparams.authConfig - Array of auth configurations with id and allowed tools@paramparams.options - Configuration options@paramparams.options.name - Unique name for the MCP configuration@paramparams.options.manuallyManageConnections - Whether to use chat-based authentication or manually connect accounts@returnsCreated server details with instance getter@example```typescript const server = await composio.mcpConfig.create("personal-mcp-server", { toolkits: ["github", "slack"], allowedTools: ["GMAIL_FETCH_EMAILS", "SLACK_SEND_MESSAGE"], manuallyManageConnections: false } }); const server = await composio.mcpConfig.create("personal-mcp-server", { toolkits: [{ toolkit: "gmail", authConfigId: "ac_243434343" }], allowedTools: ["GMAIL_FETCH_EMAILS"], manuallyManageConnections: false } }); ```
create
(
"project-docs-server", {
toolkits: (string | {
    toolkit?: string | undefined;
    authConfigId?: string | undefined;
})[]
toolkits
: [
{ toolkit?: string | undefinedtoolkit: "linear", authConfigId?: string | undefinedauthConfigId: "ac_linear_id" }, { toolkit?: string | undefinedtoolkit: "notion", authConfigId?: string | undefinedauthConfigId: "ac_notion_id" } ], allowedTools?: string[] | undefinedallowedTools: ["LINEAR_LIST_ISSUES", "LINEAR_GET_ISSUE", "NOTION_CREATE_PAGE"] } ); // Generate MCP instance for user const
const instance: {
    name: string;
    type: "streamable_http";
    id: string;
    userId: string;
    allowedTools: string[];
    url: string;
    authConfigs: string[];
}
instance
= await const server: MCPConfigCreateResponseserver.
MCPConfigCreateResponse.generate: (userId: string) => Promise<{
    name: string;
    type: "streamable_http";
    id: string;
    userId: string;
    allowedTools: string[];
    url: string;
    authConfigs: string[];
}>
Creates an instance for a user of the specific MCP Server/COnfig
@paramuserId@returns
generate
("user@example.com");
// Use MCP with OpenAI for project documentation const
const response: OpenAI.Responses.Response & {
    _request_id?: string | null;
}
response
= await const openai: OpenAIopenai.OpenAI.responses: OpenAI.Responsesresponses.Responses.create(body: OpenAI.Responses.ResponseCreateParamsNonStreaming, options?: RequestOptions): APIPromise<OpenAI.Responses.Response> (+2 overloads)
Creates a model response. Provide [text](https://platform.openai.com/docs/guides/text) or [image](https://platform.openai.com/docs/guides/images) inputs to generate [text](https://platform.openai.com/docs/guides/text) or [JSON](https://platform.openai.com/docs/guides/structured-outputs) outputs. Have the model call your own [custom code](https://platform.openai.com/docs/guides/function-calling) or use built-in [tools](https://platform.openai.com/docs/guides/tools) like [web search](https://platform.openai.com/docs/guides/tools-web-search) or [file search](https://platform.openai.com/docs/guides/tools-file-search) to use your own data as input for the model's response.
@example```ts const response = await client.responses.create(); ```
create
({
ResponseCreateParamsBase.model?: ResponsesModel | undefined
Model ID used to generate the response, like `gpt-4o` or `o3`. OpenAI offers a wide range of models with different capabilities, performance characteristics, and price points. Refer to the [model guide](https://platform.openai.com/docs/models) to browse and compare available models.
model
: "gpt-5",
ResponseCreateParamsBase.tools?: OpenAI.Responses.Tool[] | undefined
An array of tools the model may call while generating a response. You can specify which tool to use by setting the `tool_choice` parameter. We support the following categories of tools: - **Built-in tools**: Tools that are provided by OpenAI that extend the model's capabilities, like [web search](https://platform.openai.com/docs/guides/tools-web-search) or [file search](https://platform.openai.com/docs/guides/tools-file-search). Learn more about [built-in tools](https://platform.openai.com/docs/guides/tools). - **MCP Tools**: Integrations with third-party systems via custom MCP servers or predefined connectors such as Google Drive and SharePoint. Learn more about [MCP Tools](https://platform.openai.com/docs/guides/tools-connectors-mcp). - **Function calls (custom tools)**: Functions that are defined by you, enabling the model to call your own code with strongly typed arguments and outputs. Learn more about [function calling](https://platform.openai.com/docs/guides/function-calling). You can also use custom tools to call your own code.
tools
: [
{ Tool.Mcp.type: "mcp"
The type of the MCP tool. Always `mcp`.
type
: "mcp",
Tool.Mcp.server_label: string
A label for this MCP server, used to identify it in tool calls.
server_label
: "composio-server",
Tool.Mcp.server_description?: string | undefined
Optional description of the MCP server, used to provide more context.
server_description
: "Composio MCP server with Linear and Notion integrations",
Tool.Mcp.server_url?: string | undefined
The URL for the MCP server. One of `server_url` or `connector_id` must be provided.
server_url
:
const instance: {
    name: string;
    type: "streamable_http";
    id: string;
    userId: string;
    allowedTools: string[];
    url: string;
    authConfigs: string[];
}
instance
.url: stringurl,
Tool.Mcp.require_approval?: "never" | OpenAI.Responses.Tool.Mcp.McpToolApprovalFilter | "always" | null | undefined
Specify which of the MCP server's tools require approval.
require_approval
: "never",
}, ], ResponseCreateParamsBase.input?: string | OpenAI.Responses.ResponseInput | undefined
Text, image, or file inputs to the model, used to generate a response. Learn more: - [Text inputs and outputs](https://platform.openai.com/docs/guides/text) - [Image inputs](https://platform.openai.com/docs/guides/images) - [File inputs](https://platform.openai.com/docs/guides/pdf-files) - [Conversation state](https://platform.openai.com/docs/guides/conversation-state) - [Function calling](https://platform.openai.com/docs/guides/function-calling)
input
: "Find all completed Linear issues from this sprint and create a Notion page documenting the release notes",
}); var console: Console
The `console` module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers. The module exports two specific components: * A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream. * A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and [`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module. _**Warning**_: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for more information. Example using the global `console`: ```js console.log('hello world'); // Prints: hello world, to stdout console.log('hello %s', 'world'); // Prints: hello world, to stdout console.error(new Error('Whoops, something bad happened')); // Prints error message and stack trace to stderr: // Error: Whoops, something bad happened // at [eval]:5:15 // at Script.runInThisContext (node:vm:132:18) // at Object.runInThisContext (node:vm:309:38) // at node:internal/process/execution:77:19 // at [eval]-wrapper:6:22 // at evalScript (node:internal/process/execution:76:60) // at node:internal/main/eval_string:23:3 const name = 'Will Robinson'; console.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to stderr ``` Example using the `Console` class: ```js const out = getStreamSomehow(); const err = getStreamSomehow(); const myConsole = new console.Console(out, err); myConsole.log('hello world'); // Prints: hello world, to out myConsole.log('hello %s', 'world'); // Prints: hello world, to out myConsole.error(new Error('Whoops, something bad happened')); // Prints: [Error: Whoops, something bad happened], to err const name = 'Will Robinson'; myConsole.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to err ```
@see[source](https://github.com/nodejs/node/blob/v24.x/lib/console.js)
console
.Console.log(message?: any, ...optionalParams: any[]): void (+1 overload)
Prints to `stdout` with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html) (the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)). ```js const count = 5; console.log('count: %d', count); // Prints: count: 5, to stdout console.log('count:', count); // Prints: count: 5, to stdout ``` See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
@sincev0.1.100
log
("OpenAI MCP Response:",
const response: OpenAI.Responses.Response & {
    _request_id?: string | null;
}
response
.Response.output_text: stringoutput_text);

Mastra

Use MCP servers with Mastra framework.

import { class MCPClientMCPClient } from "@mastra/mcp";
import { const openai: OpenAIProvider
Default OpenAI provider instance.
openai
} from "@ai-sdk/openai";
import { class Agent<TAgentId extends string = string, TTools extends ToolsInput = ToolsInput>
The Agent class is the foundation for creating AI agents in Mastra. It provides methods for generating responses, streaming interactions, managing memory, and handling voice capabilities.
@example```typescript import { Agent } from '@mastra/core/agent'; import { Memory } from '@mastra/memory'; const agent = new Agent({ id: 'my-agent', name: 'My Agent', instructions: 'You are a helpful assistant', model: 'openai/gpt-5', tools: { calculator: calculatorTool, }, memory: new Memory(), }); ```
Agent
} from "@mastra/core/agent";
import { class Composio<TProvider extends BaseComposioProvider<unknown, unknown, unknown> = OpenAIProvider>
This is the core class for Composio. It is used to initialize the Composio SDK and provide a global configuration.
Composio
} from "@composio/core";
// Initialize Composio const const composio: Composio<OpenAIProvider>composio = new new Composio<OpenAIProvider>(config?: ComposioConfig<OpenAIProvider> | undefined): Composio<OpenAIProvider>
Creates a new instance of the Composio SDK. The constructor initializes the SDK with the provided configuration options, sets up the API client, and initializes all core models (tools, toolkits, etc.).
@paramconfig - Configuration options for the Composio SDK@paramconfig.apiKey - The API key for authenticating with the Composio API@paramconfig.baseURL - The base URL for the Composio API (defaults to production URL)@paramconfig.allowTracking - Whether to allow anonymous usage analytics@paramconfig.provider - The provider to use for this Composio instance (defaults to OpenAIProvider)@example```typescript // Initialize with default configuration const composio = new Composio(); // Initialize with custom API key and base URL const composio = new Composio({ apiKey: 'your-api-key', baseURL: 'https://api.composio.dev' }); // Initialize with custom provider const composio = new Composio({ apiKey: 'your-api-key', provider: new CustomProvider() }); ```
Composio
();
// Create MCP server with GitHub, Linear, and Notion tools const const server: MCPConfigCreateResponseserver = await const composio: Composio<OpenAIProvider>composio.Composio<OpenAIProvider>.mcp: MCP
Model Context Protocol server management
mcp
.
MCP.create(name: string, mcpConfig: {
    toolkits: (string | {
        toolkit?: string | undefined;
        authConfigId?: string | undefined;
    })[];
    allowedTools?: string[] | undefined;
    manuallyManageConnections?: boolean | undefined;
}): Promise<MCPConfigCreateResponse>
Create a new MCP configuration.
@paramparams - Parameters for creating the MCP configuration@paramparams.authConfig - Array of auth configurations with id and allowed tools@paramparams.options - Configuration options@paramparams.options.name - Unique name for the MCP configuration@paramparams.options.manuallyManageConnections - Whether to use chat-based authentication or manually connect accounts@returnsCreated server details with instance getter@example```typescript const server = await composio.mcpConfig.create("personal-mcp-server", { toolkits: ["github", "slack"], allowedTools: ["GMAIL_FETCH_EMAILS", "SLACK_SEND_MESSAGE"], manuallyManageConnections: false } }); const server = await composio.mcpConfig.create("personal-mcp-server", { toolkits: [{ toolkit: "gmail", authConfigId: "ac_243434343" }], allowedTools: ["GMAIL_FETCH_EMAILS"], manuallyManageConnections: false } }); ```
create
(
"dev-automation-server", {
toolkits: (string | {
    toolkit?: string | undefined;
    authConfigId?: string | undefined;
})[]
toolkits
: [
{ toolkit?: string | undefinedtoolkit: "github", authConfigId?: string | undefinedauthConfigId: "ac_github_id" }, { toolkit?: string | undefinedtoolkit: "linear", authConfigId?: string | undefinedauthConfigId: "ac_linear_id" }, { toolkit?: string | undefinedtoolkit: "notion", authConfigId?: string | undefinedauthConfigId: "ac_notion_id" } ], allowedTools?: string[] | undefinedallowedTools: [ "GITHUB_LIST_ISSUES", "GITHUB_CREATE_ISSUE", "LINEAR_CREATE_ISSUE", "LINEAR_UPDATE_ISSUE", "NOTION_CREATE_PAGE", "NOTION_UPDATE_PAGE" ] } ); // Generate MCP instance for user const
const instance: {
    name: string;
    type: "streamable_http";
    id: string;
    userId: string;
    allowedTools: string[];
    url: string;
    authConfigs: string[];
}
instance
= await const server: MCPConfigCreateResponseserver.
MCPConfigCreateResponse.generate: (userId: string) => Promise<{
    name: string;
    type: "streamable_http";
    id: string;
    userId: string;
    allowedTools: string[];
    url: string;
    authConfigs: string[];
}>
Creates an instance for a user of the specific MCP Server/COnfig
@paramuserId@returns
generate
("user@example.com");
// Create MCP client with Composio server export const const mcpClient: MCPClientmcpClient = new new MCPClient(args: MCPClientOptions): MCPClientMCPClient({ MCPClientOptions.id?: string | undefinedid: "composio-mcp-client", MCPClientOptions.servers: Record<string, MastraMCPServerDefinition>servers: {
composio: {
    url: URL;
}
composio
: { url: URLurl: new var URL: new (url: string | URL, base?: string | URL) => URL
The **`URL`** interface is used to parse, construct, normalize, and encode URL. [MDN Reference](https://developer.mozilla.org/docs/Web/API/URL)
URL
(
const instance: {
    name: string;
    type: "streamable_http";
    id: string;
    userId: string;
    allowedTools: string[];
    url: string;
    authConfigs: string[];
}
instance
.url: stringurl) },
} }); // Create a development workflow agent export const const devAgent: Agent<"dev-assistant", Record<string, any>>devAgent = new new Agent<"dev-assistant", Record<string, any>>(config: AgentConfig<"dev-assistant", Record<string, any>>): Agent<"dev-assistant", Record<string, any>>
Creates a new Agent instance with the specified configuration.
@example```typescript import { Agent } from '@mastra/core/agent'; import { Memory } from '@mastra/memory'; const agent = new Agent({ id: 'weatherAgent', name: 'Weather Agent', instructions: 'You help users with weather information', model: 'openai/gpt-5', tools: { getWeather }, memory: new Memory(), maxRetries: 2, }); ```
Agent
({
AgentConfig<"dev-assistant", Record<string, any>>.id: "dev-assistant"
Identifier for the agent.
id
: "dev-assistant",
AgentConfig<TAgentId extends string = string, TTools extends ToolsInput = ToolsInput>.name: string
Unique identifier for the agent.
name
: "Dev Assistant",
AgentConfig<TAgentId extends string = string, TTools extends ToolsInput = ToolsInput>.description?: string | undefined
Description of the agent's purpose and capabilities.
description
: "AI assistant for development workflow automation",
AgentConfig<TAgentId extends string = string, TTools extends ToolsInput = ToolsInput>.instructions: DynamicAgentInstructions
Instructions that guide the agent's behavior. Can be a string, array of strings, system message object, array of system messages, or a function that returns any of these types dynamically.
instructions
: "Help manage GitHub repos, Linear issues, and Notion documentation.",
AgentConfig<"dev-assistant", Record<string, any>>.model: MastraModelConfig | DynamicModel | ModelWithRetries[]
The language model used by the agent. Can be provided statically or resolved at runtime.
model
: function openai(modelId: OpenAIResponsesModelId): LanguageModelV3
Default OpenAI provider instance.
openai
("gpt-4-turbo"),
AgentConfig<"dev-assistant", Record<string, any>>.tools?: DynamicArgument<Record<string, any>> | undefined
Tools that the agent can access. Can be provided statically or resolved dynamically.
tools
: await const mcpClient: MCPClientmcpClient.MCPClient.getTools(): Promise<Record<string, any>>getTools()
}); // Example: Automate development workflow (async () => { const
const response: {
    traceId: string | undefined;
    runId: string;
    suspendPayload: any;
    scoringData?: {
        input: Omit<ScorerRunInputForAgent, "runId">;
        output: ScorerRunOutputForAgent;
    } | undefined;
    text: string;
    usage: LanguageModelUsage;
    steps: LLMStepResult<undefined>[];
    finishReason: string | undefined;
    warnings: LanguageModelV2CallWarning[];
    ... 12 more ...;
    tripwire: StepTripwireData | undefined;
}
response
= await const devAgent: Agent<"dev-assistant", Record<string, any>>devAgent.
Agent<"dev-assistant", Record<string, any>>.generate<undefined>(messages: MessageListInput, options?: AgentExecutionOptions<undefined> | undefined): Promise<{
    traceId: string | undefined;
    runId: string;
    suspendPayload: any;
    scoringData?: {
        input: Omit<ScorerRunInputForAgent, "runId">;
        output: ScorerRunOutputForAgent;
    } | undefined;
    text: string;
    usage: LanguageModelUsage;
    ... 15 more ...;
    tripwire: StepTripwireData | undefined;
}>
generate
(
"Review open GitHub issues, create Linear tasks for bugs labeled 'priority', and update the Notion roadmap page" ); var console: Console
The `console` module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers. The module exports two specific components: * A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream. * A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and [`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module. _**Warning**_: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for more information. Example using the global `console`: ```js console.log('hello world'); // Prints: hello world, to stdout console.log('hello %s', 'world'); // Prints: hello world, to stdout console.error(new Error('Whoops, something bad happened')); // Prints error message and stack trace to stderr: // Error: Whoops, something bad happened // at [eval]:5:15 // at Script.runInThisContext (node:vm:132:18) // at Object.runInThisContext (node:vm:309:38) // at node:internal/process/execution:77:19 // at [eval]-wrapper:6:22 // at evalScript (node:internal/process/execution:76:60) // at node:internal/main/eval_string:23:3 const name = 'Will Robinson'; console.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to stderr ``` Example using the `Console` class: ```js const out = getStreamSomehow(); const err = getStreamSomehow(); const myConsole = new console.Console(out, err); myConsole.log('hello world'); // Prints: hello world, to out myConsole.log('hello %s', 'world'); // Prints: hello world, to out myConsole.error(new Error('Whoops, something bad happened')); // Prints: [Error: Whoops, something bad happened], to err const name = 'Will Robinson'; myConsole.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to err ```
@see[source](https://github.com/nodejs/node/blob/v24.x/lib/console.js)
console
.Console.log(message?: any, ...optionalParams: any[]): void (+1 overload)
Prints to `stdout` with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html) (the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)). ```js const count = 5; console.log('count: %d', count); // Prints: count: 5, to stdout console.log('count:', count); // Prints: count: 5, to stdout ``` See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
@sincev0.1.100
log
(
const response: {
    traceId: string | undefined;
    runId: string;
    suspendPayload: any;
    scoringData?: {
        input: Omit<ScorerRunInputForAgent, "runId">;
        output: ScorerRunOutputForAgent;
    } | undefined;
    text: string;
    usage: LanguageModelUsage;
    steps: LLMStepResult<undefined>[];
    finishReason: string | undefined;
    warnings: LanguageModelV2CallWarning[];
    ... 12 more ...;
    tripwire: StepTripwireData | undefined;
}
response
.text: stringtext);
})();