Company

From Internal Experiment to Public Release: Bringing Conversational Payments Data to our Customers via MCP

February 17, 2026

February 17, 2026

Anthropic announced the Model Context Protocol (MCP) in November 2024 and we immediately perked up. After all, Pagos is an AI-powered payments intelligence platform with a relatively small engineering team; we’re the target audience for this kind of technology. 

By June of 2025, we publicly released our first open source MCP server, giving Pagos users access to our BIN Database through AI chat clients. But internally, we started experimenting with something more ambitious: a hosted MCP server providing access to all Pagos payment data. This was just the beginning.

Testing and Learning

To test our internal MCP server, we connected it to a Pagos account with our own data and took on personas representing our customers’ potential users. Using clients like Cursor and Claude, we asked questions of our data like: “Based on my current transaction data, are there some emerging markets we should focus on?” 

The results were astounding. This simple question sent us down an investigatory rabbit hole that ended with Claude searching Linkedin for employees of a card issuer in the Philippines we could reach out to for potential partnerships! This is exactly the kind of payment insights and intelligence Pagos was founded on, suddenly accessible via a natural language chat with an AI agent.

We further proved the usefulness of these tools during our internal AI hackathon, where a Pagos engineer expanded our internal MCP server to include Pagos Alerts data. Armed with this data and a few web searches, Claude suggested a detected outage was related to an issue at a downstream service provider rather than some sort of internal error. This saved us—in our persona as a mock Pagos merchant—countless hours looking for an internal issue that didn’t exist. (Note: We don’t yet support Alerts within our public MCP server, but rest assured it’s coming soon.)

These positive experiences (among many others) convinced us it was time to make this method of payments data access available to Pagos customers. 

Announcing: Conversational Access to Harmonized Payments Data via MCP

Today, we released our new MCP server, giving all Pagos customers conversational access to their harmonized payments data via the AI agent of their choosing. The server exposes four core data domains—transactions, disputes, fees, and refunds—each with a consistent interface: 

  • `list_*_metrics` to see what you can query

  • `list_*_dimensions` for available groupings

  • `query_*_data` to actually pull the data

We went with this self-describing pattern so the LLM can introspect what's available rather than needing it hardcoded in prompts. The capability to introspect will also make it easier to introduce new dimensions or metrics. 

You can slice transaction data by scheme, processor, card type, or issuer, group disputes by reason or status, and even break down fees by category or code. Everything supports date ranges and time granularity (day/week/month/year), and results come back as CSV. There's also a `get_bin_data` tool for BIN lookups, providing LLMs access to Pagos’ extensive BIN database.

Under the Hood

The Pagos MCP server is implemented within our C# modular monolith using the Model Context Protocol C# SDK. Adopting a technology this early required us to move fast and adapt to a lot of changes. There are still only pre-release versions of the C# SDK and we’ve had to hand-roll parts of the OAuth flow. That being said, we’re generally satisfied with the SDK. 

The Pagos MCP server uses OAuth with Dynamic Client Registration for authentication and authorization of LLM clients. We’re grateful to Clerk, our 3rd party auth provider, for supporting these use cases.

The Pagos MCP server relies on the same platform as the rest of Pagos Insights. Merchants connect their payment processors to Pagos, then we ingest all transactions, disputes, refunds, and fees from each, and harmonize that data into a single model, enriching it with additional data where possible. Merchants can then visualize, filter, and analyze that data within the Pagos Service Panel in near-real time. The Pagos MCP server is just a new way to access this verified and normalized data (i.e. via conversation instead of dashboards). 

Our internal testing taught us a valuable lesson: tool descriptions matter. When we used barebones internal tool descriptions, the LLMs needed to retry tool calls several times and make a lot of introspection calls. Before we released our public MCP server, we significantly improved this experience to ensure most customer queries succeed on the first try. To that end, we created a set of evals: 

  1. We built a GitHub workflow that ran our MCP server and dumped its tool schema into a json file.

  2. We loaded this tool schema into a separate process which ran a set of pre-defined prompts. 

  3. We scored performance on these prompts based on (a) if it used the correct tools and arguments to those tools, and (b) how often the introspection queries needed to be called. 

We run these evals in CI anytime we modify the tool descriptions.

Test the Pagos MCP Server Today

The Pagos MCP server is available now to any user with a Pagos account and connected payment data sources. Don't have payment data connected yet? You can still access Pagos's extensive BIN database through the server while you get set up. 

We're excited to see how our customers use conversational access to transform their payments operations. Oh, and we're only just getting started; expect more Pagos capabilities through MCP soon. For instructions on how to set up the server within your own LLM client, see the Pagos Product Documentation.

Anthropic announced the Model Context Protocol (MCP) in November 2024 and we immediately perked up. After all, Pagos is an AI-powered payments intelligence platform with a relatively small engineering team; we’re the target audience for this kind of technology. 

By June of 2025, we publicly released our first open source MCP server, giving Pagos users access to our BIN Database through AI chat clients. But internally, we started experimenting with something more ambitious: a hosted MCP server providing access to all Pagos payment data. This was just the beginning.

Testing and Learning

To test our internal MCP server, we connected it to a Pagos account with our own data and took on personas representing our customers’ potential users. Using clients like Cursor and Claude, we asked questions of our data like: “Based on my current transaction data, are there some emerging markets we should focus on?” 

The results were astounding. This simple question sent us down an investigatory rabbit hole that ended with Claude searching Linkedin for employees of a card issuer in the Philippines we could reach out to for potential partnerships! This is exactly the kind of payment insights and intelligence Pagos was founded on, suddenly accessible via a natural language chat with an AI agent.

We further proved the usefulness of these tools during our internal AI hackathon, where a Pagos engineer expanded our internal MCP server to include Pagos Alerts data. Armed with this data and a few web searches, Claude suggested a detected outage was related to an issue at a downstream service provider rather than some sort of internal error. This saved us—in our persona as a mock Pagos merchant—countless hours looking for an internal issue that didn’t exist. (Note: We don’t yet support Alerts within our public MCP server, but rest assured it’s coming soon.)

These positive experiences (among many others) convinced us it was time to make this method of payments data access available to Pagos customers. 

Announcing: Conversational Access to Harmonized Payments Data via MCP

Today, we released our new MCP server, giving all Pagos customers conversational access to their harmonized payments data via the AI agent of their choosing. The server exposes four core data domains—transactions, disputes, fees, and refunds—each with a consistent interface: 

  • `list_*_metrics` to see what you can query

  • `list_*_dimensions` for available groupings

  • `query_*_data` to actually pull the data

We went with this self-describing pattern so the LLM can introspect what's available rather than needing it hardcoded in prompts. The capability to introspect will also make it easier to introduce new dimensions or metrics. 

You can slice transaction data by scheme, processor, card type, or issuer, group disputes by reason or status, and even break down fees by category or code. Everything supports date ranges and time granularity (day/week/month/year), and results come back as CSV. There's also a `get_bin_data` tool for BIN lookups, providing LLMs access to Pagos’ extensive BIN database.

Under the Hood

The Pagos MCP server is implemented within our C# modular monolith using the Model Context Protocol C# SDK. Adopting a technology this early required us to move fast and adapt to a lot of changes. There are still only pre-release versions of the C# SDK and we’ve had to hand-roll parts of the OAuth flow. That being said, we’re generally satisfied with the SDK. 

The Pagos MCP server uses OAuth with Dynamic Client Registration for authentication and authorization of LLM clients. We’re grateful to Clerk, our 3rd party auth provider, for supporting these use cases.

The Pagos MCP server relies on the same platform as the rest of Pagos Insights. Merchants connect their payment processors to Pagos, then we ingest all transactions, disputes, refunds, and fees from each, and harmonize that data into a single model, enriching it with additional data where possible. Merchants can then visualize, filter, and analyze that data within the Pagos Service Panel in near-real time. The Pagos MCP server is just a new way to access this verified and normalized data (i.e. via conversation instead of dashboards). 

Our internal testing taught us a valuable lesson: tool descriptions matter. When we used barebones internal tool descriptions, the LLMs needed to retry tool calls several times and make a lot of introspection calls. Before we released our public MCP server, we significantly improved this experience to ensure most customer queries succeed on the first try. To that end, we created a set of evals: 

  1. We built a GitHub workflow that ran our MCP server and dumped its tool schema into a json file.

  2. We loaded this tool schema into a separate process which ran a set of pre-defined prompts. 

  3. We scored performance on these prompts based on (a) if it used the correct tools and arguments to those tools, and (b) how often the introspection queries needed to be called. 

We run these evals in CI anytime we modify the tool descriptions.

Test the Pagos MCP Server Today

The Pagos MCP server is available now to any user with a Pagos account and connected payment data sources. Don't have payment data connected yet? You can still access Pagos's extensive BIN database through the server while you get set up. 

We're excited to see how our customers use conversational access to transform their payments operations. Oh, and we're only just getting started; expect more Pagos capabilities through MCP soon. For instructions on how to set up the server within your own LLM client, see the Pagos Product Documentation.

Anthropic announced the Model Context Protocol (MCP) in November 2024 and we immediately perked up. After all, Pagos is an AI-powered payments intelligence platform with a relatively small engineering team; we’re the target audience for this kind of technology. 

By June of 2025, we publicly released our first open source MCP server, giving Pagos users access to our BIN Database through AI chat clients. But internally, we started experimenting with something more ambitious: a hosted MCP server providing access to all Pagos payment data. This was just the beginning.

Testing and Learning

To test our internal MCP server, we connected it to a Pagos account with our own data and took on personas representing our customers’ potential users. Using clients like Cursor and Claude, we asked questions of our data like: “Based on my current transaction data, are there some emerging markets we should focus on?” 

The results were astounding. This simple question sent us down an investigatory rabbit hole that ended with Claude searching Linkedin for employees of a card issuer in the Philippines we could reach out to for potential partnerships! This is exactly the kind of payment insights and intelligence Pagos was founded on, suddenly accessible via a natural language chat with an AI agent.

We further proved the usefulness of these tools during our internal AI hackathon, where a Pagos engineer expanded our internal MCP server to include Pagos Alerts data. Armed with this data and a few web searches, Claude suggested a detected outage was related to an issue at a downstream service provider rather than some sort of internal error. This saved us—in our persona as a mock Pagos merchant—countless hours looking for an internal issue that didn’t exist. (Note: We don’t yet support Alerts within our public MCP server, but rest assured it’s coming soon.)

These positive experiences (among many others) convinced us it was time to make this method of payments data access available to Pagos customers. 

Announcing: Conversational Access to Harmonized Payments Data via MCP

Today, we released our new MCP server, giving all Pagos customers conversational access to their harmonized payments data via the AI agent of their choosing. The server exposes four core data domains—transactions, disputes, fees, and refunds—each with a consistent interface: 

  • `list_*_metrics` to see what you can query

  • `list_*_dimensions` for available groupings

  • `query_*_data` to actually pull the data

We went with this self-describing pattern so the LLM can introspect what's available rather than needing it hardcoded in prompts. The capability to introspect will also make it easier to introduce new dimensions or metrics. 

You can slice transaction data by scheme, processor, card type, or issuer, group disputes by reason or status, and even break down fees by category or code. Everything supports date ranges and time granularity (day/week/month/year), and results come back as CSV. There's also a `get_bin_data` tool for BIN lookups, providing LLMs access to Pagos’ extensive BIN database.

Under the Hood

The Pagos MCP server is implemented within our C# modular monolith using the Model Context Protocol C# SDK. Adopting a technology this early required us to move fast and adapt to a lot of changes. There are still only pre-release versions of the C# SDK and we’ve had to hand-roll parts of the OAuth flow. That being said, we’re generally satisfied with the SDK. 

The Pagos MCP server uses OAuth with Dynamic Client Registration for authentication and authorization of LLM clients. We’re grateful to Clerk, our 3rd party auth provider, for supporting these use cases.

The Pagos MCP server relies on the same platform as the rest of Pagos Insights. Merchants connect their payment processors to Pagos, then we ingest all transactions, disputes, refunds, and fees from each, and harmonize that data into a single model, enriching it with additional data where possible. Merchants can then visualize, filter, and analyze that data within the Pagos Service Panel in near-real time. The Pagos MCP server is just a new way to access this verified and normalized data (i.e. via conversation instead of dashboards). 

Our internal testing taught us a valuable lesson: tool descriptions matter. When we used barebones internal tool descriptions, the LLMs needed to retry tool calls several times and make a lot of introspection calls. Before we released our public MCP server, we significantly improved this experience to ensure most customer queries succeed on the first try. To that end, we created a set of evals: 

  1. We built a GitHub workflow that ran our MCP server and dumped its tool schema into a json file.

  2. We loaded this tool schema into a separate process which ran a set of pre-defined prompts. 

  3. We scored performance on these prompts based on (a) if it used the correct tools and arguments to those tools, and (b) how often the introspection queries needed to be called. 

We run these evals in CI anytime we modify the tool descriptions.

Test the Pagos MCP Server Today

The Pagos MCP server is available now to any user with a Pagos account and connected payment data sources. Don't have payment data connected yet? You can still access Pagos's extensive BIN database through the server while you get set up. 

We're excited to see how our customers use conversational access to transform their payments operations. Oh, and we're only just getting started; expect more Pagos capabilities through MCP soon. For instructions on how to set up the server within your own LLM client, see the Pagos Product Documentation.

Share this Blog Post

Share this Blog Post

Let's Chat on

Want to dig deeper into payments data, news, and insights? Have hot takes of your own?
We're talking all things payments on Reddit.

Subscribe to our Blog

Subscribe to
our Blog

Subscribe to our Blog

By submitting, you are providing your consent for future communication in accordance with the Pagos Privacy Policy.

Let's Chat on

Let's Chat on

Want to dig deeper into payments data, news, and insights? Have hot takes of your own?
We're talking all things payments on Reddit.