Enterprise Security & Architecture

Querex MCP servers connect your Power BI and SSAS semantic models to leading AI platforms using the Model Context Protocol. Your data stays where it is—we just make it conversational.

⚡ Executive Summary

  • Querex MCP servers let AI tools query your existing SSAS / Power BI semantic models without copying data
  • Recommended deployment: local STDIO on your hardened enterprise machines — no network, no cloud compute
  • For cloud AI, we align with your existing data classification policy (Tier 1–3)
  • Security model mirrors Microsoft-approved connectors (e.g., SharePoint) for ChatGPT Enterprise / Copilot
  • Your data governance policy determines which AI platforms are appropriate — we just provide the bridge

Querex develops multiple MCP servers—each optimized for different analytical domains. All can run locally on your computer in STDIO mode, or if you prefer, on your own server or cloud infrastructure.

Querex MCP Server Portfolio

🗄️ SSAS MCP Server

Power BI, Fabric, SQL Tabular models

📊 Quantitative Finance MCP

Portfolio optimization, risk modeling, derivatives

📈 Statistical Analysis MCP

Hypothesis testing, regression, forecasting

🔬 Operations Research MCP

Linear programming, optimization, simulation

🔍 Data Flow Summary: Where Does Your Data Go?

Understanding what stays in your infrastructure vs. what reaches the AI provider is critical for security discussions:

  • Query: Natural language questions flow from AI client → MCP server (in your infrastructure)
  • Your Data: Semantic models, databases, and raw data never leave your infrastructure
  • Results: Numerical/textual query outputs return to the AI provider (same pattern as SharePoint/Teams connectors)

Key Point: This is the same data flow as Microsoft-approved connectors like SharePoint, GitHub, or Teams. Your files stay in SharePoint, but file contents appear in AI conversations. With SSAS MCP, your semantic models stay in your infrastructure, but query results appear in conversations.

Flexible Deployment Options

Option 1: Local STDIO (Recommended)

graph TB
    subgraph local["🖥️ Your Hardened Computer"]
        A["⚙️ MCP Server
(STDIO Mode)"] D["📊 Your Data Sources
Power BI / SSAS / Files"] end B["🔧 VS Code / Claude Desktop
(AI Client)"] B <-->|"stdin/stdout
Zero Network Latency"| A A <-->|"Local Queries"| D style A fill:#00A7B5,stroke:#008A96,stroke-width:3px,color:#fff style B fill:#E8E3F0,stroke:#00A7B5,stroke-width:2px style D fill:#D4E8E8,stroke:#00A7B5,stroke-width:2px style local fill:#F5F1ED,stroke:#00A7B5,stroke-width:2px,stroke-dasharray: 5 5

✅ Benefits:

  • Zero compute costs (uses your local CPU)
  • Blazing fast performance (no network latency)
  • Complete privacy (nothing leaves your computer)
  • No server management required

Option 2: Remote HTTP Server

graph TB
    A["🌐 AI Platforms
Claude / ChatGPT / Copilot"] B["🔒 MCP Server
(HTTPS + OAuth)"] C["☁️ Your Cloud
Azure / AWS / GCP"] D["📊 Data Sources
Power BI / SSAS"] A <-->|"HTTPS
API Key / OAuth"| B B -.->|"Hosted on"| C B <-->|"Private Network"| D style A fill:#E8E3F0,stroke:#00A7B5,stroke-width:2px style B fill:#00A7B5,stroke:#008A96,stroke-width:3px,color:#fff style C fill:#D4E8E8,stroke:#00A7B5,stroke-width:2px style D fill:#E8DED2,stroke:#00A7B5,stroke-width:2px

✅ Benefits:

  • Centralized server for team collaboration
  • Auto-scaling based on demand
  • Accessible from web-based AI interfaces
  • IAM-based access control

Why Querex Prioritizes Local STDIO Deployment

💻

Save on Compute Costs

Most modern computers are extremely powerful. Why pay for cloud compute when your laptop can handle blazing-fast queries locally?

Minimal Dependencies

No external runtime dependencies or third-party services—everything runs in a single self-contained binary. Each query executes in milliseconds—perfect for conversational AI interactions.

🔒

Maximum Security

In regulated environments (finance, defense, health, banking), endpoints are typically hardened according to your internal security standards. Data never leaves your machine in STDIO mode.

🌉

We're Just the Bridge

We don't replace your analytics stack or your AI providers. We bridge them. Same data, same infrastructure—new conversational interface.

🎯 The Querex Philosophy

We don't decide what data goes to cloud AI. Your data classification policy does. MCP is just a standardized protocol that bridges AI assistants to your existing analytical capabilities—Power BI semantic models, quantitative finance libraries, statistical packages, or operations research solvers.

By running locally in STDIO mode, analyses happen on machines already compliant with your security standards. Nothing changes about your infrastructure—you gain conversational access to what you already have.

SSAS MCP Server uses the Model Context Protocol (MCP)—an open standard for connecting AI assistants to data sources. This means you're not locked into a single AI vendor.

AI Platform Availability Data Governance Data Location Best For
Claude Enterprise Anthropic Plus, Pro, Team, Enterprise ✅ No training on data
🇪🇺 EU servers available
🇪🇺 EU (available)
🇺🇸 US
European companies requiring GDPR compliance and EU data residency
ChatGPT Enterprise OpenAI Plus, Pro, Business, Enterprise, Edu ✅ No training on data
RBAC controls
🇺🇸 US (primary)
US data residency
Organizations with US data residency, advanced Deep Research capabilities
Microsoft Copilot Azure OpenAI Microsoft 365, Copilot Studio ✅ Microsoft enterprise agreements Regional Azure data centers Organizations already using Microsoft 365 ecosystem

⚡ Key Insight: Same Security Model as Microsoft SharePoint Connectors

When deployed with TLS, OAuth 2.0, and within your existing Azure / Microsoft 365 security perimeter, SSAS MCP follows the same data flow and access pattern as Microsoft-approved connectors such as SharePoint, GitHub, or Teams. The only difference is the data source—instead of files in SharePoint, you're querying Power BI semantic models.

For ChatGPT Business / Enterprise

  • No training on your data (contractual guarantee)
  • ✅ Admin controls with RBAC for granular permissions
  • ✅ Query-on-demand (no data pre-indexing)
  • Deep Research mode for complex analytical reports
  • ✅ Full audit trail and compliance controls
  • ✅ Admin-managed deployment across workspace

For Claude Enterprise (EU)

  • No training on your data (contractual guarantee)
  • 🇪🇺 EU server hosting (GDPR compliance)
  • ✅ Data Processing Agreement (DPA) available
  • ✅ Enterprise-grade security controls
  • ✅ SOC 2 Type II certified
  • ✅ Regular security audits

For Microsoft Copilot Studio

  • Microsoft 365 enterprise agreements
  • ✅ Azure AD integration
  • ✅ Power Platform admin controls
  • ✅ Data Loss Prevention (DLP) policies
  • ✅ Regional Azure data centers
  • ✅ Compliance: ISO 27001, SOC 2, GDPR

We don't decide what can go to cloud AI. Your data classification policy does—we just align SSAS MCP to it.

The key question isn't just whether SSAS MCP is technically secure—it's which data classification tiers your organization approves for cloud AI.

Tier 1

Public Cloud AI

(Claude Enterprise, ChatGPT Enterprise, Microsoft Copilot)

✅ Acceptable Data Types:

  • Market research and trends
  • General business analytics
  • Non-client specific portfolio performance
  • Operational metrics
  • Public financial data

Recommendation: Use SSAS MCP with ChatGPT Enterprise or Claude Enterprise (EU)

Tier 2

Private Cloud AI

(Azure OpenAI with Private Endpoints)

✅ Acceptable Data Types:

  • Client portfolio analysis
  • Risk modeling
  • Trading analytics
  • Internal performance metrics

Recommendation: Wait for Microsoft's native Copilot-SSAS integration or use Azure OpenAI with APIM

Tier 3

On-Premise Only

(No cloud AI)

❌ Restricted Data Types:

  • M&A deal modeling
  • Regulatory reporting
  • Data under specific NDAs
  • Chinese Wall protected information
  • Client-confidential financial structures

Recommendation: For Tier 3 data, use on-premise AI only (local NPU-based models or private GPU clusters). SSAS MCP integrates with those environments without exposing data to public cloud AI.

💡 This is a Universal AI Question

This data classification challenge applies to all cloud AI services—not just SSAS MCP. Whether you use ChatGPT, Claude, Microsoft Copilot, or Google Gemini, you must decide which data tiers are acceptable for cloud AI processing. SSAS MCP simply provides the bridge; your data governance policy determines which AI platforms are approved.

Microsoft Copilot Studio natively supports the Model Context Protocol (MCP), allowing you to create custom AI agents that consume SSAS MCP Server or any other MCP-compatible data source.

Demo: Creating an HR agent that consumes a .NET-based MCP server published over HTTP. This video is part of the Copilot Developer Camp labs.

How to Connect SSAS MCP to Copilot Studio

  1. Deploy SSAS MCP Server

    Host your SSAS MCP Server on Azure Container Apps, Azure Functions, or your own infrastructure with HTTP endpoints.

  2. Create Custom Connector

    In Copilot Studio, navigate to Tools → Add a tool → Model Context Protocol. Use the MCP onboarding wizard to configure your server URL and authentication.

  3. Configure Authentication

    Choose from: None, API Key (header or query), or OAuth 2.0 (Dynamic discovery, Dynamic, or Manual).

  4. Test Connection

    Use the built-in test functionality to verify your SSAS MCP Server responds correctly to queries.

  5. Publish to Workspace

    For Business/Enterprise users, publish the connector to make it available to your entire workspace.

  6. Create Agent

    Build your custom Copilot agent that can query Power BI semantic models through natural conversation.

Additional Resources

📚 Microsoft Documentation

Add MCP Server Tools to Your Agent

🔧 Technical Samples

Copilot Studio Samples on GitHub

🎓 Developer Camp

Try Copilot Studio (Free Trial)

Deployment Architecture Options

✅ Recommended: Local STDIO Mode

Your Computer (Hardened)MCP Server (STDIO)VS Code / Claude Desktop

The MCP server runs entirely on your local machine. Communication happens through standard input/output (stdin/stdout)—no network connections, no HTTP endpoints, no cloud compute costs. Perfect for finance, defense, health, or banking environments where data must remain on certified, hardened machines.

Supported AI Clients:

  • ✅ VS Code with GitHub Copilot, Claude, or custom endpoints
  • ✅ Claude Desktop (official Anthropic app)
  • ✅ Cline VS Code extension
  • ✅ Any MCP-compatible local client

Best for: Maximum security, zero cloud costs, blazing-fast performance, regulated industries

✅ Supported: Public HTTP Endpoint

InternetMCP Server (HTTPS + Auth)Your Data Sources

Deploy your MCP server as an HTTP service (Streamable HTTP transport) with authentication. Accessible from web-based AI interfaces like ChatGPT.com or Claude.ai. Can be hosted on Azure, AWS, Google Cloud Run, or your own servers.

Deployment Options:

  • Google Cloud Run (deploy in under 10 minutes)
  • ✅ Azure Container Apps or Azure Functions
  • ✅ AWS Lambda or ECS
  • ✅ Your own Docker container infrastructure

Best for: Team collaboration, web-based AI interfaces, centralized management

✅ Supported: Private Azure/AWS Endpoints

VS Code / Copilot StudioPrivate Azure/AWS EndpointMCP Server (Private VNet)

Deploy MCP server in your private VNet and connect through custom Azure OpenAI or AWS endpoints. Configure VS Code, Copilot Studio, or Azure AI Foundry to use your private endpoint configuration.

Best for: Enterprise security requirements, complete network isolation

✅ Supported: Azure API Management (APIM)

AI ClientsAzure APIM (with Private Link)MCP ServerData Sources

Route all MCP requests through Azure API Management for additional security layers: IP restrictions, rate limiting, Microsoft Entra ID authentication, managed identities, DDoS protection.

Best for: Enterprise governance, advanced security controls, Azure-native deployments

🔒 Private Network Deployments

For maximum security, deploy SSAS MCP Server in a private network and connect through:

  • VS Code with Custom Azure OpenAI Endpoints (private VNet supported)
  • Microsoft Copilot Studio (private network connectors)
  • Azure AI Foundry (private endpoint support)
  • GitHub Copilot (custom model endpoint configuration)

For public deployments, secure your SSAS MCP Server with:

  • API Key authentication (minimum requirement)
  • OAuth 2.0 for user-specific access control
  • IP allowlisting (if your AI platform provides static IPs)
  • Rate limiting to prevent abuse
  • TLS/HTTPS for all traffic (required)
  • Audit logging for compliance tracking
  • Secret management via Azure Key Vault / AWS Secrets Manager (store all credentials securely)
  • Log hygiene configured to exclude sensitive query parameters and PII from logs

Best Practice: Store all credentials (API keys, OAuth secrets, database passwords) in your existing secret management solution (e.g., Azure Key Vault, AWS Secrets Manager), and configure MCP server logs to redact sensitive values and personally identifiable information.

Q: Does my data leave my infrastructure?

Your semantic model data stays on your infrastructure. When you ask a question, SSAS MCP Server queries your models and returns results. Those results appear in the AI conversation, which is processed by the AI provider (Anthropic, OpenAI, or Microsoft). This is the same as using SharePoint connectors—your files stay in SharePoint, but file contents appear in conversations.

Q: Will AI providers train models on my data?

For Enterprise customers: No. Both ChatGPT Business/Enterprise and Claude Enterprise have contractual guarantees that your data is not used for model training. For Plus/Pro users, check your "Improve the model for everyone" settings.

Q: Can I use SSAS MCP with Azure OpenAI private endpoints?

Yes! When using development tools like VS Code, Copilot Studio, or Azure AI Foundry, you can configure custom Azure OpenAI endpoints, including private endpoints in your Azure VNet. This allows your SSAS MCP Server to remain completely private within your network.

However, the public web interfaces (ChatGPT.com, Claude.ai) cannot reach private network resources. For private deployments, use VS Code with custom endpoint configuration or Microsoft Copilot Studio.

Q: How does this compare to Microsoft's native Copilot-SSAS integration?

Microsoft is building native integration between Copilot and Power BI semantic models. When released, that will offer the tightest integration within the Microsoft ecosystem. SSAS MCP Server's advantage is multi-platform support—you can use Claude, ChatGPT, and future MCP-compatible AI platforms, not just Microsoft Copilot.

Q: What compliance certifications does SSAS MCP Server have?

SSAS MCP Server itself is infrastructure—compliance depends on how you deploy it and which AI platforms you connect to. If you deploy on Azure, you inherit Azure's compliance certifications (ISO 27001, SOC 2, GDPR, etc.). The AI platforms (Claude, ChatGPT, Copilot) each have their own certifications.

Q: Who should NOT use cloud AI with SSAS MCP?

Organizations with strict network isolation requirements should avoid cloud AI—but that doesn't mean giving up AI entirely. For Tier 3 data (M&A deals, Chinese Wall protected data, specific NDAs), you have powerful on-premise options:

  • Local NPU-Powered LLMs: Modern Windows laptops with AMD Ryzen™ AI or Intel Core Ultra NPUs can run local LLMs using FastFlowLM. These run entirely on-device with zero cloud connectivity—perfect for air-gapped, hardened machines in finance, defense, or healthcare.
  • Private GPU Clusters: Organizations can deploy modern 8+ GPU clusters to run state-of-the-art LLMs completely within your private network. (Hardware examples and pricing vary; contact us for current recommendations)
  • MCP Works Everywhere: Querex MCP servers work identically with local LLMs, private clusters, or cloud AI—you choose the deployment that matches your security requirements.

Bottom line: Tier 3 data doesn't mean "no AI." It means on-premise AI using local NPUs, private GPU clusters, or hardened infrastructure that never connects to the internet.

Ready to Discuss Your Security Requirements?

Schedule a 30–60 minute technical consultation to determine the right deployment approach for your organization.

What We'll Cover:

  • ✅ Map your existing data classification tiers to MCP deployment options
  • ✅ Identify which AI platforms are safe for Tier 1 & 2 data
  • ✅ Decide whether local STDIO, private endpoints, or public HTTP best fit your risk profile
  • ✅ Review compliance requirements (SOC 2, GDPR, ISO 27001, industry-specific)
  • ✅ Discuss on-premise AI options for Tier 3 data (local NPU / private GPU clusters)