apprentice: building an MCP-native app from day one

designed as a set of tools that AI clients call — not a web app with an AI integration bolted on.

package: mcp server build sprint · client: apprentice — reducibl internal product · industry: education / consumer AI · timeline: 1 week

the situation

apprentice is an AI-powered art study tool with 25,000+ enriched masterworks. the goal was to build it as an MCP-native app from day one — not a web app that later gets an AI integration, but a system designed from the ground up to be used through AI clients.

the dataset and enrichment pipeline were already built (that story here). the question was: how do you turn a curated dataset into a production MCP server with auth, scoping, and multi-client support — in a week?

what we did

four MCP tools, each with scope-based access control through gatewaystack:

each tool declares its required scopes (read vs write). the same gatewaystack identity layer used in inner was applied — OAuth via Auth0, scope-based access control, per-user data isolation in firestore, audit logging. the governance setup took hours instead of days because the patterns were already proven.

deployed to cloud run, submitted as a chatgpt app, and launched at learnart.app.

the result

apprentice works through any AI client that supports MCP. when new agents add MCP support, apprentice is already there — no rebuild required. the “UI” is whatever AI client the user prefers.

the reusable governance pattern meant auth and scoping — usually the slowest part of a new service — was the fastest. one identity layer, applied to new tool definitions, done.

key decisions


mcp server build sprint — zero to production MCP server in one week.


interested in working together? let's talk