A public MCP server that lets Claude, ChatGPT, Cursor and any MCP-compatible AI models direct access to our 350+ lens dataset utilizing our powerful Optics-Wizards™ tools to find the best lens/CMOS imager solutions for automotive, robotics, drone, medical and physical AI applications. Based in the U.S. and serving customers worldwide for 25 years, Sunex has shipped over 100M+ lenses and imaging solutions for mission-critical systems with unmatched reliability.
Ask in natural language. The assistant picks the right tool, queries our live catalog, and returns structured results with spec-sheet, sample-order, and RFQ links.
Search by part number, manufacturer, or resolution class. Get full specs plus computed sensor geometry in millimeters.
Feed any imager PN and get compatible M12 lenses with per-lens FOV, angular resolution, and F/# filters.
Every result includes sample pricing, spec sheet URL, sample order link, and volume RFQ link. No separate lookup.
No account, no API key, no install. Just paste one URL into your MCP-compatible client.
https://mcp.sunex-ai.com/sse
This is the canonical MCP endpoint. https://sunex-ai.com/sse also works.
Settings → ConnectorsAdd custom connector~/.cursor/mcp.json)sse and the URL abovemcp-openapi)All tools return structured JSON. Parameters are self-documenting — your AI client sees descriptions and types automatically.
A thin MCP server on Cloudflare Workers, proxying Sunex's live product database. Everything is public, documented, and auditable.
# List available tools curl https://mcp.sunex-ai.com/sse # Inspect the public manifest curl https://mcp.sunex-ai.com/.well-known/mcp.json # From Python with the MCP SDK from mcp import ClientSession from mcp.client.sse import sse_client async with sse_client("https://mcp.sunex-ai.com/sse") as (r, w): async with ClientSession(r, w) as session: await session.initialize() result = await session.call_tool( "recommend_lens_for_imager", {"imagerPn": "IMX577", "fNumMax": 2.0} )
The essentials, for engineers and buyers alike.
Model Context Protocol is an open standard (originally from Anthropic, now supported by OpenAI, Google, Cursor, Zed, and others) that lets AI assistants call external tools. Think "USB-C for LLMs." If your assistant supports MCP, it can talk to our catalog without any custom integration work on your end — you just paste a URL.
Free and public. Sunex maintains it because helping engineers find the right lens faster is good for everyone — including us. The server is stateless, read-only, and rate-limited fairly. If you want high-volume programmatic access or custom tools, contact us.
MCP is native in Claude, Cursor, Continue, Zed, and a growing list of clients. ChatGPT supports it indirectly via a bridge to Actions / custom GPTs. OpenAI has announced MCP support, and we'll update this page when native connectors ship.
Live. The MCP server proxies Sunex's production catalog in real time — same data as optics-online.com. When we add a new lens, it's callable within minutes.
The server code is open and deployable to any Cloudflare account. If you're an integrator or distributor who wants your own branded MCP endpoint against our catalog, contact us — we're happy to help.
The math and database come directly from our lens wizard, which has been in production for years. But if you see something off, we want to know — email support@sunex.com with the prompt and the response.
Yes, on the roadmap. Current tools are read-only by design — anything that creates an order or RFQ will require explicit auth and per-session confirmation.