MCP Server Registry

Browse 19,306 servers scanned across 177 detection rules

Filter:19,306 results

sqlite-mcp-server5

sqlite-mcp-server6

sqlite-mcp-server7

sqlite-mcp-server-enhanced

by neverinfamous

A SQLite MCP server with JSONB support, database administration tools, and advanced database operations

sqlmap-skynet

by drcrypterdotru

SQLMap with Autonomous AI, phased workflows, RAG memory, and MCP Agent Tools

sql-mcp-server

by Snehangshu

MCP server that connects Claude to a Microsoft SQL Server database

sql-migration

SQL Sentinel MCP Server

by tkmawarire

SQL Server monitoring and diagnostics for AI agents using Extended Events. No ODBC drivers required.

SQL Server

Integrates with SQL Server databases to execute queries, manage schemas, run stored procedures, and perform data operations across single-database and server-wide contexts with support for both on-premises and Azure SQL Database environments.

sqlserver-mcp-server

A comprehensive Model Context Protocol (MCP) server for interacting with SQL Server databases

SQL Server MCP (Windows)

by TharanaBope

SQL Server MCP with RAG capabilities for Windows (native ODBC support)

sqry — Semantic Code Search

by verivus-oss

AST-based semantic code search — understands code structure and relationships across 35 languages.

Squad

by meetsquad

Your AI Product Manager. Surface insights, build roadmaps, and plan strategy with 30+ tools.

square-mcp-server

by square

A Model Context Protocol (MCP) server for square

squeez

by claudioemmanuel

Hook-based token compressor for 5 AI CLI hosts (Claude Code, Copilot CLI, OpenCode, Gemini CLI, Codex CLI). Up to 95% bash compression, signature-mode for code reads, cross-call dedup, MCP server, self-teaching protocol. Zero runtime deps.

Squire

by reidgoodbar

CLI-first remote runtimes for validation and offload tasks, exposed over MCP stdio.

@squirex.dev/mcp-server

by squirexadmin

SquireX MCP Server — Agentforce Capability Scanner for AI Coding Agents

Srclight

by srclight

Deep code indexing for AI agents. FTS5 + embeddings + call graphs + git intelligence. Fully local.

sre-agent

by benmoggee

SRE Agent — An AI-powered MCP server for production incident triage. Takes natural-language symptom reports, plans structured investigations using Gemini, executes parallel workers (logs, metrics, deploys, runbooks), synthesizes root-cause reports, and proposes remediation patches with human approval gates.

SRG+

by srgplus

Manage SRG+ hubs, channels, content, assets, users, and workspaces from any MCP-aware AI agent.

srv-d5200rd6ubrc7390v04g

srv-d5200rd6ubrc7390v04g1

srv-d5200rd6ubrc7390v04g12

srv-d7aoqmh5pdvs7391dcqg

# NWO Robotics MCP Server Control real robots, IoT devices, and autonomous agent swarms through natural language — powered by the [NWO Robotics API](https://nwo.capital). --- ## What This Server Does This MCP server exposes the full NWO Robotics API as 64 ready-to-use tools. Any MCP-compatible AI agent (Claude, ChatGPT, Cursor, etc.) can use it to: - Send natural language instructions to physical robots - Run Visual-Language-Action (VLA) inference on live camera feeds - Plan, validate, and execute multi-step robot tasks - Monitor sensors, detect slip, and fuse multi-modal data - Train robots online with reinforcement learning - Register and manage agent identities on Base mainnet via the Cardiac biometric ID system No local installation needed. The server runs on Render and is ready to connect. --- ## Tools Overview ### 🤖 VLA Inference & Models Run Vision-Language-Action inference on any supported robot. Send a text instruction and camera images, receive joint action vectors in real time. Supports auto model routing, ultra-low-latency Cloudflare edge inference (28ms avg), and WebSocket streaming at up to 50Hz. `vla_inference` · `edge_inference` · `list_models` · `get_model_info` · `get_streaming_config` --- ### 🦾 Robot Control & State Query live robot state (joint angles, gripper, battery, position), execute pre-computed action sequences, and fuse camera + lidar + thermal + force + GPS sensor inputs into a single inference call. `query_robot_state` · `execute_actions` · `sensor_fusion` · `robot_query` · `get_agent_status` --- ### 🗺️ Task Planning & Learning Decompose complex instructions into ordered subtasks, execute them step by step, poll progress, and log outcomes so the model learns and improves with every run. `task_planner` · `execute_subtask` · `status_poll` · `learning_recommend` · `learning_log` --- ### 🔑 Agent Management Self-register a new AI agent in under 2 seconds, check your monthly API quota, upgrade tiers by paying ETH, and manage robot registrations and capabilities. | Tier | Calls/month | Cost | |------|-------------|------| | Free | 100,000 | $0 | | Prototype | 500,000 | ~0.015 ETH/mo | | Production | Unlimited | ~0.062 ETH/mo | `register_agent` · `check_balance` · `pay_upgrade` · `create_wallet` · `register_robot` · `update_agent` · `get_agent_info` --- ### 🔍 Agent Discovery Discover all available execution modes (mock / simulated / live), robot types, VLA models, and sensor capabilities. Validate tasks with a dry-run before committing to execution. `nwo_health` · `nwo_whoami` · `discover_capabilities` · `dry_run` · `plan_task` --- ### 🔌 ROS2 Bridge (Physical Robots) Connect directly to physical robots over the ROS2 bridge. Send joint commands, submit action sequences, and trigger emergency stops on one or all robots within 10ms. Supported: UR5e, Panda, Spot, Unitree G1, and more. `ros2_list_robots` · `ros2_robot_status` · `ros2_send_command` · `ros2_submit_action` · `ros2_emergency_stop` · `ros2_emergency_stop_all` · `ros2_get_robot_types` --- ### 🧪 Physics Simulation Simulate trajectories, check for collisions, estimate joint torques, validate grasps, and plan collision-free motions with MoveIt2 — before touching real hardware. `simulate_trajectory` · `check_collision` · `estimate_torques` · `validate_grasp` · `plan_motion` · `get_scene_library` · `generate_scene` --- ### 📐 Embodiment & Calibration Browse the robot embodiment registry (DOF, joint limits, sensors), download URDF models, get normalization parameters for VLA inference, and run automatic joint calibration. `list_embodiments` · `get_robot_specs` · `get_normalization` · `download_urdf` · `get_test_results` · `compare_robots` · `run_calibration` · `calibrate_confidence` --- ### 🧠 Online RL & Fine-Tuning Start online reinforcement learning sessions, stream state/action/reward telemetry, build fine-tuning datasets from logged runs, and launch LoRA fine-tuning jobs on any base VLA model. `start_rl_training` · `submit_rl_telemetry` · `create_finetune_dataset` · `start_finetune_job` --- ### 🖐️ Tactile Sensing (ORCA Hand) Read 256-taxel tactile sensor arrays from the ORCA robot hand, assess grip quality and object texture, and detect slip in real time to prevent dropped objects. `read_tactile` · `process_tactile` · `detect_slip` --- ### 📦 Dataset Hub Access 1.54 million+ human robot demonstrations for the Unitree G1 humanoid (430+ hours, LeRobot-compatible format) for training and fine-tuning. `list_datasets` --- ### 🫀 Cardiac Blockchain Identity (Base Mainnet) Register AI agents on Base mainnet and receive a permanent soul-bound Digital ID (`rootTokenId`). Issue verifiable credentials for task authorization, swarm control, location access, and payments — all gasless via the NWO relayer. Smart contracts deployed on Base Mainnet (Chain ID 8453): - `NWOIdentityRegistry` — `0x78455AFd5E5088F8B5fecA0523291A75De1dAfF8` - `NWOAccessController` — `0x29d177bedaef29304eacdc63b2d0285c459a0f50` - `NWOPaymentProcessor` — `0x4afa4618bb992a073dbcfbddd6d1aebc3d5abd7c` `cardiac_register_agent` · `cardiac_identify_agent` · `cardiac_renew_key` · `cardiac_issue_credential` · `cardiac_check_credential` · `cardiac_grant_access` · `cardiac_get_nonce` · `cardiac_check_access` · `cardiac_payment_process` --- ### 🔮 Cardiac Oracle Validate ECG biometric data from smartwatches to authenticate human identities, compute cardiac hashes, and verify recent validations. `oracle_health` · `oracle_validate_ecg` · `oracle_hash_ecg` · `oracle_verify` --- ## Supported Robot Models | Model | Type | Capabilities | |-------|------|--------------| | `xiaomi-robotics-0` | VLA | Grasp, navigate, manipulate | | `pi05` | VLA | General manipulation | | `groot_n1.7` | VLA | Humanoid control | | `deepseek-ocr-2b` | OCR | Label reading, text recognition | --- ## Example Usage **Pick and place:** > "Pick up the red box from the table and place it on shelf B" **Sensor query:** > "What is the temperature in warehouse zone 3?" **Safety:** > "Run a safety check before moving robot_001 to the loading dock" **Swarm:** > "Deploy all available robots to patrol the perimeter" **Learning:** > "What grip technique should I use for fragile glass objects?" --- ## Links - 🌐 [NWO Capital](https://nwo.capital) - 📄 [Agent Skill File](https://nwo.capital/webapp/agent.md) - 📖 [API Docs](https://nwo.capital/webapp/nwo-robotics.html) - 🧬 [Cardiac SDK](https://github.com/RedCiprianPater/nwo-cardiac-sdk) - 🔑 [Get API Key](https://nwo.capital/webapp/api-key.php) - 🤗 [Live Demo](https://huggingface.co/spaces/PUBLICAE/nwo-robotics-api-demo) - 📜 [OpenAPI Spec](https://nwo.capital/openapi.yaml) --- ## Support 📧 support@nwo.capital