cortex-m — community cortex-m, executorch, community, ide skills, Claude Code, Cursor, Windsurf

v1.0.0
GitHub

About this Skill

Ideal for Edge AI Agents requiring efficient on-device AI across mobile, embedded, and edge devices with PyTorch support. On-device AI across mobile, embedded and edge for PyTorch

pytorch pytorch
[0]
[0]
Updated: 3/5/2026

Agent Capability Analysis

The cortex-m skill by pytorch is an open-source community AI agent skill for Claude Code and other IDE workflows, helping agents execute tasks with better context, repeatability, and domain-specific guidance.

Ideal Agent Persona

Ideal for Edge AI Agents requiring efficient on-device AI across mobile, embedded, and edge devices with PyTorch support.

Core Value

Empowers agents to utilize the Cortex-M backend for CMSIS-NN, enabling custom ops and graph passes to replace ATen quantized ops, and leveraging standard PT2E quantization with the CortexMPassManager for rewriting quantized ops to cortex_m:: equivalents, utilizing libraries like executorch and CMSIS-NN.

Capabilities Granted for cortex-m

Deploying PyTorch models on edge devices with CMSIS-NN
Optimizing on-device AI performance with custom ops and graph passes
Quantizing models using PT2E and rewriting with CortexMPassManager

! Prerequisites & Limits

  • Requires PyTorch support
  • Limited to edge devices with Cortex-M architecture
  • Custom implementation needed for non-standard quantization
Labs Demo

Browser Sandbox Environment

⚡️ Ready to unleash?

Experience this Agent in a zero-setup browser environment powered by WebContainers. No installation required.

Boot Container Sandbox

cortex-m

Install cortex-m, an AI agent skill for AI agent workflows and automation. Works with Claude Code, Cursor, and Windsurf with one-command setup.

SKILL.md
Readonly

Cortex-M (CMSIS-NN) Backend

Architecture

Not a delegate backend — no partitioner. Custom ops and graph passes replace ATen quantized ops with CMSIS-NN equivalents at the graph level.

Pipeline

Uses standard PT2E quantization (prepare_pt2e / convert_pt2e), then CortexMPassManager rewrites quantized ops to cortex_m:: equivalents.

python
1from executorch.backends.cortex_m.quantizer.quantizer import CortexMQuantizer 2from executorch.backends.cortex_m.passes.cortex_m_pass_manager import CortexMPassManager 3from torch.export import export 4from torchao.quantization.pt2e.quantize_pt2e import convert_pt2e, prepare_pt2e 5from executorch.exir import to_edge_transform_and_lower, EdgeCompileConfig 6 7quantizer = CortexMQuantizer() 8captured = export(model, example_inputs).module() 9prepared = prepare_pt2e(captured, quantizer) 10prepared(*example_inputs) # calibration 11quantized = convert_pt2e(prepared) 12 13exported = export(quantized, example_inputs) 14edge = to_edge_transform_and_lower( 15 exported, 16 compile_config=EdgeCompileConfig(_check_ir_validity=False), 17) 18edge._edge_programs["forward"] = CortexMPassManager( 19 edge.exported_program(), CortexMPassManager.pass_list 20).transform() 21et_program = edge.to_executorch()

In tests, CortexMTester wraps this pipeline:

python
1from executorch.backends.cortex_m.test.tester import CortexMTester 2 3tester = CortexMTester(model, example_inputs) 4tester.quantize().export().to_edge().run_passes().to_executorch()

Key Files

FilePurpose
backends/cortex_m/quantizer/quantizer.pyCortexMQuantizer — quantizes model for CMSIS-NN
backends/cortex_m/passes/cortex_m_pass_manager.pyCortexMPassManager — rewrites ATen ops → cortex_m:: ops
backends/cortex_m/test/tester.pyCortexMTester — test harness with test_dialect() and test_implementation()
backends/cortex_m/ops/operators.pyPython op definitions and reference implementations (cortex_m:: namespace)
backends/cortex_m/ops/operators.yamlC++ kernel registration schemas (used by build system)

C++ kernels calling CMSIS-NN APIs live under backends/cortex_m/ops/.

Testing

Toolchain setup (required for test_implementation tests):

bash
1./examples/arm/setup.sh --i-agree-to-the-contained-eula 2source ./examples/arm/arm-scratch/setup_path.sh

Run all tests:

bash
1source ./examples/arm/arm-scratch/setup_path.sh 2pytest backends/cortex_m/test/

test_dialect_* tests verify graph correctness (pure Python, no toolchain needed). test_implementation_* tests verify numerical accuracy on the Corstone-300 FVP (requires toolchain on PATH).

Baremetal build:

bash
1backends/cortex_m/test/build_test_runner.sh

Adding a New Op

  1. Define the op schema, meta function, and reference implementation in operators.py
  2. Write the C++ kernel in backends/cortex_m/ops/ calling CMSIS-NN APIs
  3. Register the .out kernel in operators.yaml
  4. Add a pass to rewrite the ATen op → cortex_m:: op
  5. Test with CortexMTester.test_dialect() (graph correctness) and test_implementation() (numerical accuracy on FVP)

FAQ & Installation Steps

These questions and steps mirror the structured data on this page for better search understanding.

? Frequently Asked Questions

What is cortex-m?

Ideal for Edge AI Agents requiring efficient on-device AI across mobile, embedded, and edge devices with PyTorch support. On-device AI across mobile, embedded and edge for PyTorch

How do I install cortex-m?

Run the command: npx killer-skills add pytorch/executorch. It works with Cursor, Windsurf, VS Code, Claude Code, and 19+ other IDEs.

What are the use cases for cortex-m?

Key use cases include: Deploying PyTorch models on edge devices with CMSIS-NN, Optimizing on-device AI performance with custom ops and graph passes, Quantizing models using PT2E and rewriting with CortexMPassManager.

Which IDEs are compatible with cortex-m?

This skill is compatible with Cursor, Windsurf, VS Code, Trae, Claude Code, OpenClaw, Aider, Codex, OpenCode, Goose, Cline, Roo Code, Kiro, Augment Code, Continue, GitHub Copilot, Sourcegraph Cody, and Amazon Q Developer. Use the Killer-Skills CLI for universal one-command installation.

Are there any limitations for cortex-m?

Requires PyTorch support. Limited to edge devices with Cortex-M architecture. Custom implementation needed for non-standard quantization.

How To Install

  1. 1. Open your terminal

    Open the terminal or command line in your project directory.

  2. 2. Run the install command

    Run: npx killer-skills add pytorch/executorch. The CLI will automatically detect your IDE or AI agent and configure the skill.

  3. 3. Start using the skill

    The skill is now active. Your AI agent can use cortex-m immediately in the current project.

Related Skills

Looking for an alternative to cortex-m or another community skill for your workflow? Explore these related open-source skills.

View All

widget-generator

Logo of f
f

f.k.a. Awesome ChatGPT Prompts. Share, discover, and collect prompts from the community. Free and open source — self-host for your organization with complete privacy.

149.6k
0
AI

flags

Logo of vercel
vercel

flags is a Next.js feature management skill that enables developers to efficiently add or modify framework feature flags, streamlining React application development.

138.4k
0
Browser

zustand

Logo of lobehub
lobehub

The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.

72.8k
0
AI

data-fetching

Logo of lobehub
lobehub

The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.

72.8k
0
AI