Agent Capability Analysis
The run-thor skill by RamboRogers is an open-source community AI agent skill for Claude Code and other IDE workflows, helping agents execute tasks with better context, repeatability, and domain-specific guidance.
Ideal Agent Persona
Perfect for AI Agents needing GPU-accelerated inference server management and cyberpunk-themed interface for edge deployment.
Core Value
Empowers agents to manage OpenAI-compatible inference servers with automatic model management, dynamic resource allocation, and SSH access via llama.cpp, leveraging cyber-inference for production-like test environments on Thor GPU lab servers.
↓ Capabilities Granted for run-thor
! Prerequisites & Limits
- Requires SSH access to thor.lab
- Dependent on llama.cpp and cyber-inference
- Specific to OpenAI-compatible inference servers
Browser Sandbox Environment
⚡️ Ready to unleash?
Experience this Agent in a zero-setup browser environment powered by WebContainers. No installation required.
run-thor
Install run-thor, an AI agent skill for AI agent workflows and automation. Works with Claude Code, Cursor, and Windsurf with one-command setup.
FAQ & Installation Steps
These questions and steps mirror the structured data on this page for better search understanding.
? Frequently Asked Questions
What is run-thor?
Perfect for AI Agents needing GPU-accelerated inference server management and cyberpunk-themed interface for edge deployment. Cyber-Inference is a web GUI management tool for running OpenAI-compatible inference servers. Built on llama.cpp, it provides automatic model management, dynamic resource allocation, and a beautiful cyberpunk-themed interface designed for edge deployment.
How do I install run-thor?
Run the command: npx killer-skills add RamboRogers/cyber-inference. It works with Cursor, Windsurf, VS Code, Claude Code, and 19+ other IDEs.
What are the use cases for run-thor?
Key use cases include: Deploying and testing cyber-inference on Thor, Managing OpenAI-compatible inference servers, Automating model updates and resource allocation on edge devices.
Which IDEs are compatible with run-thor?
This skill is compatible with Cursor, Windsurf, VS Code, Trae, Claude Code, OpenClaw, Aider, Codex, OpenCode, Goose, Cline, Roo Code, Kiro, Augment Code, Continue, GitHub Copilot, Sourcegraph Cody, and Amazon Q Developer. Use the Killer-Skills CLI for universal one-command installation.
Are there any limitations for run-thor?
Requires SSH access to thor.lab. Dependent on llama.cpp and cyber-inference. Specific to OpenAI-compatible inference servers.
↓ How To Install
-
1. Open your terminal
Open the terminal or command line in your project directory.
-
2. Run the install command
Run: npx killer-skills add RamboRogers/cyber-inference. The CLI will automatically detect your IDE or AI agent and configure the skill.
-
3. Start using the skill
The skill is now active. Your AI agent can use run-thor immediately in the current project.