develop-ai-functions-example — community develop-ai-functions-example, community, ide skills, Claude Code, Cursor, Windsurf

v1.0.0
GitHub

About this Skill

Ideal for TypeScript-based AI Agent Developers working with multimodal video and language models. A typescript library for connecting videos in your Mux account to multi-modal LLMs.

muxinc muxinc
[57]
[2]
Updated: 2/24/2026

Agent Capability Analysis

The develop-ai-functions-example skill by muxinc is an open-source community AI agent skill for Claude Code and other IDE workflows, helping agents execute tasks with better context, repeatability, and domain-specific guidance.

Ideal Agent Persona

Ideal for TypeScript-based AI Agent Developers working with multimodal video and language models.

Core Value

Enables direct integration between Mux video assets and AI providers through the AI SDK's generateText() function. Provides structured examples for validating and testing multi-modal LLM connections with streaming and non-streaming text generation capabilities.

Capabilities Granted for develop-ai-functions-example

Testing Mux video integration with multi-modal LLMs
Validating AI SDK function implementations across providers
Iterating on streaming vs non-streaming text generation workflows

! Prerequisites & Limits

  • Requires Mux account credentials
  • TypeScript library dependency
  • Limited to AI SDK compatible providers
Labs Demo

Browser Sandbox Environment

⚡️ Ready to unleash?

Experience this Agent in a zero-setup browser environment powered by WebContainers. No installation required.

Boot Container Sandbox

develop-ai-functions-example

Install develop-ai-functions-example, an AI agent skill for AI agent workflows and automation. Works with Claude Code, Cursor, and Windsurf with one-command...

SKILL.md
Readonly

AI Functions Examples

The examples/ai-functions/ directory contains scripts for validating, testing, and iterating on AI SDK functions across providers.

Example Categories

Examples are organized by AI SDK function in examples/ai-functions/src/:

DirectoryPurpose
generate-text/Non-streaming text generation with generateText()
stream-text/Streaming text generation with streamText()
generate-object/Structured output generation with generateObject()
stream-object/Streaming structured output with streamObject()
agent/ToolLoopAgent examples for agentic workflows
embed/Single embedding generation with embed()
embed-many/Batch embedding generation with embedMany()
generate-image/Image generation with generateImage()
generate-speech/Text-to-speech with generateSpeech()
transcribe/Audio transcription with transcribe()
rerank/Document reranking with rerank()
middleware/Custom middleware implementations
registry/Provider registry setup and usage
telemetry/OpenTelemetry integration
complex/Multi-component examples (agents, routers)
lib/Shared utilities (not examples)
tools/Reusable tool definitions

File Naming Convention

Examples follow the pattern: {provider}-{feature}.ts

PatternExampleDescription
{provider}.tsopenai.tsBasic provider usage
{provider}-{feature}.tsopenai-tool-call.tsSpecific feature
{provider}-{sub-provider}.tsamazon-bedrock-anthropic.tsProvider with sub-provider
{provider}-{sub-provider}-{feature}.tsgoogle-vertex-anthropic-cache-control.tsSub-provider with feature

Example Structure

All examples use the run() wrapper from lib/run.ts which:

  • Loads environment variables from .env
  • Provides error handling with detailed API error logging

Basic Template

typescript
1import { providerName } from "@ai-sdk/provider-name"; 2import { generateText } from "ai"; 3 4import { run } from "../lib/run"; 5 6run(async () => { 7 const result = await generateText({ 8 model: providerName("model-id"), 9 prompt: "Your prompt here.", 10 }); 11 12 console.warn(result.text); 13 console.warn("Token usage:", result.usage); 14 console.warn("Finish reason:", result.finishReason); 15});

Streaming Template

typescript
1import { providerName } from "@ai-sdk/provider-name"; 2import { streamText } from "ai"; 3 4import { printFullStream } from "../lib/print-full-stream"; 5import { run } from "../lib/run"; 6 7run(async () => { 8 const result = streamText({ 9 model: providerName("model-id"), 10 prompt: "Your prompt here.", 11 }); 12 13 await printFullStream({ result }); 14});

Tool Calling Template

typescript
1import { providerName } from "@ai-sdk/provider-name"; 2import { generateText, tool } from "ai"; 3import { z } from "zod"; 4 5import { run } from "../lib/run"; 6 7run(async () => { 8 const result = await generateText({ 9 model: providerName("model-id"), 10 tools: { 11 myTool: tool({ 12 description: "Tool description", 13 inputSchema: z.object({ 14 param: z.string().describe("Parameter description"), 15 }), 16 execute: async ({ param }) => { 17 return { result: `Processed: ${param}` }; 18 }, 19 }), 20 }, 21 prompt: "Use the tool to...", 22 }); 23 24 console.warn(JSON.stringify(result, null, 2)); 25});

Structured Output Template

typescript
1import { providerName } from "@ai-sdk/provider-name"; 2import { generateObject } from "ai"; 3import { z } from "zod"; 4 5import { run } from "../lib/run"; 6 7run(async () => { 8 const result = await generateObject({ 9 model: providerName("model-id"), 10 schema: z.object({ 11 name: z.string(), 12 items: z.array(z.string()), 13 }), 14 prompt: "Generate a...", 15 }); 16 17 console.warn(JSON.stringify(result.object, null, 2)); 18 console.warn("Token usage:", result.usage); 19});

Running Examples

From the examples/ai-functions directory:

bash
1pnpm tsx src/generate-text/openai.ts 2pnpm tsx src/stream-text/openai-tool-call.ts 3pnpm tsx src/agent/openai-generate.ts

When to Write Examples

Write examples when:

  1. Adding a new provider: Create basic examples for each supported API (generateText, streamText, generateObject, etc.)

  2. Implementing a new feature: Demonstrate the feature with at least one provider example

  3. Reproducing a bug: Create an example that shows the issue for debugging

  4. Adding provider-specific options: Show how to use providerOptions for provider-specific settings

  5. Creating test fixtures: Use examples to generate API response fixtures (see capture-api-response-test-fixture skill)

Utility Helpers

The lib/ directory contains shared utilities:

FilePurpose
run.tsError-handling wrapper with .env loading
print.tsClean object printing (removes undefined values)
print-full-stream.tsColored streaming output for tool calls, reasoning, text
save-raw-chunks.tsSave streaming chunks for test fixtures
present-image.tsDisplay images in terminal
save-audio.tsSave audio files to disk

Using print utilities

typescript
1import { print } from "../lib/print"; 2 3// Pretty print objects without undefined values 4print("Result:", result); 5print("Usage:", result.usage, { depth: 2 });

Using printFullStream

typescript
1// import { printFullStream } from '../lib/print-full-stream'; 2 3// const result = streamText({ ... }); 4// await printFullStream({ result }); // Colored output for text, tool calls, reasoning

Reusable Tools

The tools/ directory contains reusable tool definitions:

typescript
1import { weatherTool } from "../tools/weather-tool"; 2 3const result = await generateText({ 4 model: openai("gpt-4o"), 5 tools: { weather: weatherTool }, 6 prompt: "What is the weather in San Francisco?", 7});

Best Practices

  1. Keep examples focused: Each example should demonstrate one feature or use case

  2. Use descriptive prompts: Make it clear what the example is testing

  3. Handle errors gracefully: The run() wrapper handles this automatically

  4. Use realistic model IDs: Use actual model IDs that work with the provider

  5. Add comments for complex logic: Explain non-obvious code patterns

  6. Reuse tools when appropriate: Use weatherTool or create new reusable tools in tools/

FAQ & Installation Steps

These questions and steps mirror the structured data on this page for better search understanding.

? Frequently Asked Questions

What is develop-ai-functions-example?

Ideal for TypeScript-based AI Agent Developers working with multimodal video and language models. A typescript library for connecting videos in your Mux account to multi-modal LLMs.

How do I install develop-ai-functions-example?

Run the command: npx killer-skills add muxinc/ai/develop-ai-functions-example. It works with Cursor, Windsurf, VS Code, Claude Code, and 19+ other IDEs.

What are the use cases for develop-ai-functions-example?

Key use cases include: Testing Mux video integration with multi-modal LLMs, Validating AI SDK function implementations across providers, Iterating on streaming vs non-streaming text generation workflows.

Which IDEs are compatible with develop-ai-functions-example?

This skill is compatible with Cursor, Windsurf, VS Code, Trae, Claude Code, OpenClaw, Aider, Codex, OpenCode, Goose, Cline, Roo Code, Kiro, Augment Code, Continue, GitHub Copilot, Sourcegraph Cody, and Amazon Q Developer. Use the Killer-Skills CLI for universal one-command installation.

Are there any limitations for develop-ai-functions-example?

Requires Mux account credentials. TypeScript library dependency. Limited to AI SDK compatible providers.

How To Install

  1. 1. Open your terminal

    Open the terminal or command line in your project directory.

  2. 2. Run the install command

    Run: npx killer-skills add muxinc/ai/develop-ai-functions-example. The CLI will automatically detect your IDE or AI agent and configure the skill.

  3. 3. Start using the skill

    The skill is now active. Your AI agent can use develop-ai-functions-example immediately in the current project.

Related Skills

Looking for an alternative to develop-ai-functions-example or another community skill for your workflow? Explore these related open-source skills.

View All

widget-generator

Logo of f
f

f.k.a. Awesome ChatGPT Prompts. Share, discover, and collect prompts from the community. Free and open source — self-host for your organization with complete privacy.

149.6k
0
AI

flags

Logo of vercel
vercel

flags is a Next.js feature management skill that enables developers to efficiently add or modify framework feature flags, streamlining React application development.

138.4k
0
Browser

zustand

Logo of lobehub
lobehub

The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.

72.8k
0
AI

data-fetching

Logo of lobehub
lobehub

The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.

72.8k
0
AI