explore-dataset — community explore-dataset, community, ide skills, Claude Code, Cursor, Windsurf

v1.0.0
GitHub

About this Skill

Perfect for Data Analysis Agents needing advanced dataset exploration and schema discovery capabilities. The power of Axiom on the command line.

axiomhq axiomhq
[0]
[0]
Updated: 3/5/2026

Agent Capability Analysis

The explore-dataset skill by axiomhq is an open-source community AI agent skill for Claude Code and other IDE workflows, helping agents execute tasks with better context, repeatability, and domain-specific guidance.

Ideal Agent Persona

Perfect for Data Analysis Agents needing advanced dataset exploration and schema discovery capabilities.

Core Value

Empowers agents to systematically explore Axiom datasets using JSON-formatted output and axiom query commands, providing insights into dataset structure, content, and potential use cases through schema discovery and field type analysis.

Capabilities Granted for explore-dataset

Discovering actual field names and types in Axiom datasets
Listing available datasets for further analysis
Analyzing dataset content for potential use cases

! Prerequisites & Limits

  • Requires Axiom dataset access
  • Command line interface needed
  • Dependent on axiom query and dataset list commands
Labs Demo

Browser Sandbox Environment

⚡️ Ready to unleash?

Experience this Agent in a zero-setup browser environment powered by WebContainers. No installation required.

Boot Container Sandbox

explore-dataset

Install explore-dataset, an AI agent skill for AI agent workflows and automation. Works with Claude Code, Cursor, and Windsurf with one-command setup.

SKILL.md
Readonly

Dataset Exploration

Systematically explore an Axiom dataset to understand its structure, content, and potential use cases.

Arguments

When invoked with a dataset name (e.g., /explore-dataset logs), the name is available as $ARGUMENTS.

Exploration Protocol

1. List Available Datasets

If no dataset specified, list what's available:

bash
1axiom dataset list -f json

2. Schema Discovery

Always start here. Discover actual field names and types:

bash
1axiom query "['<dataset>'] | getschema" --start-time -1h

Identify:

  • Field names and types
  • Dotted fields requiring bracket notation
  • Timestamp fields
  • Key dimensions (service, status, level)

OTel trace data: If schema contains trace_id, span_id, attributes.*, note that:

  • Service fields are promoted: use ['service.name'] not ['resource.service.name']
  • Custom attributes: ['attributes.custom']['field'] with tostring() for aggregations
  • See axiom-apl skill's OTel reference for field mappings

3. Sample Data

Examine actual values:

bash
1axiom query "['<dataset>'] | limit 10" --start-time -1h -f json

Look for:

  • Data structure and relationships
  • Field value formats
  • Data quality issues

4. Volume Analysis

Understand data volume patterns:

bash
1axiom query "['<dataset>'] | summarize count() by bin(_time, 1h) | sort by _time asc" --start-time -24h

Analyze:

  • Event volume over time
  • Data freshness
  • Collection gaps

5. Categorical Field Analysis

For each key categorical field (status, level, service):

bash
1axiom query "['<dataset>'] | summarize count() by <field> | top 20 by count_" --start-time -1h

Identify:

  • Value distributions
  • Cardinality
  • Key dimensions for filtering

6. Numerical Field Statistics

For numeric fields (duration, bytes, count):

bash
1axiom query "['<dataset>'] | summarize count(), min(<field>), max(<field>), avg(<field>), percentiles(<field>, 50, 95, 99)" --start-time -1h

7. Error Pattern Detection

Search for error indicators:

bash
1axiom query "search in (['<dataset>']) 'error' or 'fail' or 'exception' | limit 20" --start-time -1h

Output Format

Provide a summary including:

markdown
1## Dataset Summary: <name> 2 3### Purpose 4<What system generated this data, what it represents> 5 6### Key Fields 7| Field | Type | Description | 8|-------|------|-------------| 9| ... | ... | ... | 10 11### Volume 12- Events per hour: ~X 13- Data freshness: last event at X 14 15### Key Dimensions 16- `status`: 200, 400, 500, ... 17- `service.name`: api, web, worker, ... 18 19### Recommended Queries 20<Common queries for this dataset> 21 22### Monitoring Opportunities 23<What could be alerted on>

When NOT to Use

  • Known datasets: If you already understand the schema, skip exploration and query directly
  • Quick field check: Use getschema directly for single field lookups
  • Production queries: Exploration uses expensive operations (search); extract patterns then optimize
  • Repeated analysis: Once explored, document findings and reuse—don't re-explore

APL Reference

For query syntax, invoke the axiom-apl skill which provides comprehensive documentation on operators, functions, and patterns.

FAQ & Installation Steps

These questions and steps mirror the structured data on this page for better search understanding.

? Frequently Asked Questions

What is explore-dataset?

Perfect for Data Analysis Agents needing advanced dataset exploration and schema discovery capabilities. The power of Axiom on the command line.

How do I install explore-dataset?

Run the command: npx killer-skills add axiomhq/cli. It works with Cursor, Windsurf, VS Code, Claude Code, and 19+ other IDEs.

What are the use cases for explore-dataset?

Key use cases include: Discovering actual field names and types in Axiom datasets, Listing available datasets for further analysis, Analyzing dataset content for potential use cases.

Which IDEs are compatible with explore-dataset?

This skill is compatible with Cursor, Windsurf, VS Code, Trae, Claude Code, OpenClaw, Aider, Codex, OpenCode, Goose, Cline, Roo Code, Kiro, Augment Code, Continue, GitHub Copilot, Sourcegraph Cody, and Amazon Q Developer. Use the Killer-Skills CLI for universal one-command installation.

Are there any limitations for explore-dataset?

Requires Axiom dataset access. Command line interface needed. Dependent on axiom query and dataset list commands.

How To Install

  1. 1. Open your terminal

    Open the terminal or command line in your project directory.

  2. 2. Run the install command

    Run: npx killer-skills add axiomhq/cli. The CLI will automatically detect your IDE or AI agent and configure the skill.

  3. 3. Start using the skill

    The skill is now active. Your AI agent can use explore-dataset immediately in the current project.

Related Skills

Looking for an alternative to explore-dataset or another community skill for your workflow? Explore these related open-source skills.

View All

widget-generator

Logo of f
f

f.k.a. Awesome ChatGPT Prompts. Share, discover, and collect prompts from the community. Free and open source — self-host for your organization with complete privacy.

149.6k
0
AI

flags

Logo of vercel
vercel

flags is a Next.js feature management skill that enables developers to efficiently add or modify framework feature flags, streamlining React application development.

138.4k
0
Browser

zustand

Logo of lobehub
lobehub

The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.

72.8k
0
AI

data-fetching

Logo of lobehub
lobehub

The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.

72.8k
0
AI