polars-expertise — community polars-expertise, community, ide skills, Claude Code, Cursor, Windsurf

v1.0.0
GitHub

About this Skill

Perfect for Data Analysis Agents needing high-performance DataFrame processing with Apache Arrow and Python. Python toolkit for processing and analyzing Polarized Resonant Reflectivity Data

WSU-Carbon-Lab WSU-Carbon-Lab
[0]
[0]
Updated: 3/5/2026

Agent Capability Analysis

The polars-expertise skill by WSU-Carbon-Lab is an open-source community AI agent skill for Claude Code and other IDE workflows, helping agents execute tasks with better context, repeatability, and domain-specific guidance.

Ideal Agent Persona

Perfect for Data Analysis Agents needing high-performance DataFrame processing with Apache Arrow and Python.

Core Value

Empowers agents to efficiently process and analyze large datasets with expression-based API, lazy evaluation, and automatic parallelization using Polars, supporting both CPU and GPU acceleration.

Capabilities Granted for polars-expertise

Analyzing Polarized Resonant Reflectivity Data with high-performance DataFrames
Optimizing data processing workflows with lazy evaluation and automatic parallelization
Building scalable data analysis pipelines with Python and Rust support

! Prerequisites & Limits

  • Requires Python or Rust programming knowledge
  • Dependent on Apache Arrow for high-performance capabilities
  • GPU support requires additional installation step
Labs Demo

Browser Sandbox Environment

⚡️ Ready to unleash?

Experience this Agent in a zero-setup browser environment powered by WebContainers. No installation required.

Boot Container Sandbox

polars-expertise

Install polars-expertise, an AI agent skill for AI agent workflows and automation. Works with Claude Code, Cursor, and Windsurf with one-command setup.

SKILL.md
Readonly

Polars

High-performance DataFrame library built on Apache Arrow. Supports Python and Rust with expression-based API, lazy evaluation, and automatic parallelization.

Quick Start

Python

bash
1uv pip install polars 2# GPU support: uv pip install polars[gpu]
python
1import polars as pl 2 3# Eager: immediate execution 4df = pl.DataFrame({"symbol": ["AAPL", "GOOG"], "price": [150.0, 140.0]}) 5df.filter(pl.col("price") > 145).select("symbol", "price") 6 7# Lazy: optimized execution (preferred for large data) 8lf = pl.scan_parquet("trades.parquet") 9result = lf.filter(pl.col("volume") > 1000).group_by("symbol").agg( 10 pl.col("price").mean().alias("avg_price") 11).collect()

Rust

toml
1# Cargo.toml - select features you need 2[dependencies] 3polars = { version = "0.46", features = ["lazy", "parquet", "temporal"] }
rust
1use polars::prelude::*; 2 3fn main() -> PolarsResult<()> { 4 // Eager 5 let df = df![ 6 "symbol" => ["AAPL", "GOOG"], 7 "price" => [150.0, 140.0] 8 ]?; 9 10 // Lazy (preferred) 11 let lf = LazyFrame::scan_parquet("trades.parquet", Default::default())?; 12 let result = lf 13 .filter(col("volume").gt(lit(1000))) 14 .group_by([col("symbol")]) 15 .agg([col("price").mean().alias("avg_price")]) 16 .collect()?; 17 Ok(()) 18}

Core Pattern: Expressions

Everything in Polars is an expression. Expressions are composable, lazy, and parallelized.

python
1# Expression building blocks 2pl.col("price") # column reference 3pl.col("price") * pl.col("volume") # arithmetic 4pl.col("price").mean().over("symbol") # window function 5pl.when(cond).then(a).otherwise(b) # conditional

Expressions execute in contexts: select(), with_columns(), filter(), group_by().agg()

When to Use Lazy

Use Lazy (scan_*, .lazy())Use Eager (read_*)
Large files (> RAM)Small data, exploration
Complex pipelinesSimple one-off ops
Need query optimizationInteractive notebooks
Streaming requiredImmediate feedback

Lazy benefits: predicate pushdown, projection pushdown, parallel execution, streaming.

Style: Use .alias() for Column Naming

Always use .alias("name") instead of name=expr kwargs:

python
1# GOOD: Explicit .alias() - works everywhere, composable 2df.with_columns( 3 (pl.col("price") * pl.col("volume")).alias("value"), 4 pl.col("price").mean().over("symbol").alias("avg_price") 5) 6 7# AVOID: Kwarg style - less flexible, doesn't chain 8df.with_columns( 9 value=pl.col("price") * pl.col("volume"), # avoid 10 avg_price=pl.col("price").mean().over("symbol") # avoid 11)

.alias() is explicit, chains with other methods, and works consistently in all contexts.

Anti-Patterns - AVOID

python
1# BAD: Python functions kill parallelization 2df.with_columns(pl.col("x").map_elements(lambda x: x * 2)) # SLOW 3 4# GOOD: Native expressions are parallel 5df.with_columns((pl.col("x") * 2).alias("x")) # FAST 6 7# BAD: Row iteration 8for row in df.iter_rows(): # SLOW 9 process(row) 10 11# GOOD: Columnar operations 12df.with_columns(process_expr) # FAST 13 14# BAD: Late projection 15lf.filter(...).collect().select("a", "b") # reads all columns 16 17# GOOD: Early projection 18lf.select("a", "b").filter(...).collect() # reads only needed columns

Performance Checklist

  • Using scan_* (lazy) for large files?
  • Projecting columns early in pipeline?
  • Using native expressions (no map_elements)?
  • Categorical dtype for low-cardinality strings?
  • Appropriate integer sizes (i32 vs i64)?
  • Streaming for out-of-memory data? (collect(engine="streaming"))

Reference Navigator

Python References

TopicFileWhen to Load
Expressions, types, lazy/eagerpython/core_concepts.mdUnderstanding fundamentals
Select, filter, group_by, windowpython/operations.mdCommon operations
CSV, Parquet, streaming I/Opython/io_guide.mdReading/writing data
Joins, pivots, reshapingpython/transformations.mdCombining/reshaping data
Performance, patternspython/best_practices.mdOptimization

Rust References

TopicFileWhen to Load
DataFrame, Series, ChunkedArrayrust/core_concepts.mdRust API fundamentals
Expression API in Rustrust/operations.mdOperations syntax
Readers, writers, streamingrust/io_guide.mdI/O operations
Feature flags, cratesrust/features.mdCargo setup
Allocators, SIMD, nightlyrust/performance.mdPerformance tuning
Zero-copy, FFI, Arrowrust/arrow_interop.mdArrow integration

Shared References

TopicFileWhen to Load
SQL queries on DataFramessql_interface.mdSQL syntax needed
Query optimization, streaminglazy_deep_dive.mdUnderstanding lazy engine
NVIDIA GPU accelerationgpu_support.mdGPU setup/usage

Migration Guides

FromFileWhen to Load
pandasmigration_pandas.mdConverting pandas code
PySparkmigration_spark.mdConverting Spark code
q/kdb+migration_qkdb.mdConverting kdb code

Time Series / Financial Data Quick Patterns

python
1# OHLCV resampling 2df.group_by_dynamic("timestamp", every="1m").agg( 3 pl.col("price").first().alias("open"), 4 pl.col("price").max().alias("high"), 5 pl.col("price").min().alias("low"), 6 pl.col("price").last().alias("close"), 7 pl.col("volume").sum() 8) 9 10# Rolling statistics 11df.with_columns( 12 pl.col("price").rolling_mean(window_size=20).alias("sma_20"), 13 pl.col("price").rolling_std(window_size=20).alias("volatility") 14) 15 16# As-of join for market data alignment 17trades.join_asof(quotes, on="timestamp", by="symbol", strategy="backward")

Load python/best_practices.md for comprehensive time series patterns.

Runnable Examples

ExampleFilePurpose
Financial OHLCVexamples/financial_ohlcv.pyOHLCV resampling, rolling stats, VWAP
Pandas Migrationexamples/pandas_migration.pySide-by-side pandas vs polars
Streaming Large Filesexamples/streaming_large_file.pyOut-of-memory processing patterns

Development Tips

Use LSP for navigating Polars code:

  • Python: Pyright/Pylance provides excellent type inference for Polars expressions
  • Rust: rust-analyzer understands Polars types and expression chains

LSP operations like goToDefinition and hover help explore Polars API without leaving the editor.

FAQ & Installation Steps

These questions and steps mirror the structured data on this page for better search understanding.

? Frequently Asked Questions

What is polars-expertise?

Perfect for Data Analysis Agents needing high-performance DataFrame processing with Apache Arrow and Python. Python toolkit for processing and analyzing Polarized Resonant Reflectivity Data

How do I install polars-expertise?

Run the command: npx killer-skills add WSU-Carbon-Lab/pyref/polars-expertise. It works with Cursor, Windsurf, VS Code, Claude Code, and 19+ other IDEs.

What are the use cases for polars-expertise?

Key use cases include: Analyzing Polarized Resonant Reflectivity Data with high-performance DataFrames, Optimizing data processing workflows with lazy evaluation and automatic parallelization, Building scalable data analysis pipelines with Python and Rust support.

Which IDEs are compatible with polars-expertise?

This skill is compatible with Cursor, Windsurf, VS Code, Trae, Claude Code, OpenClaw, Aider, Codex, OpenCode, Goose, Cline, Roo Code, Kiro, Augment Code, Continue, GitHub Copilot, Sourcegraph Cody, and Amazon Q Developer. Use the Killer-Skills CLI for universal one-command installation.

Are there any limitations for polars-expertise?

Requires Python or Rust programming knowledge. Dependent on Apache Arrow for high-performance capabilities. GPU support requires additional installation step.

How To Install

  1. 1. Open your terminal

    Open the terminal or command line in your project directory.

  2. 2. Run the install command

    Run: npx killer-skills add WSU-Carbon-Lab/pyref/polars-expertise. The CLI will automatically detect your IDE or AI agent and configure the skill.

  3. 3. Start using the skill

    The skill is now active. Your AI agent can use polars-expertise immediately in the current project.

Related Skills

Looking for an alternative to polars-expertise or another community skill for your workflow? Explore these related open-source skills.

View All

widget-generator

Logo of f
f

f.k.a. Awesome ChatGPT Prompts. Share, discover, and collect prompts from the community. Free and open source — self-host for your organization with complete privacy.

149.6k
0
AI

flags

Logo of vercel
vercel

flags is a Next.js feature management skill that enables developers to efficiently add or modify framework feature flags, streamlining React application development.

138.4k
0
Browser

zustand

Logo of lobehub
lobehub

The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.

72.8k
0
AI

data-fetching

Logo of lobehub
lobehub

The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.

72.8k
0
AI