ai-pipeline
ai-pipeline is a CI/CD pipeline automation skill that generates and validates workflow files, ensuring compliance with security and quality standards.
Browse and install thousands of AI Agent skills in the Killer-Skills directory. Supports Claude Code, Windsurf, Cursor, and more.
ai-pipeline is a CI/CD pipeline automation skill that generates and validates workflow files, ensuring compliance with security and quality standards.
BioETL is a data processing framework for acquiring, normalizing, and validating bioactivity-related datasets from multiple external sources.
book-sft-pipeline is a complete system for converting books into SFT datasets and training style-transfer models, supporting text segmentation pipelines for long-form content.
cicd-pipeline is a skill that automates continuous integration and deployment using GitHub Actions, facilitating build checks, automated test runs, and production deployment automation.
rdf-ttl-pipeline is a skill that converts JSON data to TTL format and checks ontology consistency, ensuring data integrity and accuracy.
Learn to use opencode and speckit to build data pipeline
Pluggable sample-level metadata versioning for incremental multimodal pipelines.
kafka-producer-pattern is an asynchronous Kafka producer implementation for FastAPI, ensuring mandatory user isolation and efficient data processing.
Pluggable sample-level metadata versioning for incremental multimodal pipelines.
dbt-fabric-notebook is a skill that enables running dbt pipelines inside Microsoft Fabric Python notebooks using DuckDB as the compute engine and DuckLake for Delta Lake table management.