HF Trackio
Track and visualize ML training experiments with Trackio — log metrics via Python API, fire training alerts, and retrieve logged metrics with real-time dashboards.
Skills for AI model inference, RAG pipelines, image generation, and ML operations
Track and visualize ML training experiments with Trackio — log metrics via Python API, fire training alerts, and retrieve logged metrics with real-time dashboards.
Build reusable CLI scripts for Hugging Face API operations — chaining API calls and automating repeated tasks.
Publish and manage research papers on Hugging Face Hub — create paper pages, link to models/datasets, claim authorship.
Train or fine-tune language models using TRL on Hugging Face Jobs — SFT, DPO, GRPO, reward modeling, GGUF conversion, and Trackio monitoring.
Run workloads on Hugging Face Jobs infrastructure — UV scripts, Docker-based jobs, hardware selection, cost estimation, and secrets management.
Add and manage evaluation results in Hugging Face model cards — extract eval tables, import scores, and run custom evaluations.
Create and manage datasets on Hugging Face Hub — initialize repos, define configs, stream row updates, and SQL-based querying.
Execute Hugging Face Hub operations — download models/datasets/spaces, upload files, create repos, manage local cache, and run compute jobs.
Build Gradio web UIs and demos in Python — create apps, components, event listeners, layouts, and chatbots.
Guide for Netlify AI Gateway — access AI models from OpenAI, Anthropic, and Google via a unified proxy without managing API keys directly.
MCP server for Amazon Nova Canvas — generate, edit, and manipulate images using the Nova foundation model on Bedrock.
MCP server for Amazon Bedrock Knowledge Bases — query RAG-powered knowledge bases with foundation model retrieval and generation.
MCP server for Google Vertex AI Creative Studio — generate images, edit media, and create visual content using Vertex AI models.
MCP server for Cloudflare AutoRAG — build and query retrieval-augmented generation pipelines with Vectorize and Workers AI.
MCP server for Cloudflare AI Gateway — manage AI request routing, caching, rate limiting, and observability across LLM providers.