Hugging Face AI Updates: April 17, 2026
1. transformers-to-mlx Claude Code Skill Automates Model Ports to Apple Silicon
Hugging Face. Pedro Cuenca, Awni Hannun, and the MLX community released a Claude Code Skill that automates porting models from transformers to mlx-lm. The skill handles virtual environment setup, downloads models, reads Transformers modeling code, writes MLX implementations, runs tests, and iterates until satisfied — including the gnarly parts like RoPE configuration and dtype inference from safetensors metadata. A separate reproducible test harness handles per-layer comparisons without LLM involvement, producing PR-ready code plus transparent reports for reviewers. Install via uvx hf skills add --claude. A practical example of Skills shifting agentic workflows from research demos into maintainable open-source contribution pipelines. Source
2. Training Multimodal Embedding and Reranker Models with Sentence Transformers
Hugging Face. The Sentence Transformers team published a hands-on guide to fine-tuning multimodal embedding and reranker models. The post is aimed at practitioners building retrieval systems over text plus image data, covering training data preparation, loss function selection, and evaluation — a timely addition as multimodal RAG adoption grows. Source