← Registry

mlx-vlm

Community

Run Vision Language Models locally on Apple Silicon Macs using MLX. Use when: installing mlx-vlm, running VLM inference (image + text → response), fine-tuning vision models on custom datasets, batch processing images with local AI, comparing local VLM to cloud APIs (GPT-4V, Claude Vision), or working with LLaVA, Phi-3-Vision, Qwen2-VL, Pixtral, Llama-3.2-Vision on Mac.

Install

skillpm install mlx-vlm

Format score

100/100

Spec

v1.0

Installs

0

Published

April 11, 2026