← Registry
mlx-vlm
CommunityRun Vision Language Models locally on Apple Silicon Macs using MLX. Use when: installing mlx-vlm, running VLM inference (image + text → response), fine-tuning vision models on custom datasets, batch processing images with local AI, comparing local VLM to cloud APIs (GPT-4V, Claude Vision), or working with LLaVA, Phi-3-Vision, Qwen2-VL, Pixtral, Llama-3.2-Vision on Mac.
Install
skillpm install mlx-vlmTags