Apple just dropped a bombshell: two open-source vision-language models — FastVLM and MobileCLIP2 — are now on Hugging Face, built on Apple’s MLX for blazing on-device AI performance. In this quick 1:30 tech update we break down what FastVLM and MobileCLIP2 can do (instant subtitles, accurate captions, video/image analysis), why MobileCLIP2 is 85x faster and 3.4x smaller, and what this means ahead of Apple’s Sept 9 event and possible iPhone 17 integrations. Developers can test these models directly on Apple hardware without cloud dependencies — a bold move for privacy and performance. If you enjoyed this update, please like and share! #Apple #AI #FastVLM #MobileCLIP2 #HuggingFace #iPhone17 #OnDeviceAI @Apple @AppleBrasil @AppleMexico @AppleThailand Subscribe Now: / @profilenews Join our WhatsApp Community for exclusive updates: https://chat.whatsapp.com/FizEyvGiwoS... Facebook: / 18zpyrcjrm Instagram: https://www.instagram.com/profilenews... TikTok: https://www.tiktok.com/@pn_profilenew... Twitter (X): https://x.com/profilenews_us?s=09 Website: https://www.profilenews.com