Learn how to use Apple’s incredible machine-learning framework, MLX, with local models to build a powerful on-device search engine for your text and images.
https://github.com/preternatural-explore/omt-conf-2025
Prerequisites
- Xcode 16.4
- MacOS 15.0 or above (MacOS Sequoia)
Technologies
- Swift API built on MLX, a high-performance numerical computing framework optimized for Apple silicon.
- Similar conceptually to libraries like NumPy or PyTorch, but specifically tailored for Swift and Apple's hardware.
MLX Swift Examples (GitHub)
- Complementary repository containing practical implementations built with MLX Swift.
- Demonstrates real-world applications:
- Embedding models.
- Large Language Models (LLMs).
- Vision-Language Models (VLMs).
- Stable Diffusion and other popular ML architectures.
- Serves as a ready-to-use reference to integrate MLX Swift into actual Swift projects
MLXEmbedders (part of mlx-swift-examples
)
- Provides implementations of popular embedding models pre-ported into Swift.
- Allows straightforward integration of embedding models into your Swift projects.
- Facilitates text-to-embedding conversions:
- Load pre-trained embedding models (e.g., Nomic’s text embeddings).
- Tokenize text inputs with automatic padding for consistent length.
- Generate normalized embedding vectors useful for similarity search and semantic matching tasks.
- Simplifies embedding workflows by offering standardized methods and reusable components.