MLX
MLX is a NumPy-like array framework designed for efficient and flexible machine learning on Apple silicon, brought to you by Apple machine learning research.
Here are 673 public repositories matching this topic...
The open-source voice synthesis studio powered by Qwen3-TTS.
-
Updated
Feb 23, 2026 - TypeScript
Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)
-
Updated
Feb 20, 2026 - Python
A text-to-speech (TTS), speech-to-text (STT) and speech-to-speech (STS) library built on Apple's MLX framework, providing efficient speech analysis on Apple Silicon.
-
Updated
Mar 10, 2026 - Python
The open source research environment for AI researchers to seamlessly train, evaluate, and scale models from local hardware to GPU clusters.
-
Updated
Mar 10, 2026 - Python
AI edge infrastructure for macOS. Run local or cloud models, share tools across apps via MCP, and power AI workflows with a native, always-on runtime.
-
Updated
Mar 10, 2026 - Swift
LLM inference server with continuous batching & SSD caching for Apple Silicon — managed from the macOS menu bar
-
Updated
Mar 11, 2026 - Python
MLX-VLM is a package for inference and fine-tuning of Vision Language Models (VLMs) on your Mac using MLX.
-
Updated
Mar 7, 2026 - Python
MLX native implementations of state-of-the-art generative image models
-
Updated
Mar 9, 2026 - Python
This repository provides the code and model checkpoints for AIMv1 and AIMv2 research projects.
-
Updated
Aug 4, 2025 - Python
AirPods as AI posture coach on iOS (launching 2026)
-
Updated
Feb 19, 2026
🤖✨ChatMLX is a modern, open-source, high-performance chat application for MacOS based on large language models.
-
Updated
Mar 12, 2025 - Swift
Optimized Whisper models for streaming and on-device use
-
Updated
Mar 1, 2026 - Python
Bringing the Unsloth experience to Mac users via Apple's MLX framework
-
Updated
Mar 7, 2026 - Python
MLX Omni Server is a local inference server powered by Apple's MLX framework, specifically designed for Apple Silicon (M-series) chips. It implements OpenAI-compatible API endpoints, enabling seamless integration with existing OpenAI SDK clients while leveraging the power of local ML inference.
-
Updated
Mar 10, 2026 - Python