MLX Swift LM is a Swift package to build tools and applications with large language models (LLMs) and vision language models (VLMs) in MLX Swift.
Some key features include:
- Integration with the Hugging Face Hub to easily use thousands of LLMs with a single command.
- Low-rank (LoRA) and full model fine-tuning with support for quantized models.
- Many model architectures for both LLMs and VLMs.
For some example applications and tools that use MLX Swift LM check out the MLX Swift Examples.
The MLXLLM, MLXVLM, MLXLMCommon, and MLXEmbedders libraries are available as Swift Packages.
Add the following dependency to your Package.swift:
.package(url: "https://github.com/ml-explore/mlx-swift-lm/", branch: "main"),or use the latest release:
.package(url: "https://github.com/ml-explore/mlx-swift-lm/", .upToNextMinor(from: "2.29.1")),Then add one or more libraries to the target as a dependency:
.target(
name: "YourTargetName",
dependencies: [
.product(name: "MLXLLM", package: "mlx-swift-lm")
]),Alternatively, add https://github.com/ml-explore/mlx-swift-lm/ to the
Project Dependencies and set the Dependency Rule to Branch and main in
Xcode.
See also MLXLMCommon. You can get started with a wide variety of open weights LLMs and VLMs using this simplified API:
let model = try await loadModel(id: "mlx-community/Qwen3-4B-4bit")
let session = ChatSession(model)
print(try await session.respond(to: "What are two things to see in San Francisco?")
print(try await session.respond(to: "How about a great place to eat?")Or use the underlying API to control every aspect of the evaluation.
Developers can use these examples in their own programs -- just import the swift package!
- Porting and implementing models
- MLXLLMCommon -- common API for LLM and VLM
- MLXLLM -- large language model example implementations
- MLXVLM -- vision language model example implementations
- MLXEmbedders -- popular Encoders / Embedding models example implementations