After loading my custom model - unsupportedTokenizer error

In Oct25, using mlx_lm.lora I created an adapter and a fused model uploaded to Huggingface. I was able to incorporate this model into my SwiftUI app using the mlx package. MLX-libraries 2.25.8. My base LLM was mlx-community/Mistral-7B-Instruct-v0.3-4bit.

Looking at LLMModelFactory.swift the current version 2.29.1 the only changes are the addition of a few models.

The earlier model was called: pharmpk/pk-mistral-7b-v0.3-4bit The new model is called: pharmpk/pk-mistral-2026-03-29

The base model (mlx-community/Mistral-7B-Instruct-v0.3-4bit.) must still be available. Could the error 'unsupportedTokenizer' be related to changes in the mlx package? I noticed mention of splitting the package into two parts but don't see anything at github.

Feeling rather lost. Does anone have any thoguths and/or suggestions.

Thanks, David

Same code with MLX libraries 2.25.8 but new model I get the same error. Might need to revisit the new model

After loading my custom model - unsupportedTokenizer error
 
 
Q