I built a local AI assistant that runs natively on Mac OS 9 - custom C89 inference engine, Speech Manager integration, AppleScript automation

IMG_3346.jpegHey everyone! I wanted to share a project I've been building. MacinAI Local is a fully offline AI assistant that runs natively on classic Macs. No internet, no cloud, just a custom C89 inference engine doing transformer inference directly on the hardware.

The demo is on my PowerBook G4 Titanium running OS 9.2.2. It runs five different models including GPT-2, Qwen 2.5, and a billion-parameter TinyLlama (with disk paging).

It can actually do useful things beyond just chatting - it generates AppleScript to launch apps, move files, and automate tasks, with a confirmation dialog before it runs anything. It also speaks every response using the Mac's built-in PlainTalk voices.

The inference engine is written in C89 targeting the Mac Toolbox, with AltiVec SIMD optimization that gets 2.66 tokens/sec on the G4. Runs on anything from Mac OS 9 down to System 7.5.3. Fast? No. Possible? Absolutely.

Next milestone: getting it running on a 68040 Mac from 1993.

Demo video:

Technical write-up/downloads: https://oldapplestuff.com/blog/MacinAI-Local/

Thanks for the read!
 
Back
Top