MacMind is a complete transformer neural network implemented entirely in HyperTalk. Embeddings, positional encoding, self-attention, backpropagation, and gradient descent, all running inside a HyperCard stack.
The task is learning the bit-reversal permutation, the opening step of the FFT. The model discovers the pattern purely through attention and repeated trial and error. By step 193 it was already oscillating between 50%, 75%, and 100% accuracy on successive steps, settling into convergence.
The "intelligence" is 1,216 numbers stored in hidden fields. Save the stack, quit, reopen: the trained model is still there. Option-click any button and read the actual math in the script editor.
Built for HyperCard 2.1, System 7 through Mac OS 9. It's a bit slow but correct.
GitHub has a pre-trained stack, a blank stack you can train yourself, and a Python reference implementation that validates the math:
github.com
The task is learning the bit-reversal permutation, the opening step of the FFT. The model discovers the pattern purely through attention and repeated trial and error. By step 193 it was already oscillating between 50%, 75%, and 100% accuracy on successive steps, settling into convergence.
The "intelligence" is 1,216 numbers stored in hidden fields. Save the stack, quit, reopen: the trained model is still there. Option-click any button and read the actual math in the script editor.
Built for HyperCard 2.1, System 7 through Mac OS 9. It's a bit slow but correct.
GitHub has a pre-trained stack, a blank stack you can train yourself, and a Python reference implementation that validates the math:
GitHub - SeanFDZ/macmind: Single-layer transformer in HyperTalk for the classic Macintosh
Single-layer transformer in HyperTalk for the classic Macintosh - SeanFDZ/macmind
