A transformer model with approximately 25,000 parameters is implemented on an unmodified Commodore 64, utilizing hand-written 6502/6510 assembly. This 2-layer decoder-only architecture features real multi-head causal self-attention, softmax, and RMSNorm, and can be loaded from a floppy disk.
github.com
5 min
4h ago
A transformer model with approximately 25,000 parameters is implemented on an unmodified Commodore 64, utilizing hand-written 6502/6510 assembly. This 2-layer decoder-only architecture features real multi-head causal self-attention, softmax, and RMSNorm, and can be loaded from a floppy disk.
github.com
5 min
4h ago
A transformer model with approximately 25,000 parameters is implemented on an unmodified Commodore 64, utilizing hand-written 6502/6510 assembly. This 2-layer decoder-only architecture features real multi-head causal self-attention, softmax, and RMSNorm, and can be loaded from a floppy disk.
github.com
5 min
4h ago
No more articles to load