Notes from the
lab.

Research, engineering write-ups, and field notes from the team. Proper posts are on the way — here’s what’s coming.

Research

Why we rebuilt attention from scratch

Most transformers inherit quadratic attention almost by accident. We took the problem back to first principles and arrived somewhere different.

Coming soon
Engineering

Hierarchical tokenization explained

How a two-stage tokenizer preserves both coarse meaning and fine-grained nuance while keeping the parameter budget small.

Coming soon
Field notes

Small models, real work

Case studies from teams running Kronos Mini on edge devices — latency, privacy, and the tradeoffs that actually matter.

Coming soon

Want the posts as they land? Start an account and we’ll let you know.

Get Early Access