Research
Why we rebuilt attention from scratch
Most transformers inherit quadratic attention almost by accident. We took the problem back to first principles and arrived somewhere different.
Coming soon
Research, engineering write-ups, and field notes from the team. Proper posts are on the way — here’s what’s coming.
Most transformers inherit quadratic attention almost by accident. We took the problem back to first principles and arrived somewhere different.
How a two-stage tokenizer preserves both coarse meaning and fine-grained nuance while keeping the parameter budget small.
Case studies from teams running Kronos Mini on edge devices — latency, privacy, and the tradeoffs that actually matter.
Want the posts as they land? Start an account and we’ll let you know.
Get Early Access