Halcyonic Systems · Research Series
On the kind of knowledge computation cannot produce. A three-part investigation into what today's most powerful tools actually compute, where they stop, and what fills the gap.
Every major AI paradigm learns structure from data. Systems models assert structure from theory. Herbert Simon knew the difference before the field had a name. This essay traces a 57-year arc from Simon's Sciences of the Artificial to a working systems specification language, and argues that authored ontological commitment is a fundamentally different kind of knowledge than statistical learning.
Read the essayRecognition, prediction, generation, reasoning, decision, discovery. Six capabilities, cataloged honestly: what each class of computational tool actually computes, the hard ceiling defined by its mathematical operation, and the question none of them ask. The gap is not more power. It is an authored commitment — a human who says this is what I believe this system is and accepts the consequences of being wrong.
Read the catalogNeuromorphic hardware computes using physics that already remembers, already decays, already adapts. From Intel's silicon to honey-and-carbon-nanotube memristors. Six functions compared across three substrate classes, with maturity assessments and honest limits. The pattern: the substrate changes what's efficient, not what's knowable. The ceiling is epistemic, not computational.
Read the comparison