Failing information states
When information structures don't work
You'd think it would be easy to see when information structures fail. It's information, right? So when it fails, it should be filled with lies. Right?
Nope.
When information structures fail it's because of gaps, misaligned data focus, and/or skewed connectome. Gaps means there's missing data nodes that harbor highly relevant information. Misaligned focus means that you are including information that hasn't found it's meaning and function in the problem solving (potentially "yet"). Connectome are the conduits for information, so when they skew they make it so you can't get to the nodes. Each aspect tips the system of information into stress along at least one of the system connectome aspects: flow, expansion/contraction, or elasticity; and often, also, in terms of resource tolerance.
So when information structures are nearing a failure state, the easiest thing to see is that the information is no longer hanging together; or, to make it hang together you have to add more and more metadata until the structure oscillates/wobbles and starts disintegrating into unusefulness. In my mind, it feels feels very much like Bouger's metacenter concept, which I finally really grasped when reading Neal Stephenson's description of it in the Confusion.
Information structures need to hang together to support findability, navigation, process, and for people to be able to trust information systems.
Information structures need to be kept out of a disintegrating oscillation to be able to shift along the system aspects as information changes through time. The most robust information structures can account for emergence. They are very, very rare.
I've seen so many places where culture is used to prop up a failed information structure. Most frequently in my work, it's in navigation. That navigation is so frequently due to internal organization (Conway's Law) that it's the first thing I look to — even before I knew about Conway's Law as a named concept. When it's really bad, there are often some pretty gnarly cultural standards used to keep the information processing — and which will make the necessary shifts highly political.
Misinformation, disinformation, and (yes) lies are outcomes of a failed information architecture. So is carrot/stick behavioral modifiers to keep gnarly processes functioning.
Where the qualities in the previous paragraph exist, where people are systemically engaged with regular pain, the information structures have been in failure for enough time to accumulate. They are long past failed, and solidly, irrefutably broken. They are engaging with people in the snap to enforce and maintain the structures as-is, becoming culturally moribund and very open to manipulation by our bad actors.
bad actors, connectome, environment, implicit process, metadata, nodes, system elastic, system expand/contract, system flow, system resources, time, trust, who-ness