Information architecture as a facet of our social makeup

How many apps are you going to touch today? I’ve been awake half an hour and already focused on 6 — not just glance through, but 6 that became the center of attention for at least a minute. 

Apps are tools specifically designed to manage, share, move, update, and work with information. That’s their sole function, all of them. Every pixel rendered, every keystroke or finger poke translated to a letter on your screen, is information. Regardless of whether that information was ultimately used to design water main structures, text your mom, play a game, or read the news — none of it was possible without the movement of information. 

The workings of apps tends to be more complex. Developers more or less dip their toes in network every time they call the output of one function into another function, whether they are maintained as deep lines of code or as separate files. That ease of writing, “make a new connection right here” regardless of parent-child relationship, silos, and whether the information was ever intended to be used in this way (like adding weather to a health app — whaaaa?  Yet someone with migraines or autoimmune disorders would applaud…). 

Most developers will also, with more or less alacrity, frustration, and sighs, admit that some code bases are so chaotic that finding that one legacy bit that’s screwing with the whole system right now is the bane of their existence. 

That’s a part of the information architecture that most of us don’t want to see. Most of us, for most apps, just want to use the information the app is wrangling. The codebase is too much information in a format that is mind boggling for anyone who doesn’t code. So the guts of a program become opaque. So many people want to enjoy the sausage, not understand what cuts of meats went through a grinder, watch the natural casing be processed from raw materials, how the sausage was processed, or what the kitchen looked like in the aftermath. 

The information architecture isn’t a straight line from idea, to developer, to user.  Developers spend most of their days wrangling code and codebases to get it into a usable, functioning output. That output works with a machine that, deep in its architecture, with several layers of code acting as an interface, is binary. While people like binary solutions for the problems they find uninteresting or too complex, but we are not binary in how we think; so developers have to always be shifting their mind into "how machine information works".

Then they work with people on the team who are doing these other things, all of them having gone up their own steep learnings curves. There are exceptions, but more often those specialized tasks are shared and the whole contribution is aimed at the user. 

The opportunities for confoundedness compounds with every person added to the team. Every person added to the team means that much more can be built, or shortens the timeframe, or allows for a moreness (more nuance, more holistic, more cutting edge, more elegant, more likely adhesion with users, more insights, etc.). It also accumulates cognitive bias and forms a support structure for unspoken agreements that skew the user experience.

nuances.png

We cannot assume the nuances of one, are the reflection of everyone; or the mean of everyone is the only available truth for one.

The biggest bully on the team — depending on the team — might be the reason why certain biases get baked in. Conversely, willingness to speak up can spark a cascade of willing additional opinions and a different decision gets baked in. People, in all our dynamics, are part and parcel with the data, information, technology, and interface. We touch everything, in all of our complexity.

Hierarchy in our code structures

We structure many of our working interactions based on a top-down architecture. It tends to be efficient and quick: the most bang for the buck, with reasonably easy information scents that work for a broad population — a best practice. What we forget, consistently and without hesitation, is that we’re pruning information to fit in those architectures.

The intent behind our hierarchical structures was not to be holistic, but to get enough information organized to not fuck up consistently. As the information built, I think the intent moved to refining how little we could fuck up, while maintaining a reasonably learnable information structure.

The goal was to get a team of people going. As a whole, they didn’t need to understand all the nuances to get stones set to build a pyramid. The ones who were interested in understanding more could try to tap into the network information reasonably fluidly.  We focused on what was needed. The rest didn’t go away, but also wasn’t highlighted. This worked for ages — literally, entire ages of human history. Entire social structures formed around the dissemination of information…and it’s control.

Then we started encoding it. 

I think part of what happened as we built our information technology is that we resolved and solidified those architectures and didn’t imbue them with people. We assumed that the way most people wanted to deal with any given point of information (just tell me what to do here!) was the whole. If people wanted to just get it done with because they person controlling it was an asshole, ruining their day: it didn't matter, they conceded to hierarchy, and that was the only thing looked for.

We played for the mean, and left out the rest. 

We danced with the glory of “it’s working!!!!” 

Eventually, we even learned that if we thought about people in the process being used (hello, UX!) and smoothed out the interactions, it upticked use. 

Overall, we focused on managed data, pruned to fit the function at hand, with just enough empathy to get social traction.

Why? We were problem solving in the snap. It was at-hand: use the business social hierarchy model to focus and get shit done. It was mental modeling: leverage the social hierarchy to manage and build out the functions that need to be built out: money there, legal in the footnotes, focus on the product. And it was core precept: limit cost, avoid risk, profit soon, profit high, focus on the next potential profit.

Social core precepts got in the mix, too. How many edge cases are tolerated? None – don’t waste company time, our profit center can be fulfilled by working with the 70% that fit our criteria. The rest are just a drag on cost, too risky, and can find their needs met elsewhere. We’ll chase that margin when we’ve got our (moving target) core built out.

In the midst of all of this effective, getting-shit-done problem solving, we pruned people. 

We decided — for the shortcut of other-found learning, for the respect of our elders and history, for the goal of efficiency, for the ersatz need of making as much profit as quickly as we could, and for the sake of simplicity and ease — to take a top-down architecture and impress it upon the system. And then do it again, and again, and again, ad infinitum, app after app, business model after business model, culture after culture. 

Inside the architecture of the code, in how the process was managed, in the adjacent support systems, in the navigation and findability tools provided: we chose. Some choices were small, their impacts cumulative over the broader social fabric. Some amplified for as much money as possible as quickly as possible. Whatever people didn’t fit were considered edge cases — on their own, while those with new tools sped forward. Those of us who thought about it probably told ourselves we were doing as close to right as we could manage in the circumstances, and possibly that there would be time later to go back and fix it. Later never came; instead we chased the next big idea that could be turned into profit. 

This is our information technology, one facet of environment that, currently and with more influence every week, touches ubiquitously on our environmental quadrants. 

People run around with cognitive information architectures in their heads, troubleshooting their moments in time and their relationship through time with their environment. As they butt up against each other, their interactions inform and patterns evolve, both in their internalized cognition and their ouruborus of perception. This continues up and through the interactions until culture forms in certain groups with regular interactions.

Culture is an information architecture, a shorthand for people that interact frequently to get to the key points of interaction efficiently. By leveraging existing culture, we don’t need to relitigate the entire story of the interactions point with contextualization to make it more meaningful. We jump into the middle, share the part that needs to be shared, and trust that the context will self-manage.

We navigate one culture with our family, another culture with our religion/spirituality, another culture with our cohort (like UX designers or welders or singles looking for love/sex), another culture with our coworkers, another culture as we work outside our immediate coworkers (e.g., accounting working with marketing), another culture as we navigate the roadways.

In other words, we have a vast array of cultures. The cultures become self-sustaining, and forget that they were forged by people finding ways to get along within their broad environment. Cultures shift with the individuals in them, but sometimes large cultures will set aside individuals instead of shifting with them. 

We’ve even formed cultures are so intrinsically top-down hierarchical that they cannot abide that another hierarchy might not be subsumed in theirs. 

This is on top of all the navigation we do with information. Each app, each website, each culture, each mind with it’s own architecture. Some communicate their structure better than others. Some allow for fungibility, some demand conformity and adherence to process.

Then we wade through a sea of marketing, each intended to short-circuit our decision making. Marketing both adds to the cognitive load, and greases those cognitive bias pathways so its easier to make those purchases. Through it, we are trained every day to give in to desire and fear. We are urged every day to follow the fate we’re building, just because it’s good for someone else’s bottom line. 

We people are a complex system intertwingled with a multitude of other complex systems trying to make sense of crossed spectrum nodes, aligning with future-senses based on a variable and fungible set of internalized cognition tools that may or may not be added to and modified for the next use case, which together makes a near-infinite set of potential reactions and responses that we quickly pare down to likely and then done, all to start again on a different problem in the next moment, ad infinitum until the end of our lives.

Then we want users to bend their cognitive chain to accommodate our app or site because it benefits us. That’s what we’re putting out in the universe when we decide it’s too complicated or costly to meet people halfway. 

This is not an argument that will find traction in any business meeting. Tell people this without plenty of space for them to work through their emotional reaction and their defensive logics (which will likely be sharp, unless they recognize the overwhelm themselves and are willing to admit it in this room while their only true goals are searching for upticked profit and downticked cost), and the room would erupt. I’ve been called ballsy, and have yet to say this particular thing to a meeting full of decision makers. I wait for my moments and environments.

We know by now that doing something just because it’s the right thing for users will almost never make it past the cutting block.

What might find traction? It depends on the business and the people involved, the current cultural overtones, and our ever-changing landscape of laws and regulations, and your personal appetite for loggerheads and ethics.

In our current system, profit drives first. Laws and regulations drive next. Then keeping up with competitors, then not letting technical/design debt increase to the point of collapse of system or reputation. If that’s already contended with, maybe finding a way to one-up their competitors might find resonance. 

Delight worked for a hot minute, then became more about surface and less about function. Loyalty might be the next catchphrase, as general overwhelm makes marketing less effective. I fear trustlessness is our next big generational information wave; what resonates for that has yet to emerge.

Putting in the time and work for free — above and beyond the salaried work — can work for a while, but I’ve found that usually what happens is that the business decides that’s your capacity, and fills it with heady glee and stated relief that they feared they’d had to try to get another hire. That also only works if what you’ve found that can add humanism is also recognizably framable as a profit contributor. 

Have at least two arguments for each humanism trait that doesn’t cite, “because it’s fair and ethical.” It’s shifting grape vines to the edge of the garden plot instead of focusing on their care; but, as long as they aren’t pulled up to expand for radishes, enough might survive to meet long-term vision.

In other words: late stage capitalism is deeply, inherently not humanist. Long-term thinking, win-win scenarios, and taking anything less than what can be tricked or bullied out of customers as quickly as possible is no longer generally tolerated if approached full-frontal. 

It feels like a trick, like a tactic out of the bad actors' playbook, to hide the humanist-vegetables in cookies. Years ago I could do it. After dealing with the effects of a seriously misaligned, composite information system, and writing this book — getting all my ideas articulated and balanced holistically — it no longer hits my design ethics.

These days, personally, I tend to front-load my design ethics for my project conversations. I see more traction with others doing the same in the EU.

It's a culture problem, and we're now (2025) entering an era where mass decisions will be made. We could have a mass shift of people saying, "wait, this really sucks for me, I get why it's bad now, let's not make it possible." Or the people intent on creating a world in their specific, narrow image will succeed, decimate the economy, finish killing the climate, and kill off or effectively enslave those not deemed to be in the authority caste.

It doesn’t mean capitalism is inherently bad, just that we’ve pushed certain patterns so hard that…here we are. I think we can come back, and still have capitalism. I, personally, frankly, don’t care what we call the economic system as long as humanism is central. I also don't think any of the systems we've tried to date have succeeded in not being co-opted by our bad actors. Our best bet is probably an emerging, novel approach.


social IA:
cognitive bias, cognitive IA, connections, core precepts, environment, future-sense, hierarchy, implicit process, information structures, internalized, learning, mental model, network, ouruborus, people are complex, reactions, response, time, top-down, who-ness

...auto immune...
Chiavolini, D. (2023, May 19). Stay Cool: How to Manage Autoimmune Flare-ups in Hot Temperatures. Global Autoimmune Institute. https://www.autoimmuneinstitute.org/articles/stay-cool-how-to-manage-autoimmune-flare-ups-in-hot-temperatures/

...migraine...
Sacks, O. (1981). Migraine. Macmillan.

...auto immune...
Rheumatism and arthritis are now considered autoimmune disorders, and we have pre-industrial stories accepting them as reacting to environment (Jayne Eyre, A Room with a View). This is emerging as far as accepted medicine, but something that's been experientially understood for a long time. Superstition is a fuzzy orientation within our fantasy-to-quality-truth spectrum.

...set aside individuals...
These concepts are much more evident as we approach Environments. There are massive efforts around researching and understanding them, and good starting points are included in the footnotes in that section.