Psychology
Psychology

Power and Psychological Permission

Psychology

Power and Psychological Permission

Knowledge creates a moral gap. The more you know that others don't know, the larger the gap between what you believe is true and what people acting on incomplete information believe is true. In that…
developing·concept·1 source··Apr 25, 2026

Power and Psychological Permission

How Knowledge Creates Moral Space: Information as Conscience-Enabler

Knowledge creates a moral gap. The more you know that others don't know, the larger the gap between what you believe is true and what people acting on incomplete information believe is true. In that gap, conscience dies. You're not lying. You're not consciously deceiving. You are acting on information they don't have access to. Your knowledge itself becomes moral permission.

Alexander knows the enemy's position through scouts. He knows their psychological state through intelligence networks. He knows their weaknesses through historians and analysts. Soldiers do not have this information. They see an impossible charge (Hydaspes cavalry repositioning) and they execute it because Alexander commands it, but they cannot see the information that makes the charge not impossible but precisely calculated.

This creates epistemic authority—soldiers trust Alexander's judgment because he demonstrably has information they lack. His orders have proven correct so many times that soldiers develop deep confidence in his strategic judgment. But this same epistemic authority becomes moral permission. Alexander can demand impossible speeds because he knows (and soldiers don't know) that the supply lines will hold. He can sacrifice soldiers because he knows the information that justifies the sacrifice. The soldiers cannot access the information that would allow them to evaluate the decision. They can only trust. And trust, when given to someone with superior information, becomes a blank check for moral action.

The Mechanism: Epistemic Authority as Moral Disengagement

Milgram's obedience experiments showed that subjects could inflict apparent harm on others if an authority figure told them the authority took responsibility. The subjects didn't stop being moral—they deferred morality to the authority figure. The authority's presence created a space where normal moral constraint could be suspended.

Epistemic authority works similarly but more insidiously. The authority figure isn't asking you to suspend morality—the authority figure is offering superior information that supposedly justifies the action. "I know something you don't know. Trust me, this is necessary." The information gap becomes moral justification.

Alexander's harsh demands on soldiers—the impossible pace at Hyphasis, the sacrifice of the Mallians, the demand for perpetual forward motion—are made bearable by soldiers' belief that Alexander has superior information justifying these demands. The soldiers don't know if the information justifies the demands because they don't have the information. They can only believe.

This creates a particularly insidious dynamic: as the leader accumulates more information and more power, the gap between what the leader knows and what followers know grows. The larger the gap, the easier it becomes to justify harsh actions. At some point, the leader is not actually consulting the information anymore—the leader is simply acting, secure in the knowledge that their superior information provides moral permission even if the followers cannot see it.

The dangerous outcome is when the leader's confidence in their own information creates isolation from reality-checking. The leader knows something, but what they know becomes increasingly disconnected from what is actually true. The information advantage becomes a bubble that prevents the leader from seeing contradictions.

Evidence: Intelligence and Moral Drift

Research in behavioral ethics shows that people with expert knowledge tend to experience moral drift—they gradually adopt harsher positions because they believe their expertise justifies harsher actions. The expert's knowledge becomes a permission structure for moral actions that non-experts would reject.

Doctors in research studies willing to administer higher electric shocks to research subjects if they believed the shocks were medically justified. Executives willing to mislead investors if they had "superior information" about long-term returns. Military commanders willing to sacrifice soldiers if they believed they had superior strategic information.

This is not conscious evil. This is the subtle psychological mechanism by which epistemic authority becomes moral permission. The more you know that others don't know, the easier it becomes to justify action that would seem unjustifiable if you had to explain your reasoning to the people affected.

Alexander's most shocking decisions—forcing soldiers to march beyond exhaustion, demanding sacrifices that seem disproportionate to the strategic gain—are made bearable to him by his sense that he understands something the soldiers don't. He sees the long-term pattern. He understands the enemy's psychology. He knows what it takes to build an empire. The soldiers see the immediate suffering. They see the impossible demands. They cannot see Alexander's information. So they have to trust. And in that trust, Alexander gains moral permission.

Cross-Domain Handshakes

Behavioral-Mechanics: Information Asymmetry as Epistemic Privilege

Information advantage is a behavioral advantage—you can move faster, predict better, coordinate more precisely when you have information others don't. But information advantage is also a moral advantage—it creates psychological permission for actions that would be questioned if you had to justify them to the people affected.

The diagnostic difference: Are you making decisions that you would justify to the people affected if they had the same information you do? Or are you making decisions that rely on the information gap itself as moral permission?

A decision that survives full transparency is a decision you can defend on its own merits. A decision that depends on the information gap is a decision that would be questioned if the gap closed. This is the marker of conscience-space—the gap itself is the permission structure.

Philosophy/Psychology: Evil as Knowledge

The philosophical and psychological tradition suggests that evil often has the structure of knowledge-as-permission. The person with superior knowledge believes their knowledge justifies actions that seem harsh to those with inferior knowledge. Gnostic traditions, Milton's Satan ("I know something they don't"), psychopaths who rationalize harm through superior understanding of human motivation—all follow this pattern.

What appears to non-experts as cruelty appears to the expert as justified necessity. The expert's knowledge becomes the evidence that they are correct and the critics are simply ignorant of relevant information.

History: Intelligence Advantage and Moral Drift

Empires built on intelligence advantages (Rome's spy networks, Britain's naval intelligence, modern asymmetric military advantage) tend to develop moral drift over time. The more successful the intelligence apparatus, the easier it becomes to justify harsh actions as "necessary based on what we know."

Conversely, empires that required transparency of reasoning to followers (Rome's Senate deliberations, constitutional constraints on executive power) maintained stronger moral constraint precisely because decisions had to be justifiable to people without access to superior information.

Tensions: Knowledge as Power and Knowledge as Hubris

Information Advantage Enables Brilliant Strategy AND Moral Disengagement Superior information creates the conditions for brilliant strategic decisions. It also creates psychological permission for harsh actions. The advantage that makes you more effective also makes you more dangerous.

Expertise as Justification AND Expertise as Isolation The more expert you become, the more you know that others don't know, the larger the gap between your reasoning and their understanding. At some point, you can no longer explain your decisions to people affected by them because they lack the information foundation to understand the explanation. This isolation protects you from challenge and makes you vulnerable to error.

Confidence in Knowledge AND Confidence in Error The leader with superior information develops deep confidence in their decisions. This confidence is warranted when the information is accurate and well-interpreted. But the leader has no way to distinguish between justified confidence (based on accurate information) and unjustified confidence (based on information that is incomplete or misinterpreted). The phenomenology is identical.

The Live Edge

The Sharpest Implication The more information you have that your team doesn't have, the easier it becomes to justify decisions that your team would question if they could see your full reasoning. At some point, the information gap itself becomes the permission structure for moral actions. You are not consciously being deceptive. You have simply moved into a space where you justify your decisions using information your team cannot access. If you had to justify your decisions using only information your team has, many of your decisions would not survive scrutiny.

Generative Questions

  • What information do you have that your team doesn't? How does that information shape your major decisions? Could your team evaluate the decision if they had the same information?
  • Where is your conscience most easily satisfied by "you don't understand the full picture"? Where do you find yourself overriding team members' concerns because you have information they lack?
  • How could you make your information and reasoning transparent enough that your moral decisions can be evaluated by people affected by them? Where would your decisions change if you had to justify them fully?

Connected Concepts

  • Authority and Moral Obedience — psychological mechanisms of authority-based compliance
  • Information Asymmetry as Epistemic Privilege — strategic advantage and moral permission
  • Moral Disengagement Mechanisms — how intelligent people justify harsh actions
  • Evil as Knowledge — philosophical/theological dimension

Footnotes

domainPsychology
developing
sources1
complexity
createdApr 25, 2026
inbound links5