Plato described memory as a waxen tablet: what you impress upon it, you remember; what gets wiped away, you forget and don't know. Ewen Cameron read this and thought it sounded like a treatment plan.
Cameron was the founding director of the Allan Memorial Psychiatric Institute at McGill and one of the most prominent academic psychiatrists of the 1950s. He was impatient — with the slow pace of psychotherapy, with patients who wouldn't improve quickly, with the messiness of minds that had years of accumulated problems. His solution was the tabula rasa: destroy the existing patterns first, then write new ones.1
The name he gave to the second step — the writing — was psychic driving. Tape recordings of therapeutic messages, played through pillows and helmets at 10-20 hours per day, for days or weeks, while patients were sedated into sensory deprivation or under drug disinhibition. If the conditioned reflex could be taught through repetition, then enough repetition would implant new patterns in the newly cleared substrate.
He was wrong about this in a way that became one of the clearest demonstrations of what behaviorist assumptions cannot do with human minds.
Cameron's full protocol ran in two stages:
Stage 1 — Depatterning. Before psychic driving could begin, existing patterns had to be removed. Cameron's depatterning sequence combined:
The combination reliably achieved its stated aim. Patients emerged with "vast chunks of their memories obliterated" over intervals ranging from six months to ten years before treatment. Sixty percent of Cameron's follow-up patients showed significant memory impairment.4
Stage 2 — Psychic driving. With the substrate cleared, Cameron played tape loops of therapeutic messages for up to 20 hours per day, 10-15 days, in sensory deprivation conditions with patients semi-sedated or asleep. He had crafted the messages himself — distillations of each patient's "core issue." The sequences alternated negative and positive:
"Do you realize that you are a very hostile person? Do you know you are hostile with the nurses? Do you know that you are hostile with the patients? Why do you think you are so hostile? Did you hate your mother? Did you hate your father?"
Then, days later, switching to:
"It's all right to be myself. I am affectionate and warm-hearted. It is good to be affectionate and warm-hearted. . . . I don't need to drive myself. People like me as I am."5
He modulated the tape technically — adjusting treble and bass, varying volume, introducing echo — to trigger what he described as a Pavlovian orientation reflex, keeping patients from habituating to the message.
Cameron admitted the failure publicly in his 1963 presidential address to the American Psychopathological Association, though he framed it as a setback on the way to success:
"If this thing worked after thirty repetitions, it was only common sense to see what would happen if the repetition was increased tenfold, a hundredfold or even more. . . . We soon found, however, that it did not work out quite as we had planned it. . . . We found it was possible for the individual to be exposed to the repetition . . . a quarter to one-half million times and yet be unable to repeat these few short sentences. . . . But we have made a beginning."6
A quarter of a million repetitions and the patient couldn't parrot back a few sentences. Cameron called this making a beginning.
His CIA site-visitors reached a different conclusion. Harold Wolff visited the Allan, reviewed the charts, and looked at Cameron's "successful" patients lying in their beds staring at the ceiling. "Are these typical of your successes?" Wolff asked. Cameron confirmed they were. Wolff's colleague Monroe observed: "We were distinctively living in two worlds. His and the real one." The CIA concluded that the intervention "didn't seem to really live up to the expectations that we had hoped might come from it."7
Donald Hebb, whose sensory deprivation work Cameron had appropriated, gave the verdict that's hardest to argue with: "Cameron was irresponsible — criminally stupid, in that there was no reason to expect that he would get any results from the experiments. Anyone with any appreciation of the complexity of the human mind would not expect that you could erase an adult mind and then add things back with this stupid psychic driving."8
The technical failure of psychic driving reveals a deep mistake in how Cameron conceptualized human minds.
The tabula rasa model assumes that memories and behavioral patterns can be destroyed selectively — that you can wipe the problematic contents while leaving the substrate intact and receptive. This is wrong in two directions.
First, the depatterning process that Cameron used (ECT + drugs + sleep deprivation) didn't selectively destroy problematic patterns. It destroyed the neurological substrate itself. Patients weren't left with cleared, receptive minds ready for new content. They were left with degraded neurological function — unable to form new memories, unable to maintain attention, unable to process the repetitive input Cameron's driving was delivering. The wax tablet metaphor fails because the wax and the impressions aren't separable: the neurological hardware that holds patterns is the same hardware that makes new learning possible. Erase sufficiently and you've destroyed the medium, not just the content.
Second, psychic driving assumes that conditioning operates the same way in conscious, socially embedded minds as in Pavlov's dogs in isolated chambers. It doesn't. Human belief change requires more than repetition — it requires the social and epistemic context that gives messages meaning. A message loop saying "It's all right to be myself" means nothing if the patient has no intact sense of self to apply it to, no relational context in which to evaluate it, no narrative memory in which to integrate it. The depatterning had destroyed the very structures that would have made the driving meaningful.
Cameron wasn't an outlier — he was the clearest test case of a model that all the major coercive persuasion research of the 1950s was implicitly using: the mind as mechanism, beliefs as contents that can be extracted and replaced through sufficient technical intervention.
The drugs chapter's truth-serum research tested the same model pharmacologically and found the same limit. The psychic driving research tested it cognitively and found the same limit. Both failed at the same point: the point at which the intervention that was supposed to clear the substrate destroyed the capacity for the intended outcome.
What Cameron's failure demonstrates is not that minds can't be changed — they clearly can, and the DDD framework documents how. It's that coercive belief implantation fails when it destroys the mind's own generative capacity. You can produce compliance through DDD. You can produce genuine belief change through extended social-environmental coercion (the Korean War reeducation results, while contested, showed real change in some cases). What you can't do is destroy the belief-formation apparatus and then operate it.
Cameron's work was funded as MKUltra Subproject 68, approved in January 1957 and supported for three years. The CIA's interest was clear: if you could reliably erase memories and implant new behavioral patterns, you'd have the perfect covert operative — someone who remembered their cover story and nothing else.
What the CIA actually got was a catalog of psychically destroyed patients, a lawsuit settled by the US and Canadian governments for damages, and Hebb's verdict that the entire enterprise was criminally stupid. CIA observers characterized the work as "culpably negligent, professionally unethical, bordering on illegal, repugnant, and totally abhorrent."9
The patients were the actual cost. Women admitted for mild postpartum depression, executives with anxiety disorders, alcoholics — all subjected to the full Cameron protocol because Cameron's eligibility criteria were, in his words, a matter of "singular difficulty" to assess. Sixty percent emerged with memory impairments covering years of their lives.10
Dimsdale frames Cameron as a tragic figure — a reformer with genuine care for patients, corrupted by ambition and the era's primitive research ethics. He emphasizes the institutional context: Hebb, Penfield, and Lehmann were all doing important work nearby; Cameron was the outlier who couldn't keep the clinical impulse and the scientific discipline in balance.
Meerloo's framework, in Why Do They Yield: The Psychodynamics of False Confession, offers a different angle: Cameron was attempting to engineer externally what the Soviet interrogators produced through the psychodynamics of transference, regression, and substitute-father dependency. Meerloo would argue that the Soviet results (genuine belief-change in some cases, documented compliance in nearly all) worked precisely because they operated through the patient's own psychological architecture — the infantile regression, the dependency, the substitute-father bond. Cameron tried to bypass that architecture with technology, and the architecture's absence was exactly why his results were empty.
The combined reading is sharp: what Meerloo's framework predicts is that coercive belief change requires the target's own psychological processes to do some of the work. You can drive those processes into a particular channel; you can't replace them with tape loops.
Psychology → Identity Disruption Under Coercive Pressure: Cameron's depatterning produced what looked like blank slates but were actually identity-disrupted patients — people whose narrative continuity and self-concept had been shattered. The handshake: the psychology page describes what identity disruption does to the target's relationship to their own beliefs and agency; the Cameron failure page shows what happens when a practitioner attempts to exploit that disruption for belief implantation. The insight neither page produces alone: identity disruption doesn't make a person receptive to new implanted content — it destroys the cognitive architecture that makes any content stick. The space that opens up when identity is disrupted is not an empty receptive field; it's a cognitive void that can't organize new material.
Behavioral-mechanics → DDD Framework: Cameron's protocol was an attempt to engineer DDD conditions deliberately and push them to their maximum intensity. He succeeded at DDD — debility (sleep deprivation and drug-induced regression), dependency (total institutional control), dread (the trauma of the protocol itself). But his error was assuming that maximum DDD + psychic driving would produce durable belief change. What DDD produces is compliance — behavioral change during the period of the conditions. Cameron wanted implantation — new beliefs that persist after the conditions end. The failure shows these are different outcomes requiring different mechanisms.
The Sharpest Implication
Cameron got a quarter million repetitions and couldn't get patients to parrot back a sentence. This isn't just a clinical failure — it's an important theoretical result. It shows that the mechanism of coercive persuasion that actually works — the mechanism documented across the Korean War cases, the Stockholm cases, the cult cases — operates through the target's own psychological architecture, not around it. Every documented case of real belief change involves the target's own attachment system, their need for approval, their infantile dependency structures, their group identification doing some of the generative work. The behaviorist fantasy of the mind as mechanism — blank it, then rewrite it — failed at every point it was seriously tested. The implication: you can't engineer the substrate out of the process. Effective coercion is always collaborative, in the most disturbing sense of that word.
Generative Questions