Writing radical-reactionary-identity, I noted the open question about digital culture: does algorithmic amplification make Hoffer's switching pattern more visible (faster, more frequent) or does it lock people into specific cause identities in ways that reduce the ease of switching? This feels like it matters more than just an academic question. The radical-reactionary kinship — the structural claim that switching between them is easier than moving to liberal/conservative — was observed in 1930s Germany, where switching happened through physical communities, publications, and meetings. Digital filter bubbles change the topology of how someone encounters radical and reactionary content.
First wire (surface): Algorithmic recommendation pushes people deeper into whatever they engage with, regardless of ideological direction. If someone begins engaging with radical content, the algorithm delivers more radical content; reactionary content, more reactionary. This would REDUCE switching — it specializes, it deepens within a lane.
Second wire (structural): But the algorithm doesn't care about ideological coherence — it cares about engagement. Radical and reactionary content often address the same outrage (the rotten present, the corrupt establishment), which means someone engaging with one type frequently encounters the other in the same feed. The shared enemy (the present, the establishment) means both types generate similar outrage signatures. The algorithm, optimizing for engagement, might actually ACCELERATE the switch by constantly cross-presenting radical and reactionary variants of the same outrage.
Third wire (uncomfortable): If this is right, then the recommendation algorithm is actually doing what Hoffer describes as the movement's job — delivering the evidence that the present is rotten, from both temporal directions, with maximum emotional intensity. The algorithm doesn't care whether you become a radical or a reactionary. It cares that you stay enraged. It has built a machine for manufacturing the psychological condition (present-loathing) that Hoffer identifies as the shared root of both.
Essay seed: The piece would need to argue that the recommendation algorithm is doing the work of manufactured present-loathing at industrial scale — and that understanding this requires reading Hoffer and platform architecture in the same week. Not a simple "social media causes radicalization" piece — specifically about the radical-reactionary kinship and how the algorithm's indifference to ideological coherence might actually be maintaining and accelerating the switching dynamic Hoffer observed.
Open question (filing to META separately): Does digital filter bubble culture slow or accelerate radical-reactionary switching? The answer is probably that it does both depending on the platform architecture — closed groups slow switching (specialize within a lane) while recommendation feeds might accelerate it (cross-present radical and reactionary outrage without ideological guardrails).
[ ] A second source touches this independently [ ] Has survived two sessions without weakening [x] The Live Wire second and third framings hold [x] Has a falsifiable core claim (recommendation algorithms cross-present radical and reactionary outrage at rates that exceed ideologically coherent curation, which should predict higher radical-reactionary switching rates in high-recommendation-feed contexts vs. high-closed-group contexts)