The AI-Assisted Creator Framework
This hub maps the operational territory established at the 2026 Cozora AI Summit. It synthesizes practitioner-led systems into a cohesive framework that allows a solo creator to integrate generative AI without surrendering their voice, taste, or mission.
The framework operates sequentially across three distinct architectural layers: from existential alignment (Why), to tactical generation (How), to infrastructural environment (Where).
Phase 1: Existential Alignment (The Philosophy of Creation)
Before a single prompt is written, the creator must delineate exactly what they are doing and what the AI is allowed to touch.
- mission-excavator-alignment: The absolute foundation. A diagnostic protocol for locating the human's "unpromptable core." Without this, the creator falls into the Infinite Algorithm Trap, building efficient machines to chase trends they do not care about.
- tier-one-vs-tier-two-ideas: The strategic goal. The realization that AI mathematically defaults to Tier 1 summarization (reorganizing the map), requiring explicit psychological forcing by the human to generate Tier 2 paradigm shifts (moving to a new continent).
- taste-judgment-labor-framework: The division of duties. The psychological permission structure that allows the human to outsource the "Labor" entirely to the AI, while rigorously maintaining their authority as the sole proprietor of "Taste" and the final arbiter of "Judgment."
Phase 2: Operational Generation (The Execution Mechanics)
Once the philosophy is set, these are the mechanical, prompt-based tools used to execute the physical labor of writing and design while enforcing the Taste parameter.
- vast-voice-print-method: The codification of Taste. A system of negative constraints (Vocabulary, Architecture, Stance, Tempo) acting as a rigid mold, preventing the LLM from flowing down the path of least resistance into generic AI slop.
- ai-first-design-workflow: The immediate medium manipulation. Collapsing the traditional wireframing handoff by using conversational AI to generate active, physical prototypes (code) in real-time.
- beta-editor-review-skill: The augmentation of Judgment. Feeding the finished draft back into a violently constrained LLM persona—using the machine not to write the draft, but as a critical wind tunnel to expose structural failure points before publishing.
Phase 3: Architectural Environment (The Workspace Integration)
The tools and protocols that physically wire these workflows into the creator's daily local environment, removing the friction of isolated browser windows.
- mcp-context-integration: The bridge. The Model Context Protocol gives the AI restricted, two-way access to the creator's local file system and IDE, eradicating the debilitating "copy/paste" context bottleneck.
- ai-second-brain-retrieval-shift: The death of manual filing. Moving PKM away from rigid folder taxonomy into conversational, semantic retrieval, allowing the creator to seamlessly query massive archives of past ideas directly during the generative workflow.