Research Framework — Branko Lukić
Building the missing layer between human creativity and frontier AI. Backed by 30,000+ experiments with 250 designers at Logitech's CDAI Lab — proving that AI can amplify human originality instead of averaging it out.
"What if we could transmit what makes a creator irreplaceable — through the very systems designed to replace them?"
Every generation of AI tools brings us closer to a paradox that no one in Silicon Valley wants to name: the more powerful the tool, the more it erases the hand that wields it. We call this Creativity Collapse — the systematic flattening of creative diversity toward a statistical mean. Not through malice, but through mathematics.
Large language models don't create. They converge. They learn the central tendencies of billions of human expressions and produce the most probable next token — the average of everything humans have ever written, painted, composed, or designed. The result is not intelligence. It is the platonic ideal of mediocrity, wrapped in impressive fluency.
This isn't speculation. In July 2024, Shumailov et al. published a landmark paper in Nature demonstrating that AI models trained on AI-generated data undergo model collapse — a degenerative process where the model progressively loses information about the tails of human distribution. The rare, the unusual, the brilliantly weird — all of it disappears. First the fringes dissolve. Then the model converges toward a single point estimate with vanishing variance.
The tails are where creativity lives.
A separate study from Science Advances found what might be more alarming for creative fields: AI enhances individual creativity but reduces collective diversity. Writers given AI assistance produced stories rated as more creative — but those stories were significantly more similar to each other. Individual gain, collective loss. A social dilemma playing out across every creative industry simultaneously.
The PNAS study "Echoes in AI" quantified this in narrative: LLM-generated stories contain idiosyncratic plot elements that echo across different models and different generations. The same archetypes. The same narrative arcs. The same emotional beats. An entire species of stories converging toward a single genome.
And then: the scar. Research published in Technological Forecasting and Social Change (2025) found that when AI assistance is withdrawn, individual creativity drops — but the homogeneity it introduced keeps climbing. Even months later. The convergence isn't just a side effect. It's a wound that doesn't heal.
At Logitech, I co-founded and led the CDAI Lab (Creative & Design AI Lab) with a single mission: transform how 250 designers across global teams work with AI — without losing what makes each of them irreplaceable.
The mission wasn't adoption. Every company can get designers to use Midjourney. The mission was preservation — proving, through systematic experimentation, that AI tools can carry a creator's fingerprint forward rather than dissolving it into the statistical mean.
Over 18 months, we ran 30,000+ structured experiments across every domain in the design organization: visual identity, product design, brand communication, UX, packaging, motion, and strategic concept development. Every experiment was designed to answer one question: does the output carry the creator's signature, or did the AI flatten it?
The results were not what anyone expected. They didn't confirm that AI kills creativity — nor that it magically enhances it. They revealed something more nuanced: AI amplifies whatever you feed it. Feed it generic prompts, you get generic output. Feed it a rich, structured creative fingerprint, and it produces work that only that specific creator could have directed.
The difference wasn't the model. It was the protocol.
Six categories of experiments, each designed to isolate a different dimension of the human-AI creative interaction.
Can AI output be distinguished from a specific designer's work? We ran blind evaluations where teams rated whether AI-assisted outputs matched the authoring designer's known creative signature — their color instincts, compositional habits, conceptual vocabulary. We measured the "fingerprint fidelity score" across thousands of outputs.
Does AI-assisted work converge or diverge? We measured the variance in creative outputs across teams — comparing AI-assisted batches against human-only batches. We tracked lexical diversity, visual entropy, conceptual distance, and surprise factor. The results confirmed the "creativity paradox" from academic literature — but also revealed the escape routes.
Which workflows preserve identity? We A/B tested dozens of prompting strategies, reference architectures, and calibration methods. Generic prompts vs. fingerprint-enriched prompts. Style guides vs. vibe protocols. The protocols that won became the foundation of the Copyself framework.
Can an organization's creative DNA survive AI mediation? We built Logi Brand Guru — an internal AI agent trained on Logitech's brand identity — and measured whether it could guide designers toward brand-consistent output without flattening their individual voice. It reached 85%+ accuracy: proof that organizational fingerprint transmission works at scale.
How much creative intent survives the prompt bottleneck? We gave designers prelinguistic briefs — mood boards, sensory palettes, reference clusters — and measured how much of that intent arrived in AI outputs versus traditional text prompts. The structured vibe protocols recovered 2–3× more of the original creative direction.
Do collective fingerprints exist — and can AI carry them? We studied how design teams develop emergent creative identities and tested whether AI tools could amplify rather than homogenize these collective signatures. Project FRIDA (AI Clones) explored asynchronous creative collaboration through AI proxies calibrated to individual team members.
Five findings that changed how we think about human-AI creative collaboration.
The same AI model produces radically different creative fidelity depending on the workflow wrapping it. A designer using GPT-4 with a Copyself protocol produced more authentic work than one using the latest image model with a generic prompt. The intelligence layer between the human and the AI is the decisive variable.
Individual creative signatures can be encoded into structured protocols that AI systems can interpret and preserve. In blind tests, evaluators identified the authoring designer's fingerprint in AI-assisted outputs at rates significantly above chance — when the right protocols were used.
Without intervention, AI-assisted creative work converges toward homogeneity exactly as the academic literature predicts. But with structured fingerprint protocols, diversity metrics stayed flat or even improved. The collapse is a default behavior, not a law of physics. It can be counteracted.
Prelinguistic creative direction — the "felt sense" of what you want before you can describe it — is the most valuable and most fragile signal in the creative process. Structured vibe protocols recovered 2–3× more of this signal compared to conventional text prompting.
At organizational scale (250 designers), unchecked AI use accelerated creative homogenization. But the same scale made fingerprint protocols more powerful — because the organization's collective identity became a richer calibration signal. The frameworks work better the more people use them.
The 30,000+ experiments didn't just produce data. They produced intellectual architecture. Each experiment family crystallized into a research framework: Creativity Collapse, Creative Fingerprint Transmission, Vibe Designing, and the Copyself Protocol. Not theory — distilled practice.
One creator's cognitive signature — their unique perceptual biases, associative patterns, aesthetic instincts, and creative decision-making rhythms. The irreducible unit of original thought. The fingerprint that 30,000 hours of lived experience produces and no dataset can replicate.
The emergent creative identity that arises when individual fingerprints interact. Not the average of its members, but the unique interference pattern — the specific way this group's ideas collide, combine, and transform. The collective intelligence that no single person contains.
The creative DNA encoded in culture, brand, and institutional memory. When 250 designers use AI tools, the organization's creative identity becomes the front line. Copyself at scale means ensuring that AI amplifies the company's creative signature rather than dissolving it into market averages.
AI is trained to be balanced, consistent, and predictable. Humans are none of these things — and that's the point. The specific way a designer overweights texture over color, or a writer reaches for unusual rhythm over conventional clarity — these asymmetries are the creative signal. They're what makes the work authored rather than generated. Imperfections aren't failures of precision. They're engines of originality. Every deviation from the statistical mean is information about the creator that no model contains.
The distance between a creative thought and its tangible realization is a measurable quantity — cognitive distance. Every tool either compresses or expands this distance. A piano compresses it for a trained pianist. A prompt-based AI often expands it, forcing complex intent through a text bottleneck. Creative Bitrate is the bandwidth of the channel between mind and artifact. The research goal: maximize throughput while minimizing the lossy compression that strips the creator's signature from the output.
The most important creative decisions happen before they have names. A direction that exists as a felt sense — a cluster of sensory associations, emotional weights, and aesthetic instincts — is often more precise than any verbal description could be. Yet every AI interaction starts with language. Pre-Linguistic Intent is the research into capturing and transmitting this prelingual creative field — the half-formed thought, the intuition, the direction that doesn't have words yet but is already specific.
In 2010, MIT Press published Nonobject — a book that proposed design should start not from the object, but from the space between people and objects. Written by Branko Lukić with Barry M. Katz (Stanford, IDEO), it argued that the most meaningful design lives in the relationship, not the artifact.
Fifteen years later, the same principle applies to something far more consequential. AI is the most powerful creative tool ever built, and the question it poses is identical to the one Nonobject asked about physical design: where do you start?
Most AI development starts from the model. Better architecture. More parameters. Faster inference. This is starting from the object. nonai starts from the space between — the gap between human creativity and machine intelligence, where the most important interactions happen and the most critical information is lost.
The concept of "Emotional Relevance" that Nonobject introduced — going beyond function and aesthetics to create meaningful experiences — directly informs the nonai principle of Pre-Linguistic Intent. The "Experience Essence" framework, which captures the underpinnings of an experience as a compass for design, is the philosophical ancestor of Vibe Designing.
The lineage is not incidental. It's structural. The same mind that saw design as the space between people and objects now sees AI as the space between human creativity and machine intelligence. The question hasn't changed. The scale has.
"Design starts not from the object, but from the space between people and objects." The philosophical foundation.
Designing the physical architecture of supercomputers that power frontier AI models. Understanding the hardware layer of intelligence.
Published five years before the transformer wave. Early recognition that AI would transform creative practice.
Co-founded the Creative & Design AI Lab. 30,000+ experiments in transmitting creative fingerprints through AI systems at organizational scale.
The experimental lab for creative fingerprint transmission. Live protocols. Open research.
"The Missing Layer" — presenting the research frameworks to the global design community.
The next phase: taking the Copyself Protocol from Logitech's internal lab to an open research project. Live experiments in encoding creative fingerprints into portable, AI-readable protocols that any creator can use. Follow the work as it happens.
Follow the experimentsThe first major public presentation of the nonai research frameworks — Creativity Collapse, Creative Fingerprint Transmission, Vibe Designing, and the Copyself Protocol — to the global design community. Naming the problem. Showing the solution. Backed by 30,000 experiments.
brankolukic.comDesigner, researcher, and author of Nonobject (MIT Press, 2010). First AI Fellow at Logitech. Co-founder, CDAI Lab. Lead designer of Cerebras supercomputers since 2017.
Branko builds systems that amplify human originality through AI. At Logitech, he ran 30,000+ experiments to prove that we can transmit the human "Creative Fingerprint" into AI systems without losing its authenticity — then shipped the tools, trained the teams, and partnered with Legal to establish governance for safe enterprise adoption.
His work spans the full stack: from the silicon architecture of frontier AI hardware (Cerebras) to the cognitive architecture of creative collaboration (CDAI Lab). Creator of the UE Boom (IDEA Gold), which redefined portable audio. On the hardware side, he co-developed Physical Intelligence systems where AI reads human micro-movements and responds through haptics.
His 15-year trajectory — from Nonobject philosophy to physical product design to AI infrastructure to creative AI research — represents a single continuous inquiry: how do we preserve what makes human creativity irreducible, even as the tools grow more powerful?
"AI doesn't diminish human creativity.
It reveals who actually has something to say."
Designing the physical architecture of frontier AI supercomputers since 2017
Co-founded CDAI Lab. 30,000+ experiments. Led AI transformation for 250 designers.
The philosophical foundation. "Design starts from the space between."
Built an internal AI agent (85%+ accuracy) for brand guidance — proof that organizational fingerprints are transmissible.
AI reads human micro-movements, responds through haptics. Closing the sensory loop between user and machine.
Created the product that redefined portable audio. Designed entire line through Megaboom, Roll, Blast.
"The Missing Layer" — presenting nonai research to the global design community
UC Berkeley lectures. "AI Powered Design" published 2017 — five years before the transformer wave.
That's not a rhetorical question. It's a research program.