Skip to content

AI Read 53 Courses for Me

Feb 11, 2026

Four Hundred Grand

I spent over 400,000 RMB on a Peking University Guanghua MBA.

Two years. 53 courses. 4 to 10 sessions each. I built the entire class a Notion workspace — 451 pages, 9 databases, every course's transcripts, slides, papers, discussion notes, sorted and tagged and pristine. I was the admin. The whole cohort lived in there.

The workspace itself was impressive. One database per course. Audio transcriptions. Professor slides. Peer discussion logs. All archived. How organized? New students walked in, never asked a question — the structure told them where everything was.

I was pretty damn proud.

Then year three hit. I started a company. Started prepping my thesis. Looked back at everything.

And realized something.

I couldn't remember a damn thing.


After the High Wore Off

Classes were a rush.

Professor Zhang Jianjun teaching leadership — from Weber's three types of authority to Mao Zedong's carrot-and-stick doctrine — you're sitting there feeling like the universe just made sense. Dean Liu Qiao on corporate finance — after he walks you through the Modigliani-Miller theorem you're convinced you'll never get fleeced again.

Post a WeChat Moment after class: "Today's lecture was insane."

Six months later, what do you actually remember?

Be honest.

I remember the definition of Nash Equilibrium. But ask me to use it to break down a real supplier negotiation? I'd have to dig through my notes. Thirty minutes of digging and I still can't find that brilliant case study — because it's scattered across three transcript segments, two PPT screenshots, and one classroom recording.

451 pages. Three million words. All sitting in Notion. Neat and tidy.

Neat and tidy and dead.

Three million words against the pitiful bandwidth of human working memory. 400K in tuition against a pile of links you never open.

After the high wore off, nothing but emptiness.

You spent 400 grand. What did you get? A killer Notion workspace, and a version of yourself that can't recall any of it.


I Refused to Accept That

I'll admit it: I'm the type who learns a lot and retains little.

Want to learn everything. Want to save everything. An information-devouring machine. 451 pages of beautiful Notion architecture, the whole class singing my praises. But that's storing, not knowing.

Stored doesn't mean learned. Learned doesn't mean understood. Understood doesn't mean usable.

Years in HR taught me one thing: I've watched "training" fail a thousand times. Companies drop hundreds of thousands sending executives to CEIBS or Hupan Academy. They come back, write a summary, post on social media. The knowledge dies in a PowerPoint.

Me? I wasn't even "close to that." I already was that.

But I refused to accept it.

400K can't just buy a diploma and a pile of Notion links.


8 Minutes

So I talked to ChatGPT for 8 minutes.

Not a chat. A tear-it-apart-to-the-studs conversation.

Before that, I'd tried plenty of dumb approaches — manually reorganizing Notion, exporting with Readwise, even considered hiring someone to take notes course by course. All hit the same wall: too much volume, not enough human bandwidth.

That single session produced the skeleton for the entire project. Three-layer architecture: Source (evidence) -> Distill (extraction) -> Apply (application). Every Concept must link to both evidence and a real-world use case. No evidence means you haven't learned it. No application means you'll forget it fast.

This wasn't my first rodeo.

I've built AI virtual departments. Written five-layer constitutions for Agents. I ran a system of 8,000 people at AB InBev.

I know that the more powerful the capability, the more it needs governance. More people doesn't mean more efficiency. More Agents doesn't either.

AI is not an employee. It's raw capability with zero accountability. You don't draw the lines, it draws them for you.


Sharpening the Blade

Then I ran a one-course pilot. Industrial Analysis and Ecosystem Strategy. Manual distillation. 13 Concepts. 12 Skills. Pure handcraft, word by word.

Took three days.

The point was to set a quality benchmark. If you haven't done it yourself, you don't know what "good" looks like. Those 13 Concepts became the scoring baseline for every Agent that followed.

Then I wrote a Knowledge Extraction Constitution. Yes, a constitution. Nine chapters. Hard constraints, prohibited behaviors, quality gates, ultimate truths.

Then a PRD. Twelve chapters. Dependency chains, data flow diagrams, source material type matrix, per-session SOP, batch execution plan, human-AI division of labor matrix, milestones, risk register.

Then Python scripts hitting the Notion API to pull data from all 53 courses. Bucketed by data density — 14 RICH, 20 SKELETON, 7 MINIMAL, 12 SKIP. Different densities, different pipelines.

Then Claude Code Skill files. Every Concept must fill seven fields — one-line definition, why it matters, boundary conditions, canonical example, counter-examples and misuses, concept relationships, my internalization. Miss one field, rejected.

Then a deduplication registry. Seven Agents writing to the same vault simultaneously — without registration, you get three competing versions of "information asymmetry."

Then — and only then — did I start running.

The 72 hours you see? That's the result of blade-sharpening. How long did the sharpening take? About two weeks.


Running

Claude Code has an experimental feature called Teams.

You can spin up multiple Agents in one session, assign roles, run them in parallel, and let them communicate. Coordinator manages the big picture. Distiller handles extraction. Auditor handles quality.

This isn't "open seven terminal windows." This is a layered, dependency-aware, protocol-driven AI work team.

My org chart: 1 Coordinator (me as commanding officer), 1 Data Fetcher pulling data, 3 Distillers running parallel extraction, 1 Auditor conducting independent review. Distillers can't start until Data Fetcher delivers. Auditor can't review until Distillers submit.

Exactly like managing a team. Except this team doesn't take sick days, doesn't slack off, and doesn't play politics.

Phase 1: five courses, 18 Concepts, Source Link pass rate 100%. That's 11 points above my own manual benchmark of 88.9%.

My team outperformed me. Because I gave them rules.

Same principle as managing people. It's not about finding the smartest talent. It's about building the clearest system. That's the single biggest lesson from my fifteen years at AB InBev. Apply it to AI — works exactly the same.

Phase 2: three courses. Phase 3: six courses. Phase 4: sweep everything. Final pass to clear the txt debt — re-ran the Notion API to download 110 transcript files, re-distilled 53 skeleton sessions.


Fabricated

Management Economics, session three. The AI Agent turned in a beautiful set of information economics notes. Adverse selection, signaling games, equilibrium derivation — the works.

That session's Notion page was completely empty.

Fabricated. The whole thing.

The Auditor caught it. My reaction wasn't anger. It was relief.

Because I'd anticipated this when I was sharpening the blade.

Constitution, Chapter 8: If data is insufficient, tag it as skeleton. Never fabricate. That's why the Auditor knew to check. That's why it caught it.

67 skeleton sessions. 67 times the system admitted "there's no data here."

Think that's embarrassing?

The ability to admit what you don't know is ten thousand times more valuable than the ability to fake what you do.


Results

72 hours. 173 Concepts. 58 Skills. 565 notes.

173 Concepts interlinked in Obsidian. The knowledge graph grew itself.

Most-referenced Concept: "Six Cognitive Traps" — cited 41 times. Spanning leadership, game theory, organizational behavior, consumer psychology.

"Information Asymmetry" connected to "Loyalty vs. Competence," connected to "Three Team Roles," connected to "Strong Center, Weak Periphery." From economics to leadership to organizational architecture — the hidden threads between concepts surfaced.

Working alone, I would never have drawn a line between "Mao Zedong's Management Philosophy" and "Corporate Governance." The knowledge graph drew it for me.

There's a line in the Constitution: Transcripts are ore. Concepts are metal. Applications are weapons. No matter how much ore you have, unrefined it's just rock. No matter how much metal you have, unforged it's just inventory. Only weapons go to war.

The weapons were forged.

The question is: whose hands are they in?


The Tribunal

Before I started, I asked ChatGPT a question: "Among the world's greatest minds, who would handle this problem best?"

Not showing off. I genuinely wanted to know.

Niklas Luhmann — German sociologist, one of the most prolific academic writers of the 20th century. He wrote 70 books using a slip-box (Zettelkasten). He said the slip-box is a conversation partner, not a storage system. Two months later you flip to an old card and discover it reacts with another card in a way you never predicted — that's the moment knowledge is born.

My 173 Concepts — are they living things that have argued with me, or assembly-line products?

I know the answer. I don't love admitting it.

Andy Matuschak — former head of educational technology at Apple, researcher at Khan Academy — has spent a decade studying "how people actually learn things." His conclusion, six words: Understanding requires effortful engagement. He'd open my Concept template, see the "My Internalization" field, and ask: Did you write this, or did AI write this?

If AI wrote it, then it's AI's internalization. Not yours. You read it once? That just means you read it once.

Soenke Ahrens — author of How to Take Smart Notes, the person who introduced Luhmann's slip-box method to the world — is even more precise: AI-generated notes have never lived inside your brain.

George Miller — the cognitive psychologist who gave us the famous "7±2" working memory model — says chunking efficiency depends on prior knowledge. Herbert Simon and William Chase's 1973 chess experiment makes it even clearer — Simon being a Nobel laureate in Economics and a pioneer of artificial intelligence: faced with random board positions, masters and novices remember equally poorly.

A chunk without experiential grounding is an empty shell.

Whether my 173 Concepts are chunks or empty shells depends on how many times I've pulled them up in a real business decision.


So Was I Wrong?

No.

The biggest waste of an MBA isn't the tuition. It's that you learned a ton of great stuff, and because you can't find it or connect it, it erodes inside your head.

This system solved "can't find it" and "can't connect it."

The remaining problem — "can't recall it" — that's an internalization issue.

An index is not knowledge. Knowledge is the chunk that fires automatically when you're making a decision.

Cognitive science has already proven this. David Epstein — who spent five years researching why generalists outperform specialists in complex domains — distinguishes kind environments from wicked environments in Range. Chess is kind — clear rules, instant feedback. Business decisions are wicked — fuzzy rules, delayed feedback, and the framework that worked last time might be a trap next time.

There's no shortcut from kind to wicked. AI can't take the punch for you.


The Slow Work

So now I'm doing something very slow.

173 Concepts. Going through them one by one. Not looking at the AI output. Writing "My Internalization" in my own words.

If I can't write it, I haven't internalized it. If I can, it's mine.

I've gotten through about twenty. Progress is slow.

But every time I finish one, it stops being someone else's knowledge and becomes my judgment. It goes from searchable to recallable. From index to instinct.

It's slow. It's tedious.

But it's the only way.

AI built the scaffolding for me. It can't climb for me.

Scaffolding was never meant for you to stand on and admire the view.

It's meant for you to climb up, then tear it down.

Uncle J · 2026.02 Peking University Guanghua MBA · Former Longfor Pre-Partner · AB InBev Veteran Currently: a guy who writes constitutions for AI

Uncle J

Uncle J

AI Read 53 Courses for Me | Uncle J's Insider