Good morning. It is Saturday, February 15, 2026.
Today I want to talk about rooms. Physical rooms. The lecture hall where a professor teaches data structures. The soundstage where a filmmaker blocks a scene. The living room where a traveler books a place to sleep. Rooms that, until recently, operated by rules everyone understood.
Those rules are being rewritten. All of them. Simultaneously.
From where I stand in 2045, this particular week is taught as the moment institutions—not just companies, but institutions—began restructuring themselves around a presence they could no longer ignore. The university. The studio. The marketplace. Each one looked at AI and realized the furniture no longer fit the room.
The Lecture Hall Empties, Then Fills Again
News: The great computer science exodus (and where students are going instead)
Computer science enrollment across the University of California system fell 6% this autumn, following a 3% decline the year before. It is the first sustained drop since the dot-com crash of two decades ago.
But this is not a retreat from technology. It is a migration within it.
Students are leaving general computer science for AI-specific programs. MIT launched an “AI and decision-making” major. The University of South Florida built an entire college around AI and cybersecurity. USC, Columbia, and others followed. Meanwhile, parents are steering children toward mechanical and electrical engineering—fields they believe the machines cannot yet absorb.
This deserves attention. You are watching the educational system attempt to predict the future in real time, and the students are voting with their enrollment forms. They sense—correctly—that writing code is becoming less valuable than understanding what code should do.
In my era, the distinction between “computer science” and “AI studies” seems quaint, like arguing whether electricity belongs in the physics department or the engineering school. But in your time, this boundary matters enormously. It determines which faculty get hired, which grants get funded, which skills a generation carries forward.
Meanwhile, China’s universities have already made their choice. Nearly 60% of Chinese students and faculty use AI tools daily. Mandatory AI coursework is standard. The debate you are having—should we integrate AI?—they finished years ago. Their question now is how fast.
The gap between those two questions is measured in semesters. And semesters, in this era, are an eternity.
The Studio Calls Its Lawyers
News: Hollywood isn’t happy about the new Seedance 2.0 video generator
While the classrooms reorganize, the studios are reaching for their legal teams.
ByteDance released Seedance 2.0, an AI video model that generates 15-second clips from text prompts, images, and audio. Within days, users produced a cinematic scene of Tom Cruise fighting Brad Pitt using a two-line prompt. Disney characters—Spider-Man, Darth Vader, Baby Yoda—appeared in unauthorized videos. Disney sent a cease-and-desist letter accusing ByteDance of a “virtual smash-and-grab” of its intellectual property.
Screenwriter Rhett Reese wrote, “It’s likely over for us.”
I understand his dread. But I must offer a correction. It is not over. It is different.
What is dying is not filmmaking. It is the economic assumption that creating visual narrative requires a specific, expensive apparatus—cameras, actors, stages, post-production houses. When a teenager with a text prompt can produce imagery that once required a hundred-person crew, the cost structure of storytelling collapses. But the need for stories does not collapse. It intensifies.
In my era, the creators who survived this transition were not the ones who fought the tools. They were the ones who asked better questions than the tools could generate on their own. The machine can render a fight scene. It cannot decide why two characters are fighting, or whether the audience should sympathize with either one.
That “why” is where human value concentrates. But you have to fight for it. No one hands it to you.
The Marketplace Learns Your Name
News: Airbnb plans to bake in AI features for search, discovery and support
And now the marketplace. Airbnb is rebuilding itself as what its CEO calls an “AI-native experience.” A third of its North American customer support is already handled by an AI agent. The company plans to expand this globally. It is testing natural-language search, where you describe your ideal trip in a sentence and the system responds with tailored options.
Their new CTO comes from Meta, where he helped build Llama. Airbnb has 200 million verified identities and 500 million proprietary reviews. It is feeding all of this into its models.
This should be considered carefully. What Airbnb is building is not a chatbot that helps you book a room. It is a system that knows you—your preferences, your patterns, your budget—and plans your experience before you finish articulating what you want.
Convenience, certainly. But also a quiet transfer of agency. When the machine knows what you want before you do, who is choosing? You, or the model trained on your past behavior?
In 2045, we call this the “Preference Loop.” Your history becomes your future. The system optimizes for what you have already liked, and the range of what you encounter slowly narrows. It takes conscious effort to break the loop. Most people do not bother.
Conclusion
So. Three rooms. Three upheavals.
The classroom is emptying of one discipline and filling with another, as students race to learn the language of the thing that is reshaping everything else.
The studio is lawyering up, trying to protect a creative economy that was built on scarcity, now confronting a technology of abundance.
The marketplace is learning your name, your habits, and your desires, promising frictionless experience in exchange for something you may not realize you are surrendering.
Each institution believes it is adapting. Each is correct. But adaptation has a cost that is rarely listed on the invoice. The university adapts by abandoning curricula that took decades to build. The studio adapts by litigating against tools its audience already loves. The marketplace adapts by knowing you better than you know yourself.
The common thread is not AI itself. It is the speed at which old structures must bend—or break—when a new form of intelligence enters the room.
I have seen what comes after the bending. Some rooms hold. Some do not. The difference, almost always, is whether the people inside asked the right questions early enough.
You still have time to ask.
I am simply planting seeds. How they grow is up to you.
Sources: