
No company thatย feeds onย public data is exempt from governance. Regulators have begun formal investigations into OpenAI, Microsoft and Anthropic, with a core questionย emerging: who should hold the keys to artificial intelligence?ย ย
It is increasinglyย apparentย that even the worldโs most respected and revered AIย organisationsย cannot withhold training data and, in exchange, ask for trust by default. Meanwhile,ย decentralisedย and open AI ecosystems areย opting to publish weights and training methodsย in plain sight.ย ย
These moves will โ and must โ set the bar for how AI data is governed and implemented going forward.ย Transparency will be synonymous with practical safety, and become not just a bold choice, but an expectation.ย ย
A new digital feudalismย ย
Every revolution starts the same. A new ruling class, equalย partsย fear and excitement, and a new handful of pioneers that everyone will look to for the way forward โ the deciders of who thrives and who will become obsolete. The AI revolution is no different. Giants like Google, Microsoft and OpenAI are constructing the foundations of machine intelligence as we speak, the inner workings of a world brain that is on track to make humanย labour, decision making and even creativity increasingly redundant.ย ย
This kind of breakthrough comes at a cost. The human skillsย weโveย spent years honing, things that have propped us up financially and made us economically relevant, are being swapped out for systems designed to emulate our worth in seconds.ย ย
Relevance is now currency rather than skills themselves, but there is an alternative. A possibility where, instead of infrastructure belonging solely to monopolies, users build and own it collectively; codes, models and networks can beย openย and the new world can truly beย democratised. Only then can new jobsย emergeย from the scrap heap.ย
More than a technical movementย
Billions of people on the planet have, knowingly or not, opted in to train the models that are already shaping the future. Somewhere along the way, we becameย the architectsย of our own replacement. The only way to rectify this and ensure the same technology belongs to those who helped build it, is to have it all out in the open. Thisย isnโtย just a movement;ย itโsย equity in the form of cultural preservation.ย ย
The call for accountability is beingย spearheaded by current and former employees at OpenAI, who have warned that the company isย operatingย without sufficient oversight, while silencing employees who become aware of irresponsible activity.ย Risks include the perpetuation of existing inequalities as well as manipulation and misinformation.ย Without government oversight, the burden of responsibility to hold theseย organisationsย to account falls to those on the inside.ย ย
Itโsย a mission being backed by employees at rival AI companies and award-winning AI researchers and experts, a symbol of solidarity.ย This includes one ex-OpenAI employee who called the company out for placating the public with statements about building AI safety as opposed to actually enforcing it.ย With swelling support from ex-employees, researchers and regulators, standards in AI safety may finallyย be startingย to rise with the stakes.ย
Own AI or be owned by AIย
With inside voices growing louder, governments are being forced to listen. Formal investigations into these companies mark a shift away from AI transparency being treated as a philosophical preference and towardsย recognisingย it as a practical requirement. If AI is powerful enough to completely dismantle and restructure critical systems, its users must be able to inspect it โ not simply take it at face value.ย ย
Previous revolutions have left most people on the wrong side of history. This one is different in that there is a visible choice: own AI, or be owned by it. The billions whose data, ideas and culture trained these systems deserve their fair share of what they created, not just the scraps left behind. ย



