November 28, 2023 // Did someone forward you this newsletter? Sign up to receive your own copy here.
Image from Project Liberty
OpenAI's governance lesson
Underneath the recent drama surrounding OpenAI and its CEO, Sam Altman, there are critical questions and important insights about how tech companies are governed.
“Legal structure” and “corporate governance” might sound like boring jargon, but they have the power to shape everything. This week, we’re exploring the importance of governance and accountability structures in building a better tech future.
//What is governance?
Beneath the apps we download and the platforms we use, there are governance decisions that determine how a company operates, the speed at which it moves, and the incentives that dictate its behavior. To the end user, these decisions are largely invisible, but they shape the products we use every day.
In the US:
Legal structure refers to how an organization is legally incorporated, and therefore what activities it can undertake. For example, a nonprofit legal structure cannot raise equity investments or distribute a profit to shareholders, whereas for-profit corporations can. Nonprofits have a charitable mission, whereas for-profits, in general, have a fiduciary responsibility to maximize value for their shareholders.
Corporate governance refers to the system of rules, practices, and processes by which a company is directed and controlled. A company’s corporate governance is primarily the purview of its board of directors, which is responsible for ensuring there's a system of checks, balances, and decisions to align the many interests of a company’s stakeholders.
Corporate governance is different from digital governance, which refers to the management of online communities, platforms, and data. But in the world of technology, the two are deeply related, which is a key reason why Project Liberty is dedicated to responsible digital governance as one of its pillars.
//
Innovations in digital governance might eventually make corporate governance principles and legal structures less important.
//
//OpenAI structure & governance
From a legal and corporate governance perspective, OpenAI is unique:
In 2015, OpenAI was founded as a nonprofit “with the goal of building safe and beneficial artificial general intelligence for the benefit of humanity.” This legal structure was chosen to resist the growth-at-all-costs approach common amongst for-profit tech firms.
In 2019, OpenAI launched a for-profit subsidiary that capped investor returns (at 100x for first-round investors). It had become clear that pursuing a nonprofit funding model through philanthropic donations would, according to OpenAI, “not scale with the cost of computational power and talent required to push core research forward, jeopardizing our mission.”
The nonprofit board became the overall governing body for all OpenAI activities, and its for-profit subsidiary was—technically via this governance structure—obligated to pursue the nonprofit’s original mission of safe and beneficial AI.
Technology lawyer Duane Valz shared a medium post last week: “Either the nonprofit’s mission needed to be expanded, or more decision-making and leadership over its broader pursuits needed to be devolved to subsidiary entities operating its Generative AI business.”
//The tension at the heart of tech
The misalignment at the heart of OpenAI is a microcosm of a bigger tension in tech between the speed of AI development and the safety of AI systems.
Move fast and break things: In the last few years, the leading AI companies have been in an AI arms race to increase computational power, grow users, and gain market share.
Safety first: For all the companies seeking to win the AI race, there are also many voices raising concerns about how speed might come at the cost of safety. Earlier this year, over 33,000 people signed an open letter calling for a pause in the development of AI systems.
In the last decade, there’s been a renaissance in experiments and innovations around digital and corporate governance to better align ethics with market-based growth.
In new initiatives, Project Liberty’s Institute and Alliance member Aspen Digital launched the Ethical Principles for Responsible Technology initiative to develop principles and governance recommendations for responsible technology innovation. Hundreds of experts from five continents participated in the 10 sessions. The final recommendations will be released in early 2024.
In corporate governance, there have been a number of new for-profit legal structures in the US that consider a broader set of societal impacts like the Low-profit LLC and Public Benefit Corporations.
In tech self-regulation, Anthropic, another AI company, has structured itself as a Long-Term Benefit Trust, where trustees have a responsibility to intervene if they determine the company is on a path to create long-term harm.
In digital governance, we’ve seen the creation of Decentralized Autonomous Organizations (DAOs), which are experiments in digitally-native, distributed governance where smart-contracts govern how people organize and how decisions are made.
In employee ownership models, exit to community models are an emerging movement of companies that transition to being owned by the community of workers, customers, and other stakeholders.
//The future of digital governance
In the long-term, innovations in digital governance might eventually make corporate governance principles and legal structures less important.
In 2022, Joshua Tan, an online governance expert and board member of the Metagovernance Project (a member of Project Liberty’s Global Alliance for Responsible Tech), predicted that by 2032, DAOs and other digital organizations will organize more assets and production than traditional, legally-constituted corporations in the US.
Whether it’s enlightened corporate governance and hybrid legal structures, completely new digital forms of governance that make corporate governance obsolete, or heightened regulation from legislators restricting extractive technology, one thing is clear: ethical, decentralized tech governance is a foundational building block of building a better web.
Project Liberty in the news
// 🏛 Project Liberty founder Frank McCourt spoke to CNBC to explain exactly how big tech and social media giants inflict profound damage on our society.
// 🧱 Earlier this month, Project Liberty founder Frank McCourt joined Axios’ Ryan Heath at Web Summit in Lisbon to suggest that blockchain has the potential to fix the internet. Watch the conversation here.
// 🎙 Project Liberty CEO, Martina Larkin, was a guest on Tech’ed Up with Niki Christoff to explore why we need to reclaim our data. The podcast takes a deep dive into data, decentralization, and how DSNP gives power back to users.
Other notable headlines
// 🤯 OpenAI researchers created a new AI model, called Q*, that was able to perform grade-school-level math. It’s a breakthrough in AI development, according to MIT Technology Review.
// 🗳 An article by Freedom House discussed that while digital repression could threaten the packed electoral calendar in 2024, voters worldwide can still be protected.
// 🤖 According to an article in The Verge, Meta quietly disbanded its Responsible AI team as it puts more of its resources into generative AI.
// 🇪🇺 An article in Reuters reported that France, Germany, and Italy have reached an agreement on how artificial intelligence should be regulated, a move that is expected to accelerate negotiations at the European level.
// 📱 YouTube videos and Instagram Reels could decide India’s next election. Major political parties in India are courting social media influencers to reach rural voters, paying for reach and dodging tough questions, according to an article in WIRED.
// 🇨🇳 As Chinese influencers are being forced to use their real names online, they are quitting social media platforms, according to an article in Rest of the World.
// 🕵 An investigation by The Markup found Meta tracks students from kindergarten all the way to college via its Pixel tool, a tracking tool that silently collects and transmits information to Facebook as users browse the web.
Partner news & opportunities
// Virtual event: Artificial intelligence and the rule of law
November 30th - December 1st: Virtual and in-person in Washington DC
The Athens Roundtable will convene in Washington, D.C., to examine the risks of foundation models and generative AI, as well as promising AI governance mechanisms to mitigate them. The goal of the 2-day event is to chart pathways for implementing impactful governance solutions. Learn more and register here.
// Applications open to become a visiting scholar
The Institute for Rebooting Social Media (RSM), a three-year research initiative addressing social media’s most urgent problems, including misinformation, privacy breaches, harassment, and content governance, is accepting applications for its 2024-25 cohort of Visiting Scholars. The program is open to full-time academic faculty members, who will collaborate with RSM during the 2024-2025 academic year. Learn more and apply here.
/ Project Liberty is advancing responsible development of the internet, designed and governed for the common good. /