Despite the rise of such AI models as ChatGPT and Google Gemini, “old-school” Wikipedia remains one of the most visited sites around the world—with 15 billion visitors every month. Maryana Iskander, CEO of Wikimedia, the nonprofit behind Wikipedia, offers how the platform is holding ground against AI competitors, and navigating one of the most hotly debated issues of our time: Where can we find information that we can trust in 2024?
This is an abridged transcript of an interview from Rapid Response, hosted by Bob Safian, a former editor-in-chief of Fast Company. From the team behind the Masters of Scale podcast, Rapid Response features candid conversations with today’s top business leaders navigating real-time challenges. Subscribe to Rapid Response wherever you get your podcasts to ensure you never miss an episode.
Wikipedia is in many ways a bellwether of today’s kind of fraught relationship with facts and news and bias and key cultural issues. How does Wikimedia manage or engage with those issues?
My experience is that almost everybody uses Wikipedia, but very few people really understand how it works and what sort of happens behind the screen. This is one of the top ten, top five—and in some countries, literally the top-most visited site. It gets 15 billion devices visiting Wikipedia every month, which is twice the population of the globe.
It’s operated by a nonprofit organization called the Wikimedia Foundation. It relies mostly on the generosity of small donations, which, for me, signals its value and use to millions and millions of people. And the content is created by volunteers all over the world. Hundreds of thousands of them, on topics across the globe.
In 330 languages, those volunteers follow core pillars that were the founding basis of Wikipedia, and lots of policies and guidelines to ensure that the content is accurate, verified, and uses cited sources. It’s not a place for people’s opinions. It’s a place to try to provide a neutral and verified set of information for the world.
And so how does the Wikimedia Foundation, which you run, interact with the content, the things that we read and see on Wikipedia?
We predominantly provide the technology infrastructure, but the content itself is really written, managed, and moderated by volunteer communities all over the world. The foundation has a critical role to play in legal and regulatory matters and community support, but we really partner with and support communities and the content creation itself.
In the past, I’d Google something, and Wikipedia was often the first result. Now, I see Google’s Gemini AI first, or sometimes I’ll just ask ChatGPT. Is AI in some ways your competition?
Bob, I worry and think about that all the time. We’re seeing a huge shift from what I would call a link-based internet to a chat-based internet.
I think for Wikipedia, it has kind of two forks. The first is, will people scroll down far enough and click on the link and come to the Wikipedia page? And, in some ways, in the short term, that really matters to us. It matters for our revenue model because that’s how people find us and make their donations.
It matters for how our volunteers understand what they’re doing and how it’s visible. The other side is maybe you’re not going to scroll down and click on the Wikipedia page, but the answer that Gemini or AI is giving you is coming from Wikipedia because it is the largest source of data for most of these models. The struggle is whether you know that or not, whether attribution—which I would say is probably one of the most important things we’re trying to focus on, and how AI is going to evolve.
It’s about motivating people to continue to do this and to contribute, right? Attribution matters as a way of thinking about what is going to be human motivation. To continue to create things that the machines can, you know what I mean, suck up and feed back to you in these various chatbots. So I think Wikipedia is becoming more and more vital, even if it’s becoming less and less visible. Do teenagers come to a page and read a long article? My nephew searches the internet through YouTube.
But we have not yet seen a drop in page views on the Wikipedia platform since ChatGPT launched. We’re on it, we’re paying close attention, and we’re engaging, but also not freaking out, I would say.
If Wikipedia is such a consistent source of training data for AI engines, is that a good thing or a bad thing? I mean, it’s a good thing in terms of improving the quality of what those AI agents put out, but do you wish you got paid for it?
I think that’s not really in the model. I mean, we’ve thought about that and talked about that. I think we have a different role to play. How are we going to use our voice and our place in this ecosystem to talk about making the models more open? If you look at the AI models that our teams have built, they’re all in the open. They provide communities with all the data to assess whether they’re working or not. So when people say that can’t be done, I mean, we’re doing it. I recognize it’s a different business model, but it’s an important data point.
Your business model is so unconventional—no ads and unpaid contributors. And yet you’re still dominant in this age of trillion-dollar tech giants. It’s an unexpected paradox, right?
I know. I know. It’s astounding, actually. That’s really the point. It’s astounding and almost disbelieving. If you ask our contributors why they do this, they’re not doing it to get paid. It’s like there’s something else going on here that speaks to human motivation, trying to be part of an information ecosystem with other people who care about accurate information, who care about an internet that gives us something we can trust. That’s the game we’re in, and I think finding other allies to be in that with us has also been really important.
Your customers, in some ways, are not the end users of the product, right? But are the contributors? It’s got to be very different leading volunteers as opposed to paid staff. How do you think about that differently?
In the book The Starfish and the Spider, there’s a section that talks about the role of the CEO as a catalyst as opposed to the role of the CEO as the titular head. The quick premise is if you have a spider and you cut off its head, all the legs fall away, and the whole thing dies.
Whereas if you have a starfish and you cut off a leg, it just regenerates the leg, and the rest stays intact. I think that analogy is good for being part of this very diffused system where, I mean, I do have paid staff. The Wikimedia Foundation, as a nonprofit organization, has about 700 people, but we have hundreds and thousands of volunteers. There is no directing; you have to be in partnership. You have to be influencing. There are very few things I get to wake up and decide on my own any day of the week, right?
I live in a system of stakeholders and communities. It’s a very different leadership paradigm than in many traditional organizations.
You’re relying on your community to police itself. You could see bad actors wanting to influence that. Is that a new layer of challenge for your organization?
You are absolutely right. There are ways bad actors can find their way in. People vandalize pages, but we’ve kind of cracked the code on that, and often bots can be disseminated to revert vandalism, usually within seconds. At the foundation, we’ve built a disinformation team that works with volunteers to track and monitor.
In this election year, we’ve seen that our community policies work, being an antidote to some of those threats, but there’s no autopilot.
Don’t you worry that the Russian government or the Chinese government might try to infiltrate your community of contributors?
Do I worry about that? I worry about that all the time! Creating a very healthy, large, diverse contributor community is the way to ensure that all points of view are represented and not hijacked by a small group.
With all these things swirling around, how do you keep yourself personally level-headed amid all the turbulence?
I really appreciate that question. I think building a team you can trust is a prerequisite to not losing your mind.
When I started this job, I had wide eyes around what’s happening in the world and how it impacts what we’re doing. The system works, and it has been working for almost two-and-a-half decades just gives you comfort. Our servers really never go down, even when we get huge spikes in traffic, typically when celebrities die. So the day Wikipedia got the most hits in our history was in September [2022] when Queen Elizabeth died.
But it’s a time for leaders where just trying to get it right is hard.
What’s at stake for Wikimedia right now?
There are the above-the-line issues we see playing out. I would say the below-the-surface stuff that I worry about is the strength of the institutions we rely on like free press, independent sources, research from universities that can be cited as sources of information.
So there’s infrastructure around information integrity critical for Wikipedia. Caring about those issues, journalism, censorship, and how people are producing and disseminating knowledge are things that might happen and not make headlines. I think our sense of our role vis-à-vis this broader world is critical for our own integrity and survival.
Autentifică-te pentru a adăuga comentarii
Alte posturi din acest grup
Chinese hackers remotely access
Cybercriminals who hacked Rhode Island’s system for health and benefits programs have released files to a site on
Whether to raise money, placate shareholders, or generate positive press, AI’s biggest companies have a habit of announcing advancements that are nowhere near ready to ship.
The in
The German government accused U.S. billionaire
The next housing boom will happen above Earth. Replacements for an aging International Space Station (ISS) slated for a 2030 decommission, NASA’s Artemis mission returning humans to the moon and p
So you woke up on Christmas morning to a new Mac. Perhaps it’s the miraculous M4