Many AI labs have called for the democratization of AI. In a recent GovAI blog post, Elizabeth Seger summarizes four different ways of interpreting the phrase:

  • Democratizing AI use: Making it easier for people to use AI technologies
  • Democratizing AI development: Making it easier for a wider range of people to contribute to the development and design of AI systems
  • Democratizing AI benefits: Ensuring that the benefits of AI systems are widespread
  • Democratizing AI governance: Ensuring that decisions involving AI systems are informed by a variety of stakeholders and reflect democratic processes

Things I like about the post

  • “Democratizing AI” is a vague phrase, and the post usefully distinguishes between various ideas that the term can refer to.
  • The framework can help us distinguish between forms of democratization that are relatively safe and those that carry more risks.
    • Ex: Democratizing AI benefits seems robustly good, whereas democratizing AI use has risks (given that AI can be dual-use). 
  • The framework could allow AI developers to maintain their commitment to (some forms of) democratizing AI while acknowledging that some forms carry risks.  
  • The post analyzes decisions by AI labs through the lens of the framework. Example:

In declaring the company’s AI models will be made open source, Stability AI created a situation in which a single tech company made a major AI governance decision: the decision that a dual-use AI system should be made freely accessible to all. (Stable diffusion is considered a “dual-use technology” because it has both beneficial and damaging applications. It can be used to create beautiful art or easily modified, for instance, to create fake and damaging images of real people.) It is not clear, in the end, that Stability AI’s decision to open source was actually a step forward for the democratisation of AI governance.

  • The post is short! (About 5 minutes)

Read the full post here

New Comment
4 comments, sorted by Click to highlight new comments since:

From the article:

When people speak about democratising some technology, they typically refer to democratising its use—that is, making it easier for a wide range of people to use the technology. For example the “democratisation of 3D printers” refers to how, over the last decade, 3D printers have become much more easily acquired, built, and operated by the general public.

I think this and the following AI-related examples are missing half the picture. With 3D printers, it's not just that more people have access to them now (I've never seen anyone talk about the "democratization" of smart phones, even though they're more accessible than 3D printers). It's that who gets to use them and how they're used is governed by the masses, not by some small group of influential actors. For example, anyone can make their own CAD file or share them with each other.

In the next paragraph:

Microsoft similarly claims to be undertaking an ambitious effort “to democratize Artificial Intelligence (AI), to take it from the ivory towers and make it accessible to all.” A salient part of its plan is “to infuse every application that we interact with, on any device, at any point in time, with intelligence.”

I think this is similar. When MS says they want to "infuse every application" with AI, this suggests not just that it's more accessible to more people. The implication is they want users to decide what to do with it.

When MS says they want to "infuse every application" with AI, this suggests not just that it's more accessible to more people. The implication is they want users to decide what to do with it.

Microsoft already infuses their products with telemetry. Does it imply that they want the users to decide what to do with it?

It's not that they use it in every application it's that they're making a big show of telling everyone that they'll get to use it in every application. If they make a big public announcement about the democratization of telemetry and talk a lot about how I'll get to interact with their telemetry services everywhere I use a MS product, then yes I think part of the message (not necessarily the intent) is that I get to decide how to use it.

Ah, okay, that makes sense.