Web Analytics Made Easy - Statcounter

The urgent need for AI consent frameworks in open communities

The urgent need for AI consent frameworks in open communities
Photo by Ibrahim Rifath / Unsplash

The open source origin story describes a time when transparency, collaboration, and inclusion helped us build software that shifted power and value over the tools and infrastructure we depend on away from corporations and toward people and communities. It worked. Not without its own inequities and blind spots, but it worked. And like any democratization, it requires continual vigilance and collective action to sustain.

In this AI moment, it feels alarmingly like we're losing that power - watching it shift back toward corporations, away from the open communities who built the foundation AI runs on.

"Many people so desperately want to believe that they have a relationship with technology that is, at the very least, symbiotic. 'Bicycle for the mind' cliches galore. But computing technologies, built by the most powerful companies in the world, run by the richest men in the world (the richest the world has ever seen) are all fundamentally committed to something else altogether: not to symbiosis but to extraction, exploitation, and domination." — The Right to Say No, 2ndBreakfast, Audrey Watters

Open communities are feeling this loss of power in (at least) three ways:

  • AI being added to products, services and interactions open source communities use without consent
  • AI being added to products and services that open source communities co-build - without consent
  • Misalignment between communities and corporations about what "open source AI" actually means (independent of the OSI definition), obfuscating

These are, each, deep topics on their own, but at the core they're about the same question: where does value and power flow, and how might communities require their consent to define the parameters, including the right to say 'no'.

Note: Consent related to AI is not a new concept, I couldn't find one specific to open communities (please correct me if I missed it), but happy to be wrong.

For open source communities using platforms and products shipped by companies

As open source became central to engineering workflows, so too have maintainers, developers and communities have become the 'user' that products have been designed for. It's increasingly critical that communities define their 'terms of being a user' (for want of a better term); and to think in terms of collective action not a long list of +'s in an issue, but actions that require attention.

Example goals:

  • (proactive) As a community of creators and users, define your AI terms: What AI integration is acceptable to you as a user building in the open? Opt-in only? What does it change? Model, data and weight openness? What crosses the line?
  • Establish response process: What AI changes (addition/location/purpose) trigger action? Who decides? What timeline do you give platforms to respond?
  • What if consent is denied?: Alternatives identified, migration guides ready, sponsors informed, communication channels outside the platform
  • (last resort) Exit together: Move as a unit, redirect funding, archive the old, document for others facing the same choice. Zig showed what this might look like:
"We look forward to fewer violations of our strict no LLM / no AI policy" — Zig, on their community standards around AI and migration from GitHub to Codeberg

We may discover at some point that products accept losing communities on their product paths, which is also very useful information — as we think about new areas for innovation and building this becomes a gap needing solved; new openness.


For open communities building platforms and products governed by corporations

A lot of us contribute to, and build with communities governed by companies - this is not inherently negative. In fact many maintainers are hired from the community, and care deeply about this relationship. However with the AI race, and (honestly) people fearing for their jobs right now, it's not a given. It feels increasingly important that the consent of communities be a milestone for building and shipping products:

"The response from the Firefox community has not just been overwhelmingly negative, it is universally negative as far as I can tell. At least among users willing to post on Mozilla's forums about the issue... Mozilla's core audience hates this move. At the very least, they would want all the AI components of Firefox to be opt-in, a choice that Firefox has been unwilling to make so far, instead enabling these new features by default." — The Mozilla Cycle, Part III, Taggart Tech
"It was a remarkable event – the first time in my 15+ years as an education writer (and 25+ years working in and adjacent to ed-tech) that I've been to a technology event where 'No' was presented as a viable (indeed, perhaps even the moral) response to computing." — The Right to Say No, Audrey Watters

Examples goals for product teams:

  • Create a community consent milestone for each product release. This could be as easy as adding one more step to the open collaboration model.
  • Define AI contribution standards with your community: Be explicit about what AI-generated (or AI-anything) contributions will be accepted, disclosure expectations, and quality thresholds.
  • Be transparent: When proposing AI integration, explain what, why, what data, what's default. If you ship over objection, document it publicly and own the decision. Include the openness of models, weights and data in your product roadmap and how to contribute to the evolution.
  • Support exit paths: Business decisions sometimes go against community consensus. When that happens, be honest about those decisions, make space and dignity for disagreement and exit.

Example goals for communities:

  • Set terms proactively: Define your AI standards before there's a conflict. As one example, Drupal's agents.md is proposed to show AI agents interact with their project, authored by the community.
  • Participate in governance: Show up to roadmap discussions, RFCs, community calls. Consent requires presence if you're not in the room, you can't shape direction.
  • Propose, don't just oppose: When AI features are proposed, offer alternatives. Opt-in instead of opt-out. Different defaults. Clearer disclosure. Give product teams something to say yes to.
  • Celebrate alignment: When products get consent right, amplify it. Positive examples create pressure on those who don't.

When consent breaks down:

  • Document the objection: Make it visible that community opposed a decision. Unify voices in a single location.
  • Withdraw legitimacy: Stop advocating, stop contributing, stop lending your name. (reverse DevRel)
  • Exit together: Move as a unit, redirect funding, document for others. Collective action is the ultimate feedback.

Unlike the open source origin story, it's not just one Goliath but many. There's no stalled innovation to rally against: Innovation is rapid, too rapid for communities to consent to what's being built. Everyone says "open" without meaning the same thing. The language we built is being used to describe, build and ship things open communities, users and advocates didn't agree to.

AI is the asbestos in the walls of our technological society, stuffed there with wild abandon by a finance sector and tech monopolists run amok. We will be excavating it for a generation or more." - Cory Doctorow, The Reverse-Centaurian Guide to critisizing AI

This is why consent must be central to the new open source user and community strategy. Adding consent milestones to product and community pathways feels like the minimum to ensure that power and value isn't just a feeder for the already very powerful.


I am sure you have ideas too! These are mine, on my own time: observed and written since being layed off at Microsoft. I appreciate sponsorship, and opportunities to create frameworks like this for your organization or project. Get in touch!

Also check out:

Subscribe to Emma's open notes

Sign up now to get access to the library of members-only issues.
Jamie Larson
Subscribe
Licensed under CC BY-SA 4.0