Hey everyone, this is Olga, the product manager for the summary feature again. Thank you all for engaging so deeply with this discussion and sharing your thoughts so far.

Reading through the comments, it’s clear we could have done a better job introducing this idea and opening up the conversation here on VPT back in March. As internet usage changes over time, we are trying to discover new ways to help new generations learn from Wikipedia to sustain our movement into the future. In consequence, we need to figure out how we can experiment in safe ways that are appropriate for readers and the Wikimedia community. Looking back, we realize the next step with this message should have been to provide more of that context for you all and to make the space for folks to engage further. With that in mind, we’d like to take a step back so we have more time to talk through things properly. We’re still in the very early stages of thinking about a feature like this, so this is actually a really good time for us to discuss here.

A few important things to start with:

  1. Bringing generative AI into the Wikipedia reading experience is a serious set of decisions, with important implications, and we intend to treat it as such.
  2. We do not have any plans for bringing a summary feature to the wikis without editor involvement. An editor moderation workflow is required under any circumstances, both for this idea, as well as any future idea around AI summarized or adapted content.
  3. With all this in mind, we’ll pause the launch of the experiment so that we can focus on this discussion first and determine next steps together.

We’ve also started putting together some context around the main points brought up through the conversation so far, and will follow-up with that in separate messages so we can discuss further.

  • lennivelkant@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    2 days ago

    Yeah but the compilers compile improved versions. Like, if you manually curated the summaries to be even better, then fed it to AI to produce a new summary you also curate… you’ll end up with a carefully hand-trained LLM.

    • SufferingSteve@feddit.nu
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      So if the AI generated summaries are better than man made summaries, this would not be an issue would it?

      • lennivelkant@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        16 hours ago

        If AI constantly refined its own output, sure, unless it hits a wall eventually or starts spewing bullshit because of some quirk of training. But I doubt it could learn to summarise better without external input, just like a compiler won’t produce a more optimised version of itself without human development work.