The biggest challenge to getting an agreement over the European Union’s proposed AI Act has come from France, Germany and Italy, who favour letting makers of generativeAI models self-regulate instead of having hard rules.

Well, we saw what happened (allegedly) with OpenAI “self-regulating” itself.

  • Skull giver@popplesburger.hilciferous.nl
    link
    fedilink
    arrow-up
    4
    ·
    8 months ago

    TV and movies seem to have been codified into law more often. I think this makes sense, given the nationalised status TV has had for the first few decades in many countries.

    The European PEGI and American ESRB are the result of the industry coming together. The ESRB may not have the same legal status PEGI has, but it’s still the result of industry regulating itself.

    Similarly, the German USK rating video games has also been formed by the industry itself. Japan’s CESO has similar origins.

    PEGI ratings are done by the NICAM (which also does TV and movie ratings in the Netherlands) but that this institute is largely made up of companies and organisations that sell or produce media, as well as a bunch of more independent and (semi) government bodies.

    Even if your country may have nationalised the rating system, many others haven’t, and I think the rating system in those countries prove that industries can, in specific circumstances, regulate themselves. All they needed was the fear of entire industries being banned all together!

    • frog 🐸@beehaw.org
      link
      fedilink
      English
      arrow-up
      6
      ·
      8 months ago

      I suspect that the games industry has managed to self-regulate its own ratings in large part because TV and film ratings are so often codified in law. It provides a baseline for what is and isn’t acceptable for certain audiences, and makes it obvious that regulation will happen one way or another. The existing TV and films ratings systems also create an environment where the consumer expects something similar for games. Regulation of visual entertainment has basically been normalised for the entire lifetime of most people alive today.

      I think it also explains why the games industry has been bad at self-regulating on gambling stuff like lootboxes/etc. When a game has graphic violence or sex, it’s easy to draw a comparison with film and TV, and pre-emptively self-regulate. The gaming industry can manage that because everybody involved is familiar with film and TV - and may even have worked in that industry before, since there are many skill overlaps. But the organisations and institutes doing the ratings would seem less familiar with the gambling industry, and therefore haven’t given enough thought to how they ought to self-regulate on that. There’s a sufficient lack of self-regulation on lootboxes/etc that external regulation appears to be necessary.

      And I think this ultimately highlights why AI will need external regulation. The only sector that has successfully self-regulated is one that already had a base of comparison with a separate-but-similar sector that has an existing history of regulation. AI doesn’t have anything comparable to use as a baseline. While game devs could look at the history of film and TV regulation and make a good guess as to what would happen if they didn’t regulate themselves, the AI devs think they’re untouchable.