While the motives may be noble (regulating surveillance) it might happen that models like Stable Diffusion will get caught in the regulatory crossfire to a point where using the original models becomes illegal and new models will get castrated until they are useless. Further this might make it impossible to train open source models (maybe even LoRAs) by individuals or smaller startups. Adobe and the large corporations would rule the market.

    • voluntaryexilecat@lemmy.dbzer0.comOP
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      They will just claim it “can be used to create illegal pornography” and ban it entirely. And “fix faces” will be a high-risk AI because it can detect faces. 🤦

      Only idea I’ve got to combat this is to make it available to as many people as possible so they can speak in favor of it. Other ideas welcome.

      • YellowZedman@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I can use a pencil and paper to draw wathever i want. Will drawing be illigal just cause anybody can draw prohibited stuff? I’m just saying that if this is the only argument they can find it’s dumb

        • TheForvalaka@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          But this happens on the COMPUTER, which we don’t understand, so we have to regular the scary thing so nobody else can use it too.

        • voluntaryexilecat@lemmy.dbzer0.comOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          A difficult comparison - the pencil is more like photoshop or krita. Drawing a (photorealistic) image from scratch requires mechanical skill. SD is something else, like a magical canvas that guides the pencil while you draw…

          They could still try to enforce safeguards against generating with certain tokens, or only allow models with certain training data. Or the almighty copyright-claim-banhammer…

          While we here know it is futile to censor a model without breaking it - they do not understand it.

    • tumulus_scrolls@lemmy.fmhy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Corps passing their stuff as “technically open source” would also be a problem. Google controls a lot of web by open source Chrome, Microsoft controls dev IDEs by open source VS Code. I’m sure OpenAI would find more ways to pretend to be “open” again if it would be more profitable than saying they have scary monster AIs they can’t release publicly.

      Open source exceptions would have to be in tandem with breaking them up somehow and setting some limits to their activities.