• kromem@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    Yes, but training is not infringement under current law.

    If you learn to draw by tracing Mickey Mouse, but then professionally draw original works, you haven’t infringed copyright.

    If you subsequently draw Mickey Mouse, you’ll hear from Disney’s lawyers.

    So yes, AI producing IP protected material when prompted results in infringement, just as any production would.

    The thing people seem to be up in arms about are things like copying style (not protected) or using for training (not infringing).

    If anything, all of this discussion over the past year around AI has revealed just how little people understand about IP laws. They complain that there needs to be laws for things already protected and prohibited, and they complain that companies are infringing for things that are not protected nor prohibited.

    For example, in relation to the OP question, in the US there is no federal protection around IP rights for voices and case law to the opposite.

    • Atemu@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      If you learn to draw by tracing Mickey Mouse, but then professionally draw original works, you haven’t infringed copyright.

      Tracing Mickey is already copyright infringement. It’s insane but that’s how it works.

      I think copyright is a weird idea in this day and age where you’re almost always standing on the shoulders of giants too but that doesn’t make it go away. Best we can do is hack it (copyleft).

      copying style (not protected)

      Not copyrightable but I’m not sure whether a style could be trademarked instead.

      using for training (not infringing)

      Whether that’s infringing or not is hotly debated topic. Key point here is whether training falls under fair use. If it doesn’t, training on copyrighted material without a license would be infringing under most jurisdictions.

      in the US there is no federal protection around IP rights for voices and case law to the opposite.

      Because US == world. I’d love to leave this implication behind on Reddit.

      • kromem@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        You think tracing for educational use which is then never distributed such that it could not have a negative impact on market value is infringement?

        What the generative AI field needs moving forward is a copyright discriminator that identifies infringing production of new images.

        But I’ll be surprised if cases claiming infringement on training in and of itself end up successful under current laws.

        And yeah, most of the discussion around this revolves around US laws. If we put aside any jurisdiction then there is no conversation to be had. Or we could choose arbitrary jurisdictions to support a position, for example Israel and Japan which have already said training is fair use.

        • Atemu@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          You think tracing for educational use which is then never distributed such that it could not have a negative impact on market value is infringement?

          That’s not what I think, that’s what the law says.

          I said what I think in the second paragraph. Sorry if I wasn’t being extra clear on that.

          What the generative AI field needs moving forward is a copyright discriminator that identifies infringing production of new images.

          Good luck with that.

          But I’ll be surprised if cases claiming infringement on training in and of itself end up successful under current laws.

          Depends. If the imitative AI imitates its source material too closely, that could absolutely be laid out as a distribution of copyrighted material.
          Think about it like this: If I distributed a tarball of copyrighted material, that would be infringement, eventhough you’d need tar to unpack it. Whether you need a transformer or tar to access the material should make no difference in my layman interpretation.

          • kromem@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            That’s not what I think, that’s what the law says.

            No, it doesn’t. The scenario outlined squarely falls under fair use, particularly because of the non-distribution combined with research/education use. Fair use is not infringement.

            Good luck with that.

            We’ll see.

            Depends. If the imitative AI imitates its source material too closely, that could absolutely be laid out as a distribution of copyrighted material.

            I mean, if we’re talking about hypothetical models that only produce infringing material, you might be right.

            But if we’re talking about current models that have no ability to reproduce the entire training set and only limited edge case reproducibility of training images with extensive prompt effort, I stand by being surprised (and that your tar metaphor is a poor and misleading one).

            If we’re going with poor metaphors, I could offer up the alternative of saying that distributing or offering a cloud based Photoshop isn’t infringement even though it can be used to reproduce copyrighted material. And much like diffusion based models and unlike a tarball, Photoshop requires creative input and effort from a user in order to produce infringing material.