Thursday, November 7th

    Artists' case against Stability AI and Midjourney gains more impact

    img
    A judge has allowed a lawsuit filed by several artists against Stability AI, Midjourney, and other AI-related companies to proceed, with some claims dismissed.

    A judge ruled yesterday that a lawsuit filed by several artists against Stability AI, Midjourney, and other AI-related companies can proceed, with some claims dismissed.


    Many artists claim that popular generative AI services have violated copyright law by training on a dataset containing their work, and that in some cases, users of these services can directly reproduce copies of the work. Last year, Judge William Orrick upheld a direct copyright infringement claim against Stability, the company behind the popular image-generating tool Stable Diffusion AI, but he rejected many of the other claims and asked the artists' lawyers to add more details. In that later decision, revised arguments persuaded the judge to approve an additional copyright infringement claim against Stability. It settled a copyright claim against DeviantArt, which used a model based on Stable Diffusion, as well as Runway AI, the original startup behind Stable Diffusion. And he allowed copyright and trademark infringement claims against Midjourney.


    The latter claims include allegations that Midjourney misled users with a “Midjourney Style List,” which included 4,700 artists whose names could be used to generate works in their style. The artists argue that the list, created without their knowledge or approval, implies false approval, and the judge found that the charge had sufficient merit to warrant further discussion.


    Judge Orrick was not convinced by some of the arguments he had previously submitted for further consideration. He rejected claims that the generator violated the Digital Millennium Copyright Act by removing or altering copyright information, and that DeviantArt violated its terms of use by allowing users to remove their work on the AI ​​training dataset. And, obviously, the claims he authorized must always be exposed to the court.


    Kelly McKernan, one of the artists standing behind the costume, described the decision "very exciting" and "a huge victory" on X. McKernan noted that going through this preliminary step allows them to request information from companies in the disclosure process, potentially revealing details about software tools that often remain black boxes. “We can now uncover everything these companies don’t want us to know,” McKernan wrote. (If a company orders them to provide information, they won't necessarily make it public.)


    But it's hard to predict how this case will play out. Numerous lawsuits have been filed against AI companies, alleging that tools like Stable Diffusion and ChatGPT easily replicate copyrighted works and illegally learn from vast amounts of them. Companies oppose that these replications are rare and difficult to produce, arguing that training should be considered legal use. A specific early costume has been expelled. Especially in the case of GitHub's co -pilot, the dismissal and dismissal are mentioned in yesterday's decision. Other people such as the New York Times Company open eyes continue.


    At the same time, Openai, Google, and other technical giants have concluded a transaction between millions of dollars, publishers (including parent's VOX media), and photo suppliers for constant data access. did. Smaller companies like Stability and Midjourney have less capital to buy access to data and less leverage for individual artists to demand payment, so the legal risks are especially high for both parties in this dispute.

    Tags :