A group of hackers that says it believes “AI-generated artwork is detrimental to the creative industry and should be discouraged” is hacking people who are trying to use a popular interface for the AI image generation software Stable Diffusion with a malicious extension for the image generator interface shared on Github.

ComfyUI is an extremely popular graphical user interface for Stable Diffusion that’s shared freely on Github, making it easier for users to generate images and modify their image generation models. ComfyUI_LLMVISION, the extension that was compromised to hack users, is a ComfyUI extension that allowed users to integrate large language models GPT-4 and Claude 3 into the same interface.

The ComfyUI_LLMVISION Github page is currently down, but a Wayback Machine archive of it from June 9 states that it was “COMPROMISED BY NULLBULGE GROUP.”

  • Even_Adder@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    4
    ·
    6 months ago

    You’ve got it backwards. Glaze and Nightshade aren’t FOSS and Ben Zhao, the University of Chicago professor behind them stole GPLv3 code for glaze. GPLv3 is a copyleft license that requires you share your source code and license your project under the same terms as the code you used. You also can’t distribute your project as a binary-only or proprietary software. When pressed, they only released the code for their front end, remaining in violation of the terms of the GPLv3 license.

    Moreover, Nightshade and Glaze also only works against open source models, because the only open models are Stable Diffusion’s, companies like Midjourney and OpenAI with closed source models aren’t affected by this. Attacking a tool that the public can inspect, collaborate on, and offer free of cost isn’t something that should be celebrated.