By Chyung Eun-ju and Joel Cho
![]() |
Chyung Eun-ju |
![]() |
Joel Cho |
Contrary to many tech giants that have centralized their technology, Stability Diffusion does not do much to prevent violent, pornographic, or copyright-infringing images. Instead they seem to only have a basic safety filter that people can easily disable to make their own versions of the app. And now, Stability Diffusion has more users than other generative AI such as DALL-E 2 and Midjourney.
Before delving further into the matter at hand, it is important to understand what generative AI is and does.
AI, the acronym for Artificial Intelligence, is essentially any machine system that can learn on its own through a technique called deep learning, and that is capable of figuring out problems on its own. Recently, AI has found its way to popularity in a new form ― generative AI.
Generative AI initially began with texts, which involved the development of computer systems able to autocomplete sentences mimicking different types of writing style, such as generating texts written in a Shakespearian manner or as if they were a work recounting science fiction. Then, companies started expanding this form of AI into other different forms of media.
One of the recognized pioneers in generative AI was DALL-E, an AI image generator, released by OpenAI, based on a description in natural language. Not long after Google came out with their version of generative AI, followed by Meta. Generative AI has stunned the public with their ability to make hyper-realistic images by typing a few words into a text box.
And this tech is really taking off ― DALL-E 2, has more than 1.5 million users that create more than 2 million images every day; Midjourney has more than 3 million users. Sequoia Capital, a venture capital firm, stated that generative AI could have an economic value of trillions of dollars, and Jasper, a startup developing "AI content" has raised $125 million at a $1.5 billion valuation.
We typed in "a bear playing poker in space" and it spit back an accurate image representing exactly the imputed nominative description. This was a Eureka moment for us, as we were able to actually grasp the limitless possibilities that make generative AI so popular.
So circling back to Stable Diffusion, with its exploding popularity, users witnessed social media posts being censored due to the sharing of crude and insensitive content that was generated through Stability Diffusion, but the company took zero accountability, simply instructing its users to not "generate anything you'd be ashamed to show your mother," and nothing else.
Congresswoman Anna Eschoo, Democrat of California, wrote a letter to the federal regulator calling attention to the images of "violently beaten Asian women" made by Stable Diffusion, and asked for a regulatory response to Stable Diffusion.
The founder of the app, Emad Mostaque responded by saying, "We trust people, and we trust the community," he said, "as opposed to having a centralized, unelected entity controlling the most powerful technology in the world."
So who is responsible for the output of the AI?
Mostaque has highlighted how a powerful technology has to be decentralized and that algorithms should not be locked away as tech companies have done. He stated in The New York Times interview that algorithm and data sets should be interrogable for the public good. The incentive of tech companies is not for the public good, but rather to manipulate users and serve ads.
In a Netflix Documentary, "The Social Dilemma," the tech experts behind Facebook and Twitter discuss the harmful human impacts on social media networks, highlighting the fact that they went too fast and wish they had gone slower to consider safety and trust concerns. Tristan Harris, a former Google design ethicist and co-founder of the Center for Humane Technology, said that the goal of tech companies is a "disinformation-for-profit business model" which allows companies to profit by enabling "unregulated messages to reach anyone for the best price."
The motivation behind tech companies is usually where danger lies and this can be equally or even more dangerous when dealing with artificial intelligence, a more powerful technology capable of distorting an even larger scope. Initially, people behind the development of the tech may be well intended, working for all the right motives, but when capital comes into play and corporations are involved, more often than not they stray from the original intentions, and perhaps we ― the users ― should be given the freedom to act according to ethics, without the intervention of corporations.
So what Stable Diffusion seems to be doing is making the tech and taking a populist approach to it ― opening it for everyone. If the problem was in the end goal of making a technology with the greatest possible monetary profitability, then making a decentralized AI could really be a true form of democratized expression. In a world where everyone has their own AI, everyone could give their feedback in making tech less harmful.
In the end, tech experts may make the app, but what we do with it is really our responsibility. Giving us more responsibilities may make us more responsible. The output of the generative AI is in the hands of the user, and so it seems reasonable to say that the user is responsible for it.
Generative AI is simply a way of expression, with no social interaction, so should expression be regulated? Naturally, generative AI has been integrated into social media and it may become utilized everywhere. AI will develop to become capable of learning principles and can search logic, and will evolve into a "reflection of the human brain," according to Mostaque. A tech that has the potential to be one of the most powerful techs, should be in the hands of everyone instead of a selected few, but the question is if the public can be responsible with such trust.
Chyung Eun-ju (ejchyung@snu.ac.kr) is studying for a master's degree in marketing at Seoul National University. Her research focuses on digital assets and the metaverse. Joel Cho (joelywcho@gmail.com) is a practicing lawyer specializing in IP and digital law.