Today, DALL-E 2, OpenAI’s AI system that can generate images given a message or edit and refine existing images, becomes more accessible. The company announced in a blog post that it will speed up access for customers on the waiting list with the goal of reaching around 1 million people over the next few weeks.
With this “beta” launch, DALL-E 2, which would have been free to use, will move to a credit-based fee structure. First-time users will receive a limited amount of credits that can be used to generate or edit an image or create a variation of an image. (Generations return four images, while edits and variations return three.) Credits will be replenished each month to 50 in the first month and 15 in the month thereafter, or users can purchase additional credits in increments of $ 15.
Here is a diagram with the details:
Artists in need of financial assistance will be able to apply for subsidized access, says OpenAI.
The successor to DALL-E, DALL-E 2 was announced in April and became available to a select group of users earlier this year, recently crossing the threshold of 100,000 users. OpenAI says that the broader access was made possible by new approaches to reduce distortions and toxicity in DALL-E 2’s generations, as well as developments in the policies that govern images created by the system.
For example, OpenAI said that this week it implemented a technique that encourages DALL-E 2 to generate images of people that “more accurately reflect the diversity of the world’s population” when they receive a message describing a person of an unspecified race or gender. The company also said it now rejects image uploads that feature realistic faces and attempts to create similarities with public figures, including prominent political figures and celebrities, while improving the accuracy of content filters.
Roughly speaking, OpenAI does not allow DALL-E 2 to be used to create images that are not “G-rated” or that can “cause harm” (eg images of self-harm, hate symbols or illegal activity). And it allowed earlier use of generated images for commercial purposes. As of today, however, OpenAI gives users “full usage rights” to commercialize the images they create with the DALL-E 2, including the right to reprint, sell and merchandise – including images they generated during the early preview.
As demonstrated by DALL-E 2 derivatives such as Craiyon (formerly DALL-E mini) and the unfiltered DALL-E 2 itself, image-generating AI can very easily capture the distortions and toxicities embedded in millions of images from the web used to train them. Futurism was able to get Craiyon to create images of burning crosses and Ku Klux Klan conventions and found that the system made racist assumptions about identities based on “ethnic-sounding” names. OpenAI researchers noted in an academic article that an open source implementation of DALL-E can be trained to create stereotypical associations such as generating images of white passing men in business suits for terms such as “CEO”.
While the OpenAI-hosted version of DALL-E 2 was trained on a filter set to remove images that contained explicitly violent, sexual, or hateful content, filtering has its limits. Google recently said it would not release an AI-generating model it developed, Imagen, due to the risk of abuse. Meanwhile, Meta has limited access to Make-A-Scene, its art-focused image generation system, to “prominent AI artists”.
OpenAI emphasizes that the host-based DALL-E 2 includes other security measures, including “automated and human monitoring systems” to prevent things like the model from remembering faces that are frequently displayed on the Internet. Nevertheless, the company admits that there is more work to be done.
“Expanding access is an important part of our responsible distribution of AI systems because it allows us to learn more about use in the real world and continue to iterate on our security systems,” OpenAI wrote in a blog post. “We continue to investigate how AI systems, such as DALL-E, can reflect biases in training data and the different ways we can deal with them.”