Google is launching ImageFX, an AI-powered image creation tool that uses Imagen 2, a GenAI image model developed by Google’s DeepMind team.
With ImageFX, users can easily create and modify images using a prompt-based UI and the feature of “expressive chips,” which provides keyword suggestions for experimenting with different dimensions.
According to Google’s blog post, ImageFX is designed for experimentation and creativity, allowing users to create images with a simple text prompt and modify them using expressive chips.
However, concerns about potential abuse have been raised, especially in light of recent events surrounding deepfake technology.
Google assures that ImageFX comes with technical safeguards to limit problematic outputs such as violent, offensive, and sexually explicit content. Additionally, it includes a prompt-level filter for named people, presumably public figures, as a safety measure.
Google emphasized that they have invested in the safety of training data from the outset, conducted extensive adversarial testing, and implemented SynthID watermarks to identify images generated using their AI tools.
Moreover, Imagen 2 is being integrated into more Google products and services, including its AI search experience, managed AI services Vertex AI, and Google Ads for text-to-image capabilities.
Google disclosed that Imagen 2 also powers SGE (Search Generative Experience), which provides image generation tools in Google Image Search, enabling users to enter prompts and receive image results directly in the conversational experience.
Furthermore, Imagen 2 is available through an API to Google Cloud customers and can be invoked through Bard, Google’s AI-driven chatbot, to generate high-quality images based on the provided descriptions.
While Google has not disclosed the specific data used to train Imagen 2, it remains a subject of open legal debate regarding the use of publicly available data for commercial AI models.
As relevant lawsuits unfold, Google remains cautious about divulging further details on the matter.