Apple has once again pushed the boundaries of technology, this time changing the way we think about photo editing. With the introduction of the MGIE model, developed in collaboration with the University of California, Santa Barbara, Apple is making it easier for everyone to edit photos with just a few words. Gone are the days when mastery of sophisticated software was a prerequisite for photo editing. Now, if you want to crop, resize, flip, or add a little flair with filters, all you need to do is describe your vision.
Apple’s software will compete with established technologies in the industry
The advantage of MGIE, which stands for MLLM-Guided Image Editing, is its ability to understand and perform both simple and complex edits with the help of text prompts. Imagine that you want to make the sky bluer in your photo; MGIE interprets this request into a specific action, such as increasing the brightness of the sky, to fulfill your request exactly.
One of the most exciting examples is transforming an image of a pepperoni pizza into a healthier version by simply asking the model to “make it healthier,” which magically adds vegetable toppings to the pizza. Likewise, a dull image of tigers can be made brighter by prompting the model to add more contrast, demonstrating the model’s ability to bring your imagination to life.
Apple‘s entry into the AI editing market with MGIE is a notable move, especially against the backdrop of giants like Adobe, which has led the way with its own AI editing tools. Although Apple has traditionally been more reserved in the field of generative AI, this innovation marks a significant step towards integrating more AI features into its products, as hinted at by CEO Tim Cook. Access to MGIE via GitHub and the Hugging Face Spaces web demo opens up new possibilities for creatives and tech enthusiasts alike.