Alphabet to Relaunch Gemini Image Generator for Adding Accuracy

Alphabet is betting on the second time being a charm for Gemini AI—but not without some revisionist history. Got banned because it displayed, somewhat "creatively," images of historical events; It is now going to retry its image generation with a more fact-literary take.

Alphabet to Relaunch Gemini Image Generator for Adding Accuracy
Photo Credit: Google

Alphabet said Wednesday, Aug. 28., that it has improved its Gemini AI image-creation model and is planning to restore the ability to pixelate people in a few days. That comes after the tool was taken offline for several months beginning in February amid backlash over historically inaccurate results.

The break was taken when users complained about Gemini's image generator which misrepresented historical people. The model, for example, swapped white historical figures like the U.S. Founding Fathers and Nazis with people of color in images from 1943 German soldiers.

The faults resulted in a backlash so intense that Google CEO Sundar Pichai claimed the chatbot's answers were "completely unacceptable" and called for reworking its models.

Google doesn't deny the shortcomings, though; it said that after first identifying flaws with Bard, it attempted to improve its product by following a series of "product principles" and running examples through hypothetical situations where they might fail.

In a blog post, Prabhakar Raghavan, senior vice president and head of search at Google stated that Gemini had been tuned to encourage diversity in its outputs. However, the model, which didn't sufficiently control for prompts where that variation was not relevant to a good answer, was also missed. Raghavan said the model had been over-cautious, including treating some prompts as sensitive.

With those problems solved, Google is gearing up to introduce an updated version of Gemini's image generation tool that will reportedly have improved historical accuracy and contextual fidelity.

Disclaimer: We do our best to be accurate, but sometimes things might be wrong. Please be careful when using the information we provide.

Post a Comment

0 Comments