Think Twice Before Creating That ChatGPT Action Figure

Facebook
Twitter
LinkedIn
Pinterest
Pocket
WhatsApp

At the start of April, an influx of action figure started appearing on social media sites including LinkedIn and X. Each figure depicted the person who had created it with uncanny accuracy, complete with personalized accessories such as reusable coffee cups, yoga mats, and headphones.

All this is possible because of OpenAI’s new GPT-4o-powered image generator, which supercharges ChatGPT’s ability to edit pictures, render text, and more. OpenAI’s ChatGPT image generator can also create pictures in the style of Japanese animated film company Studio Ghibli—a trend that quickly went viral, too.

The images are fun and easy to make—all you need is a free ChatGPT account and a photo. Yet to create an action figure or Studio Ghibli-style image, you also need to hand over a lot of data to OpenAI, which could be used to train its models.

Hidden Data

The data you are giving away when you use an AI image editor is often hidden. Every time you upload an image to ChatGPT, you’re potentially handing over “an entire bundle of metadata,” says Tom Vazdar, area chair for cybersecurity at Open Institute of Technology. “That includes the EXIF data attached to the image file, such as the time the photo was taken and the GPS coordinates of where it was shot.”

OpenAI also collects data about the device you’re using to access the platform. That means your device type, operating system, browser version, and unique identifiers, says Vazdar. “And because platforms like ChatGPT operate conversationally, there’s also behavioral data, such as what you typed, what kind of images you asked for, how you interacted with the interface and the frequency of those actions.”

It’s not just your face. If you upload a high-resolution photo, you’re giving OpenAI whatever else is in the image, too—the background, other people, things in your room and anything readable such as documents or badges, says Camden Woollven, group head of AI product marketing at risk management firm GRC International Group.

This type of voluntarily provided, consent-backed data is “a goldmine for training generative models,” especially multimodal ones that rely on visual inputs, says Vazdar.

OpenAI denies it is orchestrating viral photo trends as a ploy to collect user data, yet the firm certainly gains an advantage from it. OpenAI doesn’t need to scrape the web for your face if you’re happily uploading it yourself, Vazdar points out. “This trend, whether by design or a convenient opportunity, is providing the company with massive volumes of fresh, high-quality facial data from diverse age groups, ethnicities, and geographies.”

OpenAI says it does not actively seek out personal information to train models—and it doesn’t use public data on the internet to build profiles about people to advertise to them or sell their data, an OpenAI spokesperson tells WIRED. However, under OpenAI’s current privacy policy, images submitted through ChatGPT can be retained and used to improve its models.

Any data, prompts, or requests you share helps teach the algorithm—and personalized information helps fine tune it further, says Jake Moore, global cybersecurity advisor at security outfit ESET, who created his own action figure to demonstrate the privacy risks of the trend on LinkedIn.

Uncanny Likeness

In some markets, your photos are protected by regulation. In the UK and EU, data protection regulation including the GDPR offer strong protections, including the right to access or delete your data. At the same time, use of biometric data requires explicit consent.

However, photographs become biometric data only when processed through a specific technical means allowing the unique identification of a specific individual, says Melissa Hall, senior associate at law firm MFMac. Processing an image to create a cartoon version of the subject in the original photograph is “unlikely to meet this definition,” she says.

source

Facebook
Twitter
LinkedIn
Pinterest
Pocket
WhatsApp

Never miss any important news. Subscribe to our newsletter.

Recent News

Editor's Pick