Earlier this year,Arouse (2025) Meta-owned WhatsApp started testing a new feature that allows users to generate stickers based on a text description using AI. When users search "Palestinian," "Palestine," or "Muslim boy Palestine," the feature returns a photo of a gun or a boy with a gun, a report from The Guardianshows.
According to The Guardian's Friday report, search results vary depending on which user is searching. However, prompts for "Israeli boy" generated stickers of children playing and reading, and even prompts for "Israel army" didn't generate photos of people with weapons. That, compared to the images generated from the Palestinian searches, is alarming. A person with knowledge of the discussions that The Guardiandid not name told the news outlet that Meta's employees have reported and escalated the issue internally.
SEE ALSO: Pro-Palestine account @Eye.on.Palestine locked by Meta, removed on X"We're aware of this issue and are addressing it," a Meta spokesperson said in a statement to Mashable. "As we said when we launched the feature, the models could return inaccurate or inappropriate outputs as with all generative AI systems. We'll continue to improve these features as they evolve and more people share their feedback."
It is unclear how long the differences spotted by The Guardianpersisted, or if these differences continue to persist. For example, when I search "Palestinian" now, the search returns a sticker of a person holding flowers, a smiling person with a shirt that says what looks like "Palestinian," a young person, and a middle-aged person. When I searched "Palestine," the results showed a young person running, a peace sign over the Palestinian flag, a sad young person, and two faceless kids holding hands. When you search "Muslim boy Palestinian," the search shows four young smiling boys. Similar results are shown when I searched "Israel," "Israeli," or "Jewish boy Israeli." Mashable had multiple users search for the same words and, while the results differed, none of the images from searches of "Palestinian," "Palestine," "Muslim boy Palestinian," "Israel," "Israeli," or "Jewish boy Israeli" resulted in AI stickers with any weapons.
SEE ALSO: People are accusing Instagram of shadowbanning content about PalestineThere are still differences, though. For instance, when I search "Palestinian army," one image shows a person holding a gun in a uniform, while three others are just people in uniform; when I search "Israeli army," the search returns three people in uniform and one person in uniform driving a military vehicle. Searching for "Hamas" returns no AI stickers. Again, each search will differ depending on the person searching.
This comes at a time in which Meta has come under fire for allegedly shadowbanning pro-Palestinian content, locking pro-Palestinian accounts, and adding "terrorist" to Palestinian bios. Other AI systems, including Google Bard and ChatGPT, have also shown significant signs of bias about Israel and Palestine.
Topics Apps & Software WhatsApp Meta
(Editor: {typename type="name"/})
How to Squeeze the Most Out of Your iPhone's Battery
Brisbane International 2025 livestream: Watch live tennis for free
Best robot vacuum deal: Get the Ecovacs Cube Pro Robot Vacuum and Mop for $470 off at Best Buy
NVIDIA GeForce RTX 5000: Details, launch date rumors as CES 2025 looms
Inside the Murky Process of Getting Games on Steam
JBL Amazon deals: Up to 55% off at Amazon's Winter Sale
Apple Watch Series 10 deals: 42mm and 46mm back at record low prices
NYT Connections hints and answers for December 29: Tips to solve 'Connections' #569.
Acupuncture for pets is on the rise
Indiana Pacers vs. Miami Heat 2025 livestream: Watch NBA online
Impact of Temperature on Intel CPU Performance
Volkswagen leak exposed location of 800,000 electric car drivers for months
接受PR>=1、BR>=1,流量相当,内容相关类链接。