Publication date: 17 Jun 2020
Instagram will examine its algorithms which might contribute to discrimination against different groups. According to Instagram Head Adam Mosseri, the company is hearing concerns about whether its products and policies treat everyone equally and will “take a harder look” at the underlying systems in order to avoid any vulnerability to bias in the future.
Instagram is also looking into its current account verification criteria and will make changes to ensure it’s as inclusive as possible. “Verification is an area we constantly get questions on – what the guidelines are, and whether or not the criteria is favoring some groups more than others,” Mosseri added.
Instagram’s algorithms prioritize photos of semi-nude men and women, according to a report from Algorithm Watch in partnership with the European Data Journalism Network.
Researchers analyzed 1,737 posts, containing 2,400 photos. Of these posts, 362 were containing pictures showing women in bikinis or underwear, or bare chested men. Posts that contained such pictures of women were 54% more likely to appear in the newsfeed. Posts containing such pictures of men were 28% more likely to be shown. By contrast, posts showing pictures of food or landscape were about 60% less likely to appear in the newsfeed.
The tendency, however, might not apply to all Instagram users. It is likely that Instagram’s algorithms favor nudity in general, but personalization, or other factors, limits this effect for some users.
Share it with your friends via favorite social media