In response to some recent articles around content safety on GIPHY, our team has already mobilized to take action based on the facts we currently have on hand. In the spirit of transparency, we want to let you know what that means.
First and foremost, content safety is paramount at GIPHY. GIPHY is primarily a search engine and as such employs an extensive array of moderation protocols to ensure the safety of all publicly indexed content. We’re proud to have built a leading Trust + Safety division that leverages industry standard best practices and we actively work with the National Center for Missing and Exploited Children (NCMEC) as well as government agencies to ensure that any content violating our Community Guidelines is removed.
The content available publicly through our search platform and our distribution network is continually vetted by our extensive moderation protocols to ensure safety. Private accounts on any platform are subject to abuse — and any search engine that indexes private, anonymous content uploaded to GIPHY against our explicit site instructions is not in compliance with industry standards. We are troubled to learn that third party search engines may be doing this and we are taking the following steps:
- As soon as we were informed that our private content was surfacing on third party search engines, we took immediate action to block them from linking back to GIPHY while we investigate the issue.
- We temporarily stopped anonymous uploads and any anonymous media will be unavailable.
- We are also conducting further analysis and considering additional changes to our infrastructure and products that will make it more difficult for people to abuse our platform.
GIPHY has a moderation system which leverages a combination of imaging technologies and human validation. In addition, GIPHY uses a combination of internal and external resources specializing in moderation. We respond quickly to onsite reports of inappropriate content and take whatever action is necessary to enforce our Community Guidelines.
We take any reports we receive of inappropriate content seriously, and employ immediate action to remove content that violates our Community Guidelines upon discovery. We continue to encourage any company, research party, etc. to proactively work with us to address any concerns and help us make the internet as safe a place as possible.
For more information about our practices related to content moderation and safety, please reference: