Adaptive’s new AI-powered Automated Content Moderation tool scans all images in an airline’s IFE library of digital publications and rates each image using a percentage-based score, based on how certain the AI is that the image contains sensitive content (the higher the score, the more nudity in the content). The solution can either be automated or managed manually; however, using the automation function makes it even easier for airlines to ensure that their passengers are not exposed to content which does not comply with their moderation guidelines.
“Passengers’ tolerance to nudity and sensitive imagery can vary significantly, from country to country, so we created a solution that is fully customisable according to the specific preferences of each airline,” said Laurent Safar, co-founder & CEO of Adaptive. “Our solution also makes it possible for passengers of all ages to safely access Adaptive’s digital press content – before, during or after their flight – without the risk of being exposed to inappropriate content.”
If the user chooses to automate the exclusion process, he/she has the ability to determine what threshold of nudity is appropriate for their passengers and set the solution to automatically exclude images that are rated higher than that threshold. The moderation tool automatically replaces detected images with a customisable stationary page.
“Before our solution, airlines were faced with a difficult dilemma: either spend hours, going through each page of digital content offered via their IFE solution, page-by-page, or cross their fingers and hope that passengers weren’t offended by the content that they chose to read during their flight,” said David Fairand, co-founder & COO of Adaptive. “Either way, it was a lose/lose situation, so it was easy for us to identify the need for a tool to automate the process.”