09-14-2018, 01:30 AM
How PhotoDNA for Video is being used to fight online child exploitation
<div style="margin: 5px 5% 10px 5%;"><img src="http://www.sickgaming.net/blog/wp-content/uploads/2018/09/how-photodna-for-video-is-being-used-to-fight-online-child-exploitation.jpg" width="1280" height="720" title="" alt="" /></div><div><p>PhotoDNA has also enabled content providers to remove millions of illegal photographs from the internet; helped convict child sexual predators; and, in some cases, helped law enforcement rescue potential victims before they were physically harmed.</p>
<p>In the meantime, though, the volume of child sexual exploitation material being shared in videos instead of still images has ballooned. The number of suspected videos reported to the <a href="http://www.missingkids.com/gethelpnow/cybertipline">CyberTipline</a> managed by the National Center for Missing and Exploited Children (NCMEC) in the United States increased tenfold from 312,000 in 2015 to 3.5 million in 2017. As required by federal law, Microsoft reports all instances of known child sexual abuse material to NCMEC.</p>
<p>Microsoft has long been committed to protecting its customers from illegal content on its products and services, and applying technology the company already created to combating this growth in illegal videos was a logical next step.</p>
<p>“Child exploitation video content is a crime scene. After exploring the development of new technology and testing other tools, we determined that the existing, widely used PhotoDNA technology could also be used to effectively address video,” says Courtney Gregoire, Assistant General Counsel with Microsoft’s Digital Crimes Unit. “We don’t want this illegal content shared on our products and services. And we want to put the PhotoDNA tool in as many hands as possible to help stop the re-victimization of children that occurs every time a video appears again online.”</p>
<p>A recent <a href="https://www.protectchildren.ca/pdfs/C3P_SurvivorsSurveyFullReport2017.pdf">survey of survivors of child sexual abuse</a> from the Canadian Centre for Child Protection found that the online sharing of images and videos documenting crimes committed against them intensified feelings of shame, humiliation, vulnerability and powerlessness. As one survivor was quoted in the report: “The abuse stops and at some point also the fear for abuse; the fear for the material never ends.”</p>
<p><img class="aligncenter wp-image-4392" src="http://www.sickgaming.net/blog/wp-content/uploads/2018/09/how-photodna-for-video-is-being-used-to-fight-online-child-exploitation.jpg" alt="Graphic showing how DNA for Video creates hashes from video frames and compares to known images" width="900" height="506" /></p>
<p>The original PhotoDNA helps put a stop to this online recirculation by creating a “hash” or digital signature of an image: converting it into a black-and-white format, dividing it into squares and quantifying that shading. It does not employ facial recognition technology, nor can it identify a person or object in the image. It compares an image’s hash against a database of images that watchdog organizations and companies have already identified as illegal. IWF, which has been compiling a reference database of PhotoDNA signatures, now has 300,000 hashes of known child sexual exploitation materials.</p>
<p>PhotoDNA for Video breaks down a video into key frames and essentially creates hashes for those screenshots. In the same way that PhotoDNA can match an image that has been altered to avoid detection, PhotoDNA for Video can find child sexual exploitation content that’s been edited or spliced into a video that might otherwise appear harmless.</p>
<p>“When people embed illegal videos in other videos or try to hide them in other ways, PhotoDNA for Video can still find it. It only takes a hash from a single frame to create a match,” says Katrina Lyon-Smith, senior technical program manager who has implemented the use of PhotoDNA for Video on Microsoft’s own services.</p>
</div>
<div style="margin: 5px 5% 10px 5%;"><img src="http://www.sickgaming.net/blog/wp-content/uploads/2018/09/how-photodna-for-video-is-being-used-to-fight-online-child-exploitation.jpg" width="1280" height="720" title="" alt="" /></div><div><p>PhotoDNA has also enabled content providers to remove millions of illegal photographs from the internet; helped convict child sexual predators; and, in some cases, helped law enforcement rescue potential victims before they were physically harmed.</p>
<p>In the meantime, though, the volume of child sexual exploitation material being shared in videos instead of still images has ballooned. The number of suspected videos reported to the <a href="http://www.missingkids.com/gethelpnow/cybertipline">CyberTipline</a> managed by the National Center for Missing and Exploited Children (NCMEC) in the United States increased tenfold from 312,000 in 2015 to 3.5 million in 2017. As required by federal law, Microsoft reports all instances of known child sexual abuse material to NCMEC.</p>
<p>Microsoft has long been committed to protecting its customers from illegal content on its products and services, and applying technology the company already created to combating this growth in illegal videos was a logical next step.</p>
<p>“Child exploitation video content is a crime scene. After exploring the development of new technology and testing other tools, we determined that the existing, widely used PhotoDNA technology could also be used to effectively address video,” says Courtney Gregoire, Assistant General Counsel with Microsoft’s Digital Crimes Unit. “We don’t want this illegal content shared on our products and services. And we want to put the PhotoDNA tool in as many hands as possible to help stop the re-victimization of children that occurs every time a video appears again online.”</p>
<p>A recent <a href="https://www.protectchildren.ca/pdfs/C3P_SurvivorsSurveyFullReport2017.pdf">survey of survivors of child sexual abuse</a> from the Canadian Centre for Child Protection found that the online sharing of images and videos documenting crimes committed against them intensified feelings of shame, humiliation, vulnerability and powerlessness. As one survivor was quoted in the report: “The abuse stops and at some point also the fear for abuse; the fear for the material never ends.”</p>
<p><img class="aligncenter wp-image-4392" src="http://www.sickgaming.net/blog/wp-content/uploads/2018/09/how-photodna-for-video-is-being-used-to-fight-online-child-exploitation.jpg" alt="Graphic showing how DNA for Video creates hashes from video frames and compares to known images" width="900" height="506" /></p>
<p>The original PhotoDNA helps put a stop to this online recirculation by creating a “hash” or digital signature of an image: converting it into a black-and-white format, dividing it into squares and quantifying that shading. It does not employ facial recognition technology, nor can it identify a person or object in the image. It compares an image’s hash against a database of images that watchdog organizations and companies have already identified as illegal. IWF, which has been compiling a reference database of PhotoDNA signatures, now has 300,000 hashes of known child sexual exploitation materials.</p>
<p>PhotoDNA for Video breaks down a video into key frames and essentially creates hashes for those screenshots. In the same way that PhotoDNA can match an image that has been altered to avoid detection, PhotoDNA for Video can find child sexual exploitation content that’s been edited or spliced into a video that might otherwise appear harmless.</p>
<p>“When people embed illegal videos in other videos or try to hide them in other ways, PhotoDNA for Video can still find it. It only takes a hash from a single frame to create a match,” says Katrina Lyon-Smith, senior technical program manager who has implemented the use of PhotoDNA for Video on Microsoft’s own services.</p>
</div>