Microsoft joins coalition to scrub revenge and deepfake porn from Bing

Microsoft announced it has partnered with StopNCII to help remove non-consensual intimate images — including deepfakes — from its Bing search engine.

When a victim opens a "case" with StopNCII, the database creates a digital fingerprint, also called a "hash," of an intimate image or video stored on that individual's device without their needing to upload the file. The hash is then sent to participating industry partners, who can seek out matches for the original and remove them from their platform if it breaks their content policies. The process also applies to AI-generated deepfakes of a real person.

Several other tech companies have agreed to work with StopNCII to scrub intimate images shared without permission. Meta helped build the tool, and uses it on its Facebook, Instagram and Threads platforms; other services that have partnered with the effort include TikTok, Bumble, Reddit, Snap, Niantic, OnlyFans, PornHub, Playhouse and Redgifs.

Absent from that list is, strangely, Google. The tech giant has its own set of tools for reporting non-consensual images, including AI-generated deepfakes. However, failing to participate in one of the few centralized places for scrubbing revenge porn and other private images arguably places an additional burden on victims to take a piecemeal approach to recovering their privacy.

In addition to efforts like StopNCII, the US government has taken some steps this year to specifically address the harms done by the deepfake side of non-consensual images. The US Copyright Office called for new legislation on the subject, and a group of Senators moved to protect victims with the NO FAKES Act, introduced in July.

If you believe you've been the victim of non-consensual intimate image-sharing, you can open a case with StopNCII here and Google here; if you're below the age of 18, you can file a report with NCMEC here.

This article originally appeared on Engadget at https://www.engadget.com/big-tech/microsoft-joins-coalition-to-scrub-revenge-and-deepfake-porn-from-bing-195316677.html?src=rss

Microsoft announced it has partnered with StopNCII to help remove non-consensual intimate images — including deepfakes — from its Bing search engine.

When a victim opens a "case" with StopNCII, the database creates a digital fingerprint, also called a "hash," of an intimate image or video stored on that individual's device without their needing to upload the file. The hash is then sent to participating industry partners, who can seek out matches for the original and remove them from their platform if it breaks their content policies. The process also applies to AI-generated deepfakes of a real person.

Several other tech companies have agreed to work with StopNCII to scrub intimate images shared without permission. Meta helped build the tool, and uses it on its Facebook, Instagram and Threads platforms; other services that have partnered with the effort include TikTok, Bumble, Reddit, Snap, Niantic, OnlyFans, PornHub, Playhouse and Redgifs.

Absent from that list is, strangely, Google. The tech giant has its own set of tools for reporting non-consensual images, including AI-generated deepfakes. However, failing to participate in one of the few centralized places for scrubbing revenge porn and other private images arguably places an additional burden on victims to take a piecemeal approach to recovering their privacy.

In addition to efforts like StopNCII, the US government has taken some steps this year to specifically address the harms done by the deepfake side of non-consensual images. The US Copyright Office called for new legislation on the subject, and a group of Senators moved to protect victims with the NO FAKES Act, introduced in July.

If you believe you've been the victim of non-consensual intimate image-sharing, you can open a case with StopNCII here and Google here; if you're below the age of 18, you can file a report with NCMEC here.

This article originally appeared on Engadget at https://www.engadget.com/big-tech/microsoft-joins-coalition-to-scrub-revenge-and-deepfake-porn-from-bing-195316677.html?src=rss

HOT news

Related posts

Latest posts

PlayStation’s 30th anniversary PS5 and PS5 Pro consoles are so very pretty

The original PlayStation console, otherwise called the PS1, came out in Japan in late 1994. So we are quickly coming up on the console’s...

How DePIN Models Are Transforming The Automotive Industry

Infrastructure-intensive industries are undergoing massive transformations to meet the need for digital innovation. New technologies like artificial intelligence (AI), blockchain, and the Internet of...

As Pepe, Dogwifhat Surge, Some Investors are Backing Crypto All-Stars Meme Coin Staking Protocol

It’s been an excellent week for meme coin traders. Most of these coins have posted solid gains, and many traders are scouring the market...

SEC Approves Options Trading on Blackrock’s Ishares Bitcoin Trust (IBIT)

The U.S. Securities and Exchange Commission (SEC) has approved the listing and trading of options on the Ishares Bitcoin Trust (IBIT), a product...

Cards Against Humanity is suing SpaceX for trespassing and filling its property with ‘space garbage’

Cards Against Humanity is the latest entity to take on Elon Musk in court. The irreverent party game company filed a $15 million lawsuit...

Want to stay up to date with the latest news?

We would love to hear from you! Please fill in your details and we will stay in touch. It's that simple!