Google allows US Users Search Dark Web for Their Own Gmail ID!

Google allows US Users Search Dark Web for Their Own Gmail ID!

Google is now letting Gmail users in the US run scans to discover whether their Gmail ID is on the dark web.

They can now run scans to find out whether their Gmail ID appears on the dark web, Google announced at Google I/O, its annual developer conference.

This feature was 1st announced in March, when it was released for Google One users only.

Report

It allows users to run scans, & then receive a report advising them whether their information, including name, address, email address, phone number, & Social Security number, appears on dark web sites.

This information usually ends up on the dark web after a data breach (cyber-criminals are known to share or trade stolen personally identifiable information on underground forums) but could also be taken from publicly available databases.

Matching Information

With the dark web report enabled, users are then automatically notified when matching information is found. Google will also provide guidance on how to protect the exposed information.

“For example, if your US Social Security number was found on the dark web, we might suggest you report it as stolen to the Govt. or take steps to protect your credit,” Google stated in a March blog post.

Also, Google announced that “anyone with a Gmail account in the US will be able to run scans to see if their Gmail address appears on the dark web & receive guidance on what actions to take to protect themselves.”

The company explained that it will make the dark web report available to international markets ‘soon’.

Context

Google also announced a new ‘About this Image’ tool to provide users with ‘context on the visual content’ they find online, along with expanded spam protections in Drive to help users stay safe from potentially unwanted or abusive content, & a new option to delete recent searches in Maps.

To help platforms & organisations keep children safer online, they are also expanding its Content Safety API, which has been publicly available since 2018, to include potential CSAM (child sexual abuse material) in video content.

 

SHARE ARTICLE