What is Google SafeSearch?
Date: 19 – January– 2019
SafeSearch can confuse many website owners and SEO professionals. Especially a lot of questions are raised by the very operation of this function, as well as how it affects traffic in practice.
First of all, this filter can hit the traffic of those sites which content is on the verge of explicit one (at least it is in accordance with Google’s algorithms). And it can be far from pornographic content. Publishers who write on topics within a particular niche of their choice suffer mainly. Sometimes these articles contain images or other content that SafeSearch mode disables.
When this happens, these URLs and images are filtered out from the search results, which leads to a decrease in the level of traffic from Google.
To be clear: filtering applies to images, videos, and websites, so it’s not just about filtering standard search results. It has a direct impact on your media. That’s why it’s advisable for publishers to know as much as possible about the SafeSearch function, how it works, how to check if your site’s content is being filtered, how to test new content to ensure the site’s security in the future.
The fact is that traffic can fall to one degree or another, depending on how this filter is applied. For example, the effect of SafeSearch only on certain areas of the site or the entire site depends on the settings, content and how far it crosses the limits of explicit content.
As it became known, initially SafeSearch used only text to filter content. Now the SafeSearch algorithm has undoubtedly become much more complex. Currently, Google uses machine learning to identify explicit website content.
The filter is carried out by keywords, links, and images. In addition, Google even has a SafeSearch team that focuses on this feature.
What can happen when a site gets into the SafeSearch filter?
When SafeSearch marks a site containing inappropriate content, Google filters some (or all) of your pages for users who have SafeSearch enabled. This option is contained in the search settings, so some people are forced to manually enable it. For others, the filter can be turned on by management, for example, in schools, IT departments, and even parents who control the child’s computer.
If the filter is enabled, then the pages of your site will not be displayed as for regular search queries. No messages will be received from Google. Your site will simply be filtered out in the search results.
As you can guess, some sites may have a lower level of traffic depending on the target audience. For example, if your audience includes people who have SafeSearch enabled, the filter may cause traffic problems. If the majority of users visiting the site have not activated secure search, you may not even know that they are filtering you. This is hard to determine.
How can site structure help Google focus the SafeSearch filter?
Google explained that you can help their algorithms understand which sections contain adult content by using a specific URL structure to organize the content. For example, place all adult content in a specific directory.
If you do this, then Google’s algorithms can determine that not the entire site is subject to filtering, but only its specific directory. Certain difficulties can cause the fact that the adult content is determined algorithmically.
Adult Meta Tags
Google explains that the best way to tell them that your page has adult content is to use certain metadata. This is the “strongest signal” you can provide. There are two meta tags that you can use in your code (you need to use one of your choice).
How to check if your site is filtered.
To visually see how SafeSearch works, use two browsers. On one, enable SafeSearch, and on the other, turn it off.
- First, launch the browser with SafeSearch enabled and at the command prompt, enter the name of the site. In extreme cases, you will see 0 results. This will mean that your site is filtered. Similarly, you can check individual site directories.
For sites that have not been completely filtered, this method is not reliable, because sometimes it shows that the site is partially filtered, and sometimes not.
- Identify dangerous queries.
First, go to GSC (Google Search Console) and export some queries leading to your site there. Find those that will be considered risky in terms of SafeSearch (either based on the request itself, or the content that is available on your site). Then start your search.
Enter the request in a browser with SafeSearch disabled, check your rating, then do the same search in the browser with SafeSearch turned on. You may be surprised at the result by finding serious gaps in the search results page where your site was used for ranking. If so, then yes, you tripped on a filter.
Repeat the same for other queries and determine which of the queries and pages on the site are filtered. By analyzing more of this data, you can get an idea of what exactly goes beyond acceptable content.
A logical question is how to identify problematic content before it will have been filtered?
Generally, companies want to prevent the classification of their site as a site that hosts adult content.
For preliminary image analysis, the Google Vision API can be used.
It uses machine learning to automatically identify and classify images. One of the features of the Vision API is that it can be used to detect explicit content through the secure search function. It can identify adult content, violence and more. In addition, the Vision API uses optical character recognition to detect text in the image.
If you do not need to check a large number of images on a regular basis, it is possible to check individual photos directly on the site. The Google Vision API will analyze the photo and provide a ton of information, including pages that contain the image, text and … SafeSearch
There is a special section for the SafeSearch function, which describes the likelihood that the image is “adult”, “fake”, “violence” or “cruelty”, etc.
During the testing various images using the Vision API, you can begin to understand what exactly goes beyond what is permitted
It is necessary to pay attention that although during the verification of the images you will get some useful information, it is difficult to say for sure what will turn off by filter.
However, it certainly gives you a clearer picture of how SafeSearch works.
False positives of SafeSearch.
Machine learning algorithms are not perfect and sometimes sites are falsely marked as the ones that have adult content. If you are faced with this situation, you can post in the forums and hope that your data reaches Google. The bad news is that the whole world will know about your problem, because the post will be public.
– It is important to get the most out of SafeSearch, especially if you provide content that can be classified as explicit (for example, adult content). Perform rigorous testing to determine if your site is being filtered.
– If you provide adult content (or any other type of content that may seem frank), have a conversation with marketing experts and developers about the structure of the site. Remember, Google explained that placing all adult content in a specific directory can help its algorithms understand where this content is located on your site and then SafeSearch can be as detailed as possible;
- Use “adult” meta tags for pages with explicit content.
- When filtering, SafeSearch takes into account both images and text;
- Filtering can occur for web pages, videos, and images.
- Tell your employees about SafeSearch and show them how to test images using the Google Vision API. This can help protect your site in the future by requiring employees to conduct preventive checks before publishing content;
- If you think your site has been flagged by SafeSearch incorrectly, you should write to the webmaster forums to provide detailed information. It is hoped that your problem will reach Google, but there are no guarantees.
Now you know how to determine when your site is being filtered and will be able to make a plan for solving the problem. Good luck and safe search!