How To Easily Submit Websites To Google, Bing, and Yahoo
Immediately after creating a website, site owners want to know what they need to do next. The answer is to submit the website to Google. A website development agency can advise you on how to submit your website to search engines. This is a popular and important question: customers won’t find your site until search engines recognize it. In this article, you will discover several ways to get a website to show up in the results of search engines like Google, Bing, and Yahoo.
Ways to submit website to Google for indexing
Google Search Console
The most effective way to submit your website to search engines is to add it to Google Search Console. You can find out about what Google Webmaster Tools are and how to use the Google Search Console here.
After you have verified site rights in Google Search Console, select URL Inspection from the menu:
In the search bar, enter the address of the site or a new page:
Next, select Request indexing:
If a site page is submitted for a web crawler, then the page is sent to the queue for indexing, and we receive the following message:
To help Google search engines index your site, you need to add a sitemap in the format xml: sitemap.xml.
The sitemap.xml looks like this:
You can create sitemap.xml using plugins, online services, and programs:
for WordPress sites, using the best SEO plugin — Yoast SEO
using XML sitemap module for Drupal sites
xml-sitemaps.com (up to 500 pages for free)
Screaming Frog (free version allows you to crawl 500 URLs, license cost ~ £149.00 per year)
Netpeak Spider (free 14-day trial, license cost ~ $249.60 per year)
If you generated your sitemap.xml using a service or program, you need to add it to the server. If you have a WordPress or Drupal site and create a sitemap.xml using a plugin, the required file will be automatically placed.
After creating the sitemap in XML format, add it to the Google Search Console:
Google Add URL
By July 2018, submitting a website to Google using the Add URL tool became possible. The tool looked like this:
You could also submit your website to search engines directly from the search results page if you set the ‘Submit URL to google’ request:
Now, these methods are gone.
Google Chrome і Google Analytics
There are theories among SEO specialists about alternative ways of submitting pages for indexing - for example, using the Google Chrome browser or the Google Analytics tracker. In our consultation center, you can read how to integrate Google Analytics with the Drupal 8 site.
One of the most famous SEO experts, Rand Fishkin, conducted Twitter research among SEOs about whether Google uses data from the Chrome browser to crawl pages. According to the results, most SEOs responded in the affirmative.
It is expected that if you open new pages in the browser or add to them the Google Analytics tracker code, the browser or analytics system informs Google about the appearance of new pages that need to be crawled.
Perficient Digital ran two experiments to find out if Google Chrome and Google Analytics are affecting new page crawls:
Experiments have shown that Google doesn’t use the Chrome browser and Google Analytics to identify new pages.
How long does it take Google to index a site?
The new website is submitted to search engines at different times - from one week to several months.
Indexing of pages doesn’t guarantee an immediate high ranking of a site in search results. Pages may not show up when people use keywords that are important to you. Our SEO articles will help you figure out how to improve your site's rankings:
How to check if your website is indexed
You can check the indexing of a site or individual pages on Google in the following ways:
Use a search query
Enter ‘site: mynewsite.com’ in the Google search bar (to check the indexing of the entire site) or ‘site: mynewsite.com/your-article’ (to check the indexing of a specific page).
The search results will show the approximate number of indexed pages.
Via Google Search Console.
Select the Index report → Coverage. The number of indexed pages is in the Pages cell without errors.
Services, browser plugins, programs
To check the indexing of pages, you can use the free indexchecking.com service or its analogs. The service checks up to 25 links at a time.
RDS Bar plugin for Firefox, Chrome, Opera browsers.
Also, you can use paid service Netpeak Checker to scan an entire site and find out which pages are in the index. Indexing can be checked in several search engines: Google, Bing, or Yahoo.
Why can a site fall out of indexing? Causes and solutions
There may be many reasons why submitting your website to search engines is not working, let's consider the main ones.
New site or page
The indexing process is a search crawler scanning site documents. After that, the data is processed, and afterward, it appears (or doesn’t appear) in the search results. Updating the search base for each search engine occurs at different rates. Therefore, if the pages of a newly created site are not displayed in the search, then most often they have simply not been indexed yet.
No ban on indexing the site
You can close the entire site from indexing, or its particular documents. Therefore, the second step to find out why a site isn’t indexed is to check the robots.txt file located at the root of the file system.
This file is opened using any text editor, the following lines shouldn’t be in it:
Block for the whole site
Block for individual pages:
Because of meta tags, pages can be closed to indexing. If it’s clear which pages aren’t in the index, you can check them manually (or using automated tools, for example, Screaming Frog SEO Spider) for the following meta tag:
<meta name="robots" content="noindex" />
Search engines are interested in ensuring that their results are of the highest quality, meeting the needs of users as much as possible. It makes no sense to add duplicate content on your website if it is already on the Internet.
Therefore, before publishing content, be sure to check it for uniqueness. To check content for plagiarism, you can in the following services:
P.S. Uniqueness shouldn’t be lower than 75%, but for technical or legal texts this limit may be lower due to the specifics of the content.
If the site is filled with texts based on rewriting, then pay attention to its quality. Today, the structure of texts, grammar, and other characteristics is well “understood” by search engines. Therefore, if the rewriting was superficial with a simple replacement of words with synonyms, then it will probably be revealed during indexing.
Deny in .htaccess
.htaccess file can also be used to close the site from indexing, so check it for no such lines:
SetEnvIfNoCase User-Agent "^Googlebot" search_bot
Prohibition in the CMS itself
Most of all, prohibited indexing of the site or pages is an issue for websites created on WordPress. The fact is that in the admin panel of this engine, there is a setting to prohibit indexing of the site. Even if indexing is open in Robots.txt, the site won’t be indexed. You need to make sure that the checkbox in this section is unmarked and indexing is allowed.
Complicated site architecture
All content on the site should be in a structured form, at various levels of nesting (Home page - Category - Subcategory - etc.). Moreover, it is not recommended to create more than five levels of nesting. If these two rules aren’t followed, the indexing process may take a long time, especially if there are many pages on the site.
The page or website is unavailable
The site and its pages should be available to both visitors and search crawlers, and give the code 200. For various reasons, this code may differ:
4xx - the page is not available at the current address. It’s not recommended to delete pages that are in the search engine index, because this can negatively affect the ranking of the site. It is better to do a 301-redirect from a remote page to another relevant one.
5xx - the server is unable to fulfill the request.
To check the server response code, go to the tool
Filters applied to the site
If a site or its pages suddenly dropped out of the index, a filter was probably imposed by the search engine. You can check for some filters in the webmasters' panels.
Go to the ‘Manual actions’ item in Google Search Console. If the site is sanctioned by Google employees, then this information will be available here.
The automatically imposed sanctions are not displayed in the panel, but they can be detected by indirect features.
Here is a list of some actions that can lead to sanctions from the search engines:
Inadequate external link building. Search engines don't like manipulations with ranking factors, so when it’s discovered, they apply filters, lowering the site in search results, or completely remove it (or individual pages) from the index.
Low quality or non-unique content.
Re-optimization of content. It is unacceptable to exceed the limit of 3-5% of keyword density and use words unnaturally in sentences.
Cheating behavioral factors, for example, fake site visits.
Duplicate content within the site
If duplicate pages appear on the site, some of them will drop out of the index. This could include the landing page.
Duplicates frequently appear due to the specifics of the CMS, when the same page is available at different addresses. But for a search engine, these are different documents. To avoid this, you should consider such nuances of the used CMS and configure it so duplicates don’t appear (for example, by installing modules), or so that a 301-redirect from them sends visitors to the main page.
These were the main reasons why a site can drop out of search. After one or several Updates of the search base have been identified and solved, the pages of the site will return to organic search results.
In this article, we covered the ways that you can submit a website to Google, as well as why it can drop out from search engines. If you are having difficulty with the search engines displaying your site, our web development experts are ready to help you at any time.