Ideal Guidance To Bing Webmaster Tools
Bing has normally been significant to SEOs and webmasters, and moderately is this more evident than with Bing Webmaster Tools. In several ways, Bing Webmaster Tools is literally more advanced – and serves more to SEO professionals – than its Google equivalent, Google Search Console. For this, we give them a round of cheers. We mean, would we have a Google Disavow Links Tool if Bing hadn’t announced one first? Maybe; but we still cheer Bing for serving to SEOs. Bing Webmaster Tools has a plenty to offer, and in this article, we basically aim to profile all of its crucial features. This information comes from both firsthand experience and Bing documentation.
Adding & Verifying Website
Adding a website to Bing Webmaster Tools is simple. After logging in, enter the URL for your website’s home page and click the “Add” button. From there, you will be pointed to a screen to enter basic information and a sitemap URL. Before you can begin nourishing your website and viewing data, the site must be certified. After adding a website, “Verify Now” will emerge, giving you 3 choices for verification. The 1st option contains XML File Verification, where the “BingSiteAuth.xml” file will require downloading and positioned in the root directory of the site. The 2nd option permits for Meta tag verification by placing the custom line of code offered by Bing into the homepage of your website. Finally, there is DNS verification, which necessities a bit more technical skill than the earlier two methods. You will require access to your hosting in order to edit the CNAME record to hold the verification code. After verification, anticipate that it will take 1-2 days for Bing in order to gather and display data for newly added websites. This permits time for indexing and crawling.
Bing Webmaster Tools provides a user-friendly interface in order to maintain a number of websites from a single account. When you log in to Webmaster Tools, you will observe a list of the sites you manage along with a snapshot of information about each, impressions, including clicks, pages indexed, and pages crawled.
Once your website has been verified, you will be able to approach its Site Dashboard from the My Sites page. Site Dashboards provide an overview of your recent website activity in Bing, a list of sitemaps you have submitted, your supreme organic search keywords, your supreme inbound links and a small drop-down menu of URL diagnostic utensils. You can click through to observe more inclusive data on any of these segments or utilize the left navigation in order to explore additional reports and tools.
Your sitemap(s) will notify Bing about your website’s structure, making it simpler for the search engine in order to crawl and index your website pages. While there are multiple ways to submit a sitemap to Bing, implement so through Bing Webmaster Tools’ Sitemaps characteristic is arguably the simplest. Here, you can submit a sitemap for your website and view sitemap information (such as the number of URLs, date submitted, last crawl date and many others). You can also resubmit or export your sitemaps here. If you are utilizing XML sitemaps, double examine your sitemap submission using theSitemaps.org protocol. If your website is going to be equally static, pay special attention to the website architecture as well.
You can appeal for Bing to re-crawl, or direct an initial crawl of, particular pages by utilizing the Submit URLs feature. This is used in order to submit URLs that are not presently in Bing’s index or require to be re-indexed due to alterations. Add one URL per line and click “Submit” to have the URLs estimated for search indexation to emerge in search results. As valuable as this feature is, it is currently restricted to 50 URL submissions per month (max 10 per day), so save the petitions for your most important pages. Note that this characteristic is also limited to root domains only.
Ignore URL Parameters
URL parameters are beneficial for tracking a category of user behaviors, but the downside is that sites that utilize them can end up with number of versions of the URL, each of which is catering the exact same content. This can lead to duplicate content problems, which can have an opposing impact on your SEO efforts. Luckily, Bing Webmaster Tools permits you to indicate which URL parameters Bing crawlers should avoid. Similar to Google’s parameter handling, the URL distribution on Bing permits you to resolve parameter troubles. You will define which query patterns Bing’s crawlers should avoid in order to prevent duplicate content in the Bing index, ignore a page’s index value from being break between a numbers of URL variations and avoid unneeded bandwidth usage by the search crawler.
Crawl Control Settings
Search engines desires to pull as much practical information as possible from your site; however, they don’t want to weigh your site down in doing so. As a result, Bing permits you to generate a customized crawling pattern with the Crawl Control feature. You can select from one of many presets or generate your own custom crawl rate depend on when your website traffic is lightest. Bing automatically chooses the best rate for your website, but you can adjust this manually. You may want to choose a custom crawl rate if you want Bingbot to visit during off-busy hours, especially if you are running promotions that may affect your bandwidth, expecting a huge amount of traffic, or deploying heavy online content promotion.
Bing has a characteristic of similar to Google’s Sitelinks, popularly known as Deep Links. These are the links that emerge beneath top-ranked search results, linking straightly to various landing pages within the website. Essentially, Deep Links permit for more visibility in search results by providing more content choices for customers to select. While you don’t have the capability to add Deep Links (they are automatically created based on what content Bing deems most crucial), you can block particular URLs from being Deep Linked within search results.