16- May2018
Posted By: DPadmin
4 Views

The first steps of your SEO audit: Indexing issues

Even a magic SEO wand will not get a web page to rank if the page has not been indexed. Contributor Janet Driscoll Miller suggests that making sure web pages can be indexed is key during an SEO audit.

Indexing is really the first step in any SEO audit. Why?

If your site is not being indexed, it is essentially unread by Google and Bing. And if the search engines can’t find and “read” it, no amount of magic or search engine optimization (SEO) will improve the ranking of your web pages.

In order to be ranked, a site must first be indexed.

Is your site being indexed?

There are many tools available to help you determine if a site is being indexed.

Indexing is, at its core, a page-level process. In other words, search engines read pages and treat them individually.

A quick way to check if a page is being indexed by Google is to use the site: operator with a Google search. Entering just the domain, as in my example below, will show you all of the pages Google has indexed for the domain. You can also enter a specific page URL to see if that individual page has been indexed.

When a page is not indexed

If your site or page is not being indexed, the most common culprit is the meta robots tag being used on a page or the improper use of disallow in the robots.txt file.

Both the meta tag, which is on the page level, and the robots.txt file provide instructions to search engine indexing robots on how to treat content on your page or website.

The difference is that the robots meta tag appears on an individual page, while the robots.txt file provides instructions for the site as a whole. On the robots.txt file, however, you can single out pages or directories and how the robots should treat these areas while indexing. Let’s examine how to use each.

Robots.txt

If you’re not sure if your site uses a robots.txt file, there’s an easy way to check. Simply enter your domain in a browser followed by /robots.txt.

Here is an example using Amazon (https://www.amazon.com/robots.txt):

The list of “disallows” for Amazon goes on for quite awhile!

Google Search Console also has a convenient robots.txt Tester tool, helping you identify errors in your robots file. You can also test a page on the site using the bar at the bottom to see if your robots file in its current form is blocking Googlebot.


If a page or directory on the site is disallowed, it will appear after Disallow: in the robots file. As my example above shows, I have disallowed my landing page folder (/lp/) from indexing using my robots file. This prevents any pages residing in that directory from being indexed by search engines.

There are many cool and complex options where you can employ the robots file. Google’s Developers site has a great rundown of all of the ways you can use the robots.txt file. Here are a few:

Robots meta tag

The robots meta tag is placed in the header of a page. Typically, there is no need to use both the robots meta tag and the robots.txt to disallow indexing of a particular page.

In the Search Console image above, I don’t need to add the robots meta tag to all of my landing pages in the landing page folder (/lp/) to prevent Google from indexing them since I have disallowed the folder from indexing using the robots.txt file.

However, the robots meta tag does have other functions as well.

For example, you can tell search engines that links on the entire page should not be followed for search engine optimization purposes. That could come in handy in certain situations, like on press release pages.

Probably the two directives used most often for SEO with this tag are noindex/index and nofollow/follow:

  • Index follow. Implied by default. Search engine indexing robots should index the information on this page. Search engine indexing robots should follow links on this page.
  • Noindex nofollow. Search engine indexing robots should NOT index the information on this page. Search engine indexing robots should NOT follow links on this page.

The Google Developer’s site also has a thorough explanation of uses of the robots meta tag.

XML sitemaps

When you have a new page on your site, ideally you want search engines to find and index it quickly. One way to aid in that effort is to use an eXtensible markup language (XML) sitemap and register it with the search engines.

XML sitemaps provide search engines with a listing of pages on your website. This is especially helpful when you have new content that likely doesn’t have many inbound links pointing to it yet, making it tougher for search engine robots to follow a link to find that content. Many content management systems now have XML sitemap capability built in or available via a plugin, like the Yoast SEO Plugin for WordPress.

Make sure you have an XML sitemap and that it is registered with Google Search Console and Bing Webmaster Tools. This ensures that Google and Bing know where the sitemap is located and can continually come back to index it.

How quickly can new content be indexed using this method? I once did a test and found my new content had been indexed by Google in only eight seconds — and that was the time it took me to change browser tabs and perform the site: operator command. So it’s very quick!

JavaScript

In 2011, Google announced it was able to execute JavaScript and index certain dynamic elements. However, Google isn’t always able to execute and index all JavaScript. In Google Search Console, the Fetch and Render tool can help you determine if Google’s robot, Googlebot, is actually able to see your content in JavaScript.

In this example, the university website is using asynchronous JavaScript and XML (AJAX), which is a form of JavaScript, to generate a course subject menu that links to specific areas of study.

The Fetch and Render tool shows us that Googlebot is unable to see the content and links the same way humans will. This means that Googlebot cannot follow the links in the JavaScript to these deeper course pages on the site.

Conclusion

Always keep in mind your site has to be indexed in order to be ranked. If search engines can’t find or read your content, how can they evaluate and rank it? So be sure to prioritize checking your site’s indexability when you’re performing an SEO audit.

Source: The first steps of your SEO audit: Indexing issues – Search Engine Land

16- May2018
Posted By: DPadmin
4 Views

SEO Strategies and Keyword Rankings: Mobile Versus Desktop

As if we didn’t already have enough to think about in any given SEO campaign, it is now imperative to separate and refine your approaches to mobile and desktop search.

While mobile has become hugely significant over the last couple of years, this shouldn’t be to the neglect of desktop. Although SEO for mobile and desktop follow the same basic principles and best practices, there are nuances and discrepancies that need to be factored in to your overall strategy.

Part of this is the keyword rankings: you won’t ever know how to adapt your strategies if you’re not tracking the rankings separately for each. Research from BrightEdge found that 79% of listings have a different rank on mobile devices compared with desktop, and the top-ranking result for a query is different on desktop and mobile 35% of the time. These are statistics that simply cannot be ignored.

Why do they do differ?

Before delving into how to compare keyword rankings on mobile and desktop, it’s first important to acknowledge the why and the what: why they are different and what it means for your SEO strategy.

It’s paramount to understand that desktop and mobile searches use different algorithms. Ultimately, Google wants to provide the best user experience for searchers, whatever device they are using. This means creating a bespoke device-tailored experience and in order to do that, we need to delve deeper into user intent.

It’s all about user intent

The crux of the mobile versus desktop conundrum is that user intent tends to differ for each device. This is particularly important when considering how far along the funnel a user is. It’s a generalization, but overall mobile users are often closer to the transactional phase, while desktop users are usually closer to the informational phase.

For example, we can better understand user intent on mobile by understanding the prevalence of local search. If a user is searching for a product or service on mobile, it is likely to be local. In contrast, users searching for a product or service on desktop are more likely to be browsing non-location-specific ecommerce sites.

Let’s also consider the types of conversions likely to occur on each device, in terms of getting in touch. Users on mobile are for more likely to call, by simply tapping the number which appears in the local map pack section. Alternatively, desktop users would be more inclined to type an email or submit a contact form.

What on earth is a micro-moment?

To better understand the different ways in which consumers behave, it may help to spend a little time familiarizing yourself with micro-moments. These refer to Google’s ability to determine a searcher’s most likely intent, and is particularly important for mobile users, when a consumer often needs to take immediate action.

For example, if a user is searching for a local product or service, the local map pack will appear, but if they are searching for information then the quick answer box will appear. These micro-moments therefore have a significant impact on the way the SERPs are constructed.

Once you’ve understood the user intent of a given searcher, you can ensure that you are providing content for both mobile and desktop users. However, it’s worth bearing in mind that content with longer word counts continues to perform well on mobile, despite the general consensus that people on mobile simply can’t be bothered to consume long form content. This harks back to Google’s prioritization of high quality content. Besides, anybody who has a long train commute into work will understand the need for a nice, long article to read on mobile.

Rankings tools

With that context, we can now return to the matter at hand: rankings. Of course, you could record the rankings for both desktop and mobile the old-fashioned, manual way, but who has time for that? In short, any good SEO tool worth its salt will enable you to track both desktop and mobile rankings separately. Here are some favorites:

  • SEMRush is a personal favorite among the plethora of fancy SEO tools. SEMRush provides a comprehensive breakdown of mobile vs desktop results (as well as tablet if you really want to geek out) and displays the percentage of mobile-friendly results for your domain.
  • SearchMetrics offers Desktop vs. Mobile Visibility metrics, detailing individual scores for desktop and mobile, as well as overlap metrics which show how many keyword search results appear in exactly the same position for both. You can also drill down further to view how a website performs with regard to localized results.
  • Google Search Console. Don’t have access to any of the above tools? Don’t worry as you can still rely on the trusty Google Search Console. When looking at your search analytics, filter devices by comparing mobile and desktop. Even if you do have access to an SEO tool that allows you to do comparison analysis, it’s definitely still worth checking in on your Search Console insights.

Rankings are only part of the picture

It’s important to remember that rankings are only a tiny part of the picture; it’s essential to take a more holistic approach to the mobile vs desktop issue. This means taking the time to dig around Google Analytics and unearth the data and meaning beyond the vanity metrics.

You may have higher rankings for mobile, but those users might be bouncing more regularly. Is this a reflection of the user intent or is it a poor user experience? Does higher rankings for one device correlate to higher conversions? If not, then you need to consider the reasons for this. There’s no one-size-fits-all answer, so you must take a tailored approach to your strategy.

Quick tips for differentiating your strategies

You’ve got your mobile and desktop rankings sorted. Now you need to create or amend your strategies for both devices. Here are some quick tips to do so:

  • Separate mobile and desktop-specific search terms in your keyword research
  • Factor in voice search for mobile devices
  • Consider implementing Accelerated Mobile Pages where appropriate
  • Carry out a mobile SEO audit on your site
  • Include mobile vs desktop into your tracking and reporting, going beyond the rankings
  • Revisit your content strategy to ensure you are factoring in both mobile and desktop optimized content – cater for both types of user.

In short, tracking your keywords on mobile and desktop is absolutely essential for both reporting accuracy and supporting separate SEO strategies for each device. But don’t stop there; it’s more important to understand why the rankings differ and how you can use that information to refine your SEO strategies.