16- May2018
Posted By: DPadmin
9 Views

The first steps of your SEO audit: Indexing issues

Even a magic SEO wand will not get a web page to rank if the page has not been indexed. Contributor Janet Driscoll Miller suggests that making sure web pages can be indexed is key during an SEO audit.

Indexing is really the first step in any SEO audit. Why?

If your site is not being indexed, it is essentially unread by Google and Bing. And if the search engines can’t find and “read” it, no amount of magic or search engine optimization (SEO) will improve the ranking of your web pages.

In order to be ranked, a site must first be indexed.

Is your site being indexed?

There are many tools available to help you determine if a site is being indexed.

Indexing is, at its core, a page-level process. In other words, search engines read pages and treat them individually.

A quick way to check if a page is being indexed by Google is to use the site: operator with a Google search. Entering just the domain, as in my example below, will show you all of the pages Google has indexed for the domain. You can also enter a specific page URL to see if that individual page has been indexed.

When a page is not indexed

If your site or page is not being indexed, the most common culprit is the meta robots tag being used on a page or the improper use of disallow in the robots.txt file.

Both the meta tag, which is on the page level, and the robots.txt file provide instructions to search engine indexing robots on how to treat content on your page or website.

The difference is that the robots meta tag appears on an individual page, while the robots.txt file provides instructions for the site as a whole. On the robots.txt file, however, you can single out pages or directories and how the robots should treat these areas while indexing. Let’s examine how to use each.

Robots.txt

If you’re not sure if your site uses a robots.txt file, there’s an easy way to check. Simply enter your domain in a browser followed by /robots.txt.

Here is an example using Amazon (https://www.amazon.com/robots.txt):

The list of “disallows” for Amazon goes on for quite awhile!

Google Search Console also has a convenient robots.txt Tester tool, helping you identify errors in your robots file. You can also test a page on the site using the bar at the bottom to see if your robots file in its current form is blocking Googlebot.


If a page or directory on the site is disallowed, it will appear after Disallow: in the robots file. As my example above shows, I have disallowed my landing page folder (/lp/) from indexing using my robots file. This prevents any pages residing in that directory from being indexed by search engines.

There are many cool and complex options where you can employ the robots file. Google’s Developers site has a great rundown of all of the ways you can use the robots.txt file. Here are a few:

Robots meta tag

The robots meta tag is placed in the header of a page. Typically, there is no need to use both the robots meta tag and the robots.txt to disallow indexing of a particular page.

In the Search Console image above, I don’t need to add the robots meta tag to all of my landing pages in the landing page folder (/lp/) to prevent Google from indexing them since I have disallowed the folder from indexing using the robots.txt file.

However, the robots meta tag does have other functions as well.

For example, you can tell search engines that links on the entire page should not be followed for search engine optimization purposes. That could come in handy in certain situations, like on press release pages.

Probably the two directives used most often for SEO with this tag are noindex/index and nofollow/follow:

  • Index follow. Implied by default. Search engine indexing robots should index the information on this page. Search engine indexing robots should follow links on this page.
  • Noindex nofollow. Search engine indexing robots should NOT index the information on this page. Search engine indexing robots should NOT follow links on this page.

The Google Developer’s site also has a thorough explanation of uses of the robots meta tag.

XML sitemaps

When you have a new page on your site, ideally you want search engines to find and index it quickly. One way to aid in that effort is to use an eXtensible markup language (XML) sitemap and register it with the search engines.

XML sitemaps provide search engines with a listing of pages on your website. This is especially helpful when you have new content that likely doesn’t have many inbound links pointing to it yet, making it tougher for search engine robots to follow a link to find that content. Many content management systems now have XML sitemap capability built in or available via a plugin, like the Yoast SEO Plugin for WordPress.

Make sure you have an XML sitemap and that it is registered with Google Search Console and Bing Webmaster Tools. This ensures that Google and Bing know where the sitemap is located and can continually come back to index it.

How quickly can new content be indexed using this method? I once did a test and found my new content had been indexed by Google in only eight seconds — and that was the time it took me to change browser tabs and perform the site: operator command. So it’s very quick!

JavaScript

In 2011, Google announced it was able to execute JavaScript and index certain dynamic elements. However, Google isn’t always able to execute and index all JavaScript. In Google Search Console, the Fetch and Render tool can help you determine if Google’s robot, Googlebot, is actually able to see your content in JavaScript.

In this example, the university website is using asynchronous JavaScript and XML (AJAX), which is a form of JavaScript, to generate a course subject menu that links to specific areas of study.

The Fetch and Render tool shows us that Googlebot is unable to see the content and links the same way humans will. This means that Googlebot cannot follow the links in the JavaScript to these deeper course pages on the site.

Conclusion

Always keep in mind your site has to be indexed in order to be ranked. If search engines can’t find or read your content, how can they evaluate and rank it? So be sure to prioritize checking your site’s indexability when you’re performing an SEO audit.

Source: The first steps of your SEO audit: Indexing issues – Search Engine Land

04- Apr2018
Posted By: DPadmin
32 Views

How to get started with local SEO

Local SEO proved to be one of the biggest trends throughout 2016 and 2017, and is expected to continue doing so throughout 2018.

Businesses that have been able to optimize their on-page and off-page SEO strategies are already reaping the supreme benefits of local SEO. For others, there are undeniable opportunities to begin their local SEO journeys.

Google suggests that 80% users conduct online searches for local businesses, while 50% of users who do a local search on mobile for a business visit its store within a day. Yet businesses continue to miss the opportunities that local SEO provides.

Don’t be that business. Instead, use the tips and tricks mentioned in this guide to get started with local SEO.

Claim your Google My Business page and optimize it

Google+ might have mostly fizzled out, but Google My Business continues to be a cornerstone for implementing local SEO. If you’ve not claimed a Google My Business listing for your business yet, this is the time to do so. The chances of your business featuring on the front page in a local relevant search improve manifold purely by having a well optimized and filled out My Business Listing.

Go to google.com/business, start the registration and verification process, and wait for Google to send you a postcard to your physical store location.

Make sure you understand that Google only allows real business owners to have their My Business pages; so you need to work out an arrangement with your digital marketing consultants so that you continue to own the My Business listing even if they depart.

Your business name, address, and phone number (abbreviated as NAP) must match what you have been using for digital marketing till now. Also, lay special emphasis on selecting categories, business hours, types of payment accepted, etc.

Then, have top quality photographs of the office front and insides uploaded on to the profile. Digital businesses without a location can hide the address to still be able to claim their My Business listing.

Here’s what a well maintained and optimized Google My Business profile could look like on a search page.

Understand and master the art of citations

Here’s it, put simply – every mention of your business online is a citation. More citations are good for your business’ local SEO. How does Google consider a mention as a citation? Well, your business NAP has to be mentioned for it to be counted as a citation.

Too many businesses have already lost several months of efforts in getting themselves mentioned online, purely because of inconsistent NAP. Though increasingly there’s consensus among digital marketers that Google actually triangulates data and identifies slightly different business names as belonging to the same business using NAP, we’d recommend you play it safe.

Keep on optimizing your website for mobile

Though this is something every website owner must do, local business website owners need to speed up their game particularly well. That’s because a majority of local searches are done on mobile devices, and are intent-backed.

Responsive layouts, intuitive user experience and interface design, etc. are the basics; you need to step past them! Google’s Mobile Friendly testing tool is a great starting point. I did a test on a post I was reading recently, and was impressed with the tool’s validation.

Add business directories to your to do lists

Apart from giving you a valuable citation online, business directory pages for your business also garner more visibility for your business. Here are some action points for you.

  • Start with the most notable business review directory websites such as Yelp and CitySearch
  • Next, use this list of business directories and create your business profiles on each (target at least 7 complete profiles per week)
  • Look for niche specific business directories and create your profiles there
  • Look for local business community websites, and grab your listing there
  • Check if the state government has a Chamber of Commerce or equivalent website, and look for a way to get a mention there
  • Use the services of citation aggregators like Infogroup, Acxiom, and Factual
  • Look for an opportunity for a citation via local newspaper websites
  • Of course, remember to get your NAP spot on every time.

‘Localize’ your website’s content

You can do a lot to help search engines understand your business’ local appeal by optimizing your website for the same. Local content, for instance, can help search engines contextualize your website’s niche to its local service. Then, you could include an interactive map widget to further enhance the local SEO appeal of your website.

Also, consider creating a separate local news section on your website, wherein you could post content about niche-related local events. This will serve you well in terms of allowing the usage of local SEO relevant keywords.

Businesses such as restaurants, lawyer services, house repairs and interior décor, etc. have a lot to gain by using these basic tactics.

Be very hungry for online reviews

A Moz report attributes 8.4% of ranking value to online reviews. It doesn’t sound much, but considering how 88% users depend on online reviews to form opinions on quality of businesses, brands, and products, the eventual impact of reviews is significant.

Google My Business reviews are the primary source of SEO juice; you need at least 5 reviews for Google to start showing your reviews. Facebook Business reviews must be the next on your radar, because of the trust they inspire among online users. 

There are several other review websites you need to take care of, to maximize the local SEO benefit from the same. To get more reviews, try out these tactics:

  • Motivate store managers and field sales personnel to get reviews from customers on handy mobile devices, asking them log in to, for instance, Zomato or Yelp, and doing it on the spot (consider giving them a little discount for the same)
  • Use email marketing, with a single link that takes users to the reviews page
  • Consider using a social listening tool such as HootSuite to be alerted of your business and brand mentions, which you can transform into reviews
  • It’s worthwhile seeking services of online reputation management agencies for this.

Invest effort in local SEO relevant rich schema

Schema markup can be added to your website’s code to enhance its readability for search engines. There are several scheme markup tags that specifically focus on local attributes of your website.

Local schema markup tags assists local SEO in two ways:

  • First, it allows search engines to understand your business’ local relevance
  • Second, it means search engines can show your business page result along with rich snippet info such as phone number, address, business working hours, ratings, reviews, etc.

Here’s an example of how web results with local SEO schema markup appear on SERPs.

Local schema markup is beyond the scope of this guide, but here’s a good tutorial from Schema App.

Don’t forget to run your website through Google Structured Data Testing Tool to understand if the schema markup is done correctly.

Concluding remarks

As you read this, there are hundreds of potential customers searching for businesses in your neighborhood. Your website could be staring at them through their desktops and mobile phones, as soon as you get started on local SEO with the tips, tricks, tools, and methods described in this guide.

 

Source: How to get started with local SEO | Search Engine Watch