19- Apr2018
Posted By: DPadmin
19 Views

The 8 Things Most People Get Wrong About SEO

I’ve been working in the search engine optimization (SEO) space for years, yet I’m still pleasantly surprised to learn new things about the industry. I’ll discover a new update, or witness a trick used by one of my colleagues, and rush to the drawing board to incorporate it into my running campaigns. SEO is truly an industry of constant evolution and discovery, so I try not to succumb to the illusion that I know everything about it.

But on the other hand, the fundamentals of SEO have remained more or less the same, despite two decades of progression. And, in part because people never bothered to learn how SEO really works and in part because of myths that are still circulated by uninformed writers, most people still don’t fully understand how those fundamentals work.

In my conversations with SEO newcomers (including some people radically opposed to the idea), I’ve discovered there are eight main points that most people get wrong about SEO:

    1. It’s a gimmick, trick, or scheme. The way some people talk about SEO, it’s natural to think it’s some kind of gimmick. It may have been presented to you as a sequence of tricks designed to get your site to rank above others in search results; but this is only partially true. The white-hat search optimizer isn’t trying to deceive Google’s search algorithm or game their way to the top. Instead, they’re trying to figure out what website features and content are most important to users (and search engines), and provide it to them. Most of the time, this results in organic, well-intentioned website improvements—not spam, hacks, or short-term tricks.
    1. Keyword rankings are all that matter. Yes, one of SEO’s biggest priorities is getting you ranked as high as possible in search engine results pages (SERPs), but this often leads to an error in prioritization, with marketers believing keyword rankings are all that matter. In fact, there are dozens of metrics and key performance indicators (KPIs) you should be measuring to gauge your campaign’s success, and keyword rankings are only one of them.
    2. Google penalties are a major threat. The way some people write about Google penalties, you’d think they were handed out more often than speeding tickets. But the reality is, the most severe Google penalties are a result of a manual action—in response to truly egregious behavior that most webmasters know to stay away from. Automatic penalties, or temporary ranking drops, are more common but far less severe. If you follow best practices, you have nothing to worry about.
  1. The less you spend on SEO, the better. SEO is known for being a cost-effective strategy with a high return on investment (ROI). Accordingly, many newcomers think the best approach to SEO is to spend as little as possible to avoid risk and maximize long-term returns. However, low budgets often come with amateur work and minimal strategic execution; in many cases, it’s better to spend more on better services.
  2. SEO is too technically complex. It’s true that there are many technical components to SEO, and to a first-timer, things like robots.txt file editing and canonical tags can look intimidating. But even without coding experience, it’s possible to learn the basics of areas like these within a few hours. I maintain that SEO is highly learnable—so long as you’re dedicated to mastering it. And to help people learn it, I wrote SEO 101: A Guide for the Technically Challenged.
  3. SEO is easy. That said, I’ve also seen people on the other side of the fence, insisting that SEO is so simple anyone can do it without experience. That isn’t quite true either. You can learn many SEO concepts in an afternoon, but there are so many variables to remember and so many strategic directions you could take, it takes years of practice before you can consider yourself a master. And even then, you need to keep up with the latest industry changes if you want to stay relevant.
  4. Link building is spam. Link building can be spammy—if you execute it poorly or without strategic planning. But capable link builders know that the tactic isn’t about stamping your links on as many off-site pages as possible; it’s about creating relevant, valuable content that people want to read, and including natural, informative links within that content to boost your search relevance. If you’re doing link building right, you’ll be adding value to the web (and boosting your own domain authority as a fortunate side effect).
  5. The process is always the same. This is one of the biggest misconceptions I see; people seem to think the SEO process is always the same. They expect an SEO agency to use a reliable procedure, step by step, and get the same results for client B that they did for client A, within the same timeframe. But the truth is this is nearly impossible; SEO is an art as much as it is a science, and different clients will require different targeting strategies, execution methods, and investment levels to get comparable results.

If you’ve held any of these beliefs or assumptions, I can’t blame you; with so much content in circulation, and few opportunities to learn the basics of the strategy, it’s natural that you may have a skewed vision of how SEO really works. Of course, even if you do have a grasp of the fundamentals, there’s always something new to learn coming up around the bend.

Hopefully, this article has given you grounds to challenge one of your underlying assumptions, has taught you something new, or has sparked a renewed interest in SEO. There’s much to learn, even from a ground level, and plenty of time to learn it.

Source: The 8 Things Most People Get Wrong About SEO

06- Apr2018
Posted By: DPadmin
31 Views

Auditing customer reviews for organic traffic growth without losing speed or attracting penalties 

User-generated content on product or service pages can be key to driving conversions and a fantastic way to add unique content to a page.

If you don’t have the resources to write good content yourself, user-generated content can be especially helpful. However, if your customer review content isn’t optimized for search engines, it can work against you and delay or obstruct your marketing efforts instead of driving more business.

Below are four common issues (and a bonus) I have come across when auditing retailer product pages and the workarounds I’ve used for each.

1. Page speed

This is a much-discussed subject, and as of late, it is a mobile search ranking factor coming July 2018. It is key to sync with your web developers on the optimal page load speed, as images, related products and content will impact load times for this critical part of the purchase funnel.

Customer review content is best when optimized for both Hypertext Markup Language (HTML) and page speed. Suggesting you open the floodgates to 500+ reviews on a product page is not ideal for anyone (adds content but also adds load time). Search engine optimization specialists (SEOs) and developers will agree most third-party review providers will issue a standard eight to 10 visible reviews on a page before transitioning to another mechanism for accessing the remaining reviews.

Ask your dev team the threshold of reviews on your pages (don’t feel limited to 10) before speed is impacted by your desired load time, and run tests.

There are a few different ways review content can be exposed to users and search engines:

  • Create a secondary page to “read all reviews.” This page can also host the remaining reviews and can be optimized for “product + reviews” search queries. Examples of this can be seen on both Amazon.com and Bestbuy.com in this framework:
auditing customer review copy for organic traffic
Image courtesy Ayima
  • Apply a paginated approach within the main product page, to load the next round of reviews following your determined threshold being hit. For example, after 20 reviews, click next to get the next 20 reviews and so on. If your pagination is implemented correctly (rel=next / prev), this content will still be crawled by search engines.
  • As a last resort, you could also consider using JavaScript to load more reviews following your determined threshold, for user experience (UX). However, all the reviews would be loaded to the source code via a tag should JavaScript be disabled. This will add page load time depending on the volume of reviews you have for a single page and allow search engines to still have access to the content.

2. Structured data markup: Have you done it right?

Marking up your product pages with structured data, including the aggregate rating and reviews, can generate a rich result in the search engine results pages (SERPs), which can increase your click-through rate (CTR) over competitors and provides more information to the crawlers about the content on your page. You already know the benefit of markup, but has it been done correctly? You may not know you have an error!

SERP rich result for customer reviewsIf you use a third-party provider for your reviews, they typically supply the markup on those reviews when they are syndicated to your site. We have seen two issues here:

  • Mixing two different schema vocabulary encodings. You’ve decided to code in JavaScript Object Notation for Linked Data (JSON-LD), but your review provider is using Microdata. The result? Two different languages not speaking to each other. In order to achieve the rich result, all product properties must be encoded to the same vocabulary. Ask your review provider to update their feed with the same schema encoding you have on your site.
  • The reviews have been marked up outside the itemscope product. This applies to Microdata markup, not JSON-LD. Your page has a separate div the customer review content is pulled into, that lives outside of the div you’ve marked up with your Microdata product schema property. Unfortunately, this is like trying to have a conversation with someone on the other side of a door. Search engines can’t make the connection that the marked-up reviews pertain to the same product you’ve outlined in your schema and therefore does not assign the ratings and reviews to the rich result.

When testing either of these in the Structured Data Testing Tool, it won’t actually flag as an error or warning since it’s testing to see if you have structured data and the required elements, which you do. If you’re not getting rich results, one of these could be the culprit.

3. Shared or syndicated reviews

It’s not uncommon to see retailers pull reviews from a vendor site onto their own site, or sometimes a retailer shares customer reviews across multiple country code Top Level Domains (ccTLDs) where they sell the same product/service. If done incorrectly, both of these scenarios can cause duplication of the user-generated content and dilute the value of the page. Worst case: a penalty for the syndication of reviews!

Sharing vendor review content across multiple sites (typically retailers): Are you aware of how many retailers are getting the same feed for the same product information and reviews? Perhaps you’re the vendor and want to protect your unique content on your site while still sharing to retailers for increased conversions. Here is an example of a pair of UGG boots for sale on Macys.com but pulling from Ugg.com:

Potential solutions for the UGG boots review could be to block crawlers from he syndicated review content, or perhaps it should be embargoed on the original source for a determined period of time.

That time should be determined based on the content being crawled and indexed before it is shared with other parties. Check log files and crawl rates to determine an approximate time for your site; and test the indexation of that new content once crawled. This allows the search engines to determine the original source.

Often, retailers want to leverage the reviews from their other domains to help sell the product. This is fine, but the duplicate reviews must be blocked from the crawlers. This will continue to benefit the sale without harming or penalizing your site for duplicating the user-generated content.

Amazon is a good example of this. Years ago, they pulled the Amazon.com reviews into the Amazon.ca pages. This practice was later halted in favor of still showing the reviews, but blocking crawler access. Now, they simply provide a link to their Amazon.com site for more reviews.

4. Coding customer reviews

And finally, none of the above matters if the review content isn’t accessible to search engine crawlers. How you code your reviews to be displayed on the page (JavaScript vs HTML) will impact using that content for improved rankings and traffic growth.

This happens all too often: JavaScript is disabled and customer reviews disappear from the page. You can also test this in Google Search Console and run a fetch and render. Bear in mind cached content is not the same as having content in Google’s index; they are separate entities.

This falls into a bigger, much-debated and much-tested subject: Does Google crawl and index content rendered with JavaScript?

There’s no clear handbook on Google crawling and indexing different JavaScript frameworks; it’s something we’ve tested several times over. We’ve found many articles that try to help audit and explain JavaScript rendering for SEO, and while there are some good pointers, you must check everything against your own site, as we have found what works for one does not work for another.

We have performed our own in-house testing where HTML content is converted to JavaScript and seen no change to the page ranking and indexing. Therefore, we know it can work, but will it always?

JavaScript is not going away, yet it’s not clear it can be indexed. While client-rendered HTML can be indexed by Google, it is not always perfect.  Going forward, you will need to have server-rendered content for SEO purposes.

At the very least, perform the audit on your reviews and ensure Googlebot is crawling and indexing them.

You may need to speak further with your review provider to ensure the content is accessible to Googlebot. Depending on the provider and template, they can help resolve any errors or concerns here.

Bonus: 5. Use your XML sitemaps

Now that you’ve created customer reviews that will drive more crawlable content, let the search engines know! Updating the eXtensible Markup Language (XML) sitemap entries will be a strong signal to incentivize the recrawl of those specific pages and access changes sooner. Pending your crawl rates and the number of pages on site, it may be a long time before a crawler gets to all your updated customer review content.

Summary

To wrap things up, customer reviews are a fantastic source for growing your organic traffic by means of unique content. Audit your reviews for the following:

  1. Product page load time.
  2. Structured data markup.
  3. Audit of shared reviews.
  4. Crawler access to the review content.
  5. XML sitemap updates.

Following these guidelines can deliver significant organic search growth. Run the audit and take a closer look at the reviews on your site for improvements you could be making.

Source: Auditing customer reviews for organic traffic growth without losing speed or attracting penalties – Search Engine Land