06- Apr2018
Posted By: DPadmin
38 Views

Search outpaced social for referral traffic last year, driving 35% of site visits vs social’s 26% share of visits 

According to a new referral traffic report from Shareaholic, 2017 was the first time since 2014 search owned a larger share of visits over social.

After a year fraught with terms like “fake news,” and headlines centering around brand safety issues and extreme content, it appears the actions taken by social sites to curb the influx of malicious content is turning out to be a real boon for search referral traffic.

For the first time since 2014, Shareaholic says search outpaced social in the percentage of overall traffic it delivered in 2017. According to the analytic platform’s data, search drove 34.8 percent of site visits in 2017 compared to social networks which accounted for 25.6 percent of referral traffic.

Chartbeat, an analytics platform for online publishers and media organizations, has witnessed a similar trend with traffic from Google search to publisher websites up more than 25 percent since the start of 2017.

“Google Search has always been the largest referrer to Chartbeat clients,” writes the company’s CEO, John Saroff, on Chartbeat’s blog, “In late August, Chartbeat data scientists noticed that Google Search referrals across our client base were trending up.”

The CEO says his team initially thought the rise in Google referrals were attached to events like last year’s solar eclipse and Hurricane Irma, but traffic continued to rise even after news headlines around the events subsided. Instead of falling back into normal patterns, Chartbeat saw Google search driving even more traffic to publisher sites.

Search beats out social for share of visits

“At a high level, it’s clear that social media’s tenuous grip on being the top referral category is over. After beating out search for the last three years, it’s given back the title, driven by changes to the algorithms behind Facebook’s News Feed,” writes Shareaholic in its latest traffic report.

Shareaholic’s findings are based on traffic to more than 250,000 mobile and desktop sites that have opted-in to the content marketing platform’s publishing tools. The company says it analyzed a variety of traffic sources — direct traffic, social referrals, organic search and paid search — for websites that ranged in size from a thousand monthly unique visitors to one million, and spread across a broad selection of website categories (food, tech, fashion and beauty, marketing, sports, general news, and more).

Google was the top overall traffic referrer for the year, and owned a 36.82 percent share of visits during the second half of 2017. While Google’s share of visits was up more than seven percentage points between the second half of 2016 and the second half of 2017, Facebook’s dropped 12.7 percent during the same time frame.

Even with a double-digit drop however, Facebook remained the top social network for share of visits in 2017.

Shareholic notes the changes Facebook has made to its news feed algorithm, boosting content from “trusted” news sources while penalizing spammy, click-bait headlines, influenced the site’s drop in share of visits: “After a rocky 2016 US election year, Facebook made a number of major changes to what content they display in the news feed and how they display it.”

The two charts below, one from Shareaholic and the other from Parse.ly convey similar trends with respect to search vs. social referral traffic in 2017, through the third quarter of the year. The Parse.ly data reflects the upward trend in referral traffic from Google (all – including AMP – Google’s Accelerated Mobile Pages format) and declining trend in referral traffic from Facebook specifically (all Facebook – including Instant Articles).

Search vs. Social Referral Traffic – 2017 from Shareaholic

Google Search and Facebook Referral Traffic – 2017 from Parse.ly

Publishers also see continued gains from search driven by AMP

While Shareaholic’s traffic referral report is based on a wide category of websites, Chartbeat’s data is specifically attached to publishers’ web traffic.

As mentioned earlier, Chartbeat saw a 25 percent surge in traffic to publisher sites by Google search over the last year. Josh Schwartz, Chartbeat’s chief of product, engineering and data, told Digiday that Facebook referrals to publishers was down fifteen percent in 2017 — aligning with Shareaholic’s findings.

Facebook’s news feed algorithm tweaks to curb fake news and spam content are definitely impacting its overall referral traffic numbers, but Chartbeat reports the most significant factor driving traffic to its clients’ sites is AMP content. After analyzing whether or not the rise in traffic was the result of a bug, or “un-darkening” of previously dark social traffic and finding nothing, Chartbeat turned its attention to mobile versus desktop traffic numbers.

“We then looked specifically at search traffic by device and the answer was clear from our dataset. Mobile Google Search referrals were up significantly while Desktop Google Search referrals were flat,” writes Saroff.

Chartbeat then dug further into its data to evaluate sites using AMP and said it found a “stark” difference between the sites using AMP and those that were not.

“While Mobile Google Search traffic to our AMP-enabled publishers is up 100 percent over the same time-frame, traffic to publishers not using AMP is flat.”

Chartbeat says, during the last six months, Google Mobile Search referrals now outpace both mobile and desktop Facebook referrals.

Source: Search outpaced social for referral traffic last year, driving 35% of site visits vs social’s 26% share of visits – Search Engine Land

26- Jul2017
Posted By: DPadmin
149 Views

SEO: 7 Reasons to Use a Site Crawler 

No matter how well you think you know your site, a crawler will always turn up something new. In some cases, it’s those things that you don’t know about that can sink your SEO ship.

Search engines use highly developed bots to crawl the web looking for content to index. If a search engine’s crawlers can’t find the content on your site, it won’t rank or drive natural search traffic. Even if it’s findable, if the content on your site isn’t sending the appropriate relevance signals, it still won’t rank or drive natural search traffic.

Since they mimic the actions of more sophisticated search engine crawlers, third-party crawlers, such as DeepCrawl and Screaming Frog’s SEO Spider, can uncover a wide variety of technical and content issues to improve natural search performance.

7 Reasons to Use a Site Crawler

What’s out there? Owners and managers think of their websites as the pieces that customers will (hopefully) see. But search engines find and remember all the obsolete and orphaned areas of sites, as well. A crawler can help catalog the outdated content so that you can determine what to do next. Maybe some of it is still useful if it’s refreshed. Maybe some of it can be 301 redirected so that its link authority can strengthen other areas of the site.

How is this page performing? Some crawlers can pull analytics data in from Google Search Console and Google Analytics. They make it easy to view correlations between the performance of individual pages and the data found on the page itself.

Not enough indexation or way too much? By omission, crawlers can identify what’s potentially not accessible by bots. If your crawl report has some holes where you know sections of your site should be, can bots access that content? If not, there might be a problem with disallows, noindexcommands, or the way it’s coded that is keeping bots out.

Alternately, a crawler can show you when you have duplicate content. When your sifting through the URLs listed, look for telltale signs like redundant product ID numbers or duplicate title tags or other signs that the content might be the same between two or more pages.

Keep in mind that the ability to crawl does not equate to indexation, merely the ability to be indexed.

What’s that error, and why is that redirecting? Crawlers make finding and reviewing technical fixes much faster. A quick crawl of the site automatically returns a server header status code for every page encountered. Simply filter for the 404s and you have a list of errors to track down. Need to test those redirects that just went live? Switch to list mode and specify the old URLs to crawl. Your crawler will tell you which are redirecting and where they’re sending visitors to now.

Is the metadata complete? Without a crawler, it’s too difficult to identify existing metadata and create a plan to optimize it on a larger scale. Use it to quickly gather data about title tags, meta descriptions, and keywords, H headings, language tags, and more.

Does the site send mixed signals? When not structured correctly, data on individual pages can tie bots into knots. Canonical tags and robots directives, in combination with redirects and disallows affecting the same pages, can send a combination of confusing signals to search engines that can mess up your indexation and ability to perform in natural search.

If you have a sudden problem with performance in a key page, check for a noindex directive and, also, confirm the page that the canonical tag specifies. Does it convey contradictory signals to a redirect sending traffic to the page, or a disallow in the robots.txt file? You never know when something could accidentally change as a result of some other release that developers pushed out.

Is the text correct? Some crawlers also allow you to search for custom bits of text on a page. Maybe your company is rebranding and you want to be sure that you find every instance of the old brand on the site. Or maybe you recently updated schema on a page template and you want to be sure that it’s found on certain pages. If it’s something that involves searching for and reporting on a piece of text within the source code of a group of web pages, your crawler can help.

Plan Crawl Times

It’s important to remember, however, that third-party crawlers can put a heavy burden on your servers. They tend to be set to crawl too quickly as a default, and the rapid-fire requests can stress your servers if they’re already experiencing a high customer volume. Your development team may even have blocked your crawler previously based on suspected scraping by spammers.

Talk to your developers to explain what you need to accomplish and ask for the best time to do it. They almost certainly have a crawler that they use — they may even be able to give you access to their software license. Or they may volunteer to do the crawl for you and send you the file. At the least, they’ll want to advise you as to the best times of day to crawl and the frequency at which to set the bot’s requests. It’s a small courtesy that helps build respect.

Source: SEO: 7 Reasons to Use a Site Crawler | Practical Ecommerce

29- Jun2017
Posted By: DPadmin
138 Views

SEO Best Practices For Every Page On Your Site

 

teachers-blackboardIn terms of getting the best search rankings, you can broadly break your SEO efforts into two areas: site-wide optimisation and optimising individual pages. Today we’re going to focus on the second of these two subjects, looking at how to maximise the search ranking of every page you publish.

By following the steps in this guide, the individual pages on your site will earn more exposure, generate a higher volume of leads and contribute to better rankings across the rest of your site.

The challenge of creating ‘quality’ content

The phrase “quality content” is used so much these days that it’s lost all meaning. So, to be clear, for your content to be considered quality by search engines and people it must be two things: valuable and discoverable.

Valuable content provides information people actually need and discoverable content is easily accessible when people need it most. Hitting this sweet spot of providing the right information at the crucial moment is a real challenge but one we need to overcome in the age of micro-moment marketing.

image: http://cdn2.business2community.com/wp-content/uploads/2017/06/m-m.png

Think with Google: micro moments best practices
Source: Think with Google

The key is to understand the consumer journey of your target audiences and the role each of your pages plays along the way. This tells you the kind of information users need from each page and the kind of conversion goal you should be targeting.

10 steps to follow

Your next big challenge is creating unique content on every page you publish, which can be particularly difficult for services pages. When you have five, ten or any number of services to promote, how do you make every page unique and valuable?

Follow these SEO best practices steps to get you started:

  1. Introduce the service
  2. Differentiate from similar services (eg: SEO vs PPC)
  3. Make the unique benefits and selling points of each service clear
  4. Identify questions users will have and provide answers
  5. Explain which kind of clients use this service and what you’ve done for them
  6. Consider testimonials, case studies and social proof specific to this service
  7. Use visual content to reinforce your message
  8. Have a prominent, compelling call-to-action
  9. Provide access to further information for users who aren’t ready to commit yet
  10. Direct users to another service page if this isn’t the one that meets their needs

Try to be as specific as possible with each of your service pages, otherwise you’ll find they all end up being very similar. You need to make it perfectly clear why this is the service your visitors need and, if it isn’t, make it obvious where they should go next.

Multimedia ranking factors

It’s widely accepted that Google and other search engines take multimedia content into consideration when ranking pages. Humans are visually stimulated creatures and search engines know images, video and other visuals are the perfect way to spice up a page full of text.

Strong visual content is also more engaging than text, which can reduce indirect ranking factors like bounce rate, time on page, number of pages visited, etc.

image: http://cdn2.business2community.com/wp-content/uploads/2017/06/b2c-content-priorities.png

Top Priorities fir B2C Content Creators

Source: Content Marketing Institute

So visuals are important to people and search engines alike, but the same old issue of quality/value comes into play. A bunch of naff stock images aren’t going to engage people and reduce those indirect ranking factors.

Aside from this you also need to optimise your visual content so search engines can recognise them and also reduce the negative impact on performance. This starts by using the right format for images so make sure you understand the difference between JPEG, PNG and other images file types.

Hopefully, you’re well aware by now that Flash is a no-no and HTML5 video is the way to go. Here are some other things to consider:

  • Relevance is still important for videos
  • Engagement metrics like views, comments, shares, etc. have an impact
  • Metadata tells search engines what your video is about
  • Keywords are believed to also have an impact

With video content there’s always the question of hosting the video on your site or embedding via YouTube. While embedding YouTube videos can by boost engagement metrics (views, shares, etc.) you could be taking ranking points away from your page by hosting your video elsewhere. So, in the case of service pages, it’s probably best to create highly specific videos and host them on that service page only. This way all the SEO points go to that page and nowhere else.

In term of performance, speed is your biggest enemy with visual content. Optimise your images and videos to reduce file sizes as much as you can without hurting quality too much. Also think about content delivery networks (CDNs), web caching and optimise your code for the best possible speed.

Also, don’t underestimate the importance of your hosting provider/package when it comes to speed and performance.

Make your visual content discoverable

As mentioned earlier, even the best content is useless until search engines and people are able to find it at the key moment. This is more challenging with visual content because search engines can’t watch videos or see infographics, which means you need to give them a helping hand.

  • Avoid loading content with AJAX (Google still has trouble crawling this)
  • Create descriptive descriptions with relevant keywords
  • Optimise your titles and meta descriptions where possible (not every image can have a title, of course)
  • Consider transcriptions for your video content
  • Use descriptive captions
  • Avoid infographics with no written content (similar to transcriptions)

The key is to provide context with your visuals so search engines can understand the purpose they serve to users.

Write for users, optimise for search engines

We’ve already spoken about creating content that meets user needs, answers their questions and provides value. This is your priority for every page you publish. Write for users first and then optimise for search engines – once again, to make your pages discoverable and prove their relevance.

Here are the SEO essentials for on-page optimisation:

  • Descriptive titles in H1 tags, including your target keyword
  • Descriptive page URL with keyword included
  • Correct formatting with subheadings (in H2, H3 tags, etc.) including keywords if they’re relevant/useful
  • Meta data, Schema and rich snippets where relevant
  • Inbound and outbound links to/from other relevant pages on your site (internal linking)
  • Optimised visuals for performance and discoverability
  • Mobile optimisation
  • Fast loading times

There are a few things on that list that we haven’t covered in-depth yet so let’s go into some more detail about meta data, URLs and the remaining on-page essentials.

Writing effective meta data

Meta data is a subject that causes a lot of confusion because it has little-to-no impact on how search engines rank your pages. However, users still see much of this information on results pages, meaning it has a direct impact on how many people click-through to your site.

Optimising your title tags

The title tags determine what users see as the blue headline text of your search results. Here’s an example of what this looks like on a listing for Search Engine Watch:

image: http://cdn2.business2community.com/wp-content/uploads/2017/06/title-tags-sew.jpg

Google search result with title tag highlightedFor this page the HTML title will look like this:

<title>Title Tags Guide | Good & Bad Examples | Search Engine Watch</title>

This is a common formula for optimising title tags: Keyword #1 | Keyword #2 | Brand name. However, this approach is outdated now because it doesn’t provide the most descriptive title for users trying to find the most relevant result to their query.

Here are some best practices to keep in mind:

  • Be descriptive: Your priority with title tags is to accurately describe the content users will see on the other side. You want the highest number of clicks vs the lowest possible bounce rate – and this means compelling but accurate title tags.
  • Aim for queries, not keywords: Placing keywords in your title tags won’t help you rank higher but matching a user’s search query will tell them your page has what they’re looking for.
  • Include your brand name: Users are more likely to click results from brands they recognise so it’s still good practice to include your brand name in title tags.
  • Be mindful of length: Search engines tend to give you 50-60 characters (or 512 pixels more specifically) and everything after this will be cut off. Ideally, you want your full title to be visible but don’t obsess over this. Be mindful of length but focus on creating titles that will generate the most clicks.

Meta descriptions

Once again, meta descriptions have no impact on where you rank but they give users vital information about what your page contains. Much like your page titles, these only appear in search results, not your actual pages. Their role is simply to give users more information about what they can expect to gain from clicking on your listing.

image: http://cdn2.business2community.com/wp-content/uploads/2017/06/meta-desc-sew.jpg

Google search result with meta description highlightedIn the listing above, Search Engine Watch aims to get people clicking by matching the questions they have in mind within their meta description. It may not be the most readable of descriptions but it provides a lot of information about what users can expect to find on the page. They’ve also squeezed a number of potential queries into that description, which will show up in bold when users search for them.

This approach won’t be ideal for all meta descriptions but it’s a good example of the things you need to consider when creating your own:

  • Be descriptive
  • Include search queries
  • Make it readable
  • Get users excited about clicking through
  • Focus on the value your page has to offer
  • Aim for a maximum of 150-160 characters

Think of meta descriptions as a mini sales pitch about why people should click through to your site. Every page you create should have a clear, concise goal and this where you get to put this message across to searchers in a short sentence or two.

Create amazing URLs

The final key element in our trio of meta data essentials is your page URLs. The reason URLs were created in the first place was to provide users with a descriptive version of web addresses – otherwise we’d be typing in a bunch of IP addresses to access everything online.

image: http://cdn2.business2community.com/wp-content/uploads/2017/06/meta-url-sew.jpg

Google search result with URL highlightedThis is important because it basically tells you everything you need to know about URLs. Like the rest of your meta data, they should be descriptive for users – and this is something many brands have forgotten over the years.

Generally speaking, the shorter and more descriptive your URLs are, the better experience they provide for users. Here are some things to consider:

  • Cut out unnecessary words: Stay true to your page titles and/or headings with URLs but feel free to cut out unnecessary words.
  • Forget punctuation: There’s no place for question marks, commas or any other punctuation in URLs.
  • Stop words can be ok: Stop words (the, and, or, when, how, etc.) are generally considered unnecessary but it’s fine to use them if you think they make your URL more meaningful.
  • Use hyphens: Separate words in your URLs with hyphens (“-”) as these are considered more readable. Avoid underscores (“_”), spaces and any other special characters to separate words.
  • Target search queries: This one keeps coming up with every piece of meta data we look at – and for good reason.
  • Avoid dynamic parameters: These make URLs incredibly long and unreadable.

That last point is a tricky one, because many brands want to use dynamic parameters to track user journeys across their websites. The problem is they make a real mess of URLs and it’s not only search engine results pages where this can cause problems. Users are also left with a mess when they try to bookmark your page or if they try to remember the URL of your site/specific page.

Bringing it all together

A few years ago, the idea that content marketing was the new SEO became popular in the industry. This was largely due to Google’s Hummingbird update that put less emphasis on keywords and more on matching context between search queries and content. And, while it’s true content is the most important part of your SEO strategy, ignoring the more technical side of optimising your pages is a mistake – especially with loading times and other performance factors becoming increasingly influential in search rankings.

As businesses invest more time and money into creating content it would be a shame if your efforts fall short because your pages aren’t as discoverable as they could be. So pay attention to the smaller aspects of on-page optimisation best practices and give your content the best opportunity to make things happen.

Source: SEO Best Practices For Every Page On Your Site