25- Jan2018
Posted By: DPadmin
29 Views

Has AI changed the SEO industry for better or worse?

Columnist Jayson DeMers explores the impact of Google’s shift toward machine learning and discusses what the future will look like for search professionals.

With Google turning to artificial intelligence to power its flagship search engine business, has the SEO industry been left in the dust? The old ways of testing and measuring are becoming antiquated, and industry insiders are scrambling to understand something new — something which is more advanced than their backgrounds typically permit.

The fact is, even Google engineers are having a hard time explaining how Google works anymore. With this in mind, is artificial intelligence changing the SEO industry for better or worse? And has Google’s once-understood algorithm become a “runaway algorithm?”

Who was in the driver’s seat?

The old days of Google were much simpler times. Artificial intelligence may have existed back then, but it was used for very narrow issues, like spam filters on Gmail. Google engineers spent most of their time writing preprogrammed “rules” that worked to continuously close the loopholes in their search engine — loopholes that let brands, with the help of SEO professionals, take advantage of a static set of rules that could be identified and then exploited.

However, this struck at the heart of Google’s primary business model: the pay-per-click (PPC) ad business. The easier it was to rank “organically” (in Google’s natural, unpaid rankings), the fewer paid ads were sold. These two distinctly different parts of their search engine have been, and will always be, at odds with one another.

If you doubt that Google sees its primary business as selling ads on its search engine, you haven’t been watching Google over the past few decades. In fact, almost 20 years after it started, Google’s primary business was still PPC. In 2016, PPC revenues still represented 89 percent of its total revenues.

At first glance, it would stand to reason that Google should do everything it can to make its search results both user-friendly and maintainable. I want to focus on this last part — having a code base that is well documented enough (at least, internally within Google) so that it can be explained to the public, as a textbook of how websites should be structured and how professionals should interact with its search engine.

Going up the hill

Throughout the better part of Google’s history, the company has made efforts to ensure that brands and webmasters understood what was expected of them. In fact, they even had a liaison to the search engine optimization (SEO) world, and his name was Matt Cutts, the head of Google’s Webspam Team.

Cutts would go around the SEO conference circuit and often be the keynote or featured session speaker. Any time Google was changing its algorithms or pushing a new update to its search engine, Cutts would be there to explain what that meant for webmasters.

It was quite the spectacle. In one room, you typically had hundreds of SEOs who were attacking every loophole they could find, every slim advantage they could get their hands on. In the very same room, you had Cutts explaining why those techniques were not going to work in the future and what Google actually recommended.

As time when loopholes were closed, Cutts became one of the only sources of hope for SEOs. Google was becoming more sophisticated than ever, and with very few loopholes left to exploit, Cutts’s speaking engagements became crucial for SEOs to review and dissect.

The ‘uh-oh’ moment

And then, the faucet of information slowed to a trickle. Cutts’ speaking engagements became rarer, and his guidelines became more generic. Finally, in 2014, Cutts took a leave from Google. This was a shock to insiders who had built an entire revenue model off of selling access to this information.

Then, the worst news for SEOs: He was being replaced by an unnamed Googler. Why unnamed? Because the role of spokesperson was being phased out. No longer would Google be explaining what brands should be doing with each new update of its search engine.

The more convoluted its search engine algorithms were, the more PPC ads Google sold. As a result of this shift, Google capitalized immensely on PPC ad revenue. It even created “Learn with Google,” a gleaming classroom where SEO conference attendees could learn how to maximize PPC spend.

An article by Search Engine Land columnist Kristine Schachinger about the lack of information on a major algorithmic update, and Google’s flippant response by interim spokesman Gary Illyes, had all of the SEO industry’s frustration wrapped up in a nutshell. What was going on?

Removing the brakes — the switch to an AI-powered search engine

At the same time, Google was experimenting with new machine learning techniques to automate much of the updating process to its search engine. Google’s methodology has always been to automate as much of its technology as it could, and its core search engine was no different.

The pace of Google’s search engine switch to artificial intelligence caught many off-guard. This wasn’t like the 15 years of manual algorithm updates to its index. This felt like a tornado had swept in — and within a few years, it changed the landscape of SEO forever.

The rules were no longer in some blog or speech by Matt Cutts. Here stood a breathtaking question: Were the rules even written down at Google anymore?

Much of the search engine algorithms and their weightings were now controlled by a continuously updating machine-learning system that changed its weightings from one keyword to the next. Marcus Tober, CTO of SearchMetrics, said that “it’s very likely that even Google Engineers don’t know the exact composition of their highly complex algorithm.

The runaway algorithm

Remember Google’s primary revenue stream? PPC represents almost 90 percent of its business. Once you know that, the rest of the story makes sense.

Did Google know beforehand that the switch to an AI-powered search engine would lead to a system that couldn’t be directly explained? Was it a coincidence that Cutts left the spotlight in 2014, and that the position never really came back? Was it that Google didn’t want to explain things to brands anymore, or that they couldn’t?

By 2017, Google CEO Sundar Pichai began to comment publicly on Google’s foray into artificial intelligence. Bob Griffin, CEO of Ayasdi, wrote recently that Pichai made it clear that there should be no abdication of responsibility associated with intelligent technologies. In other words, there should be no excuse like “The machine did x.”

Griffin put it clearly:

Understanding what the machine is doing is paramount. Transparency is knowing what algorithm was used, which parameters were used in the algorithm and, even, why. Justification is an understanding of what it did, and why in a way that you can explain to a reporter, shareholder, congressional committee or regulator. The difference is material and goes beyond some vague promise of explainable AI.

But Google’s own search engineers were seemingly unable to explain how their own search engine worked anymore. This discrepancy had gotten so bad that in late 2017, Google hired longtime SEO journalist Danny Sullivan in an attempt to reestablish its image of transparency.

But why such a move away from transparency in the first place? Could it be that the move to artificial intelligence — something that went way over the heads of even the most experienced digital marketing executives, was the perfect cover? Was Google simply throwing its proverbial hands up in the air and saying, “It’s just too hard to explain?” Or was Google just caught up in the transition to AI, trying to find a way to explain things like Matt Cutts used to do?

Regardless of Sullivan’s hire, the true revenue drivers meant that this wasn’t a top priority. Google had solved some of the most challenging technical problems in history, and they could easily have attempted to define these new technical challenges for brands, but it simply wasn’t their focus.

And, not surprisingly, after a few years of silence, most of the old guard of SEO had accepted that the faucet of true transparent communication with Google was over, never to return again.

Everyone is an artificial intelligence expert

Most SEO experts’ backgrounds do not lend themselves very well to understanding this new type of Google search. Why? Most SEO professionals and digital marketing consultants have a marketing background, not a technical background.

When asked “How is AI changing Google?,” most answers from industry thought leaders have been generic. AI really hasn’t changed much. Effective SEO still requires the same strategies you’ve pursued in the past. In some cases, responses simply had nothing to do with AI in the first place.

Many SEO professionals, who know absolutely nothing about how AI works, have been quick to deflect any questions about it. And since very few in the industry had an AI background, the term “artificial intelligence” became almost something entirely different — just another marketing slogan, rather than an actual technology. And so some SEO and digital marketing companies even began pinning themselves as the new “Artificial Intelligence” solution.

The runaway truck ramp?

As with all industries, whenever there’s a huge shift in technology, there tends to be a changing of the guard. There are a number of highly trained engineers that are beginning to make the SEO industry their home, and these more technologically savvy folks are starting to speak out.

And, for every false claim of AI, there are new AI technologies that are starting to become mainstream. And these are not your typical SEO tools and rank trackers.

Competitive industries are now investing heavily in things like genetic algorithms, particle swarm optimization and new approaches that enable advanced SEO teams to model exactly what Google’s RankBrain is attempting to do in each search engine environment.

At the forefront of these technologies is industry veteran and Carnegie Mellon alumni Scott Stouffer, founder and CTO of MarketBrew.com, who chose to create and patent a statistical search engine modeling tool, based on AI technologies, rather than pursuing a position at Google.

Now, 11 years into building his company, Stouffer has said:

There are a number of reasons why search engine modeling technology, after all these years, is just now becoming so sought-after. For one, Google is now constantly changing its algorithms, from one search query to the next. It doesn’t take a rocket scientist to know that this doesn’t bode well for SEO tools that run off of a static set of pre-programmed rules.

On the flipside, these new search engine models can actually be used to identify what the changes are statistically, to learn the behavior and characteristics of each search engine environment. The models can then be used to review why your rankings shifted: was it on-page, off-page, or a mixture of both? Make an optimization on your site, and rerun the model. You can instantly see if that change will statistically be a positive or negative move.

I asked Stouffer to give me a concrete example. Let’s say you see a major shift in rankings for a particular search result. These search engine modeling tools start with what Stouffer coins as a “standard model.” (Think of this as a generic search engine that has been regression-tested to be a “best fit” with adjustable weightings for each algorithmic family.) This standard model is then run through a process called Particle Swarm Optimization, which locates a stable mixture of algorithmic weightings that will produce similar search results to the real thing.

Here’s the catch: If you do this before and aftereach algorithmic shift, you can measure the settings on the models between the two. Stouffer says the SEO teams that invest in Market Brew technology do this to determine what Google has done with its algorithm: For instance, did it put more emphasis on the title tags, backlinks, structured data and so on?

Suffice it to say, there are some really smart people in this industry who are quickly returning the runaway algorithm back to the road.

Chris Dreyer of Rankings.io put it best:

I envision SEO becoming far more technical than it is today.  If you think about it, in the beginning, it was super easy to rank well in search.  The tactics were extremely straight forward (i.e. keywords in a meta tag, any link placed anywhere from any other website helped, etc.). Fast forward just a decade and SEO has already become much more advanced because search algorithms have become more advanced.  As search engines move closer to the realistic human analysis of websites (and beyond), SEOs will have to adapt. We will have to understand how AI works in order to optimize sites to rank well.

As far as Google goes, the hiring of Sullivan should be a very interesting twist to follow. Will Google try to reconcile the highly technical nature of its new AI-based search engine, or will it be more of the same: generic information intended on keeping these new technologists at bay, and keeping Google’s top revenue source safe?

Can these new search engine modeling technologies usher in a new understanding of Google? Will the old guard of SEO embrace these new technologies, or is there a seismic shift underway, led by engineers and data scientists, not marketers?

The next decade will certainly be an interesting one for SEO.

Source: Has AI changed the SEO industry for better or worse?

11- Aug2017
Posted By: DPadmin
124 Views

How to Target Different Cities Without Hurting Your SEO

Done right, city pages can be an integral part of a local SEO strategy. Done poorly, they can get you penalized, or worse. Here’s the right way to do it.

City Pages: Good or Bad for SEO?

Google became critical of pages that do not add additional value to a customer a few years ago. They initially rolled this out algorithmically as the Panda algorithm, which penalized sites for using what Google considered poor content techniques. This was initially targeted at doorway pages, article spinning, and various other nefarious methods.

However, a more common (and generally more legitimate) type of content was caught in this algorithm as well: the city page. The Panda algorithm worked so well that Google integrated it into their main algorithm, and it now evaluates sites in real time.

For nearly a decade, local business owners have created pages around individual cities that they service, in hopes of catching someone looking for items in a particular town, borough, or neighborhood.

So, on the surface, what you’re proposing isn’t necessarily a bad thing. The tricky part is making sure that each page provides some unique value to the visitor.

And you have to be honest with yourself as to whether you’re providing value. The search engines know if all you change is one or two words here and there. They still consider that duplicate content.

If the content is almost completely identical except for unusual terms like “arms-on” instead of “hands-on” and it’s pretty clear that it was written by a machine, Google will flag it. Pages like that are what will trigger the Panda part of the algorithm.

Make Your Most Important City Pages Unique

My recommendation is to focus only on those pages that are most important to you – don’t make a page for every small town in the North Georgia mountains. If you want to list out all the cities in a region, just list them on the page – you don’t need an individual page for each city to rank in most cases.

In terms of making the pages different, write original content for each area or city. Focus on what makes that city unique or different, but give it value beyond what you can get in a census listing.

Too often, I see city pages that have just restated population data that they could have gotten anywhere else.

  • If you offer services in real estate, talk about the way the community in that area differs.
  • If you offer plumbing, mention that a common problem in that area is hard water.
  • If you’re a florist, talk about the climate, or how you source or grow plants in the area.

Find ways to make it different and unique so that you’ll actually add some information for the potential customer. In addition to making Google like the page better, it also gives the consumer confidence that you know the area and can really address their needs.

And finally, these listed locations should appear on all versions of the website – a common problem is that site owners will only have these links appear on a desktop site, but if you access the site on tablet or mobile, the links go away.

Summary

Because Google will be moving to a mobile-first algorithm, not having those links on the mobile site may make them drop out of the index. At worst, it could make it look like you’re trying to hide these pages; that they don’t add value.

Done right, city pages can be an integral part of a local SEO strategy.

Done poorly, city pages can get you penalized – or worse.

Source: How to Target Different Cities Without Hurting Your SEO

18- Feb2016
Posted By: Guardian Owl
244 Views

SEO is a Long-Term Investment- Marketers Feel Pressure

Even though SEO is a long-term investment, marketers often feel pressured to show progress quickly. Columnist Dan Bagby for Searchengineland provides some ideas for quick wins that can show value while waiting for your longer-term initiatives to start gaining traction.

Even though SEO is a long-term investment, marketers often feel pressured to show progress quicklyWhen you start at a new company as the SEO specialist or pick up a new client, one thing everyone wants is to see quick results. The fact that SEO takes time can be a struggle as you try to show value while also satisfying your own desire to make an impact.

Here are a few SEO techniques that will let your colleagues or clients know you are the real deal, bringing value with your expertise.

1. Win With Featured Snippets

Winning a featured snippet spot can have a huge impact, bringing organic traffic to a page. Although getting featured in the quick answer box is not guaranteed, there is a pretty simple formula for optimizing your content for it.

Even though SEO is a long-term investment, marketers often feel pressured to show progress quicklyStart by going to the Google Search Console to find rankings for queries that contain a question — you can do this by filtering for queries containing “how,” “what” or “why.”

Once you have a list of keyword phrases, check search volume and prioritize your list, focusing on the keywords with the highest search volume. If you do not currently rank for any question-related keywords, think of a simple question you can answer, and create the content to answer that question.

Increase your chances of being featured in the quick answer box by making on-page improvements:

  1. Provide a detailed answer in a bulleted or numbered format that specifically answers the question posed by the search query.
  2. Add a video to the page that answers the question (with transcription).
  3. Add additional information that adds more value to the page for the reader.

Once your content has been revised, submit it to be indexed, and share it on Google Plus, so that the changes are noticed quickly. To learn more about optimizing for featured snippets, check out this article by Eric Enge.

2. Optimize Existing Content

It is much easier to improve a strong existing page’s ranking a few spots in the SERPs than it is to get a new (or poorly ranking) page to show quick results.

Knowing that you see the biggest bumps in traffic when you get into the top three results, target content ranked in position 3 to 10. Improving bounce rates or building on pages that are converting can also be a great way to see big gains from a small time investment.

There are several ways to identify which pages to focus on:

  1. Going back to the Search Console, sort keywords by rank to find keywords ranked between 10 and 3.
  2. Looking in Google Analytics, find pages with a high bounce rate but decent traffic.
  3. Also in Google Analytics, find pages with high conversion rates. Check what keywords are driving traffic through Search Console, and focus on optimizing for those keywords.

What can you do to improve these pages and see results quickly? Here are some ideas

On-Page Optimization

  • Modifying the basic on-page ranking factors to improve search engine optimization.
  • Find internal pages that are related to your target pages, and create new internal links from the related pages to the target pages.
  • Share on Google Plus and submit to Google to be crawled.

Crowd Source Content For Quick Links

One way to quickly improve a page’s content (and possibly gain links) is by reaching out to influencers. Keep it simple by asking influencers to contribute to a page you are trying to improve.

For example, if you have a page you wrote about the best places to eat in Austin, you could reach out to food bloggers in Austin and ask them for their opinion on the best new restaurants.

Even more effective is to ask them if they have a blog post about those specific restaurants that you can link to. They will gladly give you content to link to while you get more content to add to your page.

Once the updates are made, let the influencers know by email and via Twitter. This can result in additional social shares and possibly links for the influencers. You can also use this technique when you are creating a brand-new page.

Optimizing For Search Intent

I often find pages ranking well for queries that do not fit the page. For example, I might see an article ranking for queries related to “finding influencers” that is really more focused on how to reach out to influencers. Fixing this will likely improve rank and lower bounce rate.

  1. If the page does not rank for other keywords, and the keywords currently driving traffic are strategic for your site, rewrite the article completely focusing on those keywords.
  2. If you want to maintain the article, you can add a section to better answer the query that it already ranks for.
  3. If the information that would match the search intent does not belong on the page, write a new page that answers the questions, and link to it from the ranking post with keyword-rich anchor text.

3. Improve Rank For Converting Pages

Look at Google Analytics to find the pages that are converting. Use Search Console to find the keywords driving traffic to that page. You can also look at paid campaigns to see top-converting keywords.

Focus on these keywords and pages to see quick results and really prove the value in SEO.

4. Find Competing Content

Check your site for several pieces of content on the same topic and combine the pages. Make sure to redirect URLs so that there is only one page.

5. Fixing 404 Errors

There are several tools that make it easy to find 404 errors. You can fix links by reaching out to site owners that have the broken links or redirect the broken URL to a live page.

Final Thoughts

While SEO is a long-term investment and can take time to show results, there are always a few things you can do to show quick value. I have included only included a few opportunities here, but there are many other techniques like using other sites to get content ranking quickly. What are some of your techniques to get quick results?

is a long-term investment, marketers often feel pressured to show progress quickly. Columnist Dan Bagby provides some ideas for quick wins that can show value while waiting for your longer-term initiatives to start gaining traction.

Source: Quick Wins To Beat The SEO Waiting Game

16- Feb2016
Posted By: Guardian Owl
299 Views

Special Edition Batman v Superman Galaxy S7 Edge Coming? | Digital Trends

 

The Dark Knight and the Man of Steel will finally do battle on March 25, with Samsung possibly cooking up a special-edition Galaxy S7 Edge to commemorate the occasion, reports South Korean outlet Naver.

According to the publication, Samsung will launch a Batman v Superman edition of the Galaxy S7 Edge, which will reportedly be decked out with wallpaper, ringtone, and a design based on the movie. However, Samsung allegedly won’t stop there, as the South Korean outfit will also release other special variants of the Galaxy S7 and its curved-edge equal. They include one inspired by the 2016 Winter Olympics, while another will reportedly be done in collaboration with a popular South Korean singer.

Related: Galaxy S7 and Galaxy S7 Edge rumors and news

Samsung has yet to confirm or deny the existence of any of these special-edition smartphones, let alone the Galaxy S7 and the Galaxy S7 Edge themselves. However, it wouldn’t be the first time Samsung ventured into the world of special-edition handsets, as the company released the Iron Man limited edition Galaxy S6 Edge last May.
If the three aforementioned limited editions are anything like the Iron Man smartphone, however, they will be pricey and they will be available in very limited quantities. Not only were there only 1,000 Iron Man Galaxy S6 Edge units made, but they were so expensive that one Amazon reviewer wrote he sold his genitalia, left foot, and wife on the black market just to get one. Grim stuff.

Regardless, this makes us wonder what the Batman v Superman Galaxy S7 Edge would even look like. Even though The Avengers: Age of Ultron contained multiple superheroes, Samsung and Marvel opted to go with Iron Man, so it will be interesting to see who Samsung and DC Comics roll with. Our money’s on Batman, since Samsung isn’t exactly known for releasing smartphones with outlandish colors, such as a Superman edition would require, but my personal pick is Wonder Woman, who also has a starring role in the movie.

Also watch: Samsung Galaxy S7 could get upstaged by 360-degree VR cam
by Taboola Sponsored Links From The Web
iPhone 6 Plus’s Being Sold for Next to Nothing
QuiBids
Read Ebooks? Here’s The Worst Kept Secret Among Book Lovers
BookBub
Kentucky Drivers With No Tickets In 3 Years Are Surprised By This New Rule
Provide-Savings Insurance Quotes
Read more: http://www.digitaltrends.com/mobile/samsung-batman-v-superman-galaxy-s7-smartphone/#ixzz40HgiiHj2
Follow us: @digitaltrends on Twitter | digitaltrendsftw on Facebook

g reportedly looks to continue its release of limited edition smartphones by releasing a special edition Batman v. Superman Galaxy S7.

Source: Special Edition Batman v Superman Galaxy S7 Edge Coming? | Digital Trends

13- Feb2016
Posted By: Guardian Owl
308 Views

SEO Doesn’t Have To Be A Shot In The Dark | TechCrunch

 To many startups, search engine optimization (SEO) is a task that sits on their company’s back burner.  Yes, there’s an element of uncertainty with SEO (after all, Google doesn’t publicly reveal the factors they use to rank websites). But according to a new ranking-factor study, SEO doesn’t have to be a shot in the dark. In fact, you can prioritize your SEO tasks based on what’s likely to give you the most bang for your buck.

With features to launch and customers to support, the idea of spending time fiddling with your title tags can seem like a fool’s errand. That’s especially true when there’s no guarantee that your hard work will result in a single additional visitor from Google. That’s one of the reasons that a recent study ranked SEO as the third most important marketing priority for startups (behind social media and content marketing).

Why do startups tend to shy away from SEO? From working with dozens of startups, I’ve found that founders hate the uncertainty that comes from SEO. Indeed, success with SEO can seem like throwing two dice and hoping you roll double sevens.

 

Backlinks, content and page speed are key 

Backlinko recently teamed up with a handful of SEO software companies to evaluate the factors that are most important for success with SEO today. To do this, they analyzed one million Google search results.

Of the 20 potential ranking factors they looked at, five were revealed to be especially important. I’m going to deep-dive into these five important ranking factors, and show you how you can apply them to squeeze more juice out of your SEO efforts.

Content is king?

The study found that the most important ranking factor was number of different websites linking to your page.

pasted image 0 (3)

This ranking factor is as old as Google itself.

Despite the fact that so-called “black hat SEOs” manipulate Google with phony links, it appears that this ranking factor remains an integral part of what makes Google tick.

This shouldn’t come as a big surprise. Google’s reliance on backlinks has taken it from two guys in a garage near Stanford to one of the most valuable companies on the planet. And today, Google’s worldwide search market share remains relatively stable. This makes it unlikely Google will completely remove backlinks from their algorithm. This data suggests that, at least for today, backlinks are still heavily relied upon by Big G.

You can prioritize your SEO tasks based on what’s likely to give you the most bang for your buck.

Another interesting wrinkle is that this finding flies in the face of what many SEO consultants recommend: Many SEO agencies preach a “quality over quantity” approach to link building.

While there’s no question certain backlinks provide more benefit than others (for example, a link from TechCrunch is significantly more powerful than a link from your average mommy blog), this study suggests that backlink quantity is also important.

This is an important lesson for founders and startup marketers to learn. As someone who does PR consulting for startups, I notice that many founders shoot for the moon with their link and PR aspirations. In other words, to many founders, it’s “CNN.com or bust.” This new data suggests that this approach may be a mistake. In fact, one of the chief reasons I took Polar to 40+ million pageviews is that I wasn’t overly picky about which sites we got mentions and links from.

If a site looked legit and wanted to cover us, I said, “Let’s do it.” That’s part of the reason I’ve landed 1,300 mentions over the last few years.

As you can see, a  lot of these mentions were on major news sites. But the funny thing is that a good chunk of these major mentions came as a result of a smaller blog or niche news site writing about us. In fact, this is the exact strategy that Ryan Holiday recommends in his PR classic “Trust Me, I’m Lying.”

Not only are mentions from smaller sites beneficial for startups’ PR, but they can significantly boost your Google rankings, as well.

Slow loading site = SEO death

Backlinko’s new study also found a strong tie between site speed and Google rankings. Using site-loading-speed data from Alexa, they discovered that fast-loading websites significantly outperformed slow sites.

This finding shouldn’t come as a shock to anyone who follows SEO. Google has come out and said they use site speed as a “signal in our search ranking algorithms.” Because users hate slow-loading websites, Google doesn’t want to show them to their users.

Fortunately, taking your site from “tortoise” to “hare” is relatively simple. If you happen to use WordPress, there are no shortage of plug-ins that can boost your site’s loading time. Even if you don’t use WordPress, a few quick steps can typically move the needle for most websites:

  • Upgrade your hosting: Cheap $5/month hosting plans like Bluehost aren’t bad, but their servers aren’t typically optimized for speed.
  • Cut image file sizes: For most websites, images are the No. 1 factor that slow down a page. You can usually compress them or reduce their size without sacrificing much in the way of quality.
  • Hire a coder: If you’re not a coder, hiring a pro to analyze your code with an eye on site speed can be a game changer. Most sites have at least some code bloat that can be easily cleaned up.

Long-form content wins the day

Backlinko also found that, when it comes to SEO, content may not be king, but it’s certainly queen. Specifically, their data revealed that long-form content tended to rank above shorter content.

pasted image 0 (4)

According to their analysis, the average article on Google’s first page boasts 1,890 words.

Does this mean that Google has an inherent preference for long content? Maybe. The study authors pointed out that this finding was simply a correlation, and they couldn’t say for sure. But they hypothesized that Google would want to show their users through content that fully answers their query. In other words, long-form content.

However, it may be that longer content generates more shares (in the form of tweets, Facebook likes and backlinks). In fact, BuzzSumo found that longer content tended to generate more social shares.

pasted image 0 (5)

Considering that shares can lead to higher rankings, long-form content may simply outperform short content in the share department, leading to higher Google rankings.

If you haven’t attempted to publish long-form content because you feel your audience doesn’t have the attention span for it, this finding may give you the impetus to at least give it a shot.

Adding focus to your content may improve rankings

Additionally, the study found that focused content outperformed content that attempted to cover several different topics. Using software called MarketMuse, each article in their database was scored for “topical authority.” A high score represents an article that covered a topic in-depth. A low score indicates that the article skimmed the surface of a given topic.

The authors guessed that Google would prefer comprehensive content. This is because of a fundamental shift in the way Google indexes content. In the last few years, Google has moved away from simply looking at the words on your page to actually understanding what your page is about. This is known as semantic search.

For example, before semantic search, if you Googled “who is the CEO of Starbucks”, Google would look for pages that contained the exact term “who is the CEO of Starbucks” on the page. And they would present 10 links to those pages.

Today, they know the actual answer, and present it to you.

pasted image 0 (6)

It turns out that Google may prefer in-depth content, as it gives them a deeper understanding of your content. This study found that content rated as having high topical authority ranked above content with a poor rating.

pasted image 0 (7)

The old writing adage “go an inch wide and a mile deep” may also now apply to SEO, as well.

Bounces aren’t just hurting conversions

This research also found a correlation between a low bounce rate and poor rankings in Google.

pasted image 0 (8)

According to the study, Google may use bounce rate as a proxy measure of content quality. If someone searches for a keyword, clicks on your page and quickly leaves, it sends a message to Google that your page isn’t a good fit for that keyword.

On the other hand, if you stay on the site and browse through several different pages, it implies that that person had a great experience and enjoyed reading your content. That may push Google to show your page to more people.

While this finding is interesting, there are a few important caveats I should point out.

Also, being a correlation study, it’s impossible to say whether Google directly measures or uses bounce rate as a ranking signal. A high bounce rate may simply reflect content that isn’t very good.

Regardless, reducing your bounce rate certainly won’t result in lower rankings — and it can boost conversions, as well.

Source: SEO Doesn’t Have To Be A Shot In The Dark | TechCrunch

19- Jan2016
Posted By: Guardian Owl
305 Views

Google Core Algorithm Updated Continue

Hey Relax folks! The Google change this past weekend was just a core update, nothing penguin related.

Gary Illyes from Google said on Twitter this morning that the weekend fluctuations were “core algorithm” and “not Penguin.”

Many webmasters are waiting for a Google Penguin update, and we are expecting it to happen early this year. So when we see major fluctuations, some are quick to say it is Penguin.

But Google is telling us this is not Penguin but rather just common core ranking algorithm updates.

Read the full article at Search Engine Land!

23- Mar2015
Posted By: Guardian Owl
79 Views

Biggest Advertisers Are Sending Their Dollars to Digital

By SYDNEY EMBER, NY Times
March 18, 2015

The country’s largest marketers are slashing their advertising budgets as they shift a larger portion of their spending to digital, according to new figures released on Wednesday.

The 10 biggest advertisers cut spending by 4.2 percent in 2014, to $15.3 billion from $16 billion a year earlier, according to the latest report from Kantar Media, a research firm owned by the advertising conglomerate WPP. Procter & Gamble, the top advertiser, lowered its ad spending in 2014 by 14.4 percent, bringing its expenditures to $2.6 billion, the report showed. Read more