04- Apr2018
Posted By: DPadmin
32 Views

15 SEO Myths That Just Won’t Die – SEO 101

In digital marketing, and specifically search engine optimization (SEO), there are tidbits of information that in their retelling lose context and become what we call in other circles “Zombie Lies” or in this case “Zombie Myths.”

Zombie SEO Myths

Zombie SEO myths are myths that, despite being debunked over and over again, never seem to die. They take on a life of their own and leave site owners confused as to what is true and what is not.

So this chapter is going to look at some of those myths that never seem to die – no matter how hard experts try to kill them.

Mostly, we’re going to focus on Google because that is where most sites get their traffic (and where most of the myths revolve around).


Myth 1: SEO is Voodoo or Snake Oil

There is a low bar to entry into the field of digital marketing, including and especially SEO. There are no real certification processes (because how would you certify something that changes every day?) and Google never publishes the algorithms, so there is no way to test an individual’s knowledge against what they contain.

Basically, when you hire an SEO provider it has to be based on trust.

This is why the myth that SEO is voodoo prevails. It prevails because bad practitioners did bad work and the client is left with no other way to explain their lack of results. In fact, it is often these bad practitioners who use the myth to explain their poor results.

That being said, SEO isn’t voodoo (or magic or “bovine feces”). Real SEO is the process of making sites adhere better to Google’s algorithms, for specific query strings, in order to increase relevant site traffic and/or company revenues.

These algorithms aren’t completely unknowable things.

While Google never publishes the details of that information, informed SEO professionals have a good understanding of what will bring a site in compliance with those algorithms (or, in the case of black hat SEO, how they can game those algorithms). They are after all based on math and processes governed by logic.

A trustworthy SEO professional lives and breathes algorithm changes, which can amount to multiple changes a day. They know why the algorithms do what they do as best as anyone not working at Google can do.

This is the opposite of voodoo and magic. It is called earned knowledge. It is also a very hard earned knowledge.

When you pay an SEO pro, you aren’t paying for their time. You are paying for their knowledge and results. Prices are set accordingly.


Myth 2: Content Is All You Need

“Content is KING!”

You will find many articles that make this statement. While they are not completely untrue, content is less king and more like a valuable business partner to links, design, and usability.

Mostly, though, content and links are the like the conjoined twins of the SEO world. You must have both. One will not work without the other (at least not well and not for the long term).

Now, Google will tell you many long-tail queries rank without links. That is likely true. It is also likely that these long-tail queries are so unique that there is no competition for them, so links don’t play an active role the way they do in a competitive query.

If you’re trying to rank for the Walking Dead, you better have links* or don’t expect anyone to find you.

*Good links. Not poor, $99 links bought from a link farm.

So while content is very important, content needs links. Just like links need content.

Bonus Tip: Content is not king. Content is special, but not king. Like peanut butter and jelly you can have one without the other, but it isn’t as good. Add technical to this duo and you have the triad that is the basis of all good core SEO.


Myth 3: Speed Isn’t That Important

Google said a while back that page speed is only a tie-breaker when all other factors are equal. This is one of those cases where I can say that this is not borne out in real-world testing.

Personally, I had a client increase their traffic by over 200,000 sessions a day when they cut their page speed by 50 percent during a likely Panda update. So while it is true that it acts as a tie-breaker when all things are equal it can also dramatically improve rankings when your site has a severe page speed issue.

Now when I say a page speed issue, I don’t mean you cut your 5-second site load time down to 2 seconds. I mean when you dramatically cut your page load, say a 22-second site load time down to 8 seconds, which is what happened in this case.

Know What is Being Measured

It is also important to know what Google is measuring when they are evaluating page speed. While they are looking at overall speed the issue they are most “critical” of is how long the DOM (Direct Object Model) takes to load. The DOM items are the visible items on the page excluding ads, if you have stacked your load right.

This means that if you can cut your DOM load from 22 seconds to 8 seconds as in the example, Google will likely reward you for the dramatic decrease in page load because you are now dramatically faster. This is an additional benefit of improving page speed unrelated to breaking a tie on a specific query result.

A faster site is much easier for Googlebot to crawl. When the site is not slowing the crawl down, more of your site is getting indexed either in number of pages or in depth of page crawl.

Note: The Google Page Speed Insight tool only measures items in the DOM, so you could have a higher page speed score than another site, but still perform more poorly in the rankings because your overall page load is too slow. Page speed is very important and will become even more so as we move into mobile first. So never discount it.


Myth 4: Links Are Dead

I once had a call from a potential client that asked me if I could remove all his links.

“Remove all your links? May I ask why you would want to do that,” I asked.

“Because I heard links were bad and I need to remove them,” he told me.

“Did you buy the links or get them from some nefarious method?”

“No they are all legit.”

“Then, sir, whatever you do, use me or don’t for other reasons, do not get rid of your links!”

True story.

Links aren’t dead.

Links aren’t close to dead.

If you have the best content in the world and no links, your site won’t get much visibility. Links and content are correlated with rankings. Great content still needs great links (or a lot of mediocre ones).

If you’re buying links for $99 and expecting to get to the top spots in Google, you’re barking up a very dead tree.

Remember, good links require topical relevancy and legitimacy. If it isn’t natural and it comes from an unrelated page or site, it probably won’t help much.

Bonus tip: Reciprocal linking died circa 2007, maybe earlier. Linking to your buddy and them linking to you won’t do you much good.


Myth 5: Keyword Density

There was a time keyword density have some validity.

Really, if it did not work why do you think all those people were stuffing white text on white backgrounds for ranking purposes? Then Google got smarter and it did away with keyword stuffing as a viable practice and even people who got good results from applying density testing to much smaller keyword placements no longer could count on knowing what keyword density would help.

In both cases, this no longer exists.

While you can still put any word on the page too many times, there is no set range of what makes a page rank. In fact, you can find results now where the keyword does not exist in the visible portion of the page. It might be in the links or in the image tagging or somewhere else that is not part of the content it might even be a similar not exact match. This is not typical, but it does exist.

Bottom line: placing a keyword X times per page is no longer something worth spending your time on. There are far better fish to fry.

Bonus Tip: Better to make relevant content that you can link to internally and others can link to externally than to waste time on optimizing keywords. That being said your title tag is still highly relevant. Spend some time adding your query set there. That might give you a boost.


Myth 6: You Must Submit Your Site

At least twice a week I get an email from an SEO site submission company telling me I need to pay them to submit my site to the search engines.

Seriously? No, you do not.

Now, are there times when it is good to submit your site URLs? Sure when you need the search engines to come back to the site to do things like pick up a new piece of content or re-evaluate a page, however, you never need to submit your site.

Google is advanced enough now – and especially with its status as registrar – that it can find you minutes after not only that site is live, but also when the domain is registered.

Now if you’ve been live for a few weeks and have an inbound link to the site and Google has not come by as evident by your logs it can’t hurt to submit it via Google Search Console Fetch and Render, but never ever pay someone to submit your site.

Bonus Tip: When in doubt just use Google’s URL submit form or “fetch and render/submit” in Google Search Console.


Myth 7: You Don’t Need a Sitemap

Sitemaps are not a nice to have add-on for sites today. This gets even more important as we move to the mobile-first algorithms in 2018.

Why? When Google cannot easily crawl a portion of your site, the sitemap allows the crawler to better find these pages.

Bonus Tip: Google is going to have a harder time finding pages due to the reduced size of navigational elements in mobile-first indexing. Sitemaps – both XML and HTML – will be the best way for them to find all the pages on the site you want indexed and ranked.

Sponsored Webinar – [Case Study] How to Create Content That Earns Links & Increases Your Traffic 316%
Join North Star Inbound’s founder Nicole DeLeon on Wednesday, April 18 at 2:00 PM Eastern. DeLeon will offer advice on how to leverage white hat link building to improve your backlink profile as well as earn traffic.


Myth 8: Query Must Have Freshness

QDF, or Query Deserves Freshness, most certainly applies to queries that need fresh results. For instance, from a news site or say the most recent Powerball numbers.

That does not mean you have to change every element on your homepage every day, or even very often.

While there are sites that absolutely must have fresh content on their main site pages on a daily or weekly basis, most do not.

Evergreen pages are evergreen for a reason. If you write an article on mobile-first indexing and that information has not changed, you do not need to change that page to give it “freshness”.

You do, however, need to have some fresh content on your site. So a good content strategy is how you address having fresh content without trying to meet some unnatural goal for daily content changes.

Bonus Tip: For smaller sites that have small teams or little money and do not need to have fresh content daily, you can just invest in adding pages to the site when needed but keeping an active blog presence. Adding 2-3 blog posts a week will keep the site relevant without adding the demands and costs of continually updating pages.


Myth 9: Because Big Brands Do It, It Must Be Good!

Remember your parents saying to you when you were little, “Would you jump off a bridge just because Johnny told you to?!” Same thing goes here.

There is a long history of sites copying bad website decisions from each other simply because they thought the other site knew something they didn’t.

Don’t be a lemming.

What one site does may work for them and may not. What if they tell you it is the best thing since sliced bread? Unless you’re looking at their metrics, don’t believe them and even if it is the best thing for them, the chances of that being right for you are slim.

Why? Because you’re a different company. Your users have different queries and user intent. Just because Facebook and Twitter use infinite scroll doesn’t mean you should.

In fact, because big brands don’t suffer as much from user and Googlebot discontent when they get it wrong, they are more likely to – get it wrong.

Don’t copy big brands. Find what works for your users and stick to that.

Bonus Tip: If you want to try something that you see on another site, find a section of yours that isn’t bringing in a lot of traffic and then A/B test the idea on your own pages. Your data will show you what works best for you. Never assume because a big brand does it, you will benefit from following their path.


Myth 10: Algorithm Devaluations = Penalties

Google has two types of site devaluations.

Penguin, Panda, Pirate, Pigeon, Layout etc. are all algorithms. Algorithms can giveth and they can taketh away. This means that not every site sees devaluations from the update of these processes. Many sites see positive results. This is called an “algorithmic change” not a penalty.

What are penalties then?

Penalties are manual actions you can find in Google Search Console. This is when Google took a look at your site and decided it was in violation of the Webmaster Guidelines and devalued the site. You know this happened by checking your messages in Google Search Console. When it happens they will tell you.

Penalties also require you “submit a reconsideration request” to regain your site status and remove the penalty.

Algorithmic devaluations have no such consideration. You fix what you think went wrong. Then you wait to see if Google gives you back your rankings when that algorithm or set of algorithms comes back through and re-evaluates the site.


Myth 11: Duplicate Content Is a Penalty

There is NO duplicate content penalty!

There has never been a duplicate content penalty.

Google does have a duplicate content filter, which simply means that if there is more than one item of content that is the same Google will not rank both for the same query. It will only rank one.

This makes perfect sense. Why would you want the results for a query to bring back the same content multiple times? It is simply easier to rewrite the piece than try to guess what those might be.

All that said, too much duplicate content can affect you with the Panda algorithm, but that is more about site quality rather than manual actions.

Bonus tip: The duplicate content filter applies to titles and meta descriptions as well. Make sure to make all your titles and descriptions unique.


Myth 12: Social Media Helps You Rank

Social media, done well, will get you exposure. That exposure can get you links and citations. Those links and citations can get you better rankings.

That doesn’t mean that social media postings are inherently helpful to getting you rank.

Social media doesn’t give you links, but it encourages others to link to you. It also means that the social media post may escape its ecosystem and provide you a true site link. But don’t hold your breath.

Social media is about visibility.

Getting those people to share your content and link to or mention your site in a way that Google counts it as a “link”? That is SEO.


Myth 13: Buying Google Ads Helps with Organic Ranking

No. Just no. Investing in PPC won’t boost your organic search rankings.

These two divisions are in two separate buildings and not allowed to engage with each other about these things.

Personally, I have worked with sites that have had massive budgets in Google AdWords. Their site still lived and died in organic by the organic algorithms. They received no bonus placements from buying Ads.

Bonus Tip: What buying ads can do is promote brand validation. In user experiments, it has been shown that when a user sees an ad and the site in the organic rankings together, they believe it to have more authority. This can increase click-through rates.


Myth 14: Google Uses AI in All its Algorithms

No. Google doesn’t use AI in the live algorithms except for RankBrain.

Now, Google does use AI to train the algorithms and in ways internally we are not privy to. However, Google doesn’t use AI in terms of the live algorithms.

Why?

Very simply put, because if it breaks they would not know how to fix it. AI operates on a self-learning model.

If it were to break something on search and that broken piece hurt Google’s ability to make money there would be no easy way to fix it. More than 95 percent of Google’s revenue still comes from ads, so it would be extremely dangerous to allow AI to take over without oversight.


Myth 15: RankBrain

So much has been written about RankBrain that is simply incorrect it would be difficult to state it as one myth. So, in general, let’s just talk about what RankBrain is and isn’t.

RankBrain is a top ranking factor that you don’t optimize to meet.

What does that mean? Basically, when Google went from strings to things (i.e., entity search), it needed better ways to determine what a query meant to the user and how the words in the query set related to each other. By doing this analysis, Google could better match the user’s intent.

To this end, they developed a system of processes to determine relationships between entities. For those queries they understand, they bring back a standard SERP. Hopefully, one that best matches your intent as a user.

However, 15 percent of the queries Google sees every day are new. So Google needed a way to deal with entities whose relationship was unclear or unknown when trying to match user intent.

Enter RankBrain!

RankBrain is a machine-learning algorithm that tries to understand what you mean when Google is unsure. It uses entity match and known relationships to infer meaning/intent from those queries it doesn’t understand.

For instance, back when the drought in California was severe if you looked up “water rights Las Vegas NV” (we share water) you would get back all sorts of information about water rights and the history of water rights in the Las Vegas area. However, if you put in a much lesser known area of Nevada, like Mesquite, Google wasn’t sure what you wanted to know.

Why? Because while Google understands Las Vegas as a city (entity) in a geological area (Clark County) and can associate that with water rights, a known topic of interest due to search data. It cannot, however, do the same for Mesquite.

Why? Because no one likely searched for water rights in Mesquite before or very often. The query intent was unknown.

To Google, Mesquite is a city in Nevada, but also a tree/charcoal/flavor/BBQ sauce and it brought back all of these results ignoring the delimiter “water rights” for all but one result. This is RankBrain.

Google is giving you a “kitchen sink.” Over time, if enough people search for that information or the training Google feeds it tells it differently, it will know that you specifically wanted x, not y.

RankBrain is about using AI to determine intent between entities with unknown or loosely formed relationships. So it is a ranking factor, but not really a ranking factor.

Bonus Tip: While there are a few niche cases where it might make sense to optimize for RankBrain, it really doesn’t for most. The query is a living dynamic result that is Google’s best guess at user intent. You would do far better to simply optimize the site properly than trying to gain from optimizing specifically for RankBrain.

Source: 15 SEO Myths That Just Won’t Die – SEO 101

04- Apr2018
Posted By: DPadmin
22 Views

The technology behind AI in PPC

I believe artificial intelligence (AI) will be a key driver of change in PPC in 2018 as it leads to more and better PPC intelligence.

So far, I’ve discussed the roles humans will play when PPC management becomes nearly fully automated and six strategiesagencies can take to future-proof their business. In this final post on the state of AI in PPC, I’ll cover the technology of AI.

Why AI took years to matter to PPC

AI has been around since 1956, and PPC has existed since the late 1990s. So why did it take until now for AI’s role in paid search to become such a hot topic in our industry?

It’s because we’ve recently hit an inflection point where, due to the exponential nature of technological advances, we’re now seeing improvements that used to take years happen in weeks.

What’s driving this is the exponential growth explained by Moore’s Law, the principle that computing power doubles approximately every 18 months. The outcome of exponential growth is hard for humans to grasp, so let me give an example that doesn’t involve computing speeds since those can be a bit too conceptual. Instead, let’s apply this doubling of speed to cars, where we can more easily understand how it impacts the distances we travel and how quickly we get somewhere.

Imagine if the first car, invented by Karl Benz in 1885 with a top speed of about 10 mph, was doubling its speed every 18 months. In 1885, we could have driven that car across a typical town in an hour. After 27 times doubling its speed (the same number of times the microchip has doubled its speed since it was invented), we could have gone to the sun in about 4 minutes. And less than 18 months later, it would take just about 2 hours to travel to Neptune, the farthest planet in our solar system. (Voyager 2 did that same trip in about 12 years.)

Because computing speed has already doubled 27 times, every extra doubling leads to new capabilities that are beyond imagination.

What exponential growth means for PPC

So, if we’ve reached the point of PPC automation today where humans and computers are about equally good, consider that the pace of technological improvement makes it possible for the machines to leave humans in the dust later this year. That’s why it’s worth thinking about the roles humans will play in the future of PPC.

And just like the first car is not the right vehicle for a flight to Neptune, the tools you used to manage AdWords a few years ago may no longer be the ones that make sense for managing AdWords today. So let’s take a look at what AI is doing to PPC tools.

The technologies driving PPC intelligence

Just like you want to know what your employees are capable of by interviewing them before hiring them, you should understand a technology’s capabilities (and limits) before adding it to your toolkit. So let’s see how artificial intelligence works in PPC.

PPC intelligence through programmed rules

Before the advent of AI as a research field in 1956, you could make a machine appear “intelligent” by programming it to deliver specific responses to a large number of scenarios. But that form of AI is very limited because it can’t deal with edge cases, of which there are invariably many in the real world.

In PPC, this would be akin to using Automated Rules to write rules for every possible scenario an account might encounter. Rules are great for covering the majority use cases, but the real world is messy, and trying to write rules for every scenario is simply impossible.

PPC intelligence through symbolic representations

Between the 1950s and 1980s, AI evolved into using symbolic systems to be able to take heuristic shortcuts like humans do. By framing problems in human readable form, it was believed the machines could make logical deductions.

Here’s a PPC problem: you’re adding a new keyword, but you don’t know the right bid to set because there is no historical data for it. By teaching the machine concepts like campaigns and keywords and how these relate to each other, we are providing it with the same heuristics we use to make reasonable guesses.

So the system can now automate bid management and might set a similar bid to other keywords in the campaign because it knows that campaigns tend to have keywords that have something in common.

PPC intelligence through statistical learning methods

The type of AI that is responsible for a lot of success in PPC today is based on statistics and machine learning to categorize things. Quality Score (QS) is a great example; Google looks at historical click behavior from users and uses machine learning to find correlations that help predict the likelihood of a click or a conversion.

By having a score for how likely it is that each search will translate into a conversion, automated bidding products like those offered inside AdWords can “think” through many more dimensions (like geo-location, hour of day, device, or audience) that might impact the likelihood of a conversion than a person could.

Thanks to the massively increased computing power available today, these systems can also consider interactions across dimensions without getting “overwhelmed” by the combinatorial nature of the problem.

What’s next for artificial intelligence

AI systems getting a lot of attention today, like AlphaGo Zero, are no longer dependent on structured data and can become “intelligent” without being “constrained by the limits of human knowledge,” as explained by DeepMind CEO Demis Hassabis.

The team created the AlphaZero algorithm using reinforcement learning so that it could learn to win other games besides AlphaGo. They claimed that by the end of 2017, this algorithm had learned to best humans in other games like chess and shogi in less than 1 day — a huge leap forward in AI.

Reinforcement learning uses massive computing power to run lots of simulations until it starts to recognize actions that lead to desirable outcomes. It can be applied to games because there is a clear outcome of “winning” or “losing.” When Google figures out what it means to win or lose in the game of AdWords, I bet we’ll see a huge acceleration in improvements of their automation tools.

Build your own PPC intelligence

There are a lot of tools available to automate your PPC work, and multiple third-party vendors are starting to use AI and ML to provide stronger recommendations. But there are also many free tools from AdWords that are getting better every day thanks to advances in AI, like Portfolio Bid Strategies, Custom Intent Audiences, optimized ad rotation, etc.

For those willing to invest in connecting their own business data to AdWords and AI, I’m a big fan of prototyping solutions with AdWords Scripts because they provide a lot of customizability without requiring a lot of engineering resources. Unfortunately, simple scripts you write will fall into the weakest category of AI, where PPC intelligence is achieved through hard-coded rules.

But when you get a bit more advanced in your scripting abilities, you can use Google Cloud Machine Learning Engine to start enhancing your own automations with modern machine learning techniques.

The benefit of an out-of-the box solution like this is that you don’t need to learn many types of different models. But that’s also the downside because you won’t get total control over how you set criteria and thresholds to get results that are usable. Our team at Optmyzr tried several ready-made systems but eventually decided that we needed more power — so we’re building our own AI.

Conclusion

I believe there are three pillars for being a successful PPC marketer in a world where AI takes over and I’ve now touched on each pillar in my recent posts:

  1. Be ready for the new roles humans will play.
  2. Have a plan for your business, and especially focus on having the best process for leveraging AI.
  3. Understand the technology so you can spot opportunities faster.

Over the coming months, I will share my own experiences with AI so advertisers ready to take the plunge will have a better understanding of what is involved in building successful companies that leverage the latest state of the art in technology, computation, and statistics.

Source: The technology behind AI in PPC – Search Engine Land

25- Jan2018
Posted By: DPadmin
30 Views

Has AI changed the SEO industry for better or worse?

Columnist Jayson DeMers explores the impact of Google’s shift toward machine learning and discusses what the future will look like for search professionals.

With Google turning to artificial intelligence to power its flagship search engine business, has the SEO industry been left in the dust? The old ways of testing and measuring are becoming antiquated, and industry insiders are scrambling to understand something new — something which is more advanced than their backgrounds typically permit.

The fact is, even Google engineers are having a hard time explaining how Google works anymore. With this in mind, is artificial intelligence changing the SEO industry for better or worse? And has Google’s once-understood algorithm become a “runaway algorithm?”

Who was in the driver’s seat?

The old days of Google were much simpler times. Artificial intelligence may have existed back then, but it was used for very narrow issues, like spam filters on Gmail. Google engineers spent most of their time writing preprogrammed “rules” that worked to continuously close the loopholes in their search engine — loopholes that let brands, with the help of SEO professionals, take advantage of a static set of rules that could be identified and then exploited.

However, this struck at the heart of Google’s primary business model: the pay-per-click (PPC) ad business. The easier it was to rank “organically” (in Google’s natural, unpaid rankings), the fewer paid ads were sold. These two distinctly different parts of their search engine have been, and will always be, at odds with one another.

If you doubt that Google sees its primary business as selling ads on its search engine, you haven’t been watching Google over the past few decades. In fact, almost 20 years after it started, Google’s primary business was still PPC. In 2016, PPC revenues still represented 89 percent of its total revenues.

At first glance, it would stand to reason that Google should do everything it can to make its search results both user-friendly and maintainable. I want to focus on this last part — having a code base that is well documented enough (at least, internally within Google) so that it can be explained to the public, as a textbook of how websites should be structured and how professionals should interact with its search engine.

Going up the hill

Throughout the better part of Google’s history, the company has made efforts to ensure that brands and webmasters understood what was expected of them. In fact, they even had a liaison to the search engine optimization (SEO) world, and his name was Matt Cutts, the head of Google’s Webspam Team.

Cutts would go around the SEO conference circuit and often be the keynote or featured session speaker. Any time Google was changing its algorithms or pushing a new update to its search engine, Cutts would be there to explain what that meant for webmasters.

It was quite the spectacle. In one room, you typically had hundreds of SEOs who were attacking every loophole they could find, every slim advantage they could get their hands on. In the very same room, you had Cutts explaining why those techniques were not going to work in the future and what Google actually recommended.

As time when loopholes were closed, Cutts became one of the only sources of hope for SEOs. Google was becoming more sophisticated than ever, and with very few loopholes left to exploit, Cutts’s speaking engagements became crucial for SEOs to review and dissect.

The ‘uh-oh’ moment

And then, the faucet of information slowed to a trickle. Cutts’ speaking engagements became rarer, and his guidelines became more generic. Finally, in 2014, Cutts took a leave from Google. This was a shock to insiders who had built an entire revenue model off of selling access to this information.

Then, the worst news for SEOs: He was being replaced by an unnamed Googler. Why unnamed? Because the role of spokesperson was being phased out. No longer would Google be explaining what brands should be doing with each new update of its search engine.

The more convoluted its search engine algorithms were, the more PPC ads Google sold. As a result of this shift, Google capitalized immensely on PPC ad revenue. It even created “Learn with Google,” a gleaming classroom where SEO conference attendees could learn how to maximize PPC spend.

An article by Search Engine Land columnist Kristine Schachinger about the lack of information on a major algorithmic update, and Google’s flippant response by interim spokesman Gary Illyes, had all of the SEO industry’s frustration wrapped up in a nutshell. What was going on?

Removing the brakes — the switch to an AI-powered search engine

At the same time, Google was experimenting with new machine learning techniques to automate much of the updating process to its search engine. Google’s methodology has always been to automate as much of its technology as it could, and its core search engine was no different.

The pace of Google’s search engine switch to artificial intelligence caught many off-guard. This wasn’t like the 15 years of manual algorithm updates to its index. This felt like a tornado had swept in — and within a few years, it changed the landscape of SEO forever.

The rules were no longer in some blog or speech by Matt Cutts. Here stood a breathtaking question: Were the rules even written down at Google anymore?

Much of the search engine algorithms and their weightings were now controlled by a continuously updating machine-learning system that changed its weightings from one keyword to the next. Marcus Tober, CTO of SearchMetrics, said that “it’s very likely that even Google Engineers don’t know the exact composition of their highly complex algorithm.

The runaway algorithm

Remember Google’s primary revenue stream? PPC represents almost 90 percent of its business. Once you know that, the rest of the story makes sense.

Did Google know beforehand that the switch to an AI-powered search engine would lead to a system that couldn’t be directly explained? Was it a coincidence that Cutts left the spotlight in 2014, and that the position never really came back? Was it that Google didn’t want to explain things to brands anymore, or that they couldn’t?

By 2017, Google CEO Sundar Pichai began to comment publicly on Google’s foray into artificial intelligence. Bob Griffin, CEO of Ayasdi, wrote recently that Pichai made it clear that there should be no abdication of responsibility associated with intelligent technologies. In other words, there should be no excuse like “The machine did x.”

Griffin put it clearly:

Understanding what the machine is doing is paramount. Transparency is knowing what algorithm was used, which parameters were used in the algorithm and, even, why. Justification is an understanding of what it did, and why in a way that you can explain to a reporter, shareholder, congressional committee or regulator. The difference is material and goes beyond some vague promise of explainable AI.

But Google’s own search engineers were seemingly unable to explain how their own search engine worked anymore. This discrepancy had gotten so bad that in late 2017, Google hired longtime SEO journalist Danny Sullivan in an attempt to reestablish its image of transparency.

But why such a move away from transparency in the first place? Could it be that the move to artificial intelligence — something that went way over the heads of even the most experienced digital marketing executives, was the perfect cover? Was Google simply throwing its proverbial hands up in the air and saying, “It’s just too hard to explain?” Or was Google just caught up in the transition to AI, trying to find a way to explain things like Matt Cutts used to do?

Regardless of Sullivan’s hire, the true revenue drivers meant that this wasn’t a top priority. Google had solved some of the most challenging technical problems in history, and they could easily have attempted to define these new technical challenges for brands, but it simply wasn’t their focus.

And, not surprisingly, after a few years of silence, most of the old guard of SEO had accepted that the faucet of true transparent communication with Google was over, never to return again.

Everyone is an artificial intelligence expert

Most SEO experts’ backgrounds do not lend themselves very well to understanding this new type of Google search. Why? Most SEO professionals and digital marketing consultants have a marketing background, not a technical background.

When asked “How is AI changing Google?,” most answers from industry thought leaders have been generic. AI really hasn’t changed much. Effective SEO still requires the same strategies you’ve pursued in the past. In some cases, responses simply had nothing to do with AI in the first place.

Many SEO professionals, who know absolutely nothing about how AI works, have been quick to deflect any questions about it. And since very few in the industry had an AI background, the term “artificial intelligence” became almost something entirely different — just another marketing slogan, rather than an actual technology. And so some SEO and digital marketing companies even began pinning themselves as the new “Artificial Intelligence” solution.

The runaway truck ramp?

As with all industries, whenever there’s a huge shift in technology, there tends to be a changing of the guard. There are a number of highly trained engineers that are beginning to make the SEO industry their home, and these more technologically savvy folks are starting to speak out.

And, for every false claim of AI, there are new AI technologies that are starting to become mainstream. And these are not your typical SEO tools and rank trackers.

Competitive industries are now investing heavily in things like genetic algorithms, particle swarm optimization and new approaches that enable advanced SEO teams to model exactly what Google’s RankBrain is attempting to do in each search engine environment.

At the forefront of these technologies is industry veteran and Carnegie Mellon alumni Scott Stouffer, founder and CTO of MarketBrew.com, who chose to create and patent a statistical search engine modeling tool, based on AI technologies, rather than pursuing a position at Google.

Now, 11 years into building his company, Stouffer has said:

There are a number of reasons why search engine modeling technology, after all these years, is just now becoming so sought-after. For one, Google is now constantly changing its algorithms, from one search query to the next. It doesn’t take a rocket scientist to know that this doesn’t bode well for SEO tools that run off of a static set of pre-programmed rules.

On the flipside, these new search engine models can actually be used to identify what the changes are statistically, to learn the behavior and characteristics of each search engine environment. The models can then be used to review why your rankings shifted: was it on-page, off-page, or a mixture of both? Make an optimization on your site, and rerun the model. You can instantly see if that change will statistically be a positive or negative move.

I asked Stouffer to give me a concrete example. Let’s say you see a major shift in rankings for a particular search result. These search engine modeling tools start with what Stouffer coins as a “standard model.” (Think of this as a generic search engine that has been regression-tested to be a “best fit” with adjustable weightings for each algorithmic family.) This standard model is then run through a process called Particle Swarm Optimization, which locates a stable mixture of algorithmic weightings that will produce similar search results to the real thing.

Here’s the catch: If you do this before and aftereach algorithmic shift, you can measure the settings on the models between the two. Stouffer says the SEO teams that invest in Market Brew technology do this to determine what Google has done with its algorithm: For instance, did it put more emphasis on the title tags, backlinks, structured data and so on?

Suffice it to say, there are some really smart people in this industry who are quickly returning the runaway algorithm back to the road.

Chris Dreyer of Rankings.io put it best:

I envision SEO becoming far more technical than it is today.  If you think about it, in the beginning, it was super easy to rank well in search.  The tactics were extremely straight forward (i.e. keywords in a meta tag, any link placed anywhere from any other website helped, etc.). Fast forward just a decade and SEO has already become much more advanced because search algorithms have become more advanced.  As search engines move closer to the realistic human analysis of websites (and beyond), SEOs will have to adapt. We will have to understand how AI works in order to optimize sites to rank well.

As far as Google goes, the hiring of Sullivan should be a very interesting twist to follow. Will Google try to reconcile the highly technical nature of its new AI-based search engine, or will it be more of the same: generic information intended on keeping these new technologists at bay, and keeping Google’s top revenue source safe?

Can these new search engine modeling technologies usher in a new understanding of Google? Will the old guard of SEO embrace these new technologies, or is there a seismic shift underway, led by engineers and data scientists, not marketers?

The next decade will certainly be an interesting one for SEO.

Source: Has AI changed the SEO industry for better or worse?