I believe artificial intelligence (AI) will be a key driver of change in PPC in 2018 as it leads to more and better PPC intelligence.
So far, I’ve discussed the roles humans will play when PPC management becomes nearly fully automated and six strategiesagencies can take to future-proof their business. In this final post on the state of AI in PPC, I’ll cover the technology of AI.
Why AI took years to matter to PPC
AI has been around since 1956, and PPC has existed since the late 1990s. So why did it take until now for AI’s role in paid search to become such a hot topic in our industry?
It’s because we’ve recently hit an inflection point where, due to the exponential nature of technological advances, we’re now seeing improvements that used to take years happen in weeks.
What’s driving this is the exponential growth explained by Moore’s Law, the principle that computing power doubles approximately every 18 months. The outcome of exponential growth is hard for humans to grasp, so let me give an example that doesn’t involve computing speeds since those can be a bit too conceptual. Instead, let’s apply this doubling of speed to cars, where we can more easily understand how it impacts the distances we travel and how quickly we get somewhere.
Imagine if the first car, invented by Karl Benz in 1885 with a top speed of about 10 mph, was doubling its speed every 18 months. In 1885, we could have driven that car across a typical town in an hour. After 27 times doubling its speed (the same number of times the microchip has doubled its speed since it was invented), we could have gone to the sun in about 4 minutes. And less than 18 months later, it would take just about 2 hours to travel to Neptune, the farthest planet in our solar system. (Voyager 2 did that same trip in about 12 years.)
Because computing speed has already doubled 27 times, every extra doubling leads to new capabilities that are beyond imagination.
What exponential growth means for PPC
So, if we’ve reached the point of PPC automation today where humans and computers are about equally good, consider that the pace of technological improvement makes it possible for the machines to leave humans in the dust later this year. That’s why it’s worth thinking about the roles humans will play in the future of PPC.
And just like the first car is not the right vehicle for a flight to Neptune, the tools you used to manage AdWords a few years ago may no longer be the ones that make sense for managing AdWords today. So let’s take a look at what AI is doing to PPC tools.
The technologies driving PPC intelligence
Just like you want to know what your employees are capable of by interviewing them before hiring them, you should understand a technology’s capabilities (and limits) before adding it to your toolkit. So let’s see how artificial intelligence works in PPC.
PPC intelligence through programmed rules
Before the advent of AI as a research field in 1956, you could make a machine appear “intelligent” by programming it to deliver specific responses to a large number of scenarios. But that form of AI is very limited because it can’t deal with edge cases, of which there are invariably many in the real world.
In PPC, this would be akin to using Automated Rules to write rules for every possible scenario an account might encounter. Rules are great for covering the majority use cases, but the real world is messy, and trying to write rules for every scenario is simply impossible.
PPC intelligence through symbolic representations
Between the 1950s and 1980s, AI evolved into using symbolic systems to be able to take heuristic shortcuts like humans do. By framing problems in human readable form, it was believed the machines could make logical deductions.
Here’s a PPC problem: you’re adding a new keyword, but you don’t know the right bid to set because there is no historical data for it. By teaching the machine concepts like campaigns and keywords and how these relate to each other, we are providing it with the same heuristics we use to make reasonable guesses.
So the system can now automate bid management and might set a similar bid to other keywords in the campaign because it knows that campaigns tend to have keywords that have something in common.
PPC intelligence through statistical learning methods
The type of AI that is responsible for a lot of success in PPC today is based on statistics and machine learning to categorize things. Quality Score (QS) is a great example; Google looks at historical click behavior from users and uses machine learning to find correlations that help predict the likelihood of a click or a conversion.
By having a score for how likely it is that each search will translate into a conversion, automated bidding products like those offered inside AdWords can “think” through many more dimensions (like geo-location, hour of day, device, or audience) that might impact the likelihood of a conversion than a person could.
Thanks to the massively increased computing power available today, these systems can also consider interactions across dimensions without getting “overwhelmed” by the combinatorial nature of the problem.
What’s next for artificial intelligence
AI systems getting a lot of attention today, like AlphaGo Zero, are no longer dependent on structured data and can become “intelligent” without being “constrained by the limits of human knowledge,” as explained by DeepMind CEO Demis Hassabis.
The team created the AlphaZero algorithm using reinforcement learning so that it could learn to win other games besides AlphaGo. They claimed that by the end of 2017, this algorithm had learned to best humans in other games like chess and shogi in less than 1 day — a huge leap forward in AI.
Reinforcement learning uses massive computing power to run lots of simulations until it starts to recognize actions that lead to desirable outcomes. It can be applied to games because there is a clear outcome of “winning” or “losing.” When Google figures out what it means to win or lose in the game of AdWords, I bet we’ll see a huge acceleration in improvements of their automation tools.
Build your own PPC intelligence
There are a lot of tools available to automate your PPC work, and multiple third-party vendors are starting to use AI and ML to provide stronger recommendations. But there are also many free tools from AdWords that are getting better every day thanks to advances in AI, like Portfolio Bid Strategies, Custom Intent Audiences, optimized ad rotation, etc.
For those willing to invest in connecting their own business data to AdWords and AI, I’m a big fan of prototyping solutions with AdWords Scripts because they provide a lot of customizability without requiring a lot of engineering resources. Unfortunately, simple scripts you write will fall into the weakest category of AI, where PPC intelligence is achieved through hard-coded rules.
But when you get a bit more advanced in your scripting abilities, you can use Google Cloud Machine Learning Engine to start enhancing your own automations with modern machine learning techniques.
The benefit of an out-of-the box solution like this is that you don’t need to learn many types of different models. But that’s also the downside because you won’t get total control over how you set criteria and thresholds to get results that are usable. Our team at Optmyzr tried several ready-made systems but eventually decided that we needed more power — so we’re building our own AI.
I believe there are three pillars for being a successful PPC marketer in a world where AI takes over and I’ve now touched on each pillar in my recent posts:
- Be ready for the new roles humans will play.
- Have a plan for your business, and especially focus on having the best process for leveraging AI.
- Understand the technology so you can spot opportunities faster.
Over the coming months, I will share my own experiences with AI so advertisers ready to take the plunge will have a better understanding of what is involved in building successful companies that leverage the latest state of the art in technology, computation, and statistics.
Columnist Jayson DeMers explores the impact of Google’s shift toward machine learning and discusses what the future will look like for search professionals.
With Google turning to artificial intelligence to power its flagship search engine business, has the SEO industry been left in the dust? The old ways of testing and measuring are becoming antiquated, and industry insiders are scrambling to understand something new — something which is more advanced than their backgrounds typically permit.
The fact is, even Google engineers are having a hard time explaining how Google works anymore. With this in mind, is artificial intelligence changing the SEO industry for better or worse? And has Google’s once-understood algorithm become a “runaway algorithm?”
Who was in the driver’s seat?
The old days of Google were much simpler times. Artificial intelligence may have existed back then, but it was used for very narrow issues, like spam filters on Gmail. Google engineers spent most of their time writing preprogrammed “rules” that worked to continuously close the loopholes in their search engine — loopholes that let brands, with the help of SEO professionals, take advantage of a static set of rules that could be identified and then exploited.
However, this struck at the heart of Google’s primary business model: the pay-per-click (PPC) ad business. The easier it was to rank “organically” (in Google’s natural, unpaid rankings), the fewer paid ads were sold. These two distinctly different parts of their search engine have been, and will always be, at odds with one another.
If you doubt that Google sees its primary business as selling ads on its search engine, you haven’t been watching Google over the past few decades. In fact, almost 20 years after it started, Google’s primary business was still PPC. In 2016, PPC revenues still represented 89 percent of its total revenues.
At first glance, it would stand to reason that Google should do everything it can to make its search results both user-friendly and maintainable. I want to focus on this last part — having a code base that is well documented enough (at least, internally within Google) so that it can be explained to the public, as a textbook of how websites should be structured and how professionals should interact with its search engine.
Going up the hill
Throughout the better part of Google’s history, the company has made efforts to ensure that brands and webmasters understood what was expected of them. In fact, they even had a liaison to the search engine optimization (SEO) world, and his name was Matt Cutts, the head of Google’s Webspam Team.
Cutts would go around the SEO conference circuit and often be the keynote or featured session speaker. Any time Google was changing its algorithms or pushing a new update to its search engine, Cutts would be there to explain what that meant for webmasters.
It was quite the spectacle. In one room, you typically had hundreds of SEOs who were attacking every loophole they could find, every slim advantage they could get their hands on. In the very same room, you had Cutts explaining why those techniques were not going to work in the future and what Google actually recommended.
As time when loopholes were closed, Cutts became one of the only sources of hope for SEOs. Google was becoming more sophisticated than ever, and with very few loopholes left to exploit, Cutts’s speaking engagements became crucial for SEOs to review and dissect.
The ‘uh-oh’ moment
And then, the faucet of information slowed to a trickle. Cutts’ speaking engagements became rarer, and his guidelines became more generic. Finally, in 2014, Cutts took a leave from Google. This was a shock to insiders who had built an entire revenue model off of selling access to this information.
Then, the worst news for SEOs: He was being replaced by an unnamed Googler. Why unnamed? Because the role of spokesperson was being phased out. No longer would Google be explaining what brands should be doing with each new update of its search engine.
The more convoluted its search engine algorithms were, the more PPC ads Google sold. As a result of this shift, Google capitalized immensely on PPC ad revenue. It even created “Learn with Google,” a gleaming classroom where SEO conference attendees could learn how to maximize PPC spend.
An article by Search Engine Land columnist Kristine Schachinger about the lack of information on a major algorithmic update, and Google’s flippant response by interim spokesman Gary Illyes, had all of the SEO industry’s frustration wrapped up in a nutshell. What was going on?
Removing the brakes — the switch to an AI-powered search engine
At the same time, Google was experimenting with new machine learning techniques to automate much of the updating process to its search engine. Google’s methodology has always been to automate as much of its technology as it could, and its core search engine was no different.
The pace of Google’s search engine switch to artificial intelligence caught many off-guard. This wasn’t like the 15 years of manual algorithm updates to its index. This felt like a tornado had swept in — and within a few years, it changed the landscape of SEO forever.
The rules were no longer in some blog or speech by Matt Cutts. Here stood a breathtaking question: Were the rules even written down at Google anymore?
Much of the search engine algorithms and their weightings were now controlled by a continuously updating machine-learning system that changed its weightings from one keyword to the next. Marcus Tober, CTO of SearchMetrics, said that “it’s very likely that even Google Engineers don’t know the exact composition of their highly complex algorithm.”
The runaway algorithm
Remember Google’s primary revenue stream? PPC represents almost 90 percent of its business. Once you know that, the rest of the story makes sense.
Did Google know beforehand that the switch to an AI-powered search engine would lead to a system that couldn’t be directly explained? Was it a coincidence that Cutts left the spotlight in 2014, and that the position never really came back? Was it that Google didn’t want to explain things to brands anymore, or that they couldn’t?
By 2017, Google CEO Sundar Pichai began to comment publicly on Google’s foray into artificial intelligence. Bob Griffin, CEO of Ayasdi, wrote recently that Pichai made it clear that there should be no abdication of responsibility associated with intelligent technologies. In other words, there should be no excuse like “The machine did x.”
Griffin put it clearly:
Understanding what the machine is doing is paramount. Transparency is knowing what algorithm was used, which parameters were used in the algorithm and, even, why. Justification is an understanding of what it did, and why in a way that you can explain to a reporter, shareholder, congressional committee or regulator. The difference is material and goes beyond some vague promise of explainable AI.
But Google’s own search engineers were seemingly unable to explain how their own search engine worked anymore. This discrepancy had gotten so bad that in late 2017, Google hired longtime SEO journalist Danny Sullivan in an attempt to reestablish its image of transparency.
But why such a move away from transparency in the first place? Could it be that the move to artificial intelligence — something that went way over the heads of even the most experienced digital marketing executives, was the perfect cover? Was Google simply throwing its proverbial hands up in the air and saying, “It’s just too hard to explain?” Or was Google just caught up in the transition to AI, trying to find a way to explain things like Matt Cutts used to do?
Regardless of Sullivan’s hire, the true revenue drivers meant that this wasn’t a top priority. Google had solved some of the most challenging technical problems in history, and they could easily have attempted to define these new technical challenges for brands, but it simply wasn’t their focus.
And, not surprisingly, after a few years of silence, most of the old guard of SEO had accepted that the faucet of true transparent communication with Google was over, never to return again.
Everyone is an artificial intelligence expert
Most SEO experts’ backgrounds do not lend themselves very well to understanding this new type of Google search. Why? Most SEO professionals and digital marketing consultants have a marketing background, not a technical background.
When asked “How is AI changing Google?,” most answers from industry thought leaders have been generic. AI really hasn’t changed much. Effective SEO still requires the same strategies you’ve pursued in the past. In some cases, responses simply had nothing to do with AI in the first place.
Many SEO professionals, who know absolutely nothing about how AI works, have been quick to deflect any questions about it. And since very few in the industry had an AI background, the term “artificial intelligence” became almost something entirely different — just another marketing slogan, rather than an actual technology. And so some SEO and digital marketing companies even began pinning themselves as the new “Artificial Intelligence” solution.
The runaway truck ramp?
As with all industries, whenever there’s a huge shift in technology, there tends to be a changing of the guard. There are a number of highly trained engineers that are beginning to make the SEO industry their home, and these more technologically savvy folks are starting to speak out.
And, for every false claim of AI, there are new AI technologies that are starting to become mainstream. And these are not your typical SEO tools and rank trackers.
Competitive industries are now investing heavily in things like genetic algorithms, particle swarm optimization and new approaches that enable advanced SEO teams to model exactly what Google’s RankBrain is attempting to do in each search engine environment.
At the forefront of these technologies is industry veteran and Carnegie Mellon alumni Scott Stouffer, founder and CTO of MarketBrew.com, who chose to create and patent a statistical search engine modeling tool, based on AI technologies, rather than pursuing a position at Google.
Now, 11 years into building his company, Stouffer has said:
There are a number of reasons why search engine modeling technology, after all these years, is just now becoming so sought-after. For one, Google is now constantly changing its algorithms, from one search query to the next. It doesn’t take a rocket scientist to know that this doesn’t bode well for SEO tools that run off of a static set of pre-programmed rules.
On the flipside, these new search engine models can actually be used to identify what the changes are statistically, to learn the behavior and characteristics of each search engine environment. The models can then be used to review why your rankings shifted: was it on-page, off-page, or a mixture of both? Make an optimization on your site, and rerun the model. You can instantly see if that change will statistically be a positive or negative move.
I asked Stouffer to give me a concrete example. Let’s say you see a major shift in rankings for a particular search result. These search engine modeling tools start with what Stouffer coins as a “standard model.” (Think of this as a generic search engine that has been regression-tested to be a “best fit” with adjustable weightings for each algorithmic family.) This standard model is then run through a process called Particle Swarm Optimization, which locates a stable mixture of algorithmic weightings that will produce similar search results to the real thing.
Here’s the catch: If you do this before and aftereach algorithmic shift, you can measure the settings on the models between the two. Stouffer says the SEO teams that invest in Market Brew technology do this to determine what Google has done with its algorithm: For instance, did it put more emphasis on the title tags, backlinks, structured data and so on?
Suffice it to say, there are some really smart people in this industry who are quickly returning the runaway algorithm back to the road.
Chris Dreyer of Rankings.io put it best:
I envision SEO becoming far more technical than it is today. If you think about it, in the beginning, it was super easy to rank well in search. The tactics were extremely straight forward (i.e. keywords in a meta tag, any link placed anywhere from any other website helped, etc.). Fast forward just a decade and SEO has already become much more advanced because search algorithms have become more advanced. As search engines move closer to the realistic human analysis of websites (and beyond), SEOs will have to adapt. We will have to understand how AI works in order to optimize sites to rank well.
As far as Google goes, the hiring of Sullivan should be a very interesting twist to follow. Will Google try to reconcile the highly technical nature of its new AI-based search engine, or will it be more of the same: generic information intended on keeping these new technologists at bay, and keeping Google’s top revenue source safe?
Can these new search engine modeling technologies usher in a new understanding of Google? Will the old guard of SEO embrace these new technologies, or is there a seismic shift underway, led by engineers and data scientists, not marketers?
The next decade will certainly be an interesting one for SEO.
Thanks to Google, a new artificial intelligence systemis outperforming humans in spotting the origins of images.
Google has unveiled a new system to identify where photos are taken. The task, simple when images contain famous landmarks or unique architecture, goes beyond the overt to examine small clues hidden in the pixels.
The program, named PlaNet, uses a deep-learning neural network, which means the more images PlaNet sees, the smarter it gets.
“PlaNet is able to localize 3.6% of the images at street-level accuracy and 10.1% at city-level accuracy. 28.4% of the photos are correctly localized at country level and 48.0% at continent level,” wrote the research team.
That’s still a long way from a reliable level of accuracy – but PlaNet already outperforms even the most well-traveled humans.
To compare PlaNet to human accuracy, the researchers matched their program against 10 well-traveled people in the game Geoguessr, a game providing a random street-view photo and requiring players to identify where they believe the photo was taken.
PlaNet and its human challengers played 50 rounds in total.
“PlaNet won 28 of the 50 rounds with a median localization error of 1131.7 km, while the median human localization error was 2320.75 km,” according to the paper.
Other computer programs are tackling image location as well. Im2GPS has achieved high accuracy by relying on image retrieval to identify location. For example, if im2GPS was trying to identify where a picture of a forest was taken, it would browse the internet’s millions of forest photos. When it found one that looked almost identical, it would conclude they were taken in the same place. With enough data, this method can achieve high accuracy, according to the paper.
The researchers trained the neural network using 29.7 million public photos from Google+. The neural network relies on clues and features from photos it has already seen to help identify the most likely whereabouts of a new image.
The program has some limitations. Because it depends on internet images, PlaNet is at a disadvantage when confronted with rural countrysides and other rarely photographed locales. The team also left out large swaths of the Earth, including oceans and the polar caps.
Tobias Weyland, the lead author on the project, noted that supplementing internet photos with satellite images resolved some of these weaknesses. PlaNet also focuses on landscapes and other factors besides landmarks, making it more accurate at identifying non-city images than other programs.