DISCLAIMER: This post is written as a live blog from Mozcon. There may be typos and grammar to make my high school English teachers weep. Please excuse those … it’s a fast=paced conference with back-to-back sessions and no time for proofing or even proper writing.
After the final break of the day and of the event Rand Fishkin takes the stage to chat onsite SEO in 2015.
Rand asks us to remember a time when the goal was to create perfect and well optimized pages. We also reminds us of how links got power and that SEO’s in 2007 has pretty bought their way to the top.
Even in 2012 Wil Reynolds has claimed that Google was making liars out of white hat SEO’s.
But now they’re doing a much better job at understanding not just links but the language and thematic relevance.
Amit Singhal noted years ago that Google has a bias against machine learning but by 2012 they were converting on the PPC side. Their SmartASS system uses machine learning to determine ad quality.
In 2013 Matt Cutts announces that it’s part of organic as well – that machines are determining factors and functions of the algorithm. This means that the underpinnings of rankings are changing.
He brings up the gorilla issue a week prior to this writing and the racism it implied. That’s machine learning at it’s worst.
Required Reading is Jeff Dean’s Slides on Deep Learning. I’ll be watching it on my way home.
Machine learning in search could work like …
Training Data – a good set of search results.
Compare with …
Bad Data – a bad set of resutls
Then let the machines figure out the difference.
What Does Deep Learning Mean?
It means Google won’t know what’s causing a ranking. The only thing that matters is the query success metrics (CTR, share amplification, user engagement, etc).
So we’ll be ranking less by inputs (links, etc) and more by searcher outputs (low bounce rate, high CTR, etc) essentially turning SEO on it’s head.
But What About Today?
To illustrate Rand shows us a test from 2014. During Mozcon in 2014 the group pushed a results form #7 to #1 in just a few hours by clicks.
What Google has said is that it wouldn’t make sense. The times it worked must have been a fluke.
So they tried short and long clicks.
Rand requests people short click on one result and long on another. 70 minutes later the long click site went to #1. Time on page matters. 12 hours later it fell to #13 and then back up to #4 about an hour later.
FYI – this is hard to replicate. It took 600 real searchers.
We need o optimize for two algorithms and we need o optimize for the future. That is a user-driven home.
We need to pay attention to the inputs of old (typical onsite, snippit optimization, UX, quality, keywords, etc.) and the new onsite (CTR, short vs long clicks, amplification and loyalty, task completion, etc.)
#1 – CTR
Optimization of the title and description will skew more to clicks and less to keywords. Of course, the keywords will help make the result seem more valid but they won’t be the critical element – more the means to the end.
Fresh content indicators, domain and other factors will all count.
Fortunately this means we need to focus on … waitforit … visitors.
If something doesn’t work the first time we can try again later with a subject
#2 – Engagement
Increase engagement by:
- Conscious and unconscious needs
- Deliver positive UX
- Compel visitors to more pages
- Avoid features users hate
#3 – Fulfilling Gaps in Knowledge
Machine Learning may note the that prevalence of certain words, phrases & topics may predict more successful phrases. Talking about NYC but not the Bronx (for example) may indicate thin content on the city.
The Moz team is working on a tool to model this. Until launch Alchemy API and MonkeyLearn both do this.
#4 – Earning More Shares & Links
Google denies not suing social directly and Rand believes it but sites ranking quickly with little more makes SEO’s suspicious.
Rand suspects that they’re not using the Share but rather the engagement as a more legitimate metric.
Either way it doesn’t matter in the real world – what matters is that there appears to be a correlation.
So raw shares and links may be great metrics but the question as to what your competitors are doing is key. Also not just the number of shares but what do they do on your site.
If we know what the audience and influencers share then we’ll rank. This isn’t just great content … this is 10x content. The questions isn’t just to do content as good as the top ranking stuff, it’s to create content 10x better.
Only new SEO will do this, the old strategies will not.
#5 – Fulfilling The Searchers Task
Google wants to get the user the information they need to fulfill their task faster. So if a result solves the users need it will impact results.
The old algorithm is still valid and you can do very well with it but the future is user interactions.
Rand calls this terrible advice – Make pages for people not engines.
He asserts that we need to take both into account.
10x content – bit.ly/10Xcontent
Topical modeling – bit.ly/monkeylearnseo
And we’re running a click test to see how things go. I’ll keep you posted.