There is so much to search engine optimization that any time you ponder the subject or want to write about it, you have to ask which piece should I address?
Well, in this post, I wish to address how to keep your site fresh in Google’s eyes and how to stay on the bright side of life. I’m not talking Monty Python here, but I do have the song going through my head as I write this.
Watch this and then continue reading on after the video presentation.
Let’s Ponder What is Search Engine Optimization Evolution
So, what is search engine optimization if not analyzing your site and your pages and making sure that you are within the guidelines set by the most popular search engine, Google, and you are not in violation of their many rules that dictate whether your site ranks or is penalized and either slapped or blacklisted? That’s the heart of what I am getting at here.
You want to stay on Google’s good side.
You want them to see your site as useful so that it ends up high in the SERPs and then they give you a star and serve you some pie.
Okay, maybe they’ll just give you some Kit Kat… That was a Google joke. You have to know some more about them to get that. As of this article, the current Android version is “Kit-Kat”, and Google creates Android from Linux… but I digress…
Google Search Algorithm Updates
So what happened in 2013?
Google search algorithm updates titled Penguin and Panda had further minor releases, first Penguin, then Panda was introduced, and it had it’s various releases.
Google search algorithm updates titled Hummingbird were released too.
What did all this do? It took many sites that were gaming the system and getting high search rankings based on what Google calls fraudulent methods, and either slapped or deindexed them, based on how badly they were violating the algorithm. The algorithm has a whole bunch of factors rolled into it. Some of these deal with where a site’s back links are coming from, whether they are paying for links or if the links are natural enough, etc.
That sorta thing.
There are other things too. Is the site useful? What’s the visitor bounce rate? Are people sharing it? Are they spending lots of time there and are they repeat visitors? Is the actual content good? How is the grammar and spelling on the site? (within reason… I guess they allow for some natural writing and slang)
One of the big things was that Google went to encrypted search, which no longer supplies sites with keyword data for visitors coming in. This was a big blow to SEO and many SEO’s are considering their careers dead in the water with so little data at their fingertips.
Google doesn’t care about what SEO’s think. They don’t care what site masters think.
Google wants it’s search to provide the best possible results and user experience, so they are on a crusade to weed out what they call useless sites. They are on a mission to eliminate web SPAM, gray hat tactics that shift to black hat, unnatural linking, non relevant back linking, and that sort of thing.
Site masters must now think ahead and anticipate where Google is going and what is likely to keep their sites on the bright side of life.
What is search engine optimization if not this? You are optimizing your site to do well and learn what it takes to keep it safe.
Some of the qualities of a good site are:
- Regularly published content – fresh content released regularly.
- Content that your target niche finds useful.
- Content that gets shared socially on social networks – sharing means caring, and if they share it logic dictates that they must like it and that it is therefore useful and of high quality.
When factors like these occur, then the author of such content gains authority as well as the entire body of work gaining authority. This is what you want.
Social Butterflies – Don’t be a Caterpillar
Having your content shared on at least seven quality or popular social networks will be good for the content and the content creator.
When people share the content and share it across popular social networks, those social signals are going to matter more and more in the future. It’s not just back links anymore but how real people are using the content and approving of it.
Also, when content is shared this way it can generate buzz and two way communication, so Google expects to see questions and comments answered and going in both directions.
Google didn’t spend money and resources creating Google+ for nothing. Between it being their social network and being toed to other Google services, you better believe that it’s important and matters for search ranking, at least in Google’s search engine. The plus one signals add up and it’s tied to Google Authorship, so get that in order.
Sites are going to have to have reasonable mobile versions since the mobile landscape is exploding. Everything from smart phone to tablets and other devices will keep coming online and many people may switch away from their computers to these devices, so if your site is not accessible and easily navigable from them, then it will suffer. For example, maybe it won’t be in the mobile SERPs and do you really want that to happen?
Article Length Starts to Matter
While some studies show that massive 2000+ word articles do well in the rankings, the trend it definitely up and lengthy well written and structured articles do matter and tend to do well.
The only possible caveat is that this may conflict with the mobile strategy, where mobile search customers may not want huge articles on their devices. So does this mean a large article for computers and an abridged version for mobile devices? Only time will tell, but it pays to ponder this.
All this being said, I still see some very thin articles and sites on page 1, but this may really start to change as the algorithm evolves.
Search engine page ranking will depend more and more on the “stuff” inside the search engine ranking algorithm. The Penguin and then the Panda SEO update got this new ball rolling, and currently rolls with Hummingbird, but that is not where it will stop. It will never stop, and the Big G will keep evolving this using their bottomless well of genius computer scientist minds coupled with their distributed computing model, which is essentially a wide area supercomputer.
If you think that you are smarter than that bunch then you really ought to rethink that position.