For those of us that can remember life before Google, we might recall search engines like Yahoo! and Alta Vista helping us navigate the rapidly growing number of websites online. Back then, search engines like these were unique in that they ranked web pages solely based on content. In other words, web pages with the most relevant content and matching keywords for a search query were shown first in the search results.
Google drastically changed that approach towards the late 1990’s by introducing a multifaceted algorithm for selecting relevant pages for a given search. This algorithm is ever-evolving, remains somewhat elusive, and is a main factor in Google’s immense popularity worldwide.
Google’s algorithm is increasingly complex these days due to their introduction of RankBrain in 2016, an algorithm-generating artificial intelligence system that is effectively learning how to better serve Google’s 1 billion monthly users. Arguably the most sophisticated search engine technology to date, RankBrain ensures that the underlying algorithms directing Google’s search results are changing nearly every day.
So how can we be sure that we’re taking all the right steps for effective SEO if the standards are constantly changing?
The answer lies in the foundation of Google’s approach to page ranking. Despite the rapidly changing details of how Google ranks a page in their search results, there remain a few core elements of their approach that continue to heavily influence page ranking. While there are many resources that professionals can use to keep their websites on Google’s “good side” for SEO, even someone with average skills can improve their website’s SEO amidst this new RankBrain landscape of constant algorithm updates.
One of the most reliable and effective ways to boost your website’s page ranking is by increasing and improving the links directing users to your website. Referred to as back linking, these back links (also called inbound links), have been a major element of how Google evaluates the quality of your site’s content. The logic it uses is simple (even if it’s algorithms are not). The idea is if popular and quality websites are linking back to your content, then it’s likely that your website has relevant content worth showing to other users online.
Back linking is an effective SEO strategy for improving your visibility online, but it needs to be done correctly. Beware of online services promising to garner you hundreds of back links for $9.99. The key element of Google’s back linking algorithm is that it’s evaluating quality back links. That is, back links from quality websites.
The opposite of this is also true: websites with back links from spammy or low quality websites are penalized in the search results. Quantity is important, but not over quality. Not according to google.
So how does Google define a “quality website” then? There was a time when this information was somewhat vague, but Google helped clarify its designation of “quality” by releasing their search quality rating guidelines at the end of 2015. These guidelines are what Google’s human rankers use to evaluate websites. Their evaluations are then compared to Google’s algorithms’ rankings and if there isn’t a match, then the algorithms are improved.
It turns out that what creates a “quality website” in the eyes of Google is actually fairly straightforward:
Pages that contain “a satisfying amount of high quality main quality content” are considered to be high quality pages. What determines whether the content is high quality or not is based on the topic of the content. Scientific content is expected to be clear, accurate, detailed, and complete with references. Shopping content, on the other hand, is expected to be more brief but provide links to relevant products. Suffice it to say that creating high quality content is intentional and takes time.
A website with broken links, images or pages that don’t load (or take too long to load), and any tools or widgets that aren’t working properly will quickly diminish your site’s quality rating in the eyes of Google. It’s page rankers want to see a website that is well-maintained and regularly updated.
Google rankers are encouraged to review the online reputation of the businesses from each website they review. User reviews in articles, review sites like Yelp, and even statements made in public forums are considered, as well as more formal ratings from places like the BBB. The bottom line is that Google aims to provide users of its search engine the best user experience with its results, and that includes shielding them from potentially negative interactions with businesses or individuals with a bad reputation online.
Well Designed Page Layouts
This one is fairly simple. A website with clear and straightforward page organization will be rated as higher quality than one without it. The main content is expected to be prominently displayed and any ads or secondary content should not distract from the main content.
Google likes to see “About Us” pages or information, as well as contact information and customer support, if applicable. Again, this goes back to Google’s emphasis on the users experience on your site and with your services.
Those are some of the most salient elements of a high quality website from Google’s perspective. Of course, there’s much more for SEO experts to comb through as the full report is over 150 pages long.
There’s no question that Google revolutionized the way users search for and discover information online. More surprising has been how they’ve influenced the way website developers and owners create and deliver their content. In the vast landscape of online information, sophisticated searching tools are imperative. We want our sites to be found, and we want to deliver our content widely. Our page rankings demand that we stay rooted in the fundamentals of SEO as we endeavor to keep up with the pace of Google’s rapidly changing protocols.