Search has become the dominant method by which people find what they’re looking for online. People misunderstand the web. They think that having a website, social media account or any kind of presence online automatically means everyone is looking at it. In reality, you could put up the snazziest, most interesting website ever, and it would be akin to building a beautiful billboard in the middle of a remote forest.
True, there are a ton of factors that influence how well-known a brand is online, but having findable content is a cornerstone of web marketing. It is estimated that more than 70% of Internet users leverage search to find things they are looking for. If you can’t be found there, you probably aren’t relevant in their decision making processes.
So, What Is Google’s Definition of “High-Quality Content?”
While the company does not disclose the ranking signals used in its algorithms, several official sources do outline what Google views as high-quality web content. The following factors can help content marketers distinguish between high-quality articles and low-quality articles.
The future of search isn’t going to be about how articles are constructed or how many links webmasters can get leading to their pages. It’s going to be about high-quality, well-researched, useful content that people like. In a way, Google is attempting to indirectly shed the unprofessional nature of the web and include only those shining gems of content that are worthy of a mainstream audience.
- Be written by an expert in the field
- Be well researched and comprehensive
- Be useful to the person looking for it
- Be worthy of publication in books, journals, encyclopedias or other traditional publications
- So valuable people can’t believe it’s free
- Well rounded
- Free from error
The impact that a quality piece of content can have is diluted by other external factors such as:
- Excessive advertising on the page
- Overuse of relevant keywords
- Improper use of pop-ups
- Factual errors
All of this speaks to the notion that online publishers and content creators should be thinking of their users and how those people can be helped above all else. Optimization is great and necessary, but it should not be a core focus.
Yes, writing great stuff is paramount, but there are also some technical and structural components that can’t be ignored. Search engine spiders still have a method by which they find and index content. What’s more, humans still need to know they are in the right place online. With these things being true, the following best practices still apply to your carefully crafted pages.
Keywords in your page title add to the overall relevance of the page. They also show up in search for users and help them determine if a listing in a SERP is relevant to their query. You should front-load your title tags, placing your target keyword for the page first, followed by either a variant or your brand name.
This is your one chance to market your web page in search. Meta descriptions don’t affect rankings, but they do influence people. Users scan SERPs from left to right, reading rich snippets. Your target keyword for the page should also be here, placed near the beginning of the snippet. More importantly, your snippet should be enticing enough to get people to click through to your webpage.
Search spiders crawl web pages to interpret the content contained on them. Ideally, one of your headings should contain your target keyword for the page, and others might contain variants. The content underneath those headers should be about the keyword and about what the header alludes to.
Headers also help human users scan web pages. Huge blocks of text cause anxiety, making it much harder to decipher where relevant information is located on the page.
Images play an important role in contributing to the overall relevance of a web page. They contain components such as their file names as well as their alt attributes that indicate to search engines what they are about.
Be careful not to abuse these. Over optimization penalties result because webmasters stuff irrelevant keywords into alt attributes.
Your site should have an overarching keyword strategy. A list of keywords that relate to the content of your site should be generated. Once you have keywords selected that you would like to rank for in search, target pages where they can be used. Ideally, each page should have its own “theme” and be about a specific topic.
They keyword strategy should be reflected in different components of the pages of your site. For instance, one page should have a target keyword phrase positioned in the title, in headers, in the meta description, copy, images and links pointing at the page from outside sources.
Google Page Speed Insights is a good tool for measuring the speed of pages. Google also provides tips on how to fix issues with your pages.
Keywords should be present in URLs for relevant pages. Like other factors, this one element isn’t going to make or break your ability to rank, but it will be one more small piece in the puzzle. While URL paths and file names are important for ranking in search, there is also a user experience component.
Related: How to start an anonymous blog
Google put out a webmaster video talking about the user experience implications of having file paths vs file names in URLs. This may be more of a factor for educated users who are wise to the fact that search rankings can be manipulated, and a target keyword as the file name of a page looks spammy.
You might not be able to say it, but you should configure your URLs with canonicalization in mind. Even though some URLs might seem similar (i.e. www.example.com vs example.com), they are, in fact, different locations. So, what does canonical mean as it relates to search? Simply put, a canonical URL is one that a webmaster has indicated as the preferred one to index.
Identifying canonical pages on your website can help avoid issues like duplicate content (which runs the risk of not being indexed) or URLs that are not duplicated from being excluded from the index altogether. Google has an excellent post on proper uses of the rel=canonical tag.
You’ll hear on-site and on-page used interchangeably. While there may be other definitions you’ve heard, to me, on-page means factors that have to do with pages, while on-site means factors that impact all the pages of a site as a whole. The header says “considerations” because there are some things you can do to a website as a whole that may not make it rank any better, but still affect how and when it appears in search.
There are a lot of misconceptions about the robots.txt file. I’ve seen people say it’s required, and I’ve seen people write about how not having one means you won’t rank well or won’t be indexed in search.
So should you have one at all? According to Google’s Matt Cutts, it’s good to have one that is configured very specifically just so nothing is left to chance. If you haven’t had one for some time, chances are pretty low that it has negatively impacted your site. The risk comes in the form of a web host inserting something for you when Googlebot (or other spiders) go to look for the file. In other words, Google is going to look for the file – would you rather tell them what to do or leave it to chance?
Rel and Meta No Follow
The no-follow tag became recognized by search engines in 2009. Google and other search providers stated that they would not pass PageRank or its equivalent to target pages when the rel=nofollow attribute was present. Up until that time, the meta=nofollow tag was used; however, it instructed search engines to ignore all links on the page where the meta tag was present.
No-follow plays a couple of important roles. It keeps sites protected from other domains that may be spammy or untrustworthy in nature. It also helps webmasters shape the flow of PageRank through their websites.
Matt Cutts wrote a good (but dated) post on the topic of PageRank flow. While the content is old, the concepts are pretty much still the same. There is also a disclaimer (written at the time of posting) stating that the company has become far more advanced in its ability to identify and analyze links of value.
A sitemap is important for indexability of your site in search. It’s essentially a road map for Google and other search engines to follow when indexing your site. Typically, search engines will find new pages and add them to their index without you having to do anything.
When you can submit a site map, though, you can ensure that all of your URLs get found quickly. Sitemaps provide a ton of other useful information about your pages to Google. For example, you can include markup about certain content, or let search engines know how often a resource changes. Google has some good documentation on why sitemaps are necessary.
The linking strategy and architecture of your site plays an important role in its search performance. Mainly, they help establish a hierarchy on your website. For instance, when building content silos, you can do so through a directory structure or through the way pages are linked together.
You should have clear information architecture on your site where there is one main theme and several sub-themes. The main page (typically the home page) should be linked to sub-pages, which, in turn, link to other sub-pages. If you are interested in learning more about content silos, Bruce Clay is an authority on the topic.
Google Authorship will be a factor in how the future of search shapes up. With each change that comes to the algorithm, the company tries to make it think and behave more like a human would. Authorship relies on the notion that publishers are creating good content that is linked to their profiles. Those connections are then used to deliver potential results to people looking for content that might be relevant.
If you haven’t configured authorship already for your site, you should. It is a very simple process, and it gives publishers all sorts of benefits in search.
Search engines have become pretty savvy at deciphering content on web pages. That doesn’t mean you should make it hard for them to do. Schema markup is a general standard recognized by browsers and search engines. It allows webmasters to tailor how content is displayed in search. Marking up content is also important for mobile applications like maps, where structured data is very important.
You aren’t going to rank any better in search by implementing schema markup, but there are some other benefits to using it. For example, in SERPs, there are areas where additional information is displayed, such as address information, movie listings, reviews and other data that won’t show up if Google doesn’t know how to interpret it. In this regard, you can be showing up in search where before you might not have otherwise.
Longer pieces are preferred to shorter, superficial articles. Some of the top ranking websites contain content consisting of 2000 words or more on many of their pages. Since people view longer articles as being more valuable, they are more likely to link to it – there’s a direct correlation between an article’s length and the number of people linking to it.
Does that mean that having 2000 words of nonsensical gibberish will make your web page rank well? Absolutely not. Even having a lot of content that is shallow with no real substance will not rank better than a shorter, well-written counterpart. The content should be concise. Avoid fluff words and phrases. These are words that don’t add any meaning to a sentence. The overall length should be relative to the value of the article.
Making content findable means using tactics that are configured off of your website domain. These (by nature) are more difficult to control and/or influence. Just like with on-page and on-site components, writing quality content is paramount, but these elements still play a role in search visibility.
Backlinks remain a very important signal as to the popularity of a particular web page. When there are numerous links from lots of high-quality and trustworthy websites, the page being linked to becomes high quality and trustworthy as well. Building backlinks can be one of the most impactful activities for SEO in terms of ranking a site well in search.
Having a presence on some sites is still beneficial. For example, business listings, directories or even press releases are great for more exposure on the web, but as far as links go, they aren’t high impact. Make inbound linking to your website part of your strategy, and focus on obtaining high-quality links. This is key to an inbound marketing strategy. These could come from:
- Networking with other website owners
- Guest blogging
- Creating an embeddable infographic
- Making really useful content that others want to link to
If you are building links manually, focus on following a specific pattern and avoid drastic changes in your behavior. For example, don’t build one link a week for several months and then all of a sudden acquire 200 links in a day. Of course, if the occurrence is natural, so be it. This will draw attention to you, though, so make sure the activity is legitimately natural.
Social won’t be ignored any longer, and many professionals feel it has a strong influence over organic search – especially when you are talking about sites run by search engine companies. Moz has done a number of correlation studies, and (while causation cannot be proven as they would say) the results are intriguing, to say the least. Cutts has come right out and said that Google does not use social indicators (namely +1s) in its algorithm.
The general theory is that good content also happens to be shared and liked a lot on social media. It isn’t the fact that the content was interacted with in that way that makes it rank well. Authorship is a ranking factor, however, and it can sometimes be confused with Google Plus, since users have to go to the platform to initiate the connection and because a Google Plus profile is used to make the association between content and publisher.
If you are doing business online, or just have an interest in being found there, you cannot ignore search. It is one of the easiest methods for finding content. Now that search companies have gotten popular and more sophisticated at stopping web spam, users have a wealth of information literally at their fingertips. You have to know how to be findable online to be successful at marketing yourself in search. You can have a beautiful and very functional website, but if no one knows it’s there, it might as well be a billboard standing in a forest.
Shawn Manaher is the founder and CEO of The Content Authority. He’s one part content manager, one part writing ninja organizer, and two parts leader of top content creators. You don’t even want to know what he calls pancakes.