Top 5 SEO myths

With a lot of search engine optimization (SEO) tips out there comes a lot of misinformation.

Even when the original source of that information comes from search engines, like Google’s Developer Blog, it gets twisted the more paths it crosses.

Sometimes, people interpret algorithm announcements in their own way or attribute third-party metrics to SEO because they got the news from the SEO software company who wants you to keep using their product.

Which, fair. Anytime a well-known marketing strategy exists, myths are going to follow.

1. Myth: Increasing your Domain Authority is good for SEO.

Domain Authority (DA) is often associated with SEO, because SEO software company Moz created it to determine the likelihood of a site’s high visibility in search. Specifically for Google.

However, domain rankings/authority metrics have nothing to do with SEO or your site’s crawling capabilities.

John Mueller, Google’s Senior Search Analyst, tweeted:

“Just to be clear, Google doesn’t use Domain Authority at all when it comes to Search crawling, indexing, or ranking. This is pretty clear on their site.”

In 2019, he tweeted in response to someone, “We don’t use domain authority; that’s a metric from an SEO company.”

If an SEO company had collaborated with a major search engine to help people manipulate their algorithm, that SEO company would be advertising the partnership because it’d drive more business.

Instead, Moz doesn’t advertise that DA isn’t a Google ranking factor. You’ll find this intel in their learning center.

Domain Authority is not a Google ranking factor and has no effect on the SERPs.

As of the Domain Authority 2.0 update in early 2019, the calculation of a domain’s DA score comes from a machine learning algorithm’s predictions about how often Google is using that domain in its search results.

2. Myth: You have to repeat the title and keyword as an H2 header.

H2 headers are commonly mistaken as the only legitimate headers, because the title is wrapped in H1. While a page should never have more than one H1, multiple H2 headers are recommended if the content calls for them.

The header hierarchy is six levels deep. If you’re struggling to format your posts with H2, H3, etc. properly, notice how Wikipedia uses multiple second-level headers under the first level.

Outlines should not look like this:

    1. Blog Post Title
        1. Reworded Blog Post Title
            1. Sub-level post header

You don’t need to reword your blog titles just to create headers. The headers should come naturally as a result of formatting your blog post so it’s easier to read.

While keywords, key phrases and supporting key terms should be in your headers, they shouldn’t be all the time lest you want your content to sound robotic.

Keyword density is the frequency of your target keywords and key phrases, and similar (semantic) search terms. Using your headers to manipulate search engine traffic by keyword stuffing is black hat SEO. It might work temporarily, but not long-term.

Introducing your topic under the title, then titling and introducing that topic again counts as “summarizing without adding value”. It also implies you don’t trust your readers to understand what they’re reading without you reiterating it.

3. Myth: Any related external link is fine.

Do you have a set definition for a “reputable” site to link out to?

If you find yourself “doing everything right” but not benefiting from SEO, the reason could be your external linking strategy.

Linking to articles by other sites is fine if you authored them. Otherwise, link only to:

  • Sites ending in .gov or .edu
  • Research/Science/Medical journal sites
  • Wikipedia

The exception is when you’re linking to locations, affiliate products, or services. If you find an article that superbly explains what you want your readers to know, then link there even if it’s not one of the above three.

Establishing yourself as a reputable source of information in the eyes of your audience and Google means outlinking carefully.

Directly linking to government, educational or research sites removes personal prejudice associated with certain media sites.

Everyone has their own bias about mainstream media, so linking to a Fox or Times article about a study could destroy whatever good opinion they had of you.

When you directly link to research instead of articles that cover the research, you are positioning yourself as

  • an expert who understands the research,
  • keeps up with research, and
  • informs your audience of relevant research.

Instead of hearsay, your claims are backed up with science.

When you are more strict about the sites you link out to, you’ll find your site experiences less broken links and higher search rankings.

This is because you’re linking to more static, credible sites that are less likely to change in quality.

4. Myth: The Google Sandbox

The entire concept of the Google Sandbox is built on mythological SEO.

As a result, new websites are deemed “unicorns” for ranking on Google within three months, when they just published enough content for Google to crawl for their intended search intent.

The Google Sandbox concept also deters people from changing their domain name, because they’ve falsely been led to believe that their SEO will be negatively affected.

In reality, Google Search Console has a Site Move tool that will transfer your old site’s rankings to the new site.

As Google processes the move, Search Console will show several duplicate content/canonical errors.

However, these errors work themselves out after the 30-60 days mark. Most site moves are complete by six months (180 days), with the new site ranking instead within 90 days.

While they have a sandbox program, Google had repeatedly spoken out against the existence of the Google Sandbox.

Rather, new sites need to earn Google’s trust, as does any website wanting to rank on any search engine.

Without quality content, Google doesn’t know whether your website is spam just because you say it isn’t. This process can feel like a sandbox, but that doesn’t make it one.

5. Myth: Content has to be at 1000 words.

Google has no preferred word count. They’ve never had a preference for content length, as long as your content met the search intent of the people searching for it.

Long-form content is more favorable because

  • It lowers bounce rate by keeping people on your site longer.
  • Attracting more search traffic is easier, since keyword density is more manageable across 1500+ words than 100-500 words.
  • Ad networks can display more ads, thereby allowing the content creator to make more money.

While long-form content performs better than short-form content for some search results, this isn’t the case for all of them. Moreover, longer content is not a ranking factor.

Per their Helpful Content Update, Google prefers concise content that meets the needs of the people who searched for it so well that they shouldn’t need to go back and search again.

In other words, if another site manages to meet that need in less words, it might rank higher for lacking verbosity.

Anyone claiming otherwise — or giving you several other reasons instead of admitting the truth — is gaslighting you.


Falling for SEO myths isn’t a reflection of you, but a reflection of how much misinformation exists online.

(You might also be disappointed to learn that there is no perfect posting frequency, and that it doesn’t matter as much outside of how frequently you want to train Google to crawl your site, unless you’re only emailing subscribers your new blog posts.)

These SEO myths DON’T mean that SEO doesn’t matter, because it does — just not in the way you think. When in doubt, check Google (or Bing) out.

Go straight to the source to confirm SEO news from your SEO software companies, because they’re going to word SEO news in ways that helps them continue their biz. 💁‍♀️

Love this post?

Support me by subscribing to my blog and/or buying me a cuppa:

Leave a comment