common SEO problems

Common SEO problems are often overlooked causing major SEO, crawling, and indexing troubles for a website. They are trivial and very easy to fix if you are looking for them. Let us see the list of common SEO issues and solutions to fix them.

Common SEO problems

  1. Missing Title tag
  2. Missing Meta Description
  3. Missing h1 tag
  4. Multiple h1 tags
  5. No keywords in title, description, and h1
  6. Terrible page speed
  7. Using CSR apps for landing page
  8. Missing canonical URLs
  9. Missing or Invalid robots.txt
  10. Missing Sitemap.xml
  11. Missing Meta Keywords

1. Missing Title

The <title/> tag is the basic block of SEO that tells what a page (URL) is about to search engine crawlers/bots. Every page should have a title tag explaining the purpose/gist of the page. 

It is ideal to have keywords you wish to rank for but do not stuff search queries/keywords in it. Recommended length should be 50-60 characters. 

2. Missing Meta Description

The Meta description tag explains the purpose of the page/product/service. This will give more context to search engine bots/crawlers to index and rank your page/URL. 

It is ideal to have keywords you wish to rank for but do not stuff search queries/keywords in it. Recommended length should not exceed 160 characters. 

3. Missing h1 tag

<h1> tag in the body of the page helps search engines to understand the structure of a page. This is like telling “here’s what my page is about”. Title, Meta Description stays outside of the body tag. H1 tags are often styled to stand out to provide a clear context to the page visitor as well.

4. Multiple h1 tags

This is a very very common mistake committed by rookie web developers. They design the landing page and use the h1 tag for multiple sections on the same page. Having multiple h1 tags will compete with each other to rank the page.

There should be only ONE h1 tag on any page.

5. No keywords in title, meta description, and h1

All content you write in title, meta description, and h1 should be consistent. They all should emphasize the same context so that search engines can grasp and index your page for the context easily. Typically, that context is written with keywords that you want to rank for. But be careful not to stuff too many keywords and search queries in these entities.

6. Terrible page speed

Pagespeed is a ranking factor. Yes, speed is a direct ranking factor. Most website owners don't care about how fast their page loads. But Google has repeatedly said they consider how fast a page loads into the ranking algorithm because it affects user experience directly.

Also, they released a statistic saying that 60% of website visitors leave a page if it takes more than 3 seconds to load. Make sure your website loads as fast as possible. You can use Google's official tool to measure your site's performance, SEO, etc here: https://web.dev/measure/

7. Using CSR apps for landing page

With the rise of new technologies react, vue, angular many developers tend to use them for landing pages. The problem is that they are just javascript apps without any HTML. The HTML content that you "see" is generated after the page is loaded. This paradigm is called "Client-side rendering" (CSR). Instead, landing pages and marketing pages must be built with "Server-side rendering" or "Static site generation" (SSG). You can read more about them here: CSR vs SSR vs SSG

8. Missing canonical URLs

Every page should have a canonical URL coded in HTML which is nothing but the original URL of the content displayed on the page. You can read more about canonicalization on Moz's website.

9. Missing or Invalid robots.txt

Robots.txt is the first location where search engines look at for learning how to interpret your website. This contains which pages to index, which pages to exclude, and locations of sitemaps.

So, without a robots.txt or even worse - a misconfigured one, a website's indexing is messed up. Meaning - potential customers would not find your website upon searching on Google. 

10. Missing Sitemap.xml

Sitemap.xml contains structured data about all the pages/content on your website. A search engine crawler will visit this page and follow the URLs mentioned in the sitemap to crawl quickly. 

If there is no sitemap.xml found on your site then the crawlers have NO idea about how many pages or what content you have on your website. They will be simply crawling through your webpage looking for internal links in a brute force way. And most possibly you might not have a proper internal URL linking structure. Always have a clear sitemap.xml for your website, blog, and resources. 

11. Missing Meta Keywords

Meta keywords tag used to be a ranking factor in Google around a decade ago. It is NO longer a ranking factor and that's why some developers tend to ignore/forget adding it. But the fact is, this tag will still provide context/relevance information about a page to search engines. So always add meta keywords tag.

How to identify common SEO problems?

Identifying common SEO problems is not so common. It is a lot of tedious manual processes. Instead, you can use many SEO audit/analyzer tools. One such tool is provided for free by Superblog here: SEO Analyzer tool.

If you have a blog then taking care of all the above consumes a lot of time. Checkout Superblog - a blogging platform that takes care of all such factors automatically. You can focus on writing instead of spending time and money fixing common SEO problems and technical SEO issues. Superblog also makes sure that your blog scores high in Google Lighthouse, Core Web Vitals, and common SEO audits automatically.