Google stopped indexing pages

I have around 6000 pages that should be indexed according to my sitemap and robots.txt. However, only 1.34k pages are currently indexed and the rest seem to never get indexed. This number has decreased from 2-3k indexed pages. I have tried posting more content and waiting weeks, but it doesn't seem to help. All pages have been crawled by Google but not indexed. I have already re-uploaded the sitemap and confirmed that the robots.txt is correct.

Link to image of Google index page: here

Sitemap link: here

Does anyone have any suggestions or insights on this issue?

It’s possible that the issue could be related to the quality of your content and the website’s overall authority. Google prioritizes indexing high-quality, engaging content that provides value to users.

Here are some potential factors and solutions based on common experiences:

  • Content Quality:

    • Thin Content: Check for pages with minimal text or duplicate content. These might be deemed irrelevant by Google and therefore not indexed.
    • Low-Quality Content: Ensure that your content is well-written, informative, and relevant to your target audience.
    • Lack of Internal Linking: Building strong internal linking structures can help Google understand the connections between your pages and improve crawling.
  • Website Authority:

    • Backlinks: A lack of high-quality backlinks from authoritative websites can signal to Google that your website is not trustworthy. Focus on earning natural, relevant backlinks.
    • Domain Age: Newer websites often have lower authority and may take time to establish themselves with Google.
  • Technical Issues:

    • Crawl Errors: Look for crawl errors within your Google Search Console. These might reveal problems preventing Google from accessing or indexing your pages.
    • Sitemap Issues: While you’ve already re-uploaded the sitemap, double-check its accuracy and ensure that it properly reflects the structure of your website.
    • Robots.txt Errors: Confirm that your robots.txt file is correctly set up to allow Google to crawl your sitemap and the pages you want to be indexed.
  • Other Factors:

    • Server Issues: Ensure your website is running smoothly and efficiently.
    • Mobile-Friendliness: Google prioritizes mobile-friendly websites. Make sure your website is optimized for mobile devices.

Actionable Steps:

  1. Analyze your content: Conduct a thorough review of your content, focusing on its quality, relevance, and originality.
  2. Improve internal linking: Make sure your pages are well-linked, both within the site and to external sources.
  3. Review Google Search Console: Look for crawl errors, indexed pages, and any other relevant data that might shed light on the issue.
  4. Check backlinks: Use tools like Ahrefs or Moz to assess the quality of your backlinks and identify potential areas for improvement.
  5. Monitor your website performance: Keep an eye on your website’s overall health and ensure it meets Google’s technical guidelines.

Remember, indexing can be a slow process. Give Google time to crawl and index your new or updated content. Consistent effort and attention to your website’s content and technical factors will help improve your chances of getting your pages indexed.