We have a nuxt.js project which is an ecommerce site. We load the products page with first 100 products as SSR. after that we load other products as we needed from the client side using an API call. Now the problem is when search engine crawlers visits our website. They only can see the server renders 100 products. They don't have idea about the other products this will impact our SEO score.

How can we pre-render all the products (More than 1000) in the server side for crawlers to find. Please note that we can't remove our client API call due to performance issues and we don't have pagination as well.

How do we overcome thing challenge with Nuxt.js?

3 Replies 3

Do you have individual product pages for each of those products, and mainly want those indexed? If so, put the URLs to those individual product pages into a sitemap for crawlers to find, and then serve each of these pages pre-rendered, when requested directly?

Or are you talking about getting your "main" products page indexed? IMHO that doesn't make too much sense to begin with, in the scenario you described. Even if crawlers were able to find and index your product at "position" 999 - having a SERP lead me to that page would do me as the user no good, if you only show the first X products when the page loads, because then the product I am interested in, won't even be among those that I see.

You cannot rely on client-side API calls for SEO because search engines do not wait for asynchronous data. To make all products crawlable, you must move the data fetching to the server.

The correct solution is to fetch all products during SSR or pre-render them at build time:

  1. SSR - AsyncData - Fetch all products on the server:

export default {
  async asyncData({ $axios }) {
    const products = await $axios.$get('/products?limit=1000')
    return { products }
  }
}

Search engines will receive full HTML with all products.

2.Pre-render (Static Generate) - In nuxt.config.js, generate all product pages:

generate: {
  routes: async () => {
    const products = await fetch('https://api.example.com/products').then(r => r.json())
    return products.map(p => `/products/${p.slug}`)
  }
}

This produces fully static, SEO-friendly pages.

Client-side fetching will never be indexed, so the only way to solve the problem is to load all product data during SSR or generate static product pages at build time.

You can mix your infinite scroll with real pagination.

Approach 1: Include both infinite scroll and pagination

Include both infinite scroll and pagination on your page. The user can choose what they want, and Google can index page 2, page 3, and so on to discover all your products.

Approach 2: Hidden pagination

Use an a tag for your Load More button with the link pointing to the next page, but disable the default behavior with JS. When the user clicks the Load More button, load more products with JS. When Google crawls your page, it will still see the link and discover your paginated pages. Something like this:

  <a
    :href="`/products?page=${nextPage}`"
    @click.prevent="loadMore"
    class="load-more"
  >
    Load more
  </a>

Your Reply

By clicking “Post Your Reply”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.