We're working on a new e-commerce store (subdomain) and have noticed that Google has crawled much of the store subdomain pages already.
The issue is that they have a ton of URL String Parameters included in the indexed results, like this:
https://store.oursite.com/en-us/widget/accessories?sort=Price
I've gone into Google Webmaster Tools and added in parameters for them not to index:
- p
- pageSize
- sort
- sortorder
- categorybase
- redirecturl
Would that be enough to block something like that example URL above? In other words, I don't need to add sort=Price as a parameter too, do I?
Also, after doing this, will Google eventually remove those URLs from the SERPS?