Preventing crawlers (&people) to go through content in different languages - Opencart

Question

I've been working on an e-commerce site that is based on Opencart CMS. Site is basically done, there are still some issues to solve. One of these issues is seo related. In this post I will focus on robots.txt file and related things.

1.) Is it safe to disallow google crawlers from crawling to seo non friendly urls (for example www.example.com/?route=something or www.example.com/?limit=50 or www.example.com/?product_id=1) I wouldn't want crawler to start ignoring my store, but there is also a fact that site is using only seo friendly urls (.htaccess redirects)? Can disallowing alternate seo non friendly paths (which point to seo friendly) harm my website?

2.) Store is multilingual & multi-store. I've 4 different stores that have been using the same folder and database. Stores are in my native language and one of them is using different (non English) language. There is a possibility all of them will use all the languages in the future, but for now, I want them only to focus on a local audience. I've added

Disallow: /en-gb
Disallow: /hr-hr

to my robots.txt file, but people are still visiting site in English, Croatian. Content there is incomplete (& user experience there is very poor). Keep in mind, there is one page completely in Croatian. I've deleted all the hreflang metadata added some lines to robots.txt. What else can I do? What are good practices?

Thank you all for your answers.

PS: Sorry for bad title, but I ran out of ideas for it.


Show source
| seo   | .htaccess   | google-search   | robots.txt   | google-crawlers   2016-09-14 16:09 0 Answers

Answers ( 0 )

◀ Go back