Disallow Sub-domain From Robots.txt
I am useing a sharing hosting and I have 2 domains.
My primacy domain is travelersmeeting.com
what I notice is that Google index pages from my sub-domain thenewagetraveler.com
and give to my site travelersmeeting.com, as a result Is looks like I have duplicated content....
Exable:
normal page: thenewagetraveler.com/Thailand-Chiang-Mai..html
duplicated page:
travelersmeeting.com/thenewagetraveler/Thailand-Chiang-Mai..html
I try to stop this, by add this lines to robots.txt file that located in my primacy domain under the public.html folder
Disallow: /thenewagetraveler/
User-agent: Googlebot
Noindex: /thenewagetraveler/
Is that correct?
I just want Google index my sub domain thenewagetraveler but not crow and index the same url with-in travelersmeeting.com
It will work or I have to do something more?
-
mlao -
Thanks
{{ DiscussionBoard.errors[3841814].message }} -
-
sonicadam123 -
Thanks - 1 reply
{{ DiscussionBoard.errors[3841872].message }}-
mlao -
Thanks
{{ DiscussionBoard.errors[3842714].message }} -
-