• Howdy! Welcome to our community of more than 130.000 members devoted to web hosting. This is a great place to get special offers from web hosts and post your own requests or ads. To start posting sign up here. Cheers! /Peo, FreeWebSpace.net

Robots.txt issue

Lafaso870

New Member
My host has a main domain for the root and several domains assigned to inner folders. I want to disallow everything for the root domain. If I set the file in the root to 'disallow: /' will that affect the inner domain folders? Would I then need to set the files in each inner domain folder to 'allow: /' first and then disallow anything else. Or would they be unaffected by the file in the root?
 

LSComputers

Well-Known Member
NLC
Hello Lafaso870,

Hopefully I can shed some light on your question. Please see below, and check out my website if your looking for further marketing support:

robots.txt is based on 2 steps. The first is telling the search engine who is allowed and who isn't. This is handled by a "User-agent: RECORD" line. If you are stopping or allowing everything use an "*" if you would like to only stop/allow certain search engines you can enter their name. The next stage is the actions "allow:" or "disallow:" and the path "/" or "/images/1" etc.

Examples
The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/", or /foo.html:
# robots.txt for http://www.example.com/
User-agent: *
Disallow: /cyberworld/map/ # This is an infinite virtual URL space
Disallow: /tmp/ # these will soon disappear
Disallow: /foo.html
This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper":
# robots.txt for http://www.example.com/

User-agent: *
Disallow: /cyberworld/map/ # This is an infinite virtual URL space

# Cybermapper knows where to go.
User-agent: cybermapper
Disallow:
This example indicates that no robots should visit this site further:
# go away
User-agent: *
Disallow: /
 

WebbyMouse

New Member
Lafaso, I think the way you phrased your question is a bit confusing. If I understand you correctly you want to have the deeper urls (corresponding with your "inner folders", e.g. www.domain.com/folder1) indexed and allowed but not the actual main domain (www.domain.com). Correct? If you put "disallow: /" I think that may block your whole site, including any deeper urls. Though I believe you need a * for that after the /. So actually, just writing "disallow:domain.com" (without any slash) and "allow:domain.com/folder1" should work.
 
Top