Applicable Packages
Lite Plus Max Shop





The robots.txt file is part of the Robots exclusion standard and lets you to instruct search engines not to index certain pages on your site. This can be useful if you have pages consisting largely of content duplicated from elsewhere (such as a product specification from a manufacturer site), which may otherwise cause your site to be penalised by search engines. Note that robots.txt is advisory only, and does not prevent automated software from accessing disallowed pages. In addition, the presence of a page in your robots.txt file may draw attention to its existence.


To edit the robots.txt file, open the settings panel and click Robots.txt in the menu on the left to display the disallowed pages list, as shown in figure 1.


Figure 1. the disallowed pages list


The drop-down field at the top of the disallowed pages list lets you disallow a page by selecting it and clicking Disallow. Below this is a list of the existing disallowed pages. To remove a page from the disallowed pages list, click Remove.