I went to the local site of your provider. Then I wrote the following into the address bar:

http://luga.lan/robots.txt 

In response I received:

User-agent: Yandex
Disallow: / engine
Host: luga.net.ua

  • And then robots.txt to the parameters of the http request? - Alex Kapustin
  • Yeah, robots.txt - this is for search engines, http-request here to anything! - metazet
  • What are optional parameters? What goes in the GET request after the sign ? , and in request of POST in a request body? Or do you mean request / response header fields? If the latter, then they are described in the protocol standard rfc2616 (outdated document rfc2608). Translation into Russian rfc2068 easy to find on the Internet - alexlz

1 answer 1

The question is somehow strangely renamed. This is a robots.txt file that lies in the root of the site, it contains information about what can be indexed, and what can not be + separate rules.

In this case:

  • The instruction for robot of Yandex
  • Forbidding the indexation of documents (pages, files) lying in the / engine folder and in all the folders inside it. ( here it means any sequence of characters, it is automatically substituted)
  • On search pages display a site as luga.net.ua

Using robots.txt on Yandex.Webmaster