I check robots.txt through Yandex and it gives out a bunch of incomprehensible errors of this kind. Правило начинается не с символа '/' и не с символа '*' here is the code itself:

 User-agent: Yandex Crawl-delay: 2 Disallow: loca.kg/user/login Disallow: http://loca.kg/result Disallow: http://loca.kg/cabinet Disallow: http://loca.kg/user/login Disallow: http://loca.kg/user/userlogin Disallow: http://loca.kg/user/usercabinet Disallow: http://loca.kg/user/register Disallow: http://loca.kg/center Disallow: http://loca.kg/centerlist Sitemap: http://loca.kg 

    1 answer 1

    In the disallow rule, you do not need to specify the full path with the protocol:

     User-agent: Yandex Crawl-delay: 2 Disallow: /user/login Disallow: /result Disallow: /cabinet Disallow: /user/login Disallow: /user/userlogin Disallow: /user/usercabinet Disallow: /user/register Disallow: /center Disallow: /centerlist 

    But for sitemap you need to specify the full path, for example:

     Sitemap: http://loca.kg/sitemap.xml