At the root of the site file. .htaccess :

 RewriteEngine On RewriteCond %{REQUEST_URI} ^.*.txt$ [NC] RewriteRule .* - [L,R=404] 

Here, the 404 error is returned to any request from the txt file. You need to change the regular expression:

 ^.*.txt$ 

So that it matches all .txt files except robots.txt .

  • As I understand it, the second point should designate exactly the point, and not any symbol. So it’s worth screening - cpp questions
  • I tried it like this: ^ (?! robots) .txt $ and like that ^ (?! robots). *. txt $ - does not fit - Ruport

2 answers 2

Your mistake is that you have incorrectly composed an expression. Try it:

 ^.*(?<!robots).txt$ 

I think it is not necessary to explain simple details like: ^ - beginning of line ^ .* - capture 0 or more characters; I will touch only (?<!robots).txt - this construction tells us that a search is made for a word in front of which there are no robots , followed by a simple .txt file format and the end of the line ( $ ).

    As a temporary solution, like this:

     RewriteCond %{REQUEST_URI} ^.*.txt$ [NC] RewriteCond %{REQUEST_FILENAME} -f RewriteRule .* - [L,R=404] 

    Since robots.txt is dynamically generated by me (php), this method works: all files with the .txt extension are inaccessible by the browser, but the dynamic robots.txt is available.