I am writing a small CMS, settings and some data are stored in XML files. I would like to close these files from unauthorized visitors (authorized scripts and so they will get).

So far, the solution is: The site.ru \ admin \ storage directory has been created with deny from all in .htaccess. Is this enough to protect data from common attacks?

Store data above the root does not work (

  • This is enough if there are no other vulnerabilities in the engine that allow you to perform arbitrary code on the server or path injection - naym
  • This will help if the web server is Apache - Ipatiev

1 answer 1

If the server is Apache, then you can do this in .htaccess:

 <Files ~ "\.xml$"> <IfModule !mod_authz_core.c> Deny from all </IfModule> <IfModule mod_access_compat.c> Deny from all </IfModule> <IfModule mod_authz_core.c> <IfModule !mod_access_compat.c> Require all denied </IfModule> </IfModule> </Files> <Files sitemap.xml> <IfModule !mod_authz_core.c> Allow from all </IfModule> <IfModule mod_access_compat.c> Allow from all </IfModule> <IfModule mod_authz_core.c> <IfModule !mod_access_compat.c> Require all granted </IfModule> </IfModule> </Files> 

If Nginx, you can do this in nginx.conf:

 location ~* \.xml$ { deny all; } location = /sitemap.xml { allow all; }