Suppose we have a website where users can offer news.

What if

1. Has the user attached (uploaded images using the ajax method to the server) and closed the form?

  • It turns out the images will be drawn, ie, not attached to any post.

2. How to avoid overload, for example, the user can go every time on the form, and upload photos, then leave it, and repeat the same thing. Ie it turns out you can endlessly upload images to the server.

  • What to do in this case?

I think you understand the situation, I just want to understand the logic of such work, and how to organize it correctly. What methods are protection, filtering? The only thing that came to mind was the creation of a buffer zone with the limitation of the files in it ... but I do not think that this is the best option, since if someone overflows this zone to the limit.

I would like to hear advice on how to properly and most importantly if you can just organize it.

  • The question about sending the form itself, let's say we uploaded 3 files to the server, we put the names in the form of hidden files in the form, then we send the form, pxp reads the names from the hidden variables, and starts working with the files. Is it safe to display hidden file names when submitting a form? - user199432

1 answer 1

I have not heard about any clever solutions to this problem and have not met in my experience. All options are fairly obvious:

  • limiting the total amount of data transferred per user over a period of time. It can be made quite large. For example - 1GB per hour, in my opinion, for the eyes in your case.
  • limiting the number of files transferred per user per unit of time.
  • setting ttl (lifetime) for "free" files and deleting them after time expires. Here proceed from the probable time of filling out your form with all possible delays. Although, I would put ttl for at least a day anyway.
  • The last option is very custom and is required, in my opinion, only if you really have such a problem. Based on the logic of your application, we write a code that tracks the number and speed of the appearance of "free" files per user (taking into account whether he fills the form completely or not) and, when the value reaches a certain value, we start to issue a loading error. We block for an hour, for example. This, again, depends on how low the limit we set. If you are allowed to upload 5 files in 10 minutes, no more than 100MB - block for 10 minutes. If 20 files per hour for 200MB - block for an hour.

The most important and paramount is to understand how acute the problem is. In my experience, excessive “forethought” only takes time and complicates the project. If you don’t even have a hint of this problem, it’s better not to make any filters. Make sure to clean the files once a day and add logging according to interesting criteria - the number of files, the size of files or something like that. Plus - for a serious project, it is imperative to connect a monitoring system with metrics and notifications (such as zabbix) and set up monitoring of the parameters of interest there with notifications at a critical increase. Thus, if the number of files suddenly goes up sharply - you will not miss it, take action and think about how to filter them in the future.

I have done a similar pattern in several recent projects. Projects were closed, so there was no question for intruders. But you are not worth it yet, and only preventive measures are needed. So. All uploaded files are added to the database. When the form is saved, these records appear in the database with the entities of the forms (or something else). Once a day, the script for cron goes through the file system and checks for each file whether it has a corresponding record in the database and whether this record has a useful connection in the database. If there are no links, the file is garbage and is deleted. It should be noted that bypassing the file system is fraught with a heavy load on the disk, moreover, many queries are generated in the database. Therefore, all this happens in the deep night and the detour occurs in parts. Those. there is a given number of days for a full detour, for example 10, and every day only 1/10 of all files are checked. At the end of the check, the current state is saved and next time it continues from the same place.

PS Having expressed my opinion, nevertheless, I join the author and I will be very happy if someone tells about elegant ways to solve the issue.

  • I forgot to say that the project will be posted on shared hosting. But with the ability to create subdomains. In terms of security, I think to create a subdomain of img.site.com and upload it there, and to configure .htaccess. But the problem is that your methods are good only when you have full control over the server, there are naturally more options. But I need an option for the shared, and more optimal. I'm still inclined to the buffer zone, which will control the total number of files. - user199432
  • Ie from the buffer zone, if the form is successfully submitted, the files will be moved to a shared folder with images. Ie they will be considered valid because the form has passed the test, which means that the data is entered into the database, and the pictures are moved to the original folder. But if someone starts to clog the buffer zone with an apload of pictures, and increases the limit, then users will not be able to attach images. Here I think to come up with something interesting, like catching someone uploading a mass photo. I just don’t want to think about how to solve it in the event of such a problem, and not to prepare for it earlier. - user199432
  • Wow Shared hosting is tough. The total number of files ... Maybe I do not fully understand your idea, but it does not sound very much. Those. I, (the correct user), fill out the form, he comes (the attacker), starts spamming files, the buffer zone is clogged, and my files too (and the first, as the older ones) are deleted? In my opinion, it is impossible to manage without control over specific users, if properly implemented. - Ivan Pshenitsyn
  • No, nothing will be deleted, it will just be impossible to upload images, the buffer zone can also serve from the influx of apload as if, for example, if 20 users upload 2 pictures at a time, then this is 40 files (well, this is an example) and we are in the buffer zone put a limit on the storage of 41 files, then the 21st user if he wants to attach a picture, a message will fly out, sorry for problems with the service and so on ... try again, the buffer zone is freed by itself if the correct users work there, i.e. send normal form. P.s error will not fly out on the site page, but above the Browse button. - user199432
  • 1 - how to clear the buffer? Do not clear it, because unattached file is normal, in small quantities they will be inevitable. On shared hosting crowns are there at all, I do not know? 2 - does it bother you that in the case of an attacker’s actions all other users will suffer who will be blocked from downloading? In general, the idea deserves the right to life, especially in your limited conditions. But to think out the details definitely need. - Ivan Pshenitsyn