Hello! I wanted to ask developers who may have had this experience! In general, there is an idea: to write code in a modular way (each module in its file), and include the required modules as a result into the main file (for example, from all 30 modules we need only some 5), or this method: at the end of writing modules, " to merge "all of them into one file and make one connection to the main file? For a clearer picture: include 5 required files of 100 kb each, and work with them, or include 1 file of 3000 kb, with all the modules glued together at once - which is faster?

  • nifiga se you have modules of 100 kb .... - mountpoint
  • This is me figuratively =)) So that you can compare the numbers! - axejko
  • one
    You still collect the compiler for php. - zb '
  • one
    Class autoloading is your everything. - nolka
  • "Class autoloading is all yours." - considered this option ... Quote from the PHP documentation: "You can define the function __autoload (), which will be automatically called when using a previously undefined class or interface.". I think this is not the best varinat. If I am not mistaken, then the function every time scrolls through the directory with the files, and if necessary, again, it includes ... It is a long time, and in my opinion even worse! Please, do not judge the idea itself, and if you have experience, tell me what is faster - include many small files, or one big file, but at times smaller than the total beat of all small ones? - axejko

4 answers 4

Are you thinking about optimizing for include? so you specify the full path and no search will be. In general, php bottlenecks are not usually when downloading files, but primarily in the application logic, in the second (and often in the first) - when exchanging with the database and in the third - when interpreting the code, regardless of whether it is in one file or in several, especially as you yourself noticed not every module will be needed for each request, i.e. when using the “one big file” scheme, you will at least perform a syntax check of this large file each time you start it.

  • I am also more inclined to the option “Better several useful inclusions than one big one, but with a bunch of extra code”! Thank! - axejko
  • yes, by the way, my post does not exclude autoload but even welcomes it. - zb '

In general, such a trick gives a performance boost (especially in conjunction with an accelerator). require / include in any case spends time connecting the file (and the time itself, of course, depends on the HDD and file system), and not everyone has SSD drives. For example, take Zend and see here , so that makes sense. But, not any project architecture allows you to implement this.

  • "Actually, this trick gives a performance boost" which one? Incorporate several smaller ones, or one large one? - axejko
  • One big one. - wendel841
  • one
    In general, in normal axes, the files that are often accessed are cached. - zb '

That's bullshit. In fact, you give the load more.

<? include ('5.php'); //в 5пхп $o=2+1; echo $o.'<br>'; echo memory_get_usage() . "\n"; //50632 ?> <? $o=2+1; echo $o.'<br>'; echo memory_get_usage() . "\n"; //50192 ?> 

You can store everything in one package in a store, or you can sort them into packages, the load will only increase (layout time + weight of packages) is another matter if you carry small packages from a store, i.e. there are separate rega.hhp lk.hhp files, but protecting a large package is easier than a small one; this function only makes sense in connecting files of the same type, such as config.hpp

but better not to suffer this garbage, write as it is more convenient for you, it takes a lot more to enter new variables

    Check out my article for an exhaustive study on the subject matter.

    Performance comparison of autoloading and merging classes into one file