There is a site about games. On each page there is a link to buy the game in the store, in which the lowest price.
Once a day I download 5 xml files from different stores, some of them are several megabytes. To create a page I read each file, I find in it a game and a price. I choose the lowest and insert it into the page.
And so for each page you need to read this 5 xml files of several megabytes. Pages a lot. Visitors around 2000 per day. This does not seem to be very efficient, and it consumes a lot of disk and processor resources. How to properly organize this process?
Resolved! I will store in the database.
What do you think is better to store in separate tables or to reduce everything into one? The number and name of the fields in all the XML is different.