This is not a holivar, but a specific task. There is a base for postgres. And something has become rather bad for her lately. We have at this stage - cron runs the python script, which scans the data for the graphs for the last day every 5 minutes. Total we have in the text file on the server 250k records (graph id, value, time). On php using ajax, I grab the data and parse, depending on the graph, what data to display. Like lepotta. FF loads data for 1s, but the donkey processes data for 10 seconds. So the performance gain is low.
Attention, experts, the question is how to optimize this business? It seems to me that the python should be faster in parsing the values on the ID, but can php page run the script and how can I pass the tag values?
And if you do this with the help of PCP, then transfer the tag values and the time with a getta or a post is a trifling matter. But this again will be parsed by the server and it seems to be from empty to empty? Who kind of did \ did?
UPD
For simplicity, I want to implement a scheme (request to the database from python -> create a file of ALL values -> parsing php data \ python -> the end user from the php page accesses data only for its graphs) stumbled on the intrument of text file parsing. php - it seems easy, but what about performance? python is kind of a cool YaP, but how can I pass ids tags and is it even possible?
разница файла и запроса к базе 