Hello!

I have a list of, say, 5,000 items. I created it using a for loop or a list generator. After filling in the values, I can refer to any of them, but how can I not to run the script each time and fill in the list, write it once and then either add or edit it and be able to import into other modules? Because running the script, creating a list of the n-th number of elements is very long. Maybe in this case you need to use the databases? I apologize if I vaguely explained the essence of the issue. :)

  • 3
    See answer with pickle. - BOPOH
  • one
    It is possible and the database, you can drop the list in the file after creation and then take from there from any module ... 5k records will be read very quickly. Depends on the needs - Isaev
  • Thanks, the pickle module is good for my tasks. ) I understand that it can be used with dictionaries and then supplement entries by translating the entry to the end of the file? - Piroru

1 answer 1

If you create a list of 5,000 (small number) items for too long when the program starts, you can try to optimize its performance or read the already counted data from the disk, network, etc., if it takes a reasonable time, or just keep the program running all the time (why not).

You can use different formats to save:

  • json is a simple human-machine-readable format, understood by many languages;
  • sqlite3 - relational database in one file;
  • pickle is convenient, supports many Python objects as it is, which can lead to problems with different versions of the program (if the classes have changed). Intolerable and unsafe (may lead to the execution of arbitrary code);
  • shelve is like a dictionary, but on disk (uses pickle inside).

to be able to import into other modules

You can simply import the global object (the list itself or a proxy, with a specific interface, for example, to prohibit modification by other modules):

from source_module import your_list_proxy item = your_list_proxy[index] 

Tell me more, please, and is it possible to write lists in postresql tables?

@Piroru: possible. There are several ways, for example, each element of the list in its own row in a table (or even several tables) or the entire list as one json column. Which option to choose deserves a separate question (another question, which ORM to use or whether to use at all). What can help answer the question: what kind of list, what operations should be performed with it, how often, what are the performance requirements, portability to different systems, the possibility of installing Python packages with Pypi.

  • Tell me more, please, and is it possible to write lists in postresql tables? If possible, give a link to the article sensible. - Piroru
  • The list consists of IDs pulled out using the facebook API, then dictionaries are formed from this list (they should also be recorded in good DB), dictionaries consist mainly of the key (id) and the text of the article (meaning). After writing to the database, it is planned to make links between one-to-many tables, and using queries by id, retrieve the rest of the information associated with the article (messages, likes, etc.). The operations will be performed once a month (they will be replenished and accounted for), performance does not play a big role, portability to other systems, too, there are opportunities for installing packages. - Piroru
  • one
    @Piroru: given the liberal requirements, you can use the first simplest thing that works and be ready to rewrite once a month (change the db scheme, libraries used for work) according to need. In this case, you can tend to normalized db to reduce the likelihood of some classes of errors. Articles will be little to read, you need to read in the volume of the book (the majority will then be forgotten and become just "common sense"). - jfs
  • Thank you, I will deal with this topic. ) - Piroru