If you create a list of 5,000 (small number) items for too long when the program starts, you can try to optimize its performance or read the already counted data from the disk, network, etc., if it takes a reasonable time, or just keep the program running all the time (why not).
You can use different formats to save:
- json is a simple human-machine-readable format, understood by many languages;
- sqlite3 - relational database in one file;
- pickle is convenient, supports many Python objects as it is, which can lead to problems with different versions of the program (if the classes have changed). Intolerable and unsafe (may lead to the execution of arbitrary code);
- shelve is like a dictionary, but on disk (uses pickle inside).
to be able to import into other modules
You can simply import the global object (the list itself or a proxy, with a specific interface, for example, to prohibit modification by other modules):
from source_module import your_list_proxy item = your_list_proxy[index]
Tell me more, please, and is it possible to write lists in postresql tables?
@Piroru: possible. There are several ways, for example, each element of the list in its own row in a table (or even several tables) or the entire list as one json column. Which option to choose deserves a separate question (another question, which ORM to use or whether to use at all). What can help answer the question: what kind of list, what operations should be performed with it, how often, what are the performance requirements, portability to different systems, the possibility of installing Python packages with Pypi.