I want to save the file in a separate process. Is it safe? After all, there should not be the same problems as with multithreading, right?

from multiprocessing import Process def update(self, request): uid = str(uuid.uuid4()) Process(target=self.save_file, args=(uid, request)).start() return uid 
  • If several processes simultaneously write to one file, there will be porridge. - Arnial
  • They only write to me in one log. On the production did not check, but on LAN everything is cool. :) - faoxis 2:22
  • What problems with multithreading are you talking about? logging by default can work with multiple threads . But in order to write to several logs from several processes , additional synchronization is required. Check out the documentation for multiple single file processes - jfs
  • You have a question: "Is it possible to write to one log file from several processes?" or something different. Try to write more informative headlines — you have more than 100 questions. - jfs

1 answer 1

In one file it is impossible, for example, there should be 2001 lines, but in fact 1850

 import multiprocessing file='1.tmp' open(file, 'w').write('') def write(c): for r in range(c): with open(file, 'a') as f: f.write('%s\n' % r) print('count {}, r {}'.format(c, r)) if __name__ == '__main__': with multiprocessing.Pool(processes=3) as pool: pool.map(write, [1000]*2) print('len', len(list(open(file)))) 

out:

 count 1000, r 999 count 1000, r 999 len 1850