I use Parallel::ForkManager
for perl multi-threading. The script itself is basically simple, the file is opened and data is read into the array from there, after the while
, the array is opened and, using the ForkManager
data is passed to the function for processing. But the problem is that if the input has to have 15 function calls, then the output can be 15 + n (sometimes it is called 1 or 10 times more), i.e., in the cycle one of the calls is duplicated .
Is it possible to check this somehow so that the data are not duplicated? At the output, the data is slightly different from the same functions (there is considered time and data do not match for 1 second) because of which this double cannot be thrown out of the array.
The script itself:
use Parallel::ForkManager; use Text::ParseWords; use IPC::Shareable; my @wPrint = 0; my $handle = tie @wPrint, 'IPC::Shareable', { destroy => 'Yes' }; my $fileLog = 'test.txt'; $pm = new Parallel::ForkManager(5); sub myFunc { my ($s) = @_; my @arr = quotewords(":", 0, $s); $handle->shlock(); push(@wPrint, $arr[0].":".$arr[1].":".$arr[2].":".$arr[3].":".$arr[4].":\n"); $handle->shunlock(); } open(my $file, '<:encoding(UTF-8)', $fileLog); while (my $row = <$file>) { my $pid = $pm->start and next; myFunc($row); $pm->finish; } close $file; $pm->wait_all_children; print @wPrint; IPC::Shareable->clean_up_all;
The data in the file in this form:
10.0.0.1:Имя 1:0:0:0: 10.0.0.2:Имя 2:0:0:0: 10.0.0.3:Имя 3:0:0:0: