Hello, there are files with numeric data. The number of lines in each file is different.
Approximate structure.
2015 3 1 0 7 20 796.00 27 1
2015 3 1 0 7 20 796.00 27 1
2015 3 1 0 7 20 796.00 27 1
It is necessary to read the data files and add to the array for calculations. The problem is that initially it is not known how many rows will consist of a large array. Therefore implemented a double arraylist.
ArrayList<ArrayList<Double>> massivData = new ArrayList<ArrayList<Double>>(); This approach works effectively only for an array of small length. However, when I need to read for example 100 files (each of which has an average of 50,000 lines), java simply overloads the RAM. If working with a static array, the operation speed is quite acceptable.
I do not know how best to implement the integration of data from all files into one array. As an option, I think when reading files to create a conditional file mergerfile, which will be overwritten data from each file. An array of the same form of mergerfile.
What do you advise?