MSSQL Server. There is a huge table (> 15 000 000 records). The records contain a unique record identifier (table key), a non-unique identifier (software that writes to the database) as a string, timestamp, and other less interesting data. In the stored procedure (CLR SP), I first want to collect all non-unique identifiers, then, iterating over them, collect the data of interest:
// Псевдокод list_of_serials = SELECT [SerialNo] FROM [Table] GROUP BY [SerialNo]; foreach( serial in list_of_serials ) { rowset = SELECT * FROM [Table] WHERE [SerialNo] = @serial AND [Timestamp] BETWEEN @startDate AND @endDate; ORDER BY [Timestamp] // Обработка результатов }
The trouble is that each request lasts at least 10 minutes with a local database. I am trying to smoke indices, but I have not yet achieved tangible results. How to deal with this problem? Additional meta-tables and triggers are already too late =) I would appreciate any ideas.
Along the way, another question on .Net: does it make sense to first collect the data in an IEnumerable<...>
, then processing it with Linq means, or make a request to the database each time? (We are dealing with CLR SP ).