Good day. There are a dozen scripts on different servers, which every minute generate a file that is dynamic in content, but approximately the same in structure. The file structure is not standard (not XML, Json), but the data is structured. Files weigh from 1mb to 5mb. You need to consolidate all these files in one system and give them to others (a zoo) on request with the ability to take only the data that is needed. The system that generates these files can not be touched (legacy straight from the Mesozoic) The data must somehow be stored for 24 hours. Now everything is transferred to files in a bunch of programs using ESB (not satisfied with the speed and fault tolerance).
In this regard, the question is using what is best to implement?
I assume the following chain: Server with file -> Service layer on each server (for file transformation and communication with Kafka) -> Kafka -> System client
But maybe I don’t understand how Kafka works and can it be implemented on it?