Hello!
Help solve the problem. There is a program that implements the AEC algorithm, but there are some differences:
- the time that the program needs to decrypt is 2 times longer than that required to be encrypted;
- on different computers the result of the program is different, on one computer after the program is executed, the size of the source and decrypted files are the same, and on the other computer they differ by several bytes.
What could be the reason? After all, AES is a symmetric block cipher, which means that the time it takes to decrypt a file should coincide with the time it takes to encrypt a file or differ a little, but not 2 times as much. Why on one computer when decrypting information decoded into the decrypted file is equal to that in the source file, and on the other computer in the decrypted file, empty lines are added to the information that is written there, resulting in the size of the decrypted file does not match the size of the original file .