The story is like that. The program needed a stripped-down editor for images, in which there are sliders that change values for gamma, contrast, etc. Processed images have a size of about 6000 by 4000 pixels. Everything would be fine, but the processing, for example, of the gamma is delayed by half a second, but with some cunning it is reduced by half. For example gamma:
QTime time; time.start(); double gamma = 0.5; double G = 1/gamma; double preCalc[256]; for (int i = 0; i <= 255; i++) { preCalc[i] = pow((i / 255.0), G) * 255; // для ускорения просчитываем все заранее (чтобы не повторять это вычисление миллионы раз вместо 255) } for( int y = 0; y < tempImg.rows; y++ ) { for( int x = 0; x < tempImg.cols; x++ ) { for( int c = 0; c < 3; c++ ) { tempImg.at<cv::Vec3b>(y,x)[c] = cv::saturate_cast<uchar>( preCalc[tempImg.at<cv::Vec3b>(y,x)[c]] ); } } } qDebug() << time.elapsed(); As a result, there are 1/3 seconds. And imagine, let's add a calculation of the brightness of the contrast all ... And Photoshop processes this image in real time when the slider changes, regardless of scale. Of course, I do not forget that the guys at ADOBE can do a lot. The question is, what trick did they use? Because It happens on a laptop and I'm not sure if gpu is used.