My option is to calculate the color of each future pixel by finding the average color between the background color and the brush color so that a smooth (feather) transition from the brush line to the background is obtained. But it takes a lot of CPU time.

alt text

2 answers 2

There is such a thing: Gaussian blur. Not very beautiful from my point of view, but rather economical in terms of resources.

G (u, v) =one× e - (u 2 + v 2 ) / (2σ 2 )
2πσ 2
  • So what? I understand that I need to draw a line in the buffer, apply a gaussian blur to the buffer, but then this buffer needs to be applied to the original image. But how to do it? - ololo
  • It is done by "adding" matrices. Adding in quotes, because with different overlay filters there will be different functions for two colors. The size of the matrix is ​​equal to the diameter of the brush. Their number depends on the length of the line, of course. With high probability, CUDA support will not appear in the near future, so in any case you will not be counting the processor. Another thing is that even the arm normalizes 40,000 integrars (a line 1000 pixels long, 10 pixels wide, 32 bits per pixel) for an extremely short time. And yes, of course, the blurred line is calculated first, and then it is superimposed on the drawing. So faster. - knes
  • And using a shader for rendering each pixel is not more effective? - ololo
  • for example HLSL - ololo

Now I understand your question.
For these purposes, I call the external program ImageMagick