Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there a way to do free drawing of a mask for image filters? #6465

Closed
codingdudecom opened this issue Jul 17, 2020 · 21 comments
Closed

Is there a way to do free drawing of a mask for image filters? #6465

codingdudecom opened this issue Jul 17, 2020 · 21 comments
Labels
stale Issue marked as stale by the stale bot

Comments

@codingdudecom
Copy link

Is there a way to do free drawing of a mask for image filters?

I'm thinking of a situation in which you apply an image filter, but you don't want it to affect the whole image. So, using a brush it would be great if you could add/remove areas of the image that get affected by the filter.

I believe that the BlendImage filter has a "mask" mode that allows using an alphaMask to show/hide parts of the image. For me, it would be sufficient if there was some way in which one would be able to draw or erase on the alphaMask in the BlendImage filter.

Any thoughts about how this could be achieved?

thanks!

@asturur
Copy link
Member

asturur commented Jul 18, 2020

This is a feature i wanted to build.
Did not have time yet.

Ideally you add a maks of effect to filters.
Then you build this mask as you want, with a brush or something else.

To build a mask of effect for filters is the first step i would try to build, it requires to know a bit of the fabricJS internals.
If you want to try i can help you.

@codingdudecom
Copy link
Author

Thanks for answering. I do know a bit about the fabricJS inner workings, but any help would be highly appreciated.
What approach would you recommend for building the mask effect?
Should I do that by adding it as an extra texture to the filter, and then combine the filter result with the original image texture based on the mask's alpha? Or did you have something else in mind?

@asturur
Copy link
Member

asturur commented Jul 19, 2020

Well to work in all occasions we do not need the original picture ( that we have available somewhere during all the process ) but the current image we are going to filter, that in a filter chain can be intermediate step.

Now what the mask would do?

If the mask is just on/off:

if (mask === 0 ) {
  return unchangedColor
} else {
  return executeFilterLogic();
}

if the mask can have different level of opacity:

const newColor = executeFilterLogic();
return (newColor * alpha + oldColor * (1 -alpha);

Ore something like that...

Now adding that filter by filter is really boring, we can probably wrap each filter in an higher order function that can do that, at cost of performances probably.

I'm not entirely sure how the code should look like. I make a theoretical example:

this is contrast today:

WEBGL

      precision highp float;
      uniform sampler2D uTexture;
      uniform float uContrast;
      varying vec2 vTexCoord;
      void main() {
        vec4 color = texture2D(uTexture, vTexCoord);
        float contrastF = 1.015 * (uContrast + 1.0) / (1.0 * (1.015 - uContrast));
        color.rgb = contrastF * (color.rgb - 0.5) + 0.5;
        gl_FragColor = color;
      }

could become

      precision highp float;
      uniform sampler2D uTexture;
      uniform sampler2D uMaskTexture;
      uniform float uContrast;
      varying vec2 vTexCoord;
      vec4 interpolate(vec4 filtered, vec4 original, float mask) {
        return filtered * mask + original ( 1.0 - mask );
      } 
      void main() {
        vec4 color = texture2D(uTexture, vTexCoord);
        vec4 mask = texture2D(uMaskTexture, vTexCoord);
        float contrastF = 1.015 * (uContrast + 1.0) / (1.0 * (1.015 - uContrast));
        color.rgb = contrastF * (color.rgb - 0.5) + 0.5;
        gl_FragColor = interpolate(filtered, texture2D(uTexture, vTexCoord), mask.a);
      }

with the interpolate function being injected in needed and also the final line being added if needed only.

Javascript

    applyTo2d: function(options) {
      if (this.contrast === 0) {
        return;
      }
      var imageData = options.imageData, i, len,
          data = imageData.data, len = data.length,
          contrast = Math.floor(this.contrast * 255),
          contrastF = 259 * (contrast + 255) / (255 * (259 - contrast));

      for (i = 0; i < len; i += 4) {
        data[i] = contrastF * (data[i] - 128) + 128;
        data[i + 1] = contrastF * (data[i + 1] - 128) + 128;
        data[i + 2] = contrastF * (data[i + 2] - 128) + 128;
      }
    },

later:

    applyTo2d: function(options) {
      if (this.contrast === 0) {
        return;
      }
      var imageData = options.imageData, i, len,
          data = imageData.data, len = data.length,
          contrast = Math.floor(this.contrast * 255),
          contrastF = 259 * (contrast + 255) / (255 * (259 - contrast)),
          destinationImageData = options.destinationImageData.data,
          maskData = options.maskData.data,

      for (i = 0; i < len; i += 4) {
        destinationImageData[i] = contrastF * (data[i] - 128) + 128;
        destinationImageData[i + 1] = contrastF * (data[i + 1] - 128) + 128;
        destinationImageData[i + 2] = contrastF * (data[i + 2] - 128) + 128;
        if (maskData[i+3] !== 1) {
          destinationImageData[i] = destinationImageData[i] * maskData[i+3] * data[i] * (1 - maskData[i+3]);
          // ... repeat for i + 1 and i + 2 ...
        }
      }
    },

this is consderably slower.

@asturur
Copy link
Member

asturur commented Jul 19, 2020

Then which is the most perforamnt way to load a mask texture and which is also a comfortable way to define it for the developer, i have no idea. I would imagine a monochrome image would be enough, but there are not such things in canvas, you need to instatiate a full image of which you read only the alpha channel.

@codingdudecom
Copy link
Author

codingdudecom commented Jul 20, 2020

I understand, I think that what you described is similar to what I imagined.

Regarding the boring part of re-writing the code for each filter, I already used an unorthodox solution/hack for a different purpose (adding watermarks) and I think this could also be used here. Probably JS purist will have my scalp for a solution like this, but in my case it worked :-)

The solution I used was to convert some methods to string, do a string replace and then add the method back in.
Something like this:

  • enumerate over the filters and override the applyToWebGL method and add an extra texture binding
filter.prototype.applyToWebGL = function(options) {
	function createTexture(backend, image) {
      return backend.getCachedTexture(image.cacheKey, image._element);
    }
  // load texture to blend.
  var gl = options.context,
      texture = createTexture(options.filterBackend, newTex);
  this.bindAdditionalTexture(gl, texture, gl.TEXTURE1);
  this.callSuper('applyToWebGL', options);
  this.unbindAdditionalTexture(gl, gl.TEXTURE1);
}
  • then the hack which is a string replace in the body of the function like this:
filter.prototype.sendUniformData.toString().replace(/\}$/g,"\tgl.uniform1i(uniformLocations.newTex, 2);\n}");
eval("filter.prototype.sendUniformData ="+sendUniformData_fnStr);
  • and a similar replace in the fragment source
var fragSrc = filter.prototype.fragmentSource;
fragSrc = fragSrc.replace(/sampler2D\s+uTexture;/g,"sampler2D uTexture;\nuniform sampler2D newTex;");
fragSrc = fragSrc.replace(/}\s*$/g,"\nvec4 colTex = texture2D(newTex,vTexCoord);\ngl_FragColor = mix(gl_FragColor, colTex, colTex.a * 0.25);\n}");
eval("filter.prototype.fragmentSource = `"+fragSrc+"`");

Do you think something like this would also work?

If so, what would be the best way to create the mask texture. To be honest that's my biggest challenge right now, to retrieve image info (width/height) from inside the code of the filter.

@asturur
Copy link
Member

asturur commented Jul 20, 2020

the original image is not always what are you looking for.

For the mask it depends what are should cover and how. Gradients? freedrawing?

@codingdudecom
Copy link
Author

sorry if I annoy you, but I just got a different idea that I want to run by you.

I'm also thinking of a different solution that could be applied in the applyToWebGL function of each filter.
As I understand it, if you have multiple filters applied to an image you have a source (that can be the result of a previous pass) and a target.

I think that if you apply the filter and get the result, then combine the source with the mask and write the result of that over the previous result you should end up with the filter only applied to the unmasked areas.

Since the webgl filters work with webgl canvas (and I don't have much experience with that) I think that this:

http://mrdoob.github.io/webgl-blendfunctions/blendfunc.html

is the equivalent of the globalcompositeoperation of the 2d context.

I think that "source-in" is the globalcompositeoperation for 2d, for webgl I am still trying to figure out

@asturur
Copy link
Member

asturur commented Jul 21, 2020

      vec4 interpolate(vec4 filtered, vec4 original, float mask) {
        return filtered * mask + original ( 1.0 - mask );
      } 

This is the blending function, isn't it?

@codingdudecom
Copy link
Author

Yes, but I was thinking of doing the masking in the base_filter.applyToWebGL() function, to avoid having to re-write all filters

@asturur
Copy link
Member

asturur commented Jul 21, 2020

You can do that too, if you have source and destination handy.
The only difference would be skip eventually expensive computations for pixels that are on mask at 0, not that my example is taking that in consideration

@stale
Copy link

stale bot commented Aug 4, 2020

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale Issue marked as stale by the stale bot label Aug 4, 2020
@stale stale bot closed this as completed Aug 11, 2020
@Bariskau
Copy link

@codingdudecom @asturur is there any news about it?

@codingdudecom
Copy link
Author

Actually yes, I've managed to do this using 2 things:

  1. the BlendImage image filter
  2. a modified version of a PencilBrush for drawing the actual mask

Basically the BlendImage filter takes 2 images and applies an operation to it according to the mode. For the BlendImage filter I added an extra mode to it and called it "mask" which simply multiplies the alpha channels of the image and the mask.

This is actually a fragmentSource:

		precision highp float;
		uniform sampler2D uTexture;
		uniform sampler2D uImage;
		uniform vec4 uColor;
		varying vec2 vTexCoord;
		varying vec2 vTexCoord2;

		void main() {
			vec4 color = texture2D(uTexture, vTexCoord);
			vec4 color2 = texture2D(uImage, vTexCoord2);
			color.a *= color2.a;
			gl_FragColor = color;
		}

The mask is drawn with a free drawing brush which I extended from the PencilBrush. It basically draws on a black canvas with either source-over or destination-over mode to remove/add from the mask.

Hope this makes sense

@Bariskau
Copy link

Bariskau commented May 16, 2022

Thank you @codingdudecom. I understand. But I'm not very knowledgeable about webgl. Can you help me with the title here? I can apply alpha mask, but I couldn't make it dynamic with brush.

@codingdudecom
Copy link
Author

I've made a Gist for this: https://gist.github.com/codingdudecom/ba183221d705a23962fcfcd3cae0c63f
and added my reply on StackOverflow. Hope this helps

@Bariskau
Copy link

Bariskau commented May 16, 2022

Thx again @codingdudecom, your approach makes sense. I wanted to try your code, but I couldn't. Is my usage correct?

https://jsfiddle.net/mkfuw726/5/

@codingdudecom
Copy link
Author

You're almost there. Made a few adjustments:
https://jsfiddle.net/codingdude/sk84xLh2/

You need to initialize the brush like this:

        canvas.freeDrawingBrush = new fabric.MaskBrush( {
            canvas:canvas,
            target:image,
            width:20,
            mode: 'source-out',
            targetMaskFilter: filter
        });

Also, in my example I used Fabric@3.4.0, seems that it also works with the latest version, but there's a little offset in coordinates that you'll have to debug yourself if you need to use the latest version

@humoyun91
Copy link

humoyun91 commented Feb 1, 2023

@codingdudecom
First of all, thank you for providing useful infos with code samples for alpha masking. I just wanted to tweak a little bit your solution: currently your brush is drawing with 100% opacity, how can I change the transparency? I want the background image should also be visible a bit.

Here is what I want to achieve:

Screen Shot 2023-02-01 at 10 31 59 PM

@codingdudecom
Copy link
Author

aha, so somebody did find this useful 😄

I think what you want is to play with the fragment shader. Try this:

precision highp float;
uniform sampler2D uTexture;
uniform sampler2D uImage;
uniform vec4 uColor;
varying vec2 vTexCoord;
varying vec2 vTexCoord2;

void main() {
	float maskOpacity = 0.5;
	vec4 color = texture2D(uTexture, vTexCoord);
	vec4 color2 = texture2D(uImage, vTexCoord2);
	color.a = mix(color.a, color.a * color2.a,maskOpacity);
	gl_FragColor = color;	
}

You could also make the maskOpacity a uniform that you can later pass to the BlendImage filter to make it adjustable. Anyway, interesting idea you gave me as this might be a good way of allowing the user to see where he's working when masking parts of the image.

let me know if this helps

@humoyun91
Copy link

humoyun91 commented Feb 1, 2023

@codingdudecom
wow, that works as I expected. Thank you very much for the quick response (usually, I did not receive any reply or receive very late in github issue threads). Let me explain what I am trying to achieve and how I ended up using alpha mask filters.
I am working on Segmentation tool where users (annotators) can freely draw some parts of image to label it as some object like a donut which I provided in the previous comment. I used fabric's PencilBrush:

this.canvas.freeDrawingBrush = new fabric.PencilBrush(this.canvas);

But the problem is that drawn paths overlap each other and makes segment (all drawn parts) look as if they are not related. I can just make opacity 1 but in this case UX is not that good. So I came to this thread 😄

Screen Shot 2023-02-02 at 1 40 15 AM

Remaining issue

I have no experience on shaders so I am struggling with this task. Can you please also help (guide) me on how to handle following problems:

  • how can I control color of a mask in the shader (how to pass a color from app (userland) to a shader program) as I need to provide any number of various segments for users so that they can label any number of various objects in the image.
  • should I create a separate mask for each segment or can be handled with just one mask
  • is it possible to convert this kind of masks into fabric objects (maybe as SVG or a path)
  • most important one: How can I save this masks (segments) so that I can extract them again from saved image when users come back again to modify already created segments

I would be really grateful if you can help on these problems!

@codingdudecom
Copy link
Author

wow, that's a pretty complex thing you want to achieve and starting with this masking feature might not be the way to go.

If I understand correctly you want to draw over the image using several colors for image segmentation (eg. green is foreground, red is background, etc)?

My code simply produces an opacity mask. Of course, you might be able to use the same code for drawing over the image and there are ways to pass the color to the shader using uniforms. Basically the passing of uniforms is done via the getUniformLocations and sendUniformData methods in the filter (see the code for BlendImage filter here: https://github.com/fabricjs/fabric.js/blob/master/src/filters/BlendImage.ts)

You'd need to override them to include your extra colors or maybe use different textures (one for each segment/area) to be able to retrieve them later.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stale Issue marked as stale by the stale bot
Projects
None yet
Development

No branches or pull requests

4 participants