New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[question] should shouldUpdateComponent hook be considered a hint? #728

Closed
rosenfeld opened this Issue Jun 11, 2017 · 46 comments

Comments

Projects
None yet
4 participants
@rosenfeld

rosenfeld commented Jun 11, 2017

Hi, I'd like to apologize myself upfront if this is not the proper place to ask questions, but I couldn't find any discussion group or so. Please let me know where I should ask such questions.

This is what React says about shouldUpdateComponent:

https://facebook.github.io/react/docs/react-component.html#shouldcomponentupdate

Currently, if shouldComponentUpdate() returns false, then componentWillUpdate(), render(), and componentDidUpdate() will not be invoked. Note that in the future React may treat shouldComponentUpdate() as a hint rather than a strict directive, and returning false may still result in a re-rendering of the component.

I'd like to know if the same is valid for Preact as well. If this behavior changes it would possibly affect components that rely on that current behavior, so I feel quite scared about the possibility that this behavior could change in the future. I'd like to know what Preact and Inferno think about this so that I could feel confident about relying on that behavior and just drop React support once they decide to ignore the false return value.

I've been thinking a lot about an autocomplete solution for our application that supports multiple selections, so let me explain why I think this is relevant. One of the approaches I've been considering is to use an structure such as:

render() {
  return <div className={uniqueComponentClassName}>
    <InputComponent className={uniqueInputClassName} />
    <SelectionDisplay selected={selectedItems} />
    <OptionsMenu onShouldUpdateComponent={_=> false} className={uniqueOptionsMenuClassName} />
  </div>
}

In componentWillMount I would register some global event handlers that would compare event.target against some known unique css class names. Then I'd like to be able to populate the OptionsMenu with whatever I want knowing that that component would remain untouched by Preact. There are many reasons why this would be interesting. For example, I noticed with our current autocomplete custom component based on jQuery UI that when we have to render thousands of options, using cloneNode instead of createElement can be much faster, specially in some browsers. So, I could decide to use that to generate the menu items, for example. Then, as the items are selected, I'd like to update the state of the autocomplete component so that SelectionDisplay is updated correctly as well as other parts of the applications, but I don't want Preact to touch the content of OptionsMenu. In that case, if we ignore the onShouldUpdateComponent results the component wouldn't just become slower. It wouldn't work correctly at all. That's why I'm concerned if Preact would ever consider ignoring the return value from shouldComponentUpdate. It's a valuable escape hatch, specially when integrating to third party components not managed by the vdom implementation. Does that make sense to you?

@robertknight

This comment has been minimized.

Show comment
Hide comment
@robertknight

robertknight Jun 11, 2017

Collaborator

The short answer is that I think you can assume the contracts for lifecycle hooks in Preact are the same as React unless explicitly documented otherwise.

For example, I noticed with our current autocomplete custom component based on jQuery UI that when we have to render thousands of options, using cloneNode instead of createElement can be much faster, specially in some browsers

Have you looked into virtualizing the list so that you only render the options that the user can actually see? This is how most native UI toolkits and many React applications that deal with large lists of items achieve good performance even on very slow devices.

In that case, if we ignore the onShouldUpdateComponent results the component wouldn't just become slower. It wouldn't work correctly at all

In the setup you describe, it sounds like OptionsMenu breaks a basic contract of React components, which is that the UI is a function of the inputs (props) and current state. The runtime can therefore update a component whenever it likes.

It's a valuable escape hatch, specially when integrating to third party components not managed by the vdom implementation

The way I usually handle integration with non-React APIs is to use the componentWillUpdate / componentDidUpdate lifecycle hooks to compare the current / previous inputs & state, determine what changed and apply the changes to the third-party component. This is much the same pattern as React itself uses to apply changes to DOM nodes. Would this not work for your use case?

Collaborator

robertknight commented Jun 11, 2017

The short answer is that I think you can assume the contracts for lifecycle hooks in Preact are the same as React unless explicitly documented otherwise.

For example, I noticed with our current autocomplete custom component based on jQuery UI that when we have to render thousands of options, using cloneNode instead of createElement can be much faster, specially in some browsers

Have you looked into virtualizing the list so that you only render the options that the user can actually see? This is how most native UI toolkits and many React applications that deal with large lists of items achieve good performance even on very slow devices.

In that case, if we ignore the onShouldUpdateComponent results the component wouldn't just become slower. It wouldn't work correctly at all

In the setup you describe, it sounds like OptionsMenu breaks a basic contract of React components, which is that the UI is a function of the inputs (props) and current state. The runtime can therefore update a component whenever it likes.

It's a valuable escape hatch, specially when integrating to third party components not managed by the vdom implementation

The way I usually handle integration with non-React APIs is to use the componentWillUpdate / componentDidUpdate lifecycle hooks to compare the current / previous inputs & state, determine what changed and apply the changes to the third-party component. This is much the same pattern as React itself uses to apply changes to DOM nodes. Would this not work for your use case?

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 11, 2017

Regarding rendering only the visible options, it's more easily said than done. For example, if you imagine that for a given component it would be allowed for the options to be wrapped, then you don't know the size of each item, which means you can't know unless ou render them. But it's certainly a good option for fixed height items.

But this misses the main point which I failed to point accurately. For me to be confident on largely adopting some technology I think it's important that it integrates well with third party components. So, it's not only important to decide the start point (which is decided when calling render) but also the end point, which is currently possible by returning false in componentWillUpdate. However, if this becomes only a hint rather than a guarantee, then we can't count on this behavior that is pretty important when we want to be able to define the boundaries.

Your suggestion to use componentWillUpdate/DidUpdate doesn't work in all situations, while shouldComponentUpdate does. For example, imagine you have some menu open in the multi-autocomplete component. If the vdom implementation removes it upon diffing, even if it communicates it will do that before, it won't be always possible to restore to the previous state (current scroll position, selected item in the menu and so on). It will largely depend on what the third-party component would allow us to do through its public API. With shouldComponentUpdate in the other hand, we have full control over the situation. We can delay any updates for example by returing false while the menu is open, for example.

rosenfeld commented Jun 11, 2017

Regarding rendering only the visible options, it's more easily said than done. For example, if you imagine that for a given component it would be allowed for the options to be wrapped, then you don't know the size of each item, which means you can't know unless ou render them. But it's certainly a good option for fixed height items.

But this misses the main point which I failed to point accurately. For me to be confident on largely adopting some technology I think it's important that it integrates well with third party components. So, it's not only important to decide the start point (which is decided when calling render) but also the end point, which is currently possible by returning false in componentWillUpdate. However, if this becomes only a hint rather than a guarantee, then we can't count on this behavior that is pretty important when we want to be able to define the boundaries.

Your suggestion to use componentWillUpdate/DidUpdate doesn't work in all situations, while shouldComponentUpdate does. For example, imagine you have some menu open in the multi-autocomplete component. If the vdom implementation removes it upon diffing, even if it communicates it will do that before, it won't be always possible to restore to the previous state (current scroll position, selected item in the menu and so on). It will largely depend on what the third-party component would allow us to do through its public API. With shouldComponentUpdate in the other hand, we have full control over the situation. We can delay any updates for example by returing false while the menu is open, for example.

@robertknight

This comment has been minimized.

Show comment
Hide comment
@robertknight

robertknight Jun 12, 2017

Collaborator

For example, if you imagine that for a given component it would be allowed for the options to be wrapped, then you don't know the size of each item, which means you can't know unless ou render them.

True, virtualizing a list is more complex if it has items whose height is variable and hard to measure cheaply beforehand. One approach I've used is to start with a guesstimate of the height for each item, which gives a good enough approximation of the true scrollbar range, then refine those heights with actual values when they become known.

Fortunately you don't have to implement this yourself because react-virtualized can do it. See the "CellMeasurer" example on the website.

So, it's not only important to decide the start point (which is decided when calling render) but also the end point

What do you mean by "start point" and "end point" here?

The way I would expect a (P)react wrapper around a third-party component to work is that the render() function just returns a placeholder element (eg. render() { return <div ref={(el) => this._el = el}/>; }) and then all the actual work of creating, updating or tearing down the third party component happens in lifecycle hooks. The VDOM diffing won't touch the third-party component.

Collaborator

robertknight commented Jun 12, 2017

For example, if you imagine that for a given component it would be allowed for the options to be wrapped, then you don't know the size of each item, which means you can't know unless ou render them.

True, virtualizing a list is more complex if it has items whose height is variable and hard to measure cheaply beforehand. One approach I've used is to start with a guesstimate of the height for each item, which gives a good enough approximation of the true scrollbar range, then refine those heights with actual values when they become known.

Fortunately you don't have to implement this yourself because react-virtualized can do it. See the "CellMeasurer" example on the website.

So, it's not only important to decide the start point (which is decided when calling render) but also the end point

What do you mean by "start point" and "end point" here?

The way I would expect a (P)react wrapper around a third-party component to work is that the render() function just returns a placeholder element (eg. render() { return <div ref={(el) => this._el = el}/>; }) and then all the actual work of creating, updating or tearing down the third party component happens in lifecycle hooks. The VDOM diffing won't touch the third-party component.

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 12, 2017

What do you mean by "start point" and "end point" here?

calling render(vdom, root) sets root as the element start point. Its siblings are not affected by the vdom engine necessarily. However, there's no mean to tell the vdom to stop at some root descendent if shouldComponentUpdate is just a hint.

The way I would expect a (P)react wrapper around a third-party component to work is that the render() function just returns a placeholder element

This doesn't work well because once the parent component state is changed, it would be re-rendered and while comparing the vdom with the actual DOM, like Preact does, it will detect that any children in the placeholder would have to be removed, for example...

rosenfeld commented Jun 12, 2017

What do you mean by "start point" and "end point" here?

calling render(vdom, root) sets root as the element start point. Its siblings are not affected by the vdom engine necessarily. However, there's no mean to tell the vdom to stop at some root descendent if shouldComponentUpdate is just a hint.

The way I would expect a (P)react wrapper around a third-party component to work is that the render() function just returns a placeholder element

This doesn't work well because once the parent component state is changed, it would be re-rendered and while comparing the vdom with the actual DOM, like Preact does, it will detect that any children in the placeholder would have to be removed, for example...

@developit

This comment has been minimized.

Show comment
Hide comment
@developit

developit Jun 12, 2017

Owner

@rosenfeld Preact will never remove elements it didn't create:

class Foo extends Component {
  componentDidMount() {
    this.base.appendChild(document.createElement('x-hello'));
    this.forceUpdate();
    console.log(this.base.firstChild);  // <x-hello />
  }
  render() {
    return <div />
  }
}
Owner

developit commented Jun 12, 2017

@rosenfeld Preact will never remove elements it didn't create:

class Foo extends Component {
  componentDidMount() {
    this.base.appendChild(document.createElement('x-hello'));
    this.forceUpdate();
    console.log(this.base.firstChild);  // <x-hello />
  }
  render() {
    return <div />
  }
}
@developit

This comment has been minimized.

Show comment
Hide comment
@developit

developit Jun 12, 2017

Owner

Oh also you mentioned a discussion group - we have an active Slack chat with about 360 people in it! https://preact-slack.now.sh

Owner

developit commented Jun 12, 2017

Oh also you mentioned a discussion group - we have an active Slack chat with about 360 people in it! https://preact-slack.now.sh

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 12, 2017

Preact will never remove elements it didn't create

Good to know. Is this also valid for other implementations such as React and Inferno? Is this stated anywhere in the Preact documentation?

rosenfeld commented Jun 12, 2017

Preact will never remove elements it didn't create

Good to know. Is this also valid for other implementations such as React and Inferno? Is this stated anywhere in the Preact documentation?

@developit

This comment has been minimized.

Show comment
Hide comment
@developit

developit Jun 12, 2017

Owner

I believe it would be true for most VDOM implementations. Preact does it because that was how React behaves, which I believe is to accommodate the exact use-case you described. I don't think there's anything in Preact's current docs that notes this behavior, since it's the same as the other implementations and was actually only fully implemented in 6.x.

Owner

developit commented Jun 12, 2017

I believe it would be true for most VDOM implementations. Preact does it because that was how React behaves, which I believe is to accommodate the exact use-case you described. I don't think there's anything in Preact's current docs that notes this behavior, since it's the same as the other implementations and was actually only fully implemented in 6.x.

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 12, 2017

But my point is, is this behavior documented in React for example? Or is it just an implementation detail that could change any time?

rosenfeld commented Jun 12, 2017

But my point is, is this behavior documented in React for example? Or is it just an implementation detail that could change any time?

@developit

This comment has been minimized.

Show comment
Hide comment
@developit

developit Jun 12, 2017

Owner

I'm actually not sure it's documented for React either, but it's something that is normally intrinsic to VDOM - since most VDOM libraries diff a new tree against the previous tree, they are entirely unaware of externally created nodes. Preact diffs the VDOM tree against the DOM itself, so there was more work to be done in ignoring elements in that regard.

Based on what I know of VDOM, I don't see a reason to worry about this detail changing - it's one of the things that makes VDOM a reasonable competitor to Shadow DOM.

Owner

developit commented Jun 12, 2017

I'm actually not sure it's documented for React either, but it's something that is normally intrinsic to VDOM - since most VDOM libraries diff a new tree against the previous tree, they are entirely unaware of externally created nodes. Preact diffs the VDOM tree against the DOM itself, so there was more work to be done in ignoring elements in that regard.

Based on what I know of VDOM, I don't see a reason to worry about this detail changing - it's one of the things that makes VDOM a reasonable competitor to Shadow DOM.

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 12, 2017

I remember I read once in React documentation that the diff algorithm should be considered an implementation detail and that this could change and that re-rendering everything should lead to the same effect. I'm worried about this because if I implement things based on the assumptions of how things currently work and then React decides to change their implementation details and then Preact would decide to follow up with React, then I would be in a really bad position, being unable to upgrade as it would break our application because I was counting on things that I shouldn't be counting on. This makes all the difference. That's why I'm trying to get an idea on what I should take for sure and what I shouldn't rely on. If I can count on some behavior where I can specifically say where VDOM starts and where it ends, then I'm good to come on board, however, if I can't count on that, then it would be too risky to adopt such technology that doesn't work play nice with other by design. I'm not sure you understand my real concerns...

rosenfeld commented Jun 12, 2017

I remember I read once in React documentation that the diff algorithm should be considered an implementation detail and that this could change and that re-rendering everything should lead to the same effect. I'm worried about this because if I implement things based on the assumptions of how things currently work and then React decides to change their implementation details and then Preact would decide to follow up with React, then I would be in a really bad position, being unable to upgrade as it would break our application because I was counting on things that I shouldn't be counting on. This makes all the difference. That's why I'm trying to get an idea on what I should take for sure and what I shouldn't rely on. If I can count on some behavior where I can specifically say where VDOM starts and where it ends, then I'm good to come on board, however, if I can't count on that, then it would be too risky to adopt such technology that doesn't work play nice with other by design. I'm not sure you understand my real concerns...

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 12, 2017

Here it is:

https://facebook.github.io/react/docs/reconciliation.html

It is important to remember that the reconciliation algorithm is an implementation detail. React could rerender the whole app on every action; the end result would be the same.

Except in the cases where it wouldn't.

rosenfeld commented Jun 12, 2017

Here it is:

https://facebook.github.io/react/docs/reconciliation.html

It is important to remember that the reconciliation algorithm is an implementation detail. React could rerender the whole app on every action; the end result would be the same.

Except in the cases where it wouldn't.

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 12, 2017

I just asked the React team about this subject: facebook/react#9926

rosenfeld commented Jun 12, 2017

I just asked the React team about this subject: facebook/react#9926

@developit

This comment has been minimized.

Show comment
Hide comment
@developit

developit Jun 14, 2017

Owner

FWIW I don't think the excerpt was implying "re-render, bypassing React's API constraints" - re-rendering a whole app from the root would skip shouldComponentUpdate():false components. The only way around this I'm aware of would be deep-force-update as that uses Component#forceUpdate() which bypasses shouldComponentUpdate() entirely.

Even if your component were to re-render with shouldComponentUpdate():false, any elements you create outside the scope of VDOM would be preserved. The only effect would be that the preact-created elements would be re-rendered. I guess the recommendation there (though I have no plans to turn shouldComponentUpdate() into a hint) would be to make sure a re-render will accurately reflect the state of your component. If your component is just a div with attributes that has a bunch of things added to it (children, events, etc), then you'd just need to make sure re-rendering will not destroy attributes you've manually added to the element.

Owner

developit commented Jun 14, 2017

FWIW I don't think the excerpt was implying "re-render, bypassing React's API constraints" - re-rendering a whole app from the root would skip shouldComponentUpdate():false components. The only way around this I'm aware of would be deep-force-update as that uses Component#forceUpdate() which bypasses shouldComponentUpdate() entirely.

Even if your component were to re-render with shouldComponentUpdate():false, any elements you create outside the scope of VDOM would be preserved. The only effect would be that the preact-created elements would be re-rendered. I guess the recommendation there (though I have no plans to turn shouldComponentUpdate() into a hint) would be to make sure a re-render will accurately reflect the state of your component. If your component is just a div with attributes that has a bunch of things added to it (children, events, etc), then you'd just need to make sure re-rendering will not destroy attributes you've manually added to the element.

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 14, 2017

Even if your component were to re-render with shouldComponentUpdate():false, any elements you create outside the scope of VDOM would be preserved.

My point is that this is not enough guarantee.

I guess the recommendation there would be to make sure a re-render will accurately reflect the state of your component.

The problem is that this is not always possible with third-party stateful components. Let me give you an example:

class MyComponent{
  render() {
    return <div><input type='text' className='autocomplete-input'
        ref={ input => input.dataset.setValueDescription = d => this.setValue({ valueDescription: d }) }
      /><button tabindex='-1' data-autocomplete-role='button'>&#9660;</button>
      <div>{ this.state.valueDescription }</div>
    </div>
  }
}

So, suppose you have some global listeners that will create a new autocomplete options container dynamically when the input has the 'autocomplete-input' class. Then, unless it has been already initialized, it would create a new container inside the input's parent (the React component root) with the options with a checkbox next to each item. As the checkboxes are checked, the react component would have its state changed, which should update the value description div. My point is that if there are no guarantees provided by React, then, in theory, a future version of React could decide to simply remove the root node (which would also remove the autocomplete menu with its state, such as scroll position, currently selected item and so on) and re-created it, right? Why not? But in that case the autocomplete menu would have been closed and would not be restored upon re-rendering. That's what I'm concerned about. I miss more guarantees from React regarding what it will and will not touch. If it's just an internal implementation detail, then it doesn't work for me when integrating to third party stateful components. Do you get my point?

rosenfeld commented Jun 14, 2017

Even if your component were to re-render with shouldComponentUpdate():false, any elements you create outside the scope of VDOM would be preserved.

My point is that this is not enough guarantee.

I guess the recommendation there would be to make sure a re-render will accurately reflect the state of your component.

The problem is that this is not always possible with third-party stateful components. Let me give you an example:

class MyComponent{
  render() {
    return <div><input type='text' className='autocomplete-input'
        ref={ input => input.dataset.setValueDescription = d => this.setValue({ valueDescription: d }) }
      /><button tabindex='-1' data-autocomplete-role='button'>&#9660;</button>
      <div>{ this.state.valueDescription }</div>
    </div>
  }
}

So, suppose you have some global listeners that will create a new autocomplete options container dynamically when the input has the 'autocomplete-input' class. Then, unless it has been already initialized, it would create a new container inside the input's parent (the React component root) with the options with a checkbox next to each item. As the checkboxes are checked, the react component would have its state changed, which should update the value description div. My point is that if there are no guarantees provided by React, then, in theory, a future version of React could decide to simply remove the root node (which would also remove the autocomplete menu with its state, such as scroll position, currently selected item and so on) and re-created it, right? Why not? But in that case the autocomplete menu would have been closed and would not be restored upon re-rendering. That's what I'm concerned about. I miss more guarantees from React regarding what it will and will not touch. If it's just an internal implementation detail, then it doesn't work for me when integrating to third party stateful components. Do you get my point?

@robertknight

This comment has been minimized.

Show comment
Hide comment
@robertknight

robertknight Jun 16, 2017

Collaborator

My point is that if there are no guarantees provided by React, then, in theory, a future version of React could decide to simply remove the root node (which would also remove the autocomplete menu with its state, such as scroll position, currently selected item and so on) and re-created it, right?

In theory, you are correct. In practice, there is a lot of code out there which depends upon React/Preact etc. not touching the root element for a component once created provided that its type has not changed. I think React implementations probably could commit to that as a guarantee for stateful components. On the other hand, what happens with inner elements returned by a render is harder to commit to. The whole point of React is that the view is a pure function of the input. That freedom is very valuable for implementations - witness the recent complete rewrite of React's core (React Fiber) that happened without changing the public API. Substantial changes have also been made to the way Preact works over time with minimal breakage for users.

My recommendation would be that if you have any third-party components, you extract them into a dedicated wrapper component which renders only that third-party component. That way you have a clear place to handle initialization and cleanup via the wrapper's lifecycle hooks.

Collaborator

robertknight commented Jun 16, 2017

My point is that if there are no guarantees provided by React, then, in theory, a future version of React could decide to simply remove the root node (which would also remove the autocomplete menu with its state, such as scroll position, currently selected item and so on) and re-created it, right?

In theory, you are correct. In practice, there is a lot of code out there which depends upon React/Preact etc. not touching the root element for a component once created provided that its type has not changed. I think React implementations probably could commit to that as a guarantee for stateful components. On the other hand, what happens with inner elements returned by a render is harder to commit to. The whole point of React is that the view is a pure function of the input. That freedom is very valuable for implementations - witness the recent complete rewrite of React's core (React Fiber) that happened without changing the public API. Substantial changes have also been made to the way Preact works over time with minimal breakage for users.

My recommendation would be that if you have any third-party components, you extract them into a dedicated wrapper component which renders only that third-party component. That way you have a clear place to handle initialization and cleanup via the wrapper's lifecycle hooks.

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 16, 2017

In theory, you are correct. In practice, there is a lot of code out there which depends upon React/Preact etc. not touching the root element for a component once created provided that its type has not changed.

There were many applications relying on Angular 1 API and that didn't prevent Angular 2 to move to a completely separate product. Here's my concern regarding this kind of libraries. Rails seems to be Basecamp's needs based development. Angular seems to be Google's needs based development. React seems to be Facebook needs based development. That's why I find the existence of compatible API implementations such as Preact and Inferno very important at this point. Because if Facebook decides their needs have changed, I could still count on Preact, Inferno and so on, so that I'm not tied to an specific implementation that is suited towards a company's needs.

When React was created it was important for Facebook to make it interoperable with existing components so that they could move to React slowly. Maybe they have already moved everything to React and interoperability may no longer be an important thing to support, so they might change their implementation to focus in other things. This is what it seems to be the case since they don't want to commit to any constraints in their documentation that would support third-party components.

With server-side code one can always switch slowly to another language or framework/library by moving one request at a time by properly setting up a reverse proxy such as nginx, for example. This is how I moved from Grails to Rails long ago. Through the Rack abstraction I could recently move from Rails to Roda incrementally. So, I find it important that I can plan moving my stack slowly at any time. Today React/Preact/Inferno seems like an interesting approach to write SPA, but I want to be able to slowly move out of this stack if a new stack comes up which seems more interesting to me. The only way I can think an incremental rewrite would be possible is for components to provide some means to specify its boundaries. I must be able to tell a React component that it has the control from node X but that its control should stop on node Y and Z because I'm delegating them to third-party components.

Also, how am I supposed to move small parts of the application to React, incrementally, without having to rewrite everything at once if I can't reliably delegate parts of the component to existing third-party components? I mean, I can start doing that today based on the current implementation but then maybe I wouldn't be able to upgrade React if suddenly the implementation changes in a way my expectations no longer match it even though they didn't break any public API or documentation statements since the old behavior is supposed to be an implementation detail rather than an specification. This is certainly not a good position to be in. Except if I get such guarantees from React alternatives. In that case, I know I can simply drop React support if that happens.

I really think React, Angular and Knockout, to name a few, should actively support third-party components by providing hard constraints rather than by chance based on the current implementation. Third-party components support is very important for not being taken seriously.

My recommendation would be that if you have any third-party components, you extract them into a dedicated wrapper component which renders only that third-party component.

Components are supposed to act on the application state usually, so this is not much of a realistic scenario. But even for those cases, React provides no guarantees it won't touch that component-dedicated node. It's currently handled as an implementation detail, which could change without further notice.

That way you have a clear place to handle initialization and cleanup via the wrapper's lifecycle hooks.

As I said previously, initialization and cleanup are not enough for keeping a component's state. For example, let's consider a very simple datepicker component that doesn't allow you to set up the currently selected date. Then, if you open the datepicker, then you're supposed to be able to navigate through the dates. Now, while you're navigating in the datapicker, your application got a websockets notification which triggered a state change in another part of the application. Since there are no documented guarantees regarding the implementation, React could re-render the whole thing. Even if you get notified that your datepicker will be unmounted and that it will be remounted later, all you can do using the datepicker public API is to clean it up and then to recreate it, however any states of the current date selection navigation would be lost and the user experience would be terrible. Like when you're navigating in Facebook mobile and suddenly, while you're reading a friend's post, Facebook will scroll to the top with new content and now you're curious to finish reading that interesting post but you have to scroll many more posts for an hour until you can find what you were previously reading if you can find it. This is a terrible user experience.

rosenfeld commented Jun 16, 2017

In theory, you are correct. In practice, there is a lot of code out there which depends upon React/Preact etc. not touching the root element for a component once created provided that its type has not changed.

There were many applications relying on Angular 1 API and that didn't prevent Angular 2 to move to a completely separate product. Here's my concern regarding this kind of libraries. Rails seems to be Basecamp's needs based development. Angular seems to be Google's needs based development. React seems to be Facebook needs based development. That's why I find the existence of compatible API implementations such as Preact and Inferno very important at this point. Because if Facebook decides their needs have changed, I could still count on Preact, Inferno and so on, so that I'm not tied to an specific implementation that is suited towards a company's needs.

When React was created it was important for Facebook to make it interoperable with existing components so that they could move to React slowly. Maybe they have already moved everything to React and interoperability may no longer be an important thing to support, so they might change their implementation to focus in other things. This is what it seems to be the case since they don't want to commit to any constraints in their documentation that would support third-party components.

With server-side code one can always switch slowly to another language or framework/library by moving one request at a time by properly setting up a reverse proxy such as nginx, for example. This is how I moved from Grails to Rails long ago. Through the Rack abstraction I could recently move from Rails to Roda incrementally. So, I find it important that I can plan moving my stack slowly at any time. Today React/Preact/Inferno seems like an interesting approach to write SPA, but I want to be able to slowly move out of this stack if a new stack comes up which seems more interesting to me. The only way I can think an incremental rewrite would be possible is for components to provide some means to specify its boundaries. I must be able to tell a React component that it has the control from node X but that its control should stop on node Y and Z because I'm delegating them to third-party components.

Also, how am I supposed to move small parts of the application to React, incrementally, without having to rewrite everything at once if I can't reliably delegate parts of the component to existing third-party components? I mean, I can start doing that today based on the current implementation but then maybe I wouldn't be able to upgrade React if suddenly the implementation changes in a way my expectations no longer match it even though they didn't break any public API or documentation statements since the old behavior is supposed to be an implementation detail rather than an specification. This is certainly not a good position to be in. Except if I get such guarantees from React alternatives. In that case, I know I can simply drop React support if that happens.

I really think React, Angular and Knockout, to name a few, should actively support third-party components by providing hard constraints rather than by chance based on the current implementation. Third-party components support is very important for not being taken seriously.

My recommendation would be that if you have any third-party components, you extract them into a dedicated wrapper component which renders only that third-party component.

Components are supposed to act on the application state usually, so this is not much of a realistic scenario. But even for those cases, React provides no guarantees it won't touch that component-dedicated node. It's currently handled as an implementation detail, which could change without further notice.

That way you have a clear place to handle initialization and cleanup via the wrapper's lifecycle hooks.

As I said previously, initialization and cleanup are not enough for keeping a component's state. For example, let's consider a very simple datepicker component that doesn't allow you to set up the currently selected date. Then, if you open the datepicker, then you're supposed to be able to navigate through the dates. Now, while you're navigating in the datapicker, your application got a websockets notification which triggered a state change in another part of the application. Since there are no documented guarantees regarding the implementation, React could re-render the whole thing. Even if you get notified that your datepicker will be unmounted and that it will be remounted later, all you can do using the datepicker public API is to clean it up and then to recreate it, however any states of the current date selection navigation would be lost and the user experience would be terrible. Like when you're navigating in Facebook mobile and suddenly, while you're reading a friend's post, Facebook will scroll to the top with new content and now you're curious to finish reading that interesting post but you have to scroll many more posts for an hour until you can find what you were previously reading if you can find it. This is a terrible user experience.

@developit

This comment has been minimized.

Show comment
Hide comment
@developit

developit Jun 16, 2017

Owner

This is the issue tracker for Preact though, which is not React and not a Facebook project. Preact does what it does today and is largely lead by consensus, not a company with its own motivations. It would be very unlikely for a project of Preact's nature to make decisions that have widespread negative impacts, because the users of the library are also its authors.

@robertknight's suggestion of wrapping third-party components is wise: it's how these things work today, and it works quite well. Like any marriage of two technologies, an interface needs to be defined in order to maintain an appropriate boundary between the two.

For your datepicker example, I think you're jumping to conclusions about what was supposed to be a theoretical example in the React docs. No VDOM renderer would ever suddenly re-render your entire application for no reason, since that would make it a very slow and unreliable VDOM renderer.
In reality, an application that re-renders from the root for no reason would have much larger problems to worry about than the state of a date picker being reset.

The point of the sentence you pointed out was what @robertknight described: to explain that you can make your components more resilient by not relying on shouldComponentUpdate. There are ways for you (the programmer) to ask React/Preact to bypass shouldComponentUpdate():false, so you should at least plan for your components to handle that case gracefully should it occur. The docs were not making the point that shouldComponentUpdate could fundamentally change in the future - change is possible in any library, scaring people would be an odd way to comment on versioned APIs.

I'm a little confused where the frontend/backend thing is coming from - what about front-end development prevents you from setting up the same type of boundaries you would in Rails? Instead of migrating requests, you migrate pieces of the UI hierarchy - all of the other concepts seem to apply quite naturally. I know many companies who have done this, including the company I work for who did so over the past year (massive PHP website now partially built with Preact, converted over the course of a year on a per-component basis).


Also - for what it's worth, Preact's docs explicitly call out the exact use-case you described (a date picker) as an example of proper shouldComponentUpdate() usage. As far as I'm concerned that makes it an API constraint and not an implementation detail, at least in Preact:
https://preactjs.com/guide/external-dom-mutations

Owner

developit commented Jun 16, 2017

This is the issue tracker for Preact though, which is not React and not a Facebook project. Preact does what it does today and is largely lead by consensus, not a company with its own motivations. It would be very unlikely for a project of Preact's nature to make decisions that have widespread negative impacts, because the users of the library are also its authors.

@robertknight's suggestion of wrapping third-party components is wise: it's how these things work today, and it works quite well. Like any marriage of two technologies, an interface needs to be defined in order to maintain an appropriate boundary between the two.

For your datepicker example, I think you're jumping to conclusions about what was supposed to be a theoretical example in the React docs. No VDOM renderer would ever suddenly re-render your entire application for no reason, since that would make it a very slow and unreliable VDOM renderer.
In reality, an application that re-renders from the root for no reason would have much larger problems to worry about than the state of a date picker being reset.

The point of the sentence you pointed out was what @robertknight described: to explain that you can make your components more resilient by not relying on shouldComponentUpdate. There are ways for you (the programmer) to ask React/Preact to bypass shouldComponentUpdate():false, so you should at least plan for your components to handle that case gracefully should it occur. The docs were not making the point that shouldComponentUpdate could fundamentally change in the future - change is possible in any library, scaring people would be an odd way to comment on versioned APIs.

I'm a little confused where the frontend/backend thing is coming from - what about front-end development prevents you from setting up the same type of boundaries you would in Rails? Instead of migrating requests, you migrate pieces of the UI hierarchy - all of the other concepts seem to apply quite naturally. I know many companies who have done this, including the company I work for who did so over the past year (massive PHP website now partially built with Preact, converted over the course of a year on a per-component basis).


Also - for what it's worth, Preact's docs explicitly call out the exact use-case you described (a date picker) as an example of proper shouldComponentUpdate() usage. As far as I'm concerned that makes it an API constraint and not an implementation detail, at least in Preact:
https://preactjs.com/guide/external-dom-mutations

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 16, 2017

That was exactly what I was looking for, when I asked about whether or not Preact would honor that shouldComponentUpdate() return value, since React doesn't want to commit to it and I can know that I would be able to use Preact rather than React if React indeeds decide to implement it just as a hint.

However, the issue goes far beyond that, as shouldComponentUpdate only prevents the component from being updated, but if it's removed, then any state would be lost. This is not a problem when the removal is exactly what the developer was expecting, but some removals can also be caused by innefficient diffying, like in the issue #725 which I filled a few days ago. In that case, several nodes are being removed and re-rendered in Preact while this is not what the developer would expect (React and Inferno both behave like I was expecting). If the diffying algorithm also can't provide some guarantees, then it means that an inefficient diff algorithm could lead to a bad user experience way more important than any slowdowns caused by the inneficiencies.

In our application, we must render a dynamic tree that will often contain some dynamic third-party component for each node, such as date-pickers, autocomplete and sliders. As some values are being set, for example, while checking a checkbox from an autocomplete menu, other unrelated items in the tree should be either added or removed. If the diffying algorithm is not smart enough it could cause some ancestor from the autocomplete component to be re-rendered, and in that case the menu would be removed and recreated, but losing its current state, such as scroll position, currently focused item and so on.

So it's important to have such constraints well defined as a project's goal. Because, if the project says those components shouldn't have been recreated, then this should be taken as an urgent bug which must be fixed as soon as possible. Otherwise, it's just a performance bug which is not that important to fix immediately.

Every video and article I read about integrating a third-party component, such as Google Maps, or questions in StackOverflow, Reddit and similar forums they all talk about shouldComponentUpdate as if they depended on that behavior. So it's seems like a mandatory API, very important for considering it just as a hint, because if that false return value is ignored then most of those videos and articles conclusions are no longer valid. And, like I told you, unless we have some expectations on how the diff algorithm must act upon certain state changes, then even shouldComponentUpdate isn't enough guarantee. That concerns me a lot, it's not just a theorical problem to me when I think I'll have to maintain this application for many more years.

I'm a little confused where the frontend/backend thing is coming from

I wasn't concerned with frontend vs backend specifically, I was just explaining that it's important to me to know that I can replace libraries one little part at a time and then I just mentioned some examples on how I achieved that a few times in the server-side. But ideally the same should be possible in the front-end as well

Instead of migrating requests, you migrate pieces of the UI hierarchy

That's my goal, but I need to be sure that it will be possible because the current lack of hard constraints seem to indicate the opposite to me. Since regular HTTP requests are stateless, that's a hard constraint which has the benefit that it allows us to easily map requests to separate applications as no states are stored by each application, as long as both applications can read the same cookies and they belong to the same domain. If it's a complex websockets implementation, then things get harder and we would probably have to replace all at once for the websockets implementation if it relies on in-memory states.

That's why I prefer to work with solutions that are as much standard as possible and which provide enough constraints for me to become confident I can always change my mind on which technologies to adopt.

React seems to follow some sort of standard, since there are at least two libraries implementing it (Preact and Inferno), so this is a good thing. However I'm worried about being able to integrate with third-party components that would result in changing the state of some ancestor component.

If the project can make bold statements such as shouldComponentUpdate will always be honored and provided the developer gives the libraries unique keys that any nodes would be kept unless the node with that key is no longer in the VDOM, then I can work with those constraints to build my application while feeling confident that it's a safe bet and that I can later change my mind and move to something else if I want.

However, my feeling is that VDOM authors seems like those statements are too hard to commit to, and that becomes a real concern to me, as someone who has to support this application for many years and it's a huge one.

rosenfeld commented Jun 16, 2017

That was exactly what I was looking for, when I asked about whether or not Preact would honor that shouldComponentUpdate() return value, since React doesn't want to commit to it and I can know that I would be able to use Preact rather than React if React indeeds decide to implement it just as a hint.

However, the issue goes far beyond that, as shouldComponentUpdate only prevents the component from being updated, but if it's removed, then any state would be lost. This is not a problem when the removal is exactly what the developer was expecting, but some removals can also be caused by innefficient diffying, like in the issue #725 which I filled a few days ago. In that case, several nodes are being removed and re-rendered in Preact while this is not what the developer would expect (React and Inferno both behave like I was expecting). If the diffying algorithm also can't provide some guarantees, then it means that an inefficient diff algorithm could lead to a bad user experience way more important than any slowdowns caused by the inneficiencies.

In our application, we must render a dynamic tree that will often contain some dynamic third-party component for each node, such as date-pickers, autocomplete and sliders. As some values are being set, for example, while checking a checkbox from an autocomplete menu, other unrelated items in the tree should be either added or removed. If the diffying algorithm is not smart enough it could cause some ancestor from the autocomplete component to be re-rendered, and in that case the menu would be removed and recreated, but losing its current state, such as scroll position, currently focused item and so on.

So it's important to have such constraints well defined as a project's goal. Because, if the project says those components shouldn't have been recreated, then this should be taken as an urgent bug which must be fixed as soon as possible. Otherwise, it's just a performance bug which is not that important to fix immediately.

Every video and article I read about integrating a third-party component, such as Google Maps, or questions in StackOverflow, Reddit and similar forums they all talk about shouldComponentUpdate as if they depended on that behavior. So it's seems like a mandatory API, very important for considering it just as a hint, because if that false return value is ignored then most of those videos and articles conclusions are no longer valid. And, like I told you, unless we have some expectations on how the diff algorithm must act upon certain state changes, then even shouldComponentUpdate isn't enough guarantee. That concerns me a lot, it's not just a theorical problem to me when I think I'll have to maintain this application for many more years.

I'm a little confused where the frontend/backend thing is coming from

I wasn't concerned with frontend vs backend specifically, I was just explaining that it's important to me to know that I can replace libraries one little part at a time and then I just mentioned some examples on how I achieved that a few times in the server-side. But ideally the same should be possible in the front-end as well

Instead of migrating requests, you migrate pieces of the UI hierarchy

That's my goal, but I need to be sure that it will be possible because the current lack of hard constraints seem to indicate the opposite to me. Since regular HTTP requests are stateless, that's a hard constraint which has the benefit that it allows us to easily map requests to separate applications as no states are stored by each application, as long as both applications can read the same cookies and they belong to the same domain. If it's a complex websockets implementation, then things get harder and we would probably have to replace all at once for the websockets implementation if it relies on in-memory states.

That's why I prefer to work with solutions that are as much standard as possible and which provide enough constraints for me to become confident I can always change my mind on which technologies to adopt.

React seems to follow some sort of standard, since there are at least two libraries implementing it (Preact and Inferno), so this is a good thing. However I'm worried about being able to integrate with third-party components that would result in changing the state of some ancestor component.

If the project can make bold statements such as shouldComponentUpdate will always be honored and provided the developer gives the libraries unique keys that any nodes would be kept unless the node with that key is no longer in the VDOM, then I can work with those constraints to build my application while feeling confident that it's a safe bet and that I can later change my mind and move to something else if I want.

However, my feeling is that VDOM authors seems like those statements are too hard to commit to, and that becomes a real concern to me, as someone who has to support this application for many years and it's a huge one.

@developit

This comment has been minimized.

Show comment
Hide comment
@developit

developit Jun 16, 2017

Owner

If you have a huge project you need to support for many years and require a firm commitment from the open source libraries it depends on, that seems like a difficult position to be in regardless of what libraries you select.

It sounds like you need to build a prototype and verify for yourself whether Virtual DOM is an environment that could support the needs of your application. Right now this is a highly circular discussion because nobody but yourself has the context that makes these things so much more important than they have been for the community prior.

Regarding the diffing bug you linked to, I consider that a high priority bug. However, I would also suggest that using keys to collapse parts of a list in that manner will lead to some fairly poor DOM performance. It would be better to wrap the conditionally rendered children in an element and toggle that entire tree.

Owner

developit commented Jun 16, 2017

If you have a huge project you need to support for many years and require a firm commitment from the open source libraries it depends on, that seems like a difficult position to be in regardless of what libraries you select.

It sounds like you need to build a prototype and verify for yourself whether Virtual DOM is an environment that could support the needs of your application. Right now this is a highly circular discussion because nobody but yourself has the context that makes these things so much more important than they have been for the community prior.

Regarding the diffing bug you linked to, I consider that a high priority bug. However, I would also suggest that using keys to collapse parts of a list in that manner will lead to some fairly poor DOM performance. It would be better to wrap the conditionally rendered children in an element and toggle that entire tree.

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 16, 2017

that seems like a difficult position to be in regardless of what libraries you select.

It indeed is.

It sounds like you need to build a prototype and verify for yourself whether Virtual DOM is an environment that could support the needs of your application.

Unfortunately this isn't enough. I can learn new technologies pretty quickly but I can't understand how some library's community think about some subject without asking. One can't simply test against the current state of some library when they are adopting some technology. We must also make the distinction on whether or not that implementation is part of the spec or if it's just an implementation detail because this makes all the difference. In the Ruby land, to give you an example, one of the main reasons why I use Sequel rather than ActiveRecord is because Arel, used by ActiveRecord, is considered an implementation detail, which means you shouldn't count on it being used by ActiveRecord. That means you're on your own risk when you're taking advantage of Arel to build some SQL query which is not supported through the ActiveRecord public API.

So, if I know the community thinks that supporting third-party components is a top requirement and provide the means to support that, then I'm okay with that. As long a I have a working version, even if a future release would break this requirement I know it would be considered a bug and fixed in the next release, so it would be just a matter of waiting for the next fixed release. However, if the approach is "it is what it is, if it suites your needs, great, otherwise, I'm sorry" then it doesn't work for me, unless I would be using that technology to build something with a relatively short duration, such as an event site. But for permanent large code bases, I need something more consistent to rely on.

I've been looking at many Virtual DOM based libraries as I liked the idea of not having to worry about manual updates to an existing DOM when working on the higher level components. I'm convinced VDOM can be fast enough to allow me to only be concerned about the initial rendering. However, I also need to be able to fully support any third-party component I want to use. And, of course, I don't want to rely on some technology that could be abandoned tomorrow.

I can write code that works without any changes on React, Preact, Inferno and Dio for example. That's awesome because the chances that all of them would be suddenly discontinued are very low. But I want a small bundle, so React would be my last choice, while Preact would be my favorite one as long as it works as I expect it to. Otherwise it would be the next smaller alternative. Anyway, React isn't even a choice right now until they change their documentation to provide some more guarantees I need. That's why I tend to avoid something like Mithril or Maquette. I'm not aware of any other Mithril compatible implementation. Also, even though I appreciate its XHR support, I'd prefer to handle XHR with a separate library.

Regarding the diffing bug you linked to, I consider that a high priority bug. However, I would also suggest that using keys to collapse parts of a list in that manner will lead to some fairly poor DOM performance.

It seems fast enough for me.

It would be better to wrap the conditionally rendered children in an element and toggle that entire tree.

I agree if that would be an option, but it isn't really. We actually use a table element in our real tree because it makes easier to align some items while allowing the browser to dynamically calculate a proper size for the cells. It's a single table rather than a table inside another table's row, so we can't really wrap the subtree. Currently in some cases we would toggle some "hidden" class in some situations instead of removing the elements, and that might perform better, however it also uses more memory than required and if removing and readding is fast enough, which seems to be the case, then I have no problem with using that approach.

rosenfeld commented Jun 16, 2017

that seems like a difficult position to be in regardless of what libraries you select.

It indeed is.

It sounds like you need to build a prototype and verify for yourself whether Virtual DOM is an environment that could support the needs of your application.

Unfortunately this isn't enough. I can learn new technologies pretty quickly but I can't understand how some library's community think about some subject without asking. One can't simply test against the current state of some library when they are adopting some technology. We must also make the distinction on whether or not that implementation is part of the spec or if it's just an implementation detail because this makes all the difference. In the Ruby land, to give you an example, one of the main reasons why I use Sequel rather than ActiveRecord is because Arel, used by ActiveRecord, is considered an implementation detail, which means you shouldn't count on it being used by ActiveRecord. That means you're on your own risk when you're taking advantage of Arel to build some SQL query which is not supported through the ActiveRecord public API.

So, if I know the community thinks that supporting third-party components is a top requirement and provide the means to support that, then I'm okay with that. As long a I have a working version, even if a future release would break this requirement I know it would be considered a bug and fixed in the next release, so it would be just a matter of waiting for the next fixed release. However, if the approach is "it is what it is, if it suites your needs, great, otherwise, I'm sorry" then it doesn't work for me, unless I would be using that technology to build something with a relatively short duration, such as an event site. But for permanent large code bases, I need something more consistent to rely on.

I've been looking at many Virtual DOM based libraries as I liked the idea of not having to worry about manual updates to an existing DOM when working on the higher level components. I'm convinced VDOM can be fast enough to allow me to only be concerned about the initial rendering. However, I also need to be able to fully support any third-party component I want to use. And, of course, I don't want to rely on some technology that could be abandoned tomorrow.

I can write code that works without any changes on React, Preact, Inferno and Dio for example. That's awesome because the chances that all of them would be suddenly discontinued are very low. But I want a small bundle, so React would be my last choice, while Preact would be my favorite one as long as it works as I expect it to. Otherwise it would be the next smaller alternative. Anyway, React isn't even a choice right now until they change their documentation to provide some more guarantees I need. That's why I tend to avoid something like Mithril or Maquette. I'm not aware of any other Mithril compatible implementation. Also, even though I appreciate its XHR support, I'd prefer to handle XHR with a separate library.

Regarding the diffing bug you linked to, I consider that a high priority bug. However, I would also suggest that using keys to collapse parts of a list in that manner will lead to some fairly poor DOM performance.

It seems fast enough for me.

It would be better to wrap the conditionally rendered children in an element and toggle that entire tree.

I agree if that would be an option, but it isn't really. We actually use a table element in our real tree because it makes easier to align some items while allowing the browser to dynamically calculate a proper size for the cells. It's a single table rather than a table inside another table's row, so we can't really wrap the subtree. Currently in some cases we would toggle some "hidden" class in some situations instead of removing the elements, and that might perform better, however it also uses more memory than required and if removing and readding is fast enough, which seems to be the case, then I have no problem with using that approach.

@robertknight

This comment has been minimized.

Show comment
Hide comment
@robertknight

robertknight Jun 17, 2017

Collaborator

Hello @rosenfeld - I do appreciate where you are coming from. I also try to be careful to avoid build long-lived software that relies on implementation details rather than contracts of the libraries it uses.

Personally, I think you have a number of implicit good-enough guarantees against all the React-like libraries disappearing "tomorrow" or otherwise changing in ways that would break your software: a) The core concepts and APIs of React are simple. The existence of such small libraries that implement it are proof of that. Empirically, simple things (Unix) stand the test of time. b) Facebook, Twitter and other large corporations have got huge amounts of code relying on the current behaviour of these libraries that we've discussed here. Any significant changes to those behaviours, even if they are not documented currently using a "proper" spec would have to be clearly announced.

Having said that, I do think we are probably in a place where writing a specification for React-like libraries with an accompanying suite of tests to ensure interoperability might be a valuable contribution to the community.

Collaborator

robertknight commented Jun 17, 2017

Hello @rosenfeld - I do appreciate where you are coming from. I also try to be careful to avoid build long-lived software that relies on implementation details rather than contracts of the libraries it uses.

Personally, I think you have a number of implicit good-enough guarantees against all the React-like libraries disappearing "tomorrow" or otherwise changing in ways that would break your software: a) The core concepts and APIs of React are simple. The existence of such small libraries that implement it are proof of that. Empirically, simple things (Unix) stand the test of time. b) Facebook, Twitter and other large corporations have got huge amounts of code relying on the current behaviour of these libraries that we've discussed here. Any significant changes to those behaviours, even if they are not documented currently using a "proper" spec would have to be clearly announced.

Having said that, I do think we are probably in a place where writing a specification for React-like libraries with an accompanying suite of tests to ensure interoperability might be a valuable contribution to the community.

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 17, 2017

It seems I just misinterpreted what "rerender" meant in the Reconciliation documentation. Please re-read the issue in the React project:

facebook/react#9926

When documentation says "re-render" it refers to calling the render method and reconciling the changes, not unmounting and mounting components. React never unmounts components unless key or type changes.

While we can't guarantee that React won't be calling render in more cases, we can pretty much guarantee that:

If component key and type doesn't change, React won't unmount the component (and destroy its state). At most, it will attempt to reconcile parts of the DOM managed by React. And you can always short-circuit that by returning just <div /> and managing its content yourself.

If you manually touch the DOM that React doesn't manage (e.g. imperatively change an attribute that doesn't exist on React side), React won't blow away your changes on re-rendering.

These are enough guarantees to me to feel safe about using React or some of its alternatives. Thanks for your patience :)

rosenfeld commented Jun 17, 2017

It seems I just misinterpreted what "rerender" meant in the Reconciliation documentation. Please re-read the issue in the React project:

facebook/react#9926

When documentation says "re-render" it refers to calling the render method and reconciling the changes, not unmounting and mounting components. React never unmounts components unless key or type changes.

While we can't guarantee that React won't be calling render in more cases, we can pretty much guarantee that:

If component key and type doesn't change, React won't unmount the component (and destroy its state). At most, it will attempt to reconcile parts of the DOM managed by React. And you can always short-circuit that by returning just <div /> and managing its content yourself.

If you manually touch the DOM that React doesn't manage (e.g. imperatively change an attribute that doesn't exist on React side), React won't blow away your changes on re-rendering.

These are enough guarantees to me to feel safe about using React or some of its alternatives. Thanks for your patience :)

@rosenfeld rosenfeld closed this Jun 17, 2017

@developit

This comment has been minimized.

Show comment
Hide comment
@developit

developit Jun 22, 2017

Owner

Ah yep - that's a super important distinction. Appreciate you raising the issue on both sides @rosenfeld 🙌

Owner

developit commented Jun 22, 2017

Ah yep - that's a super important distinction. Appreciate you raising the issue on both sides @rosenfeld 🙌

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 22, 2017

I'm still worried about Preact though since it compares against the actual DOM rather than with the previous tree like Dio, Inferno and React. That doesn't seem to play well with third-party stateful components that interact directly to the DOM. I know you suggest to use an empty element and do everything inside it from the third-party component but this is a bit limiting and not really required by the other alternatives that do not compare to the real DOM. For example, if you want to store some arbitrary data in a third-party library, it's a common pattern to assign them to some unique attribute in the element being enhanced. This is great because you don't usually leak memory when that element is removed from the DOM as any associated data references are removed as well. For example, when using jQuery(element).data(data), behind the scene jQuery will add a new unique attribute to element to hold that data. This is usually a problem to Preact as far as I understand what happens in Preact. It seems when reconciling Preact will notice the extra attribute and will remount that element, losing any state set by the third-party component. At least I think this is probably what's happening in this example (it's still a work-in-progress but you can see that it works in all other alternatives, but Preact):

https://codesandbox.io/s/R6m9xVpZw

Try to check some item in the first dropdown and you'll notice that it will close the dropdown once the selection changes with Preact. That doesn't happen with Dio, Inferno and React, for example.

rosenfeld commented Jun 22, 2017

I'm still worried about Preact though since it compares against the actual DOM rather than with the previous tree like Dio, Inferno and React. That doesn't seem to play well with third-party stateful components that interact directly to the DOM. I know you suggest to use an empty element and do everything inside it from the third-party component but this is a bit limiting and not really required by the other alternatives that do not compare to the real DOM. For example, if you want to store some arbitrary data in a third-party library, it's a common pattern to assign them to some unique attribute in the element being enhanced. This is great because you don't usually leak memory when that element is removed from the DOM as any associated data references are removed as well. For example, when using jQuery(element).data(data), behind the scene jQuery will add a new unique attribute to element to hold that data. This is usually a problem to Preact as far as I understand what happens in Preact. It seems when reconciling Preact will notice the extra attribute and will remount that element, losing any state set by the third-party component. At least I think this is probably what's happening in this example (it's still a work-in-progress but you can see that it works in all other alternatives, but Preact):

https://codesandbox.io/s/R6m9xVpZw

Try to check some item in the first dropdown and you'll notice that it will close the dropdown once the selection changes with Preact. That doesn't happen with Dio, Inferno and React, for example.

@developit

This comment has been minimized.

Show comment
Hide comment
@developit

developit Jun 22, 2017

Owner

As with elements, Preact will not remove attributes it did not create. In the demo, Preact is actually behaving very predictably - it does not intercept and cancel events originating within components (by design).

Owner

developit commented Jun 22, 2017

As with elements, Preact will not remove attributes it did not create. In the demo, Preact is actually behaving very predictably - it does not intercept and cancel events originating within components (by design).

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 22, 2017

Do you know the reason why the menu dropdown is closing upon an option selection while it works as I expected with the other alternatives?

rosenfeld commented Jun 22, 2017

Do you know the reason why the menu dropdown is closing upon an option selection while it works as I expected with the other alternatives?

@developit

This comment has been minimized.

Show comment
Hide comment
@developit

developit Jun 22, 2017

Owner

As I said, all of those other libraries intercept browser events and cancel event propagation. You have a focus listener on document that is being fired any time you click, because Preact does not mess with event bubbling in any way. In any case, running setup() on focus is a strange behavior - it would make much more sense to invoke setup specifically for an element based on its introduction into the DOM via a lifecycle method.

Owner

developit commented Jun 22, 2017

As I said, all of those other libraries intercept browser events and cancel event propagation. You have a focus listener on document that is being fired any time you click, because Preact does not mess with event bubbling in any way. In any case, running setup() on focus is a strange behavior - it would make much more sense to invoke setup specifically for an element based on its introduction into the DOM via a lifecycle method.

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 22, 2017

The reason why I lazily initialize the autocomplete components is so that I can load the application faster.

I suspect the reason has nothing to do with this focus listener and whether or not those libraries mess with those events. If I don't call setState after selecting an option, then it works just as expected with Preact too. But as soon as I tell Preact that the state has changed, we can see that the input element is being re-rendered in the Devs toolbar... This is probably what's causing the menu to close.

rosenfeld commented Jun 22, 2017

The reason why I lazily initialize the autocomplete components is so that I can load the application faster.

I suspect the reason has nothing to do with this focus listener and whether or not those libraries mess with those events. If I don't call setState after selecting an option, then it works just as expected with Preact too. But as soon as I tell Preact that the state has changed, we can see that the input element is being re-rendered in the Devs toolbar... This is probably what's causing the menu to close.

@developit

This comment has been minimized.

Show comment
Hide comment
@developit

developit Jun 22, 2017

Owner

If you're able to pare down that demo at all that would be useful - right now it's quite difficult to actually grasp what is happening since there are so many events and mutations at play.

Owner

developit commented Jun 22, 2017

If you're able to pare down that demo at all that would be useful - right now it's quite difficult to actually grasp what is happening since there are so many events and mutations at play.

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 22, 2017

Okay, but that will probably take a long time since I noticed a bug in our current autocomplete component and I suspect it will be faster to finish the new component and replace the old one than trying to fix the old one, so this is currently my priority, but if I can get some spare time to try to replicate the issue with a simpler code I'll let you know. The reason why I think it has nothing to do with the event handlers is because I don't use any event handler from the "React" component. All registered event handlers were manually registered using addEventListener, so there's no way this could be related to how each library handles events in my opinion.

rosenfeld commented Jun 22, 2017

Okay, but that will probably take a long time since I noticed a bug in our current autocomplete component and I suspect it will be faster to finish the new component and replace the old one than trying to fix the old one, so this is currently my priority, but if I can get some spare time to try to replicate the issue with a simpler code I'll let you know. The reason why I think it has nothing to do with the event handlers is because I don't use any event handler from the "React" component. All registered event handlers were manually registered using addEventListener, so there's no way this could be related to how each library handles events in my opinion.

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 22, 2017

But if you want to be sure that it's not my code who is closing the suggestions menu, but Preact, just take a look at this stacktrace:

image

I put a breakpoint in the suggestions element removal and then I took a screenshot of the stacktrace. You don't need to understand what the code does. If you simply put this breakpoint you'll be able to look at why Preact thinks that element should be removed on rerender since Preact didn't create that element. It was created by the third-party library. It seems quite tricky to figure out whether or not the element was created by Preact or not since it compares against the real DOM. When comparing to the previous tree one doesn't have to worry about this.

rosenfeld commented Jun 22, 2017

But if you want to be sure that it's not my code who is closing the suggestions menu, but Preact, just take a look at this stacktrace:

image

I put a breakpoint in the suggestions element removal and then I took a screenshot of the stacktrace. You don't need to understand what the code does. If you simply put this breakpoint you'll be able to look at why Preact thinks that element should be removed on rerender since Preact didn't create that element. It was created by the third-party library. It seems quite tricky to figure out whether or not the element was created by Preact or not since it compares against the real DOM. When comparing to the previous tree one doesn't have to worry about this.

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 22, 2017

Also, I replaced the custom property with a WeakMap and I can confirm the behavior didn't change, which means those extra added properties are not related to this bug.

rosenfeld commented Jun 22, 2017

Also, I replaced the custom property with a WeakMap and I can confirm the behavior didn't change, which means those extra added properties are not related to this bug.

@trueadm

This comment has been minimized.

Show comment
Hide comment
@trueadm

trueadm Jun 26, 2017

This might be related to how Preact deals with setState compared to other libraries? They deal with setState synchronously when triggered from an event listener as forms can break if you don't do this.

trueadm commented Jun 26, 2017

This might be related to how Preact deals with setState compared to other libraries? They deal with setState synchronously when triggered from an event listener as forms can break if you don't do this.

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 26, 2017

This is definitely not the issue with this example. If you change the example above to add a setTimeout before calling setState and perform it 1s later the only change is that the dropdown menu will close 1s after the change :)

rosenfeld commented Jun 26, 2017

This is definitely not the issue with this example. If you change the example above to add a setTimeout before calling setState and perform it 1s later the only change is that the dropdown menu will close 1s after the change :)

@trueadm

This comment has been minimized.

Show comment
Hide comment
@trueadm

trueadm Jun 26, 2017

@rosenfeld Sorry, I was confused as I'm seeing multiple input issues in your example above compared to the same code in React (the dropdown rarely even opens up for me, but always opens in React as intended). Your problem is most likely due to Preact's diffing of DOM vs vDOM like in other libraries.

I'm sure @developit can write an alternative version for you that works more with the Preact ecosystem of how to deal with state/diffing so your application will work as expected :)

trueadm commented Jun 26, 2017

@rosenfeld Sorry, I was confused as I'm seeing multiple input issues in your example above compared to the same code in React (the dropdown rarely even opens up for me, but always opens in React as intended). Your problem is most likely due to Preact's diffing of DOM vs vDOM like in other libraries.

I'm sure @developit can write an alternative version for you that works more with the Preact ecosystem of how to deal with state/diffing so your application will work as expected :)

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 26, 2017

Yes, @trueadm, exactly. I also believe it's related to the reconciler comparing against DOM when the other compare against to the previous VDOM, which means they won't touch any elements created by the third-party library. But the autocompleter always open for me with Preact, I'll try to update that code to a more recent version once I'm finished with the basic component and see if that helps... I initialize the autocomplete lazily and took advantage of some aspects I've observed but maybe they don't apply to all browsers. For example, I noticed that the focus event is triggered when the dropdown button is clicked as it also gets the focus, so I didn't mind creating another click event handler to lazily evaluate the component. But maybe not all browsers behave this way. Another thing is that I noticed that when clicking on the dropdown, after the focus event initialized the component, the component would register a click handler that would be able to process the same click that caused the component to be initialized. But, again, maybe I shouldn't count on that if this doesn't work for every browser...

rosenfeld commented Jun 26, 2017

Yes, @trueadm, exactly. I also believe it's related to the reconciler comparing against DOM when the other compare against to the previous VDOM, which means they won't touch any elements created by the third-party library. But the autocompleter always open for me with Preact, I'll try to update that code to a more recent version once I'm finished with the basic component and see if that helps... I initialize the autocomplete lazily and took advantage of some aspects I've observed but maybe they don't apply to all browsers. For example, I noticed that the focus event is triggered when the dropdown button is clicked as it also gets the focus, so I didn't mind creating another click event handler to lazily evaluate the component. But maybe not all browsers behave this way. Another thing is that I noticed that when clicking on the dropdown, after the focus event initialized the component, the component would register a click handler that would be able to process the same click that caused the component to be initialized. But, again, maybe I shouldn't count on that if this doesn't work for every browser...

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 26, 2017

But the main problem with Preact is not related to this. I could work around this with more explicit event handlers. The problem is that when I call setState it's removing an element added by the third-party component when it shouldn't touch it. And I can't work around this.

rosenfeld commented Jun 26, 2017

But the main problem with Preact is not related to this. I could work around this with more explicit event handlers. The problem is that when I call setState it's removing an element added by the third-party component when it shouldn't touch it. And I can't work around this.

@trueadm

This comment has been minimized.

Show comment
Hide comment
@trueadm

trueadm Jun 26, 2017

@rosenfeld Obviously question, sorry: could you not move your manually added DOM node into vDOM?

trueadm commented Jun 26, 2017

@rosenfeld Obviously question, sorry: could you not move your manually added DOM node into vDOM?

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 26, 2017

The whole point of this exercise is to test how well those alternatives play with third-party components, as this is pretty important to me. Read the "Old content" section of this article if you want to understand the reason: https://rosenfeld.herokuapp.com/en/articles/2017-06-16-adopting-react-js-seems-risky-for-long-term-projects

rosenfeld commented Jun 26, 2017

The whole point of this exercise is to test how well those alternatives play with third-party components, as this is pretty important to me. Read the "Old content" section of this article if you want to understand the reason: https://rosenfeld.herokuapp.com/en/articles/2017-06-16-adopting-react-js-seems-risky-for-long-term-projects

@trueadm

This comment has been minimized.

Show comment
Hide comment
@trueadm

trueadm Jun 26, 2017

@rosenfeld That makes sense – given what you're aiming to do here. Sorry I couldn't help more!

trueadm commented Jun 26, 2017

@rosenfeld That makes sense – given what you're aiming to do here. Sorry I couldn't help more!

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 26, 2017

No problem, this is part of my evaluation on the VDOM tech as a whole to understand which VDOM implementations I should count on when adopting it :)

rosenfeld commented Jun 26, 2017

No problem, this is part of my evaluation on the VDOM tech as a whole to understand which VDOM implementations I should count on when adopting it :)

@trueadm

This comment has been minimized.

Show comment
Hide comment
@trueadm

trueadm Jun 26, 2017

@rosenfeld To be honest, in the future I think we'll see far less of vDOM and more of compiled strategies that completely avoid vDOM (potentially with a combination of WASM). It'll be interesting for sure :)

trueadm commented Jun 26, 2017

@rosenfeld To be honest, in the future I think we'll see far less of vDOM and more of compiled strategies that completely avoid vDOM (potentially with a combination of WASM). It'll be interesting for sure :)

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 26, 2017

My current concerns with WASM is that they have the potential of creating a bigger bundle size, which affects the load performance negatively, which is something I'm particularly worried about due to some SLA we've committed to. So, it could be the case that for slow clients, the benefits of using WASM would not compensate the extra time downloading the bundle for the first non cached access after a new deploy...

rosenfeld commented Jun 26, 2017

My current concerns with WASM is that they have the potential of creating a bigger bundle size, which affects the load performance negatively, which is something I'm particularly worried about due to some SLA we've committed to. So, it could be the case that for slow clients, the benefits of using WASM would not compensate the extra time downloading the bundle for the first non cached access after a new deploy...

@trueadm

This comment has been minimized.

Show comment
Hide comment
@trueadm

trueadm Jun 26, 2017

@rosenfeld That's not strictly true from my experiences – it entirely depends on what you're trying to do. I've seen the binary sizes to be far smaller for similar logic that would be done in JS. It does depend on how the original code was written and what dependencies it pulls in with it.

I don't think there's a one-trick pony to all of this, once with have GC interop in WASM, we should be able to mix styles eagerly depending on trade-offs needed for certain scenarios.

trueadm commented Jun 26, 2017

@rosenfeld That's not strictly true from my experiences – it entirely depends on what you're trying to do. I've seen the binary sizes to be far smaller for similar logic that would be done in JS. It does depend on how the original code was written and what dependencies it pulls in with it.

I don't think there's a one-trick pony to all of this, once with have GC interop in WASM, we should be able to mix styles eagerly depending on trade-offs needed for certain scenarios.

@rosenfeld

This comment has been minimized.

Show comment
Hide comment
@rosenfeld

rosenfeld Jun 26, 2017

Let's watch how things go :) Several years ago, when I worked for another company, we used to build map-based applications that were supposed to run well in Firefox. Chrome didn't exist yet and they didn't bother to support IE. While I was working there, Chrome was born and the application would run just fine in it too, but IE still wouldn't. Sometimes, there were some other opportunities for using our map-based infra-structure but the new products would have to work with IE as well... It was hard to make it support IE too and at some point our manager suggested the team leader whether he would consider moving from JS to Flash, since it should work the same in all browsers without having to write code for multiple engines. The team leader at that time wasn't convinced about the future of Flash and was betting on the future for JS and said no to the manager. Some years later and Flash was becoming obsolete. This is kind of how I see the situation with WASM. Currently we can't know whether or not this is something that will be always supported by browsers, something that's coming to stay. That means, there's a high risk involved when adopting it, while JavaScript is certainly going nowhere. So, currently, staying with JavaScript is a safe choice. I can't currently take the risk of adopting something like WASM because I won't be able to rewrite everything later if browsers vendors decide they are no longer interested in WASM. That's also the reason why I preferred to give me some time to watch React, Angular and other alternatives before making a decision to use some of them. I try to low the risks involved when adopting some library that completely changes the way one writes code... Currently I'm keeping an eye on WASM as I was keeping an eye on React, Angular, and other libraries. It's very likely I won't be moving to WASM in the next few years, so VDOM should be what I would be investing my time in the next few years. :)

rosenfeld commented Jun 26, 2017

Let's watch how things go :) Several years ago, when I worked for another company, we used to build map-based applications that were supposed to run well in Firefox. Chrome didn't exist yet and they didn't bother to support IE. While I was working there, Chrome was born and the application would run just fine in it too, but IE still wouldn't. Sometimes, there were some other opportunities for using our map-based infra-structure but the new products would have to work with IE as well... It was hard to make it support IE too and at some point our manager suggested the team leader whether he would consider moving from JS to Flash, since it should work the same in all browsers without having to write code for multiple engines. The team leader at that time wasn't convinced about the future of Flash and was betting on the future for JS and said no to the manager. Some years later and Flash was becoming obsolete. This is kind of how I see the situation with WASM. Currently we can't know whether or not this is something that will be always supported by browsers, something that's coming to stay. That means, there's a high risk involved when adopting it, while JavaScript is certainly going nowhere. So, currently, staying with JavaScript is a safe choice. I can't currently take the risk of adopting something like WASM because I won't be able to rewrite everything later if browsers vendors decide they are no longer interested in WASM. That's also the reason why I preferred to give me some time to watch React, Angular and other alternatives before making a decision to use some of them. I try to low the risks involved when adopting some library that completely changes the way one writes code... Currently I'm keeping an eye on WASM as I was keeping an eye on React, Angular, and other libraries. It's very likely I won't be moving to WASM in the next few years, so VDOM should be what I would be investing my time in the next few years. :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment