Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

particle system improvements and features #385

Closed
insidiator opened this issue Jan 19, 2014 · 16 comments
Closed

particle system improvements and features #385

insidiator opened this issue Jan 19, 2014 · 16 comments

Comments

@insidiator
Copy link
Contributor

If it is okay with you, I would like to use this ticket for discussing which features and improvements to the particle system could and should be implemented.
The implementation of a feature should then be discussed in a separate pull request.

So here are some ideas I gathered so far. They are in no particular order.

  • particle effect editor
    • The example could be extended to allow the creation of custom effects, while the current effects are just presets. An export button would then allow the user to copy and paste the necessary source code into his game.
  • arbitrary shapes for start region
    • Instead of having only a rectangle it would be nice to set the start region to any shape (circle, polygon, maybe even text?). Allowing multiple shapes would also be nice.
  • custom timer
    • That way the particle system could be paused independently of the rest of the game, or be slowed down for a slow motion effect. This might be something that could be used for all game objects (timer groups).
  • explicit path calculation
    • This would allow for reversing the timer and let the particles flow back into the start region.
  • arbitrary interpolation functions for all values (linear as default)
    • The tween.js functions could be used for interpolation of opacity, size, etc.
  • multiple particle types in one system
  • arbitrary particles
  • correct handling of floating and onlyInViewport
    • Currently there are no boundary checks when the emitter is floating and onlyInViewport is true.
  • cached drawing
    • By drawing the particle container to a custom canvas, the render output could be reused when skipFrames is in action, or the particle effect is paused. This might also bring a performance improvement when globalCompositionMode is set to lighter.
@parasyte
Copy link
Collaborator

Before I forget about it, I had a vision of a great particle editor interface where the emitter properties were controlled directly with a mouse:

  • Drag and drop emitters to move them.
  • Adjust the angle and magnitude ranges with two vector lines (originating from the emitter); the angle difference between both lines defines the total angle range. The length difference between the two lines defines the total magnitude range. The range can be drawn with a filled arc:
  • Vector Ranges
  • Particle scaling and rotation ranges can also use similar widgets.
  • And of course, everything gets a number input field (at least).

@insidiator
Copy link
Contributor Author

I like the idea of controlling the properties directly, but we should add this after we have everything else finalized.
It will take a lot of work to implement such an interface properly and updating it with every new API change will be a pain ;)
For the first version of the editor I think it should be easiest to use jqeury-ui sliders.

@agmcleod
Copy link
Collaborator

I agree. Otherwise, I think it's a great idea.

@parasyte
Copy link
Collaborator

The description also illustrates another point I made about changing the emitter angle and rotation range properties into vectors. The proposed UI for it is also very simple (approx. an hour to implement all of this?)

I ask that you not be so quick to dismiss it as a "nice to have but not a priority". Often, a UI will lead to API design decisions. Not the other way around. (Eat your own dog food to discover which parts need work. Do this early and often.)

Secondly, presentation is just as important as the rest of the particle manager. Why have a particle effect example in the first place? Eye candy. Demonstration of capabilities.

Third, the process will be identical to maintaining the melonJS engine and melonJS examples; every change to the engine requires changes to the examples, and these are maintained in parallel. So we just maintain the particle manager and its example (the editor) in parallel.

@insidiator
Copy link
Contributor Author

I didn't mean to dismiss the idea at all, I am completely on your side with having an awesome particle effect editor interface to demonstrate the capabilities of this awesome engine.

I just think it will save us a lot of time if we do some of the other changes first, as it will heavily influence what options the UI will have to provide and how they are going to work.

Specifically changing from the current model to having a particle manager which contains multiple emitters is something I think should be done first, as it will have a big impact on how particles work.

Also I am sure that we will need some iterations to make the UI feel alright after we have a first prototype, so an hour is IMHO a very optimistic guess for the whole working and shining interface ;)

And last but not least I think it is a good idea to do the fallback interface first (sliders, input fields, check-boxes), because you then can compare if everything is working as you expect it to, which in turn makes it easier to develop an experimental interface.

@parasyte
Copy link
Collaborator

The hour estimate was for the widget alone... The one that I mocked up in GIMP (image above). 😉 Adding form controls and styling them will be the brunt of the work, and I share no optimism for pace at which that can be developed. Also, manual inputs will always be necessary (last point in my first post). The widget provides a visualization and natural manipulation aspect, but a number field is the only way to provide fixed values (sliders can be inaccurate, and are just a different kind of visualization/manipulation widget).

But again, I recommend doing the work in parallel; time will be saved overall by adding multiple emitters and multiple emitter editors to the UI simultaneously. If you procrastinate on the UI, you'll just spend more time manipulating properties in code. The tradeoff seems like having a UI early would be minimally advantageous with the added effort. But in the long run, it will save a great deal of time for testing and iterating on the internal moving parts. In other words, you will see diminishing returns far faster without a decent UI.

@insidiator
Copy link
Contributor Author

Well, you are right.
But I also want to remind you that the particle effect system should be something that is used in games and not just a part of the particle effect editor. So when implementing a UI widget like the above, we should also keep in mind that it might limit the possibilities of the system if we change the available properties based on the needs of the widget.

With that said, I would like to start implementing the editor then. I will first add some input fields to the example and then open a pull request so you can have a look at it and we can continue the discussion there.

@parasyte
Copy link
Collaborator

You make a good point, and just to be clear: I don't advocate changing the APIs just to match a fancy UI. I think of the process more as an exercise in solving a problem.

The problem in this case was having multiple numbers represent range properties for the emitter. The numbers are not really connected to one another except through naming conventions, so it's hard to treat them as anything but individual numbers.

When represented as vectors, the problem of describing the range properties is resolved; a vector tightly ties coordinates together. But it's not a simple concept to understand without seeing it in action. That's what lead to the widget design and mockup.

There you have it: The API designed the UI, not the other way around.

@insidiator
Copy link
Contributor Author

Some new ideas I had while working on the editor today:

  • physics enabled particles
    • bounce off the surface of walls etc.
  • attraction and repulsion points
  • multiple images for one emitter
    • particles choose one of them at random when they spawn
  • change min/max speed to speed and speed variation
  • change min/max angle to angle and angle variation
  • change wind and gravity to force vector
  • range class which handles value ranges
    • min is always smaller than max
    • get random value in range

@aaschmitz
Copy link
Contributor

Hi @insidiator

I would help in implementing this item:

  • Multiple images for one emitter: particles choose one of them at random when they spawn

But it is interesting to use a texture atlas to load images. Any idea how to pass a texture atlas to the particles, instead of a single image?

Thanks!

@insidiator
Copy link
Contributor Author

Hi @ciangames
Using a texture atlas should be rather easy if we implement arbitrary particles.

In order to keep the draw method small, it would probably be best to create different particle classes for the current form and for particles based on renderables. We could also add additional classes for primitive particles based on shape objects or text.

Then we can just pass any object that we support into the emitter.

If you want to use all images inside the texture atlas you would probably iterate over the atlas, call createSpriteFromName for every name and put everything in an array. Then you just pass that to the emitter.

@aaschmitz
Copy link
Contributor

Hi @insidiator

Currently the emitter accept a image parameter for a single image:

me.loader.getImage("particle");

And for use a Texture Atlas we need something like:

 game.createSpriteFromName("particle");

We need create new classes as:

  • me.ParticleImage derived from me.Renderable (for use with single Images)
  • me.ParticleTexture derived from me.AnimationSheet (for use with Texture Atlas)
  • me.ParticleCanvas derived from me.Renderable (for use with Native Canvas draw, like rect)

It's something like this? What do you guys think?

Thanks!

@insidiator
Copy link
Contributor Author

I think the emitter should accept either a single image like now, or an array of images. The required particle class should then be decided inside addParticles when an image is selected.

The new classes can be based on the same parent class, because the update function should be the same for all particles.
I personally would prefer me.ImageParticle, me.TextureParticle and me.CanvasParticle, but I think that is something @obiot, @parasyte and @agmcleod should decide ;)

@obiot
Copy link
Member

obiot commented Feb 27, 2014

from my point of view the emiter should only accept a renderable object. Sure there will be a small overhead compared to just an image, but even for a single image it is useful just at least because it's then possible to get a single sprite out of a packed texture.

@parasyte
Copy link
Collaborator

The fact is that we need to optimize the renderable process anyway. Just like we need to optimize the container. There's no point to writing custom code for the sake of speed, if it is not going to help other parts of the engine as well.

So I agree with @obiot - particles should be any class that extends me.Renderable

@obiot
Copy link
Member

obiot commented Jul 28, 2014

some improvements might be there for 1.1.0 (see #531)

@obiot obiot added this to the 1.1.0 milestone Jul 28, 2014
@obiot obiot modified the milestones: 1.1.0, 1.2.0 Aug 11, 2014
@parasyte parasyte modified the milestones: 1.2.0, Future Oct 15, 2014
@melonjs melonjs locked and limited conversation to collaborators Sep 17, 2021
@obiot obiot closed this as completed Sep 17, 2021

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Projects
None yet
Development

No branches or pull requests

5 participants