Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for particle recording and playback #7085

Open
QbieShay opened this issue Jun 14, 2023 · 3 comments
Open

Add support for particle recording and playback #7085

QbieShay opened this issue Jun 14, 2023 · 3 comments

Comments

@QbieShay
Copy link

Describe the project you are working on

The Godot Engine and VFX

Describe the problem or limitation you are having in your project

  1. Art workflow is very limiting (i.e. Particle just goes off when testing animations). Need to be able to pause particles together with animations to ensure they're properly synced.

  2. To develop advanced particle effects with multiple Particles nodes, you need to be able to scrub through the particle effect going backward and forward and pausing as necessary, from all particles at the same time.

  3. Running CPU Particles is often needed for performance reasons and to target low-end devices, but falling back to CPUParticles is only possible when using the built in ParticlesProcessMaterial, which severely limits art workflows

Describe the feature / enhancement and how it helps to overcome the problem or limitation

We will create a new Resource Type called "BakedParticlesData" which will contain prerecorded data for a GPUParticles node.

The GPUParticles nodes will have an extra property for the BakedParticlesData. When the BakedParticlesData is set, instead of running the PrcoessMaterial, the particles will update from the baked data. This will allow the GPUParticles to trivially update on either the GPU or on the CPU.

Using BakedParticlesData also allows the artist to trivially scroll through the particle effect as they work on the design of the particle system.

The editor need to contain a button to bake the particle effect into a BakedaParticlesData resource. This would work the same as baking a VoxelGI or Lightmap GI node. To support this button, a new rendering server function will need to be added that allows running and recording a particle system with specified frames and frame timing.

Describe how your proposal will work, with code, pseudo-code, mock-ups, and/or diagrams

The BakedParticlesData resource will contain 3 arrays 1) transform, 2) color, 3) custom. This data will be expressed in term of the particles' lifetime and the positions expressed as offset from initial position (i.e. deltas). This will allow compatibility with local/global space and offer flexibility of initial spawn position (many cases where you want to keep the same particle motion but change the emission size. Could consider separating color as well in the baking?)

The GPUParticles will have new added properties:

  • baked_particle_data

  • process_time (for scrolling through the effect)

  • playback mode: (simulate, playback)

  • process_time_mode: (automatic - manual). This will allow particles to be scrubbed back and forth, or to just advance their process time automatically when emitting is set to on. This will be useful when creating editor tools to work with the node

  • target_mode (GPU, CPU)

When using the CPU backend, GPUParticles will read from those arrays and interpolate between them based on the current process time. Internally, a multi mesh will be updated with the relevant transform, color, and custom values.

When using the GPU backend, the arrays will be uploaded as storage buffers and the particle data can be interpolated and set entirely on the GPU which will be much faster on modern devices.

The interpolation can happen within the normal particle copy step (usually this is only done if we need to adjust the space of the particle, or sort the particles) to avoid any extra compute shader passes.

Editor tooling:

Particles will be represented by the animation player in the similar way as audio, marking each loop of the emitter
image

1.When scrubbing the timeline in the animation player, particles will play baked data. If no data has been baked, the particles will bake on the fly and keep the baked data until parameters that affect baked properties are changed.

  1. It should be possible to enable particles in one animation and them continuing to emit even after the animation is over.

  2. Looping in the animation player should not cause the particles to be cut off in their playback. If the animation finishes at half of the loop of the emitter, the animation looping should not cause the particles playback time to snap back. However ..

  3. It should be possible to keyframe process time on its own. This should auto-set the particle system in playback + manual mode for time.

  4. None of the editor internal functioning should compromise the saved scene. If i am scrubbing the timeline and i hit save and my particles are in simulate mode, they should not be set to manual because i was scrubbing the timeline at save time.

Considerations on scene interaction
GPUParticles have the ability to collide with objects, spawn sub-emitters and interact with attractors. Collisions and sub-emitter should be left out of the scope of this initial work. Attractors will not be supported at any point. Alternatively, Collision and Attraction can be baked in, but there will be no

For collision, only destroy on collision or stop on collision should be supported. Baking should allow to bake both with and without collisions.

If this enhancement will not be used often, can it be worked around with a few lines of script?

This cannot be worked around with a script. However, the full CPU-only implementation could be prototyped as an extension/addon

Is there a reason why this should be core and not an add-on in the asset library?

This proposal requires modifications to the GPUParticles node and internal particle so it can't be an addon. It could be protoyped in an addon however.

@Calinou
Copy link
Member

Calinou commented Jun 14, 2023

@Calinou Calinou changed the title Particle recording and playback Add support for particle recording and playback Jun 14, 2023
@reduz
Copy link
Member

reduz commented Jul 21, 2023

Here are some ideas for implementation:

1st: Implement particle recording and playback in RenderingServer

The first thing to note is that we have to store the particle capture in a format that is efficient to compress to disk, and efficient to read to memory. If we use raw floats, the amount of particle information we can capture will be severely limited and usage will be enormous.

So, some small FAQ:

Q: Do we need to capture just the particle transform or the whole particle state?
A: We need to capture the whole particle state. The only things that are not sent to rendering are velocity and flags, BUT if our particle system has a sub-emitter or if it has trails, we still need to keep the state of every frame and run it in order for the emission or trails logic to take place. Since its only just one more field, I suggest we just capture the whole state anyway.

Q: Particles run independent of framerate in Godot, do we need to capture them at a fixed FPS?
A: Yes, fixed framerate must be use during back capture and playback. We can capture at say 30hz and play at fixed frame rate. Godot already will interpolate the particles to full framerate.

Q: How will these be saved to disk?
A: While compression will ensure that these captures are small (read more below) we still need to be able to ensure users will save them to disk as binary.
I think the simplest solution here is to, before baking any animation, ask the user to move the AnimationLibrary to an external file and save it, then allow the bake.

Proposed capture format:

For memory (replay):

The idea here is to save normalized values in a range of 0-1 and encode each channel to 16 bits for position, scale, velocity and color for each particle lifetime track.
Separately, the max/min values are saved for each track. This should efficiently capture the particle state with enough resolution.

Format in detail: RGBA16UI

  • Position: RGB (XYZ), normalized, A: rotation angle
  • Scale: RG (normalized, encoded as 11:10:11), Rotation axis: BA (octahedral encoded)
  • Velocity: RG (normalized encoded as 11:10:11), BA flags (uint32)
  • Color: RGBA (normalized)
  • Custom : RGBAx2 (float encoded, can't compress)
  • Userdata: RGBAx2 (flloat encoded) x [as many userdatas used]

(Note, for 2D particles, Scale can be just 32:32 uncompresssed, Velocity can be just 16:16).

Add to this the normalization ranges (min/max of particle position, scale, velocity and color), thisis a RGBA32F of size: 2,[number of particles]. This means that a particle recording is these two textures.

So, without userdata, a particle snapshot in memory takes up: 8+8+8+8+16 = 48 bytes.
As such, thinking an extreme case, a particle system with 1024 particles that runs for 10 seconds at 30hz (60 should be unnecessary since we can interpolate), takes up: 48102430*10 = ~15mb in memory, very decent.

One doubt that may still remain is how many particles to capture. (Y height), we know the total particles that are spawned in lifetime, so we can see how many lifetimes enter in the time requested for capture and kind of guess the amount of particles (and allocate a bit more just in case).

For saving to disk, conversion to bitwidth delta compression can be used, which results in very large savings (almost 10x). This code already exists in the animation compressor, so it just needs to be copied over. This means that a 15mb capture will be only 1.5mb.

APIs to be added:

// Begin recording particles, return an RID. This RID is a texture that contains recording information: On Y Axis the time, on X axis the time. The format is RGBA32F and it contains: 

RID RenderingServer::particles_recording_create();
void RenderingServer::particles_recording_set_data(RID p_particles_recording, const Vector<uint8_t>& p_compressed_buffer);
Vector<uint8_t> RenderingServer::particles_recording_get_data(RID p_particles_recording) const;

void RenderingServer::particles_recording_begin(RID p_particles_recording,RID p_particles_instance,double p_length_sec,int p_hz);
void RenderingServer::particles_recording_end(RID p_particles);
////
void RenderingServer::instance_play_particles_recording(RID p_particles_instance, RID p_particles_recording,float p_time_scale);
void RenderingServer::instance_pause_particles_recording(RID p_particles_instance, bool p_paused);
void RenderingServer::instance_seek_particles_recording(RID p_particles_instance, double p_time);
void RenderingServer::instance_stop_particles_recording(RID p_particles_instance);

// (Note, similar API will have have to be implemented for 2D particles)

The capture process should be more or less straightforward, capture happens uncompressed to a RGBA32F texture (we know the length in seconds and hz so the texture can be preallocated). Once finished (particles_recording_end) what was recorded gets compressed using a compute shader.

The compute shader basically does something like this:

  • Compute minimums/maximums of each particle track using a reduction algorithm like the one in luminance, store them to the min/max texture.
  • Copy the capture texture to a compressed texture, normalize and compress based on min/max and encode rotation as compressed axis/angle.

Compressing/decompressing bitwidth can have when getting/setting from RenderingDevice. Initially this can be unimplemented for a first PR as long as the format contemplates this being enabled later on.

2nd: Recording resource

class ParticlesRecording : public Resource {
	// creates RID internally
	// These expose a serialized hidden property
	void set_recording_data(Vector<uint8>); 
	Vector<uint8> get_recording_data(Vector<uint8>); 
};
...

3rd: Implement particle recording in Animation

The Animation resource will need an extra type of track: Particles
It points to a CPUParticles2D or 3D
Every keyframe in the track will have a length in seconds and an optional Ref<ParticlesRecording>

During playback, If no recording exists, the track will start and stop the animation emission. If a recording exists, the track uses the instance_play_particles_recording.

Seeking will not work on this track until there is a recording, in which case the recording will be seeked.

4th: Editor

The editor will allow you to create particle tracks and insert keyframes (which are a playback range during which the particle system is active.
The keyframe segments will have an option to be baked (make an over button, I don´t know). While baking, the animation segment will be played back in real-time (editor will not let the user interact, except some dialog saying its recording and option to cancel). When baked, the animation should have some locked or something icon and editing the length of the keyframe should no longer be allowed. I suppose clicking it will ask the user if they want to remove baked data.

As mentioned before, when baking, if not already the case, the editor will ask the user to save the AnimationLibrary to a separate file, simlar to when baking lightmapping.

@QbieShay
Copy link
Author

I can't comment on the technical implementation part but I want to comment on the final workflow goals

  1. Seeking in the animation player should be possible even for particles that are in the end simulates at runtime. This feature should be the groundwork to make it easy to scrub the timeline, and not for changing all particles to baked particles if you want to be able to scrub the timeline. Games require a lot of texture space and allocating extrq for particles just because I want to be able to scrub the timeline at editor time is really not ideal
  2. Keyframing some properties of particles doesn't work because initial state is recalculated every frame, so changing for example the scale value doesn't work at all. It should be supported later down the line with some emitter lifecycle curves which will also conveniently allow to spare GPU memory compared to making snapshot of the emitter properties when the particles were spawned, especially considering we support custom particle shaders

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: Needs implementer
Development

No branches or pull requests

3 participants