Skip to content

Collecting Parameters

Sam Swain edited this page Feb 15, 2024 · 12 revisions

What are parameters?

Parameters are the inputs that a machine learning model operates, learning to associate different sets of these values with each output that is desired to be recognised. Parameters are just a list of numbers, but form information stream that the machine learning algorithms are trained to recognise patterns in.

See Machine Learning for more on this 👉

Where do parameters come from?

Parameters can come from a variety of sources, including; player position and orientation, controller values, key presses, mouse positions, orientation and velocity information, external serial ports, processed audio and video, and many other input devices.

Pre-processing

Parameters can be pre-processed before passing into the ML systems to be better presentable for training and recognition purposes. Examples of this might be:

  • Normalisation - Although most of the algorithms are scale and range agnostic you may need to remap different sources into a comparable range.
  • Re-framing - Removal of absolute values is useful to give better objectivity to the recognition (e.g. VR controller position should be relative to the head and not in world space).
  • Projection - Some values that have wrapping issues (such as angles, particularly direction) don't work well due to discontinuities and are better projected into another form (e.g. cartesian direction vector) and then projected back after the ML system.
  • Filtering - Some sources are too noisy for direct training and better filtered to remove high frequency components (e.g. head or joystick position).
  • Reduction - Some sources are too high bandwidth and should be reduced by some method to distil fundamental properties out to match against (e.g. representing an audio waveform as its frequency spectrum).

Parameter Data Types

Input parameters are handled internally as an array of numerical values (floats), but for convenience this is hidden and parameters can be provided as a wider variety of data types. The following types are supported:

  • Integer - whole numbers (-1, 0, 1, 2, 3, etc.)
  • Float - real numbers (1.0, 3.1415, -98.76, etc.)
  • Boolean - logical values (false or true)
  • 2D Vector - two floating point numbers (e.g. a point, size, or direction in two dimensions)
  • 3D Vector - three floating point numbers (e.g. a point, size, or direction in three dimensions)
  • Rotation - rotation information
  • Colour - Red, Green, Blue (and alpha) values in floating point form (0.0 to 1.0 for each channel)

💡The node now supports Arrays of values for added flexibility. All the above types can be collected into an array and plugged into a single input. NOTE: Changing the quantity of items in an array could invalidate any stored training data, especially if other parameters are collected after the array.

Collect All The Things!

A Blueprint node called "Collect All The Things!" is provided to capture a set of parameters into the internal numerical form for passing around the node graphs and into recording and running nodes.

See Blueprint-Nodes#parameter-collection-node for full connection details 👉

This can be wired up to your input sources (after any pre processing) and produces a single connection that can be routed to the nodes set up for recording into a Training Set or the running of a trained Model.

Here, the actors position (a 3D vector) and bearing (as a 2D direction vector) are combined into a parameter set. This parameter set equates to a total of 5 float values (see node title text). This is then fed into a classification model to produce a single numerical output value each time it is run.

Collection Functions

Recording and running both require input parameters, but it is likely that these two processes will be performed in different parts of the project, in different Blueprints, and possibly on different Actors. Since the parameters are usually going to come from the same source and pass through the same pre-processing it makes sense to wrap this up in a Blueprint Library function.

This then makes using them as simple as dropping in a single node in each place it's needed.

💡This requires the enabling and hooking up of the extra Actor input on the parameter collection node, this is because the Actor that normally provides context for the node to run in isn't present when it's abstracted out from an Actor's own script into a function library. See Utility Blueprints for details 👉

Learn more about 🔗Blueprint Macro & Function Libraries 👉


👈 Blueprint Nodes | 🏠 Home | Recording Examples 👉

Clone this wiki locally