Skip to content
Scott C Gray edited this page Feb 26, 2017 · 2 revisions

Variable

precision - Controls number of digits displayed in floating point values

Description

When display floating point values, ${precision} determines how many total digits of the value are displayed. ${scale} is used in conjunction with this value to indicate how many of those digits are used to show the values after the decimal.

This property only applies to types of real, float, double and Oracle NUMERIC (with no precision or scale specified).

See also

scale

Clone this wiki locally