precisionDefault

Precision used to round _decimal operation results. Every result will be adjusted to fit the specified precision. Use DecimalControl to query or set the context precision

Values

ValueMeaning
precisionDefault0

use the default precision of the current type (7 digits for decimal32, 16 digits for decimal64 or 34 digits for decimal128)

precision32Decimal!32.PRECISION

use 32 bits precision (7 digits)

precision64Decimal!64.PRECISION

use 64 bits precision (16 digits)

precision128Decimal!128.PRECISION

/use 128 bits precision (34 digits)

Meta