Skip to content

Misuse ValueReal.DECIMAL_PRECISION when optimize typeinfo from DOUBLE to DECFLOAT #3585

@lic-hen

Description

@lic-hen

I have a column FOO which type is DOUBLE, when I select FOO/100.0 from table, it returns shorter than expected,
examples:

sql> create table t_test(foo double);
(Update count: 0, 44 ms)
sql> insert into t_test values (1234567890123.4);
(Update count: 1, 2 ms)
sql> select foo/100.0 from t_test;
FOO / 100.0
1.2345679E+10

DOUBLE has 15 to 17 significant decimal digits precision, but result only have 9.

I debug it on h2 source code, then i found :
when double divide numeric, they are converted to a same higer type called DECFLOAT.
read the method of org.h2.value.TypeInfo#toDecfloatType i found "case Value.REAL" and "case Value.DOUBLE" call inner method getTypeInfo with same params, and the second param is ValueReal.DECIMAL_PRECISION ,
as below:
image

It seems like the second param of "case Value.DOUBLE" 's getTypeInfo misuse the DECIMAL_PRECISION of ValueReal , because ValueDouble also have a constant field named DECIMAL_PRECISION which value is 17

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions