In schema/defs.json, there are definitions for 'int64' and 'uint64' types:
"int64": {
"type": "integer",
"minimum": -9223372036854776000,
"maximum": 9223372036854776000
},
"uint64": {
"type": "integer",
"minimum": 0,
"maximum": 18446744073709552000
},
These minimum and maximum values are not representable in a normal signed/unsigned 64-bit integer. Is there a reason that these definitions don't match normal INT64_MIN/INT64_MAX/UINT64_MAX values? The other sized integer type definitions in the schema have the min/max values I would expect.