Description
Currently, Spark only supports to infer IntegerType, LongType, DoubleType and StringType.
DecimalType is being tried but it seems it never infers type as DecimalType as DoubleType is being tried first.
Also, DateType and TimestampType can be inferred. It seems it is a pretty common to use both for a partition column.
It'd be great if they can be inferred as both rather than just StringType.
Attachments
Issue Links
- is duplicated by
-
SPARK-11995 Partitioning Parquet by DateType
-
- Closed
-
- links to