Skip to content

Commit 94c44ba

Browse files
author
DjvuLee
committed
[SPARK-19239][PySparK] Check the lowerBound and upperBound whether equal None in jdbc API
The ``jdbc`` API do not check the lowerBound and upperBound when we specified the ``column``, and just throw the following exception: ```int() argument must be a string or a number, not 'NoneType'``` If we check the parameter, we can give a more friendly suggestion.
1 parent 61e48f5 commit 94c44ba

File tree

1 file changed

+4
-1
lines changed

1 file changed

+4
-1
lines changed

python/pyspark/sql/readwriter.py

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -399,7 +399,8 @@ def jdbc(self, url, table, column=None, lowerBound=None, upperBound=None, numPar
399399
accessible via JDBC URL ``url`` and connection ``properties``.
400400
401401
Partitions of the table will be retrieved in parallel if either ``column`` or
402-
``predicates`` is specified.
402+
``predicates`` is specified. ``lowerBound` and ``upperBound`` is needed when ``column``
403+
is specified.
403404
404405
If both ``column`` and ``predicates`` are specified, ``column`` will be used.
405406
@@ -431,6 +432,8 @@ def jdbc(self, url, table, column=None, lowerBound=None, upperBound=None, numPar
431432
if column is not None:
432433
if numPartitions is None:
433434
numPartitions = self._spark._sc.defaultParallelism
435+
assert lowerBound != None, "lowerBound can not be None when ``column`` is specified"
436+
assert upperBound != None, "upperBound can not be None when ``column`` is specified"
434437
return self._df(self._jreader.jdbc(url, table, column, int(lowerBound), int(upperBound),
435438
int(numPartitions), jprop))
436439
if predicates is not None:

0 commit comments

Comments
 (0)