Skip to content

Magical limit of failure between 70 million and 100 million rows in one table #2592

@esrat

Description

@esrat

Details for the issue

What did you do?

I imported a table from a .csv file with many lines.

What did you expect to see?

I thought I could browse / filter / sort the table data after the import.

What did you see instead?

The whole DB Browser got stuck when opening the "Browse data" tab (endless loop?).

Useful extra information

It's quite simple to reproduce.
Successful Case: Just create a .csv file with an integer column only, e.g. column "A" with values 0 to 69999999. You can import this table easily and browse its contents afterwards.
Failure Case: Create the same .csv file with values up to 99999999. Now, you cannot browse / filter / sort the imported table, because the program gets stuck (full single threaded CPU utilisation!).

What operating system are you using?

  • Windows: ( version: Windows 7 64 Bit )
  • Linux: ( distro: ___ )
  • macOS: ( version: ___ )
  • Other: ___

What is your DB4S version?

  • 3.12.1
  • 3.12.0
  • 3.11.x
  • Other: nightly build of 13th February 2021

Did you also

Metadata

Metadata

Assignees

No one assigned

    Labels

    Qtbrowse databugConfirmed bugs or reports that are very likely to be bugs.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions