Welcome to DBTest

Welcome

About DBTest

Over the last few years, we have seen an increase in academic and industry work focusing on benchmarking and performance evaluation in novel cloud settings, driven both by an increase and diversity in workloads as well as hardware such as FPGAs or GPUs. Moreover, there are new classes of data-driven applications (i.e., machine learning and big data scenarios) that now need to be considered while at the same time, SQL and no-SQL engines alike are developing, and new engine concepts such as unified engines are being created. Consequentially, special testing efforts and rigor to ensure classical database strengths such as reliability, integrity, and performance are crucial for these novel system architectures and designs.

The goal of DBTest 2026 is to bring researchers and practitioners from academia and industry together to discuss challenges, approaches and open problems around these issues.

Topics Of Interest

  • Reproducibility of database research (new!)
  • Testing and benchmarking of learning-based database systems (new!)
  • Testing of database systems, storage services, and database applications
  • Testing of database systems using novel hardware and software technology (non-volatile memory, hardware transactional memory, …)
  • Testing heterogeneous systems with hardware accelerators (GPUs, FPGAs, ASICs, …)
  • Testing distributed and big data systems
  • Testing machine learning systems
  • Specific challenges of testing and quality assurance for cloud-based systems
  • War stories and lessons learned
  • Performance and scalability testing
  • Testing the reliability and availability of database systems
  • Algorithms and techniques for automatic program verification
  • Maximizing code coverage during testing of database systems and applications
  • Generation of synthetic data for test databases
  • Testing the effectiveness of adaptive policies and components
  • Tools for analyzing database management systems (e.g., profilers, debuggers)
  • Workload characterization with respect to performance metrics and engine components
  • Metrics for test quality, robustness, efficiency, and effectiveness
  • Operational aspects such as continuous integration and delivery pipelines
  • Security and vulnerability testing
  • Experimental reproduction of benchmark results
  • Functional and performance testing of interactive data exploration systems
  • Tracability, reproducibility and reasoning for ML-based systems

Keynote Speakers
To Be Announced


Call for Contributions

Research or Experience Papers

Authors are invited to submit original, unpublished research papers, or experience papers that describe the implementation of testing-related solutions in applications and products that are not being considered for publication in any other forum pertaining to any of the topics listed above as DBTest's topics of interest.

The authors' submission is expected to contain 4 to 6 pages excluding references and appendix.

Accepted research submissions will be published in the ACM DL.

Talks

Authors can submit a talk proposal for previously published but relevant content as well as new ideas within the scope of DBTest's topics of interest that they want feedback from the community on.

The submission should consist of 1 page including references and appendix. Talk proposals need to be marked with the suffix [Talk Proposal] in the submission's title.

Accepted talk proposals will be listed on the DBTest homepage.



Guidelines

Authors are required to follow the current ACM Proceedings Format.

The submission will be handled through HotCRP.


HotCRP

Timeline

March 13, 2026 / 11:59PM AoE

Paper Submission

April 17, 2026 / 11:59PM AoE

Notification Of Outcome

April 24, 2026 / 11:59PM AoE

Camera-Ready Copy
Schedule

Program Schedule


Organization

Program Committee

Workshop Co-Chairs

Anja
Gruenheid

Microsoft, Switzerland

Anupam
Sanghi

IIT Hyderabad, India

Steering Committee

Carsten
Binnig

TU Darmstadt, Germany

Alexander
Böhm

SAP, Germany

Tilmann
Rabl

HPI, Germany