Over the last few years, we have seen an increase in academic and industry work focusing on benchmarking and performance evaluation in novel cloud settings, driven both by an increase and diversity in workloads as well as hardware such as FPGAs or GPUs. Moreover, there are new classes of data-driven applications (i.e., machine learning and big data scenarios) that now need to be considered while at the same time, SQL and no-SQL engines alike are developing, and new engine concepts such as unified engines are being created. Consequentially, special testing efforts and rigor to ensure classical database strengths such as reliability, integrity, and performance are crucial for these novel system architectures and designs.
The goal of DBTest 2026 is to bring researchers and practitioners from academia and industry together to discuss challenges, approaches and open problems around these issues.
Authors are invited to submit original, unpublished research papers, or experience papers that describe the implementation of testing-related solutions in applications and products that are not being considered for publication in any other forum pertaining to any of the topics listed above as DBTest's topics of interest.
The authors' submission is expected to contain 4 to 6 pages excluding references and appendix.
Accepted research submissions will be published in the ACM DL.
Authors can submit a talk proposal for previously published but relevant content as well as new ideas within the scope of DBTest's topics of interest that they want feedback from the community on.
The submission should consist of 1 page including references and appendix. Talk proposals need to be marked with the suffix [Talk Proposal] in the submission's title.
Accepted talk proposals will be listed on the DBTest homepage.
Authors are required to follow the current ACM Proceedings Format.
The submission will be handled through HotCRP.
March 13, 2026 / 11:59PM AoE
April 17, 2026 / 11:59PM AoE
April 24, 2026 / 11:59PM AoE