Prepare for release 1.1.0, with Databricks support#441
Merged
janblom merged 75 commits intoOHDSI:masterfrom Oct 5, 2025
Merged
Prepare for release 1.1.0, with Databricks support#441janblom merged 75 commits intoOHDSI:masterfrom
janblom merged 75 commits intoOHDSI:masterfrom
Conversation
…ense validation config
… version for Java 1.8
Project now requires Java 17 to build. Should still produce java 8 (1.8) compatible artifacts though.
License compliance
Bumps org.apache.avro:avro from 1.11.2 to 1.11.3. --- updated-dependencies: - dependency-name: org.apache.avro:avro dependency-type: direct:production ... Signed-off-by: dependabot[bot] <[email protected]>
…apache.avro-avro-1.11.3 Bump org.apache.avro:avro from 1.11.2 to 1.11.3 in /rabbit-core
Distribution integration test
Without this change, the table panel height is always higher than needed (when using stem table), because the stem table is counted as one of the items in the components list. It is however shown separately at the top, which is already accounted for by the stem table margin.
* Refactor RichConnection into separate classes, and add an abstraction for the JDBC connection. Implement a Snowflake connection with this abstraction * Add unit tests for SnowflakeConnector * Added Snowflake support for SourceDataScan; added minimal test for it; some refactorings to move database responsibility to rabbit-core/databases * Move more database details to rabbit-core/databases * Clearer name for method * Ignore snowflake.env * Create PostgreSQL container in the TestContainers way * Refactored Snowflake tests + a bit of documentation * Fix Snowflake test for Java 17, and make it into an automated integration test instead of a unit test * Remove duplicate postgresql test * Make TestContainers based database tests into automated integration tests * Suppress some warnings when generating fat jars * Let autimatic integration tests fail when docker is not available * Allow explicit skipping of Snowflake integration tests * Added tests for Snowflake, delimited text files * Switch to fully verifying the scan results against a reference version (v0.10.7) * Working integration test for Snowflake, and some refactorings * Some proper logging, small code improvements and cleanup * Remove unused interface * Added tests, some changes to support testing * Make automated test work reliably (way too many changes, sorry) * Rudimentary support for Snowflake authenticator parameter (untested) * review xmlbeans dependencies, remove conflict * extend integration test for distribution * Restructuring database configuration. Work in process, but unit and integration tests all OK * Restructuring database configuration 2/x. Still work in process, but unit and integration tests all OK * Restructuring database configuration 3/x. Still work in process, but unit and integration tests all OK * Restructuring database configuration 4/x. Still work in process, but unit and integration tests all OK * Restructuring database configuration 5/x. Still work in process, but unit and integration tests all OK * Restructuring database configuration 6/x. Still work in process, but unit and integration tests all OK * Restructuring database configuration 7/x. Still work in process, but unit and integration tests all OK * Intermezzo: get rid of the package naming error (upper case R in whiteRabbit) * Intermezzo: code cleanup * Snowflake is now working from the GUI. And many small refactorings, like logging instead of printing to stout/err * Refactor DbType into an enum, get rid of DBChoice * Move DbType and DbSettings classes into configuration subpackage * Avoid using a manually destructured DbSettings object when creating a RochConnection object * Code cleanup, remove unneeded Snowflake references * Refactoring, code cleanup * More refactoring, code cleanup * More refactoring, code cleanup and documentation * Make sure that order of databases in pick list in GUI is the same as before, and enforce completeness of that list in a test * Add/update copyright headers * Add line to verify that a tooltip is shown for a DBConnectionInterface implementing class * Test distribution for Snowflake JDBC issue with Java 17 * cleanup of build files * Add verification that all JDBC drivers are in the distributed package * Add/improve error reporting for Snowflake * Disable screenshottaker in GuiTestExtension, hoping that that is what blocks the build on github. Fingers crossed * Better(?) naming for database interface and implementing class * Use our own GUITestExtension class --------- Co-authored-by: Jan Blom <[email protected]>
Update stem table image
Fix image crop when using stem table
* Fixed a bug in the comparison for sort; let comparison report report all differences before failing * Allow the user to specify the port for a MySQL server * Add tests for a MySQL source database
Since the documentation won't let me find out if it is possible.
…tch case is implemented
…is hidden and added a test for hiding tables
…erent separators.
…test into the existing test for RabbitInAHat
hiding tables - Issue 274
Bumps org.apache.avro:avro from 1.11.3 to 1.11.4. --- updated-dependencies: - dependency-name: org.apache.avro:avro dependency-type: direct:production ... Signed-off-by: dependabot[bot] <[email protected]> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
#60) * All versions now in parent pom (dependencyManagement section), all minor upgrades done, mvn verify passes including snowflake * Add log4j-api, hoping to fix build problems in CI * Force protobuf dependency to 3.25.5 * Add BSD-3_Clause to accepted licenses * Added manual config for dependency-check, with optional suppression(s), currently jackson-databind * Add documentation on checking vulnerabilities and allowed licences * Clarified vulnerabilities checking * Added infon checking dependency versions
Issue OHDSI#324 - RiaH export highlight required and unmapped fields
Issue 254
Issue 313
Added a sentence about required fields to documentation
* Fix postgres tests: use newer version of TestContainer for postgres * First steps with Databricks. No configuration yet, so clearly a W.I.P. * Add configuration handling to DatabricksHandler * Working integration test based on ini file (command line usage of WhiteRabbit) * Exclude old log4j dependency * Use catalog and schema in table specification. Integration test for command line now ok. * Add browser flow for Databricks authentication * Add GUI based scan test for Databricks * Rename 2 base classes (now both abstract) to clarify their intent * Add some documentation here and there * Move Databricks tests from the maven phase to the phase * Add a remark about manyally testing browser flow authentication * Scripts, data and additional documentation for the Databriks test data * Attempt to fix build on github * Test if latest version of maven-resources-plugin also works * Add ini file example for Databricks * Ignore python version file * Databricks: add LIMIT x to TABLESAMPLE statement; ignore commented column names * Documentation for Databricks * Dependency updates * Disable distribution verification (almost obsolte anyway as we will move to Java 17 as the minimum version
….0 to 1.28.0 (OHDSI#438) (#63) * fix: rabbit-core/pom.xml to reduce vulnerabilities The following vulnerabilities are fixed with an upgrade: - https://snyk.io/vuln/SNYK-JAVA-ORGAPACHECOMMONS-10734078 * Resolve dependencies on old version of commons-io --------- Co-authored-by: snyk-bot <[email protected]>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
No description provided.