Skip to content

Using the library in Databricks environment #8

@MrPowers

Description

@MrPowers

I did a bit of experimentation and looks like it's tricky to use this lib in Databricks.

Any way we can provide an interface that doesn't require the user to set a configuration option?

Perhaps we can let the user run an import statement like import org.apache.spark.sql.itachi.postgres._ to get all the functions? The function registration process is still a little fuzzy for me. Let me know if you think this would be possible!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions