Skip to content

Change splink default dbfs locations #1825

Answered by RobinL
eric6204 asked this question in Q&A
Discussion options

You must be logged in to vote

Unfortunately there isn't a way within Splink to do that. I'm not a databricks user so some of this stuff is outside my expertise.

Whilst it's not supported by the released versions of Splink if you manually edit the source code here:

checkpoint_dir = self._get_checkpoint_dir_path(spark_df)

then you could hard code a path wherever you wanted and presumably it should work ('presumably' because I don't really know anything about abfss and dbfs, so i'm guessing a bit)

Replies: 1 comment 6 replies

Comment options

You must be logged in to vote
6 replies
@eric6204
Comment options

@RobinL
Comment options

@eric6204
Comment options

@RobinL
Comment options

Answer selected by eric6204
@eric6204
Comment options

@sthamodh
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants