Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot write to table with computed column #25

Closed
chenfeiw opened this issue Jul 27, 2020 · 6 comments
Closed

Cannot write to table with computed column #25

chenfeiw opened this issue Jul 27, 2020 · 6 comments
Assignees
Labels
enhancement New feature or request

Comments

@chenfeiw
Copy link

I have a table like this:
Create table Test
(
Id int,
Year nvarchar(4),
Month nvarch(2),
Date As (Year + '-' + Month)
)

Because Date is a computed column, my dataframe doesn't have this column, I get exception 'Spark Dataframe and SQL Server table have different numbers of columns'.

@shivsood shivsood added the enhancement New feature or request label Jul 29, 2020
@shivsood
Copy link
Collaborator

shivsood commented Jul 29, 2020

does this work with default JDBC connector?
Here's what needs to be done to support this.

  1. Support option to not do a strict check
  2. Utilize SqlBulkCopyColumnMapping during bluk copy to map specific columns.

@chenfeiw
Copy link
Author

Yes, it works with default JDBC.

@shivsood
Copy link
Collaborator

Fix would be required to support a non-strict option. Also Utilize SqlBulkCopyColumnMapping during bulk copy to map specific columns.

@shivsood
Copy link
Collaborator

Looks same as #14

@pmooij
Copy link

pmooij commented Oct 7, 2020

does this work with default JDBC connector?
Here's what needs to be done to support this.

  1. Support option to not do a strict check
  2. Utilize SqlBulkCopyColumnMapping during bluk copy to map specific columns.

indeed, these 2 features are needed to deal with Identity, Computed & Defaults constraints.
quite cumbersome to do the workarrounds now

@rajmera3
Copy link
Contributor

Solved in #52

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants