Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Violation of PRIMARY KEY constraint 'PK_WorkloadDetails' #133

Open
regis671 opened this issue Jul 11, 2023 · 2 comments
Open

Violation of PRIMARY KEY constraint 'PK_WorkloadDetails' #133

regis671 opened this issue Jul 11, 2023 · 2 comments

Comments

@regis671
Copy link

hello, when analysing an extended event converted into a sqlite, I have this error :

Info - WorkloadTools.Consumer.Analysis.WorkloadAnalyzer : Summary info written (15 rows)
Warn - WorkloadTools.Consumer.Analysis.WorkloadAnalyzer : Unable to write workload analysis.
Warn - WorkloadTools.Consumer.Analysis.WorkloadAnalyzer : Violation of PRIMARY KEY constraint 'PK_WorkloadDetails'. Cannot insert duplicate key in object 'Prod_1007.WorkloadDetails'. The duplicate key value is (742302267, -4258734915548242222, 0, 0, 6, 0).
The statement has been terminated.
Error - WorkloadTools.Consumer.Analysis.WorkloadAnalyzer : Unable to write workload analysis info to the destination database.
Error - WorkloadTools.Consumer.Analysis.WorkloadAnalyzer : à System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action1 wrapCloseInAction) à System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose) à System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady) à System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) à System.Data.SqlClient.SqlBulkCopy.RunParser(BulkCopySimpleResultSet bulkCopyHandler) à System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsyncContinuedOnSuccess(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource1 source)
à System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsyncContinued(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource1 source) à System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsync(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource1 source)
à System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestContinuedAsync(BulkCopySimpleResultSet internalResults, CancellationToken cts, TaskCompletionSource1 source) à System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestAsync(CancellationToken cts, TaskCompletionSource1 source)
à System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalAsync(CancellationToken ctoken)
à System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServerAsync(Int32 columnCount, CancellationToken ctoken)
à System.Data.SqlClient.SqlBulkCopy.WriteToServer(DbDataReader reader)
à WorkloadTools.Consumer.Analysis.WorkloadAnalyzer.WriteExecutionDetails(SqlConnection conn, SqlTransaction tran, Int32 current_interval_id) dans C:\github\WorkloadTools\WorkloadTools\Consumer\Analysis\WorkloadAnalyzer.cs:ligne 760
à WorkloadTools.Consumer.Analysis.WorkloadAnalyzer.WriteToServer(DateTime intervalTime) dans C:\github\WorkloadTools\WorkloadTools\Consumer\Analysis\WorkloadAnalyzer.cs:ligne 443
à WorkloadTools.Consumer.Analysis.WorkloadAnalyzer.CloseInterval() dans C:\github\WorkloadTools\WorkloadTools\Consumer\Analysis\WorkloadAnalyzer.cs:ligne 103

Here is the config json :

"Controller": {

    "Listener":
    {
        "__type": "FileWorkloadListener",
		"Source": "d:\\sqlite\\workload1.sqlite"
    },

    "Consumers":
    [
        {
            "__type": "AnalysisConsumer",
            "ConnectionInfo": 
            {
                "ServerName": "1.1.1.1",
                "DatabaseName": "replay_analysis_20G",
                "SchemaName": "Prod_1007",
                "UserName": "sa",
                "Password": "********"
            }
        }
    ]
}

}

@8trackbattlecat
Copy link

I was able to resolve this duplicate key error by adding the “UploadIntervalSeconds” property to the Consumer.

Example Consumers Json snippet with the fix:

    "Consumers":
    [
        {
            "__type": "AnalysisConsumer",
            "ConnectionInfo": 
            {
                "ServerName": "1.1.1.1",
                "DatabaseName": "replay_analysis_20G",
                "SchemaName": "Prod_1007",
                "UserName": "sa",
                "Password": "********"
            },
            “UploadIntervalSeconds”: 60
        }
    ]

@spaghettidba
Copy link
Owner

Thanks for reporting this. I'll have a look at the code.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants