Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error in Importing SAS Key #97

Closed
leofrederiksen opened this issue Jan 18, 2022 · 3 comments
Closed

Error in Importing SAS Key #97

leofrederiksen opened this issue Jan 18, 2022 · 3 comments

Comments

@leofrederiksen
Copy link

Hi Team,

We have setup a landscape where we use SAP’s SLT system (a realtime table replication system) in order to replicate data into Azure Event Hub and store it on the storage as avro files for further processing.
The SAP SLT is tracking all the changes made to tables, i.e. Insert, Update or Delete statement, and is running on the SAP Application Layer. And we have a lot of customers who is eager to get their SAP data into Azure in realtime and perform various analytics and ML afterwards.
It is actually working pretty well for all the tables we have worked on from SAP ECC system.

However, we have an issue when we are using parallel jobs. Either parallel jobs for a single table or parallel jobs for multiple tables.
With SLT we are getting all the table changes and we then generate relevant json files that are sent to the Event Hub.
Multiple tables are being transferred into the same event hub (we normally have 200-600 tables we need to replicate), as we have changes to many tables in SAP all the time.
If I force SLT to only use 1 job to move table change to Event Hub, it is working fine. But then we see a delay in the processing as the system only process 1 table at a time.
But, as soon as we define i.e. 4 parallel jobs to move changes from 50 tables to the Event hub, we get errors when getting the SAS token.

The error message is: Error in Importing SAS Key

And we think the issue is coming from some of the programs you deliver:
We are sending data using zcl_adf_service_eventhub method send.
"Sending Converted SAP data to Azure Eventhub
CALL METHOD lr_eventhub->send
EXPORTING
request = <l_json>-json "Input XSTRING of SAP Business Event data
"request = lv_json_xstring "Input XSTRING of SAP Business Event data
it_headers = lt_response_headers "Header attributes
IMPORTING
response = lv_response "Response from EventHub
ev_http_status = lv_http_status. "Status

**Here it tries to get the token.
CLASS: ZCL_ADF_SERVICE_EVENTHUB.
method SEND.

TRY.
get_sas_token( EXPORTING iv_baseaddress = gv_uri
RECEIVING rv_sas_token = lv_sas_token ).
CATCH zcx_adf_service INTO lcx_adf_service.
lv_msg = lcx_adf_service->get_text( ).
MESSAGE lv_msg TYPE 'I'.
ENDTRY.

**First step is to decode sign
CLASS: ZCL_ADF_SERVICE_EVENTHUB.
method GET_SAS_TOKEN.

DEFINE encrypt_key.
decode_sign( receiving rv_secret = lv_sas_key ).

** get connection encryption from memory
CLASS: ZCL_ADF_SERVICE.
METHOD decode_sign.
DEFINE decode_key.
**Import internal table as a cluster from INDX
IMPORT tab = lt_enveloped_data[]
FROM DATABASE zadf_con_indx(zd)
TO lw_indx
ID lv_srtfd.

IF NOT lt_enveloped_data[] IS INITIAL.

** does it’s thing correcrly.

ELSE.
**Raise Exception
RAISE EXCEPTION TYPE zcx_adf_service
EXPORTING
textid = zcx_adf_service=>error_import_sas_key
interface_id = gv_interface_id.
ENDIF.

So the issue seems to be that the lt_enveloped_data[] has not been retrieved from the memory.
Either it is being updated from another process with an incompatible memory area and then is unable to find it.
Or it is a timing issue where it is momentarily being updated.
When running a table in 4 parallel process, we sometimes get very fare into the process before it fails. E.g. 750 portions with 3 failed portions with this error after around 500 packages.
I guess this suggest it is some sort of timing issue.

Do you know how we might be able to work around this issue?

I hope this is the correct location to raise this issue.

Thanks a lot for the great tools we can use to move SAP data to Azure!

@vikasbansal2022
Copy link
Contributor

Hi @leofrederiksen
The index table ZADF_CON_INDX is filled with encrypted key when maintaining the entry in table ZADF_CONFIG via TMG events based on interface ID.

Please let us know if you still facing this problem. we can have a look at it.

Regards,
Vikas

@leofrederiksen
Copy link
Author

Hi
The problem we are facing is when running parallel jobs when replicating data.

However, we have an issue when we are using parallel jobs. Either parallel jobs for a single table or parallel jobs for multiple tables.
With SLT we are getting all the table changes and we then generate relevant json files that are sent to the Event Hub.
Multiple tables are being transferred into the same event hub (we normally have 200-600 tables we need to replicate), as we have changes to many tables in SAP all the time.
If I force SLT to only use 1 job to move table change to Event Hub, it is working fine. But then we see a delay in the processing as the system only process 1 table at a time.
But, as soon as we define i.e. 4 parallel jobs to move changes from 50 tables to the Event hub, we get errors when getting the SAS token.

We have not been able to find a solution for it as the SDK is trying to keep the key in the memory/buffer.
Regards,
Leo

@vikasbansal2022
Copy link
Contributor

vikasbansal2022 commented Nov 28, 2022

Hi @leo

Thanks, for response.
Please check table ZADF_CONFIG ' Re-Process' Flag should be blank. If you still face issue, please share your email id. We can have a call to understand this issue.

image

Regards,
Vikas

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants