Skip to content

Latest commit

 

History

History
642 lines (365 loc) · 31.2 KB

User-Guide-Mojaloop-Testing-Toolkit.md

File metadata and controls

642 lines (365 loc) · 31.2 KB

Users Guide Mojaloop Testing Toolkit

Mojaloop Testing Toolkit

Table of Contents

  1. User Guide

  2. At first glance

  3. Welcome Page

  4. Monitoring

  5. Rules

    4.1 Sync Response Rules

    4.2. Validation Rules (Error Callbacks)

    4.3. Callback Rules (Success Callbacks)

    4.4 Building your own Rules File

    4.5 Inbound Scripting

  6. Outbound Request

    5.1 Collection Manager

    5.2 Import Environment

    5.3 Test Cases

    5.3.1 Request

    5.3.2 Test Editor

    5.3.3 Test Scripts

    5.3.4 Tests

    5.4 Download Report

    5.5 New Template

    5.6 Show Template

    5.7 Save

    5.8 Send

    5.9 Download Testcase Definition

  7. Settings

1 At first glance

When you open the Mojaloop Testing Toolkit in your browser, you will be welcomed by the Dashboard display. Currently this is still under development, and only display "static data". The current static display will provide you with a fair representations of the intentional Dashboard functionality.

Opening view

Take note of the navigation bar on the left. The navigational items are;

  • Welcome Page
  • Monitoring
  • Sync Response Rules
  • Validation Rules (Error Callbacks)
  • Callback Rules (Success Callbacks)
  • Outbound Request
  • Settings

We will work through each one of the items and provide you with a fair understanding of the current functionality.

2 Welcome Page

The Welcome page is the default opening window.

Opening view

3 Monitoring

The Monitoring navigation tab allows you to monitor incoming and outgoing requests to / from the Testing Toolkit.

Monitoring Initial State

By following the docker installation instructions in the README document, you should have the Mojaloop Simulator UI open in your browser tab. On the Mojaloop Simulator UI open browser, go to the navigation bar on the left and click on Outbound Send tab.

Press Send Transfer button on the main window to send sample test data from the Mojaloop Simulater UI to the Mojaloop Testing Toolkit.

Send Transfer

You should receive a response on the Mojaloop Simulator UI as indicated below. Note the variable data are generated randomly, therefore it could differ from the information displayed in the images provided below.

Simulator response

Go back to the Mojaloop Testing Toolkit UI in your browser and select from the navigation bar on the left the Monitoring tab. You will notice the three operations associated with the above transfer request send. The most recent request with the associated operations will be the most recent displayed item on the main page and can be verified by the date/time stamp associated to each operation.

  • GET /parties/{Type}/{ID}
  • POST /quotes
  • POST /transfers

Monitoring messages

To view detailed information on any one of the Callbacks, click on the operation - in this example GET /parties/MSISDN/0001 presented as a blue button.

If you recall the Architecture Diagram mentioned earlier, under messages, you'll notice, Version negotiation, Schema Validation and Additional validations;

Clicking on the + to the left of the expanded list on the main window to view detail related to that specific timeline message.

Expanded monitoring messages

As an example for this document, the above messages in the image has be expanded to provide better insight.

  • Request: get/parties/MSISDN/000111
    • This contains the body of the request
  • Version negotiation succeeded, picked up the version 1.0
    • Confirm the API version that was used
  • Callback rules are matched
    • The callback rule used in this process. This can be customized, and will be covered later in this document
  • Received callback response 200 OK
    • The http response for the previous step "Sending callback put /parties/MSISDN/000111"

When you send more transfers from the Mojaloop Simulator UI, these transactions will be added to the monitoring event.

Additional transfers

4 Rules

4.1 Sync Response Rules

Validation and synchronous response based on the schema of the code

The Sync Response Rules navigation tab on the left of the Mojaloop Testing Toolset allow you the setup fixed or mock responses. Take note of the default.json file on the right hand side window. That contains the list of operations and sample content for mock or fixed responses for the operations listed on the center window. These can be tested by running the collection in Postman. (Import the collection and environment files into Postman testing tool.)

Opening  Sync Response Rules

Below is a sample MOCK_RESPONSE

Sample - Mock Response

Below is a sample FIXED_RESPONSE

Sample - Fixed Response

4.2 Validation Rules

Error Callbacks

This Rules section is to simulate asynchronous error callbacks.

The setup and functionality is similar to 4 Sync Response Rules with the ability to create your own Rules file. This has already been explained in the previous section under 4.1 Building your own Rules File.

Validation Rules Screen

4.3 Callback Rules

Success Callbacks

This Rules section is to simulate asynchronous success callbacks.

The same applies for this section, the functionality is similar to 4 Sync Response Rules with the ability to create your own Rules file. This has already been explained in the previous section under 4.1 Building your own Rules File.

Callback Rules Screen

4.4 Building your own Rules File

The toolset allows you to create your own file with a collection of rules to suite your specific testing requirements. A rule can be created completely with the web UI. But for advanced users and a comprehensive insight, please view the Rules Engine document for more information related to the Rules Engine.

The following section will provide an overview into building your own Rules file.

On the left window, click the New Rules File button in the top left. Provide an appropriate file name. Remember to save it with a .json extension. Click on the to create/save the file.

Creating New Rule File

Click on the Add a new Rule button on the middle top center window. The Rule Builder window will popup.

Building New Rules File

Firstly, select your desired API from the drop down list provided. All of the supported API's are included in this list.

Select API

Next, select the operationId you require for the new rule in the Resource dropdown list. All the operationId's are listed in the dropdown list. Click on the down arrow to provide the selection. The list is as per the Swagger definition of the API selected.

Resource selection

Add Condition Button

You will be presented with 4 boxes each with a dropdown list to select from;

  • Fact Type
  • Fact
  • Operation
  • Value

For each of the above mentioned items, click on the down arrow in the item box to list the selection options to select from. These selection options are as per the Swagger definition for the selected API above.

Sample Condition

You can use configurable param in Value field. You can select one by clicking on Add Configuration Params button.

Sample Condition add configurable params

Next step would be to select the EVENT Type detail below. Click on the down arrow in the item box to list the option. Select either Mock Response or Fixed Response.

Event Response Options

For normal use cases you can select Mock Response, where the testing toolkit can generate mock data based on the schema in the swagger file. By selecting the Mock Response, you will have the option to override the standard mock response by selecting Override some parameters tick box. This will open the Headers. Your options to select from are Add Header, Add Required Header, Add All Headers.

Header Selection

A sample body can also be created. The data is generated by using the swagger definition pertaining to the specified operationId. There are no real "value" to the sample data, but it can be edited in the editing console to be more meaningfull for testing purposes.

Click on the Populate with sample body button. The editing window will be filled with the sample data for the body of the selected Resource operationId.

Populate with sample body

Update the sample body if so required.

Updated sample body data

The body of the response can be configured. Click on the Add Configurable Params. The Select a Configurable Parameter window will pop-up with a list of valid selections. Once completed, the Configured Parameter is copied to clipboard by clicking on the completed {$request.body.state}. This can then be used within the latter as can be seen in the sample rules image below.

Using Configurable Parameter

Once completed, you should save the newly create rule by selecting the Save button on the top of the Rule Builder window.

This will provide a summarized view of the newly created rule, similar to the samples provided.

Summarized view of Rule

Going forward, the option exist to updated the rule or delete it by selecting the appropriate button - Edit or Delete.

Lastly, set the new rule file as active by selecting the new rule file on the right hand side window and selecting the Set as active button. You will see the mark next to it to indicate it is active.

You have the option to include one or more CONDITIONS for the new rule. Select the Add Condition button should you wish to add an additional condition to your rule.

4.5 Inbound Scripting

You have the option to enable or disable inbound requests scripts. You can use Scripts tab to set dfsp positions and do validations later. In the given example we are updating the ttkdfspTotalAmount if the transfer request is successfull for testingtoolkit dfsp. You could execute p2p-happy-path with fromFspId equal to testingtoolkitdfsp and go back to Environment tab to see the total amount.

Inbound Requests Scripts

In the Environment tab you could observe the current environment state.

Inbound Requests Environment

You can write scripts in two formats.

  • Postman Script:

    If you select postman script option, you can use the same functions like pm.sendRequest as in postman. (Postman script format in the Testing Toolkit is deprecated and not encouraged, please use javascript format)

  • Java Script:

    If you want advanced features and flexibility, you can select javascript option. This option enables you to write the scripts in javascript format and you can use the following functions.

    • console.log - function

    • request - variable

    • environment - variable

    • axios - library

      With axios library, you can use various functions like axios.get, axios.post...etc. Please note these functions are async functions and you may need to use await before the function.

      const resp = await axios.get('http://someurl')
      

      You can find the documentation about axios at this link https://github.com/axios/axios#axios-api

5 Outbound Request

This sections will enable you to intiate requests from the testing toolkit to your DFSP / HUB implementations.

The user can create a collection of operations and add a number of assertions to these operations. The assertions can be setup and customized to support your testing requirements and verify both positive and negative requests and responses.

Selecting the Outbound Request navigation tab on the left side, the following will open in the main display window.

Outbound display opening

At the top of the screen, you will notice the following buttons on the main window, starting from the left.

  • Collections Manager
  • Load Sample
  • Show Current Template
  • Iteration Runner
  • Send

You can see two tabs 'Test Cases' and 'Input Values'.

Template Window

5.1 Collection Manager

By selecting the Collection Manager button, it will open a drawer on left which contains a number of file operations. That will allow you to import, export and modify a collection. For your exploration, sample collections are available under the project root directory under /examples/collections sub directory. To select one of the sample files, on the file explorer window that poped up when you selected the Import File button, navigate to /examples/collections/dfsp under the project root directory. Select the p2p_happy_path.json file to import into the Mojaloop Testing Toolkit application. You should select the file in the collection manager, and observe the test cases should be loaded in the main screen. You could add more test cases by clicking on Add Test Case button

Collection Manager

P2P Transfer Happy Path

  • get/parties/{Type}/{ID} - Get party information

  • post/quotes - Get quote

  • post/transfers - Send transfer

Opened Imported Template

5.2 Import Environment

By selecting the Import Environment button in Input Values tab, it will allow you to import input values. For your exploration, sample environments are available under the project root directory under /examples/environments sub directory. To select one of the sample files, on the file explorer window that poped up when you selected the Import Environment button, navigate to /examples/environments under the project root directory. Select the dfsp_local_environment.json file to import into the Mojaloop Testing Toolkit application. This sample file consist of a couple of input values.

It is possible to update the Input Values on the right hand side window. Additional input values can be added, by selecting the Add Input Value button on the top right of this right hand side window. These values relates to the specified variable that you can select and use in any Request that you create.

Add Additional Input Values

The process is straight forward. Provide a name for the new input value and click on the Add button.

Add New Input Variable

To update values of exisiting variable(s) to the Input Values, simply select the value next to the variable and type in a new value. It will then be available for use in the test cases imported earlier.

Add New Input Value

5.3 Test Cases

Click on the Edit button, to open up the Test Case in the edit mode.

Test-Case Editor

5.3.1 Request

The Request tab reflects both the appropriate Header and Body that makes up the request as per the selected API swagger specification. These values can be changed in the Editor tab.

Sample Request

5.3.2 Editor

The Editor tab displays request content and it can be updated manually on this window. Depending on how the request was defined, you will be able to see the selected API, the operationId, the Header and Body content. Options to Duplicate, Delete or Rename are also available as buttons above. There are some additional options like Override with Custom URL or Ignore Callbacks. You can also build your own new request by selecting the Add New Request button on the top right of this window. The process to build a new request is similar to the one explained in 2.4 Sync Response Rules

Sample Scripts

5.3.3 Scripts

The Scripts tab allows you to use postman like pre request and test scripts. Make sure that advanced features options is enabled.

You can write scripts in two formats.

  • Postman Script:

    If you select postman script option, you can use the same functions like pm.sendRequest as in postman. This option is usefull when you want to convert your existing postman tests to testing toolkit format.

    • pm.test - not supported - Use Testing Toolkit Tests for this purpose. In Tests You could use values stored in the environment. To access thoes values use environment.'key'
    • pm.response - to get the response object outside pm.sendRequest use pm.response.body not pm.response.json()
    • everything else should work the same way is in postman
  • Java Script:

    If you want advanced features and flexibility, you can select javascript option. This option enables you to write the scripts in javascript format and you can use the following functions.

    • console.log - function

    • response - variable

    • environment - variable

    • axios - library

      With axios library, you can use various functions like axios.get, axios.post...etc. Please note these functions are async functions and you may need to use await before the function.

      const resp = await axios.get('http://someurl')
      

      You can find the documentation about axios at this link https://github.com/axios/axios#axios-api

    • websocket - library

      With websocket library, you can connect to a websocket server and get the first message from the server.

      Functions supported:

      • websocket.connect - To connect to a websocket URL and listen for messsages
      • websocket.getMessage - To get the message arrived. This function can also wait for the message some time. The session will be disconnected automatically after returning the message
      • websocket.disconnect - To disconnect a particular session
      • websocket.disconnectAll - To disconnect all the sessions

      This will be used to assert on the payee side data from the sdk-scheme-adapter in tests cases. You may need to enable websocket capabilities in the sdk-scheme-adapter.

      Examaple:

      In Pre-request

      await websocket.connect('ws://localhost:4002/requests/{$inputs.toIdValue}', 'payeeRequest')
      

      In Post-request

      environment.payeeRequest = await websocket.getMessage('payeeRequest')
      

      Then you can use the above environment variable in assertions.

      environment.payeeRequest.headers['content-type']
      
    • inboundEvent - library

      With inboundEvent library, you can listen on an inbound request to testing toolkit

      Functions supported:

      • inboundEvent.addListener(clientName, method, path, [conditionFn], [timeout])

        To start listening for the inbound messsages to TTK

        Parameters:

        • clientName - Client name to be referred later in postrequest script

          Type: [String]

        • method - Http method to match

          Type: [Request]

        • path - Http path to match

          Type: [Request]

        • conditionFn (optional) - An optional function to call for addional matching logic

          Type: [Function]

          Parameters passed: (request.headers, request.body)

          Return Value: [Boolean]

        • timeout - Time in ms to wait for the inbound request

          Type: [integer]

        Example usage:

        await inboundEvent.addListener('quote1', 'post', '/quotes', (headers, body) => {
          return body.quoteId === '<SOME_ID_HERE>'
        })
      • inboundEvent.getMessage(clientName, [timeout])

        To get the message arrived. This function can also wait for the message some time. The session will be disconnected automatically after returning the message

        Parameters:

        • clientName - Client name to get the message from. The name should match with the name provided in the addListener call.

          Type: [String]

        • timeout - Time in ms to wait for the inbound request

          Type: [integer]

        Example usage:

        await inboundEvent.getMessage('quote1')
    • custom.setRequestTimeout - To set a specific timeout value for the request in milli seconds (Default: 3000ms)

    • custom.sleep - To wait for particular amount of milli seconds

    • custom.jws - library

      With custom.jws library, you can sign and validate an FSPIOP request using JWS

      Functions supported:

      • custom.jws.signRequest(<PRIVATE_KEY>) - To sign the outgoing request using the private key
      • custom.jws.validateCallback(<callback.headers>, <callback.body>, <PUBLIC_CERTIFICATE>) - To validate the incoming callback using public certificate. This will validate protected headers too.
      • custom.jws.validateCallbackProtectedHeaders(<callback.headers>) - To validate only protected headers in the FSPIOP-Signature header
    • custom.skipRequest - function

      By using this function in the pre-request script, you can skip the current request including post-request-scripts and assertions. You can see the request and assertions as skipped in the UI and in the report as well.

    • custom.pushMessage(message, [sessionID])

      By using this function in the scripts in rules for inbound requests, you can push a websocket message to the clients listening on the websocket server of TTK and on topic 'pushMessage'. There is an optional sessionId that we can pass as second argument to this function

      Parameters:

      • message - The message object to emit to the clients.

        Type: [Object]

      • sessionID - Optional sessionID to send message to targetted clients who are listening on the topic 'pushMessage/'

        Type: [string]

      Example usage:

      await custom.pushMessage({ name: 'Sample Name' })
      await custom.pushMessage({ name: 'Sample Name' }, 'client1')
    • custom.appendRequestBody(requestBody)

      By using this function in the outbound pre-request script, we can mutate the request and override some values in the request.

      Parameters:

      • requestBody - The request body to override.

        Type: [Object]

      Example usage:

      await custom.appendRequestBody({ sampleKey: 'Sample Value' })
    • custom.appendEventBody(eventBody)

      By using this function in the inbound script, we can modify the event body specified in the rule.

      Parameters:

      • eventBody - The event body to override.

        Type: [Object]

      Example usage:

      await custom.appendEventBody({ sampleKey: 'Sample Value' })

Sample Pre Request and Post Request Scripts

After executing the test case you will see Console Log and Environment State as well.

  • Console Log

Sample Scripts - Console Log

  • Environment State

Sample Scripts - Environment State

5.3.4 Tests

The Tests tab contains the assertions that was setup for the specified operationId. These are similar to PostMan tests and will evaluate the request and/or the response based on the requirements of the assertion. The below is an assertion from the sample Import Template imported earlier, and validate the Callback and expect the response.status to be equal to 202.

Sample Test Assertion

To create a new assertion, you can either add to an existing assertion, or choose to set-up a new assertion. Apart from naming new assertion,the rest of the steps are the same in both cases. We will only cover the basic set-up for this document. The options to Rename or Delete the assertion are also avalable.

To create a new assertion, select the Add New Assertion button on the top right. Provide an appropriate name and click on the Add button.

Add New Assertion

The new assertion will be available at the bottom of the existing assertions for that operation.

Navigate to the newly created assertion, and click on the arrow on the left to expand the dropdown list.

New Empty Assertion

Include a new expectation by selecting the Add Expectation button at the bottom left. This will launch the Expectation window. Selecting the first box will provide you with the option to either select a synchronous response or a callback to be assessed.

Assess Request or Response

We have opted for the Response. Next select the field to be assessed - Status was selected for this demo.

Assess Response Status

Select the equation from the middle box. We have opted for Not Equal to.

Assess Response Equation

Add the required value in the last box and click on the Save button. Congratulations - you have successfully created an assertion.

Assess Response Equation Save

It is also possible to compare the values from Input Values or the parameters from the request. Select the Configurable Parameter button at the bottom right to launch the Configurable Parameter window. Click on the dropdown box to provide the list of possible options. We have selected Input Values for this demo.

Configurable Parameter

Click on the Input Values box below to display the dropdown list. Select one of the options listed. We have chosen currency for this demo.

Configurable Parameter Currency

The Configurable Parameter {$inputs.currency} is now available for use with the option to Copy to clipboard or Insert into editor.

To get a better understanding of how this will work, please refer to the below. This assertion is part of the samples provided. This assertion contains 2 expectations. You will notice it is possible to refer to the values from the previous operationId $prev.2.callback.body.* and can be compared to the values from the current operationId $request.body.*. Due to the technical nature of these assertions, it will be explained in detail in a seperate document.

Configurable Parameter Assertion

5.4 Download Report

You could download a report in several formats by clicking on Download Report in the top right corner:

  • JSON
  • HTML
  • Printer Friendly HTML

Download Report

5.5 New Template

You could create a new template by clicking on New Template button the top right corner

5.6 Show Template

You could view the template by clicking on Show Template button in the top right corner

5.7 Save

You could save the collection or the environment by clicking on Save button in the top right corner

5.8 Send

You could execute the whole template by clicking on Send button in the top right corner. Please insure you added a user with {MSISDN} and value found in the Input Values on the simulator (see Frequently Asked Questions section 4.2 Generic ID not found). (/documents/Mojaloop-Testing-Toolkit.md#7-architecture)

Sending Test Cases

  • Select the aditional options and Send this test case button on the right top.

Sending Test Cases

  • Select Edit and then the Send button on the right top.

Sending Test Cases

If you select the Edit button now, you will notice the addition of the response tab. Select that to view all the responses for the operation.

View Response

5.9 Download Testcase Definition

You can download a report with all the definitions of testcases (like descriptions for all the requests in a test case, expected results ...etc) Currently only HTML format is supported and by clicking on Download Definition at the right side of begining of the test cases.

Download Testcase Definition

And you can edit meta information about each testcase and request using the option provided.

Edit Meta Info

6 Settings

The SETTINGS navigation tab will open to the SETTINGS window. Below is the default view of this page.

Opening Default Settings

  • The SETTINGS window consist of the following two windows;

    • On the left Runtime Global Configuration displays the actual configuration that is effectively active by the Mojaloop Testing Toolkit service.
    • On the right Edit Global Configuration amongst a couple of other options, it allows you to edit the values manually. Should you elect to use other environment values you can disable the default values by selecting the Override with Environment Variable option.
  • In a default docker deployment, the environment values will be provided in the local.env file in the project root directory.

Override with Environment Variable