Skip to content
SAP ABAP tool for unit testing
ABAP Shell
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.

Build Status

Mockup Loader for ABAP unit testing


Version: 2.1.5 (history of changes)

Mockup loader is a tool to simplify data preparation for SAP ABAP unit tests. Create unit test data in Excel, easily convert it into MIME object that travels with ABAP package, easily consume the data from your unit test code. The tool was created with the following high level goals in mind:

  • simplify communication process between a developer and a business analyst (from client side in particular)
  • simplify test data preparation and maintenance - do and store it in Excel (and commit the Excel file to git)
  • simplify test data consumption - in particular, reduce volume of code required for a complex data tests


  • Interface stubbing - dynamically create double implementations of data accessor interfaces and connect their methods to files in mocks. Does not depend on ABAP test double framework. See 'Data delivery' section
  • Singleton memory storage to be able to substitute data selections in the legacy code without data accessor interfaces. See 'Store/Retrieve' section below
  • Strict and non-strict parsing - skip irrelevant table fields in your test data
  • "Deep" data loading - load master-detail structures in one step. See 'Deep data loading' section
  • Load source redirection - convenient utility to temporarily redirect loading from in-system-MIME object to local file, while you are working on or enhancing your test dataset. See Load source redirection section
  • Tools to simplify conversion from Excel to MIME object inside the system as well as CI flows - mockup compiler, mockup compiler JS, mockup editor (alpha, view only at the moment)
  • Utilities for table data filtering



The tool is created to simplify data preparation/loading for SAP ABAP unit tests. In one of our projects we had to prepare a lot of table data for unit tests. For example, a set of content from BKPF, BSEG, BSET tables (FI document). The output of the methods under test is also often a table or a complex structure.

Hard-coding all of that data was not an option - too much to code, difficult to maintain and terrible code readability. So we decided to write a tool which would get the data from TAB delimited .txt files, which, in turn, would be prepared in Excel in a convenient way. Certain objectives were set:

  • all the test data should be combined together in one file (zip)
  • ... and uploaded to SAP - test data should be a part of the dev package (W3MI binary object would fit)
  • loading routine should identify the file structure (fields) automatically and verify its compatibility with a target container (structure or table)
  • it should also be able to safely skip fields, missing in .txt file, if required (non strict mode) e.g. when processing structures (like FI document) with too many fields, most of which are irrelevant to a specific test.
" Test class (o_ml is mockup_loader instance)
o_ml->load_data( " Load test data (structure) from mockup
    i_obj       = 'TEST1/bkpf'
    e_container = ls_bkpf ).

o_ml->load_data( " Load test data (table) from mockup
    i_obj       = 'TEST1/bseg'
    i_strict    = abap_false
    e_container = lt_bseg ).

" Call to the code-under-test
    i_bkpf   = ls_bkpf
    it_bseg  = lt_bseg ).


The first part of the code takes TAB delimited text file bkpf.txt in TEST1 directory of ZIP file uploaded as binary object via SMW0 transaction...

1000  10    2015  1     40    S     ...
1000  10    2015  2     50    S     ...

... and puts it (with proper ALPHA exits and etc) to an internal table with BSEG line type.

On-the-fly data filtering is supported. For more information see

Data delivery

Interface stubbing

Since 2.0.0 mockup loader supports generating of interface stubs. 🎉

It creates an instance object which implements the given interface where one or more methods retrieve the data from the mockup. Optional filtering is supported, thus one of the method parameters is treated as the value to filter the mockup data by the given key field.

  data lo_factory type ref to zcl_mockup_loader_stub_factory.
  data lo_ml      type ref to zcl_mockup_loader.
  lo_ml = zcl_mockup_loader=>create(
    i_type = 'MIME'

  create object lo_factory
      io_ml_instance   = lo_ml

  " Connect one or MANY methods to respective mockups 
    i_method_name     = 'TAB_RETURN'         " <METHOD TO STUB>
    i_mock_name       = 'EXAMPLE/sflight' ). " <MOCK PATH>

  data li_ifstub type ref to ZIF_MOCKUP_LOADER_STUB_DUMMY. 
  li_ifstub ?= lo_factory->generate_stub( ).

  " Pass the stub to code-under-test, the effect is:
  data lt_res type flighttab.
  lt_res = li_ifstub->tab_return( i_connid = '1000' ).
  " lt_res contains the mock data ...

... and with filtering

    i_method_name     = 'TAB_RETURN'         " <METHOD TO STUB>
    i_sift_param      = 'I_CONNID'           " <FILTERING PARAM>
    i_mock_tab_key    = 'CONNID'             " <MOCK HEADER FIELD>
    i_mock_name       = 'EXAMPLE/sflight' ). " <MOCK PATH>

This will result in the data set where key field CONNID will be equal to I_CONNID parameter actually passed to interface call.

  • Structured addressing also supported, e.g. IS_PARAMS-CONNID.
  • ranges also supported - I_CONNID above can be a range parameter

Returning, exporting and changing parameters are supported. For more information see

In addition, forwarding calls to another object (implementing same interface) is supported. For example if some of accessor methods must be connected to mocks and some others were implemented manually in a supporting test (or real production) class. See

Finally, it is possible to return just one field of the first matching record e.g. Document type of a document selected by number. For this specify the field to return in I_FIELD_ONLY param. See

accessor pattern

Stub control

Generated stub instance implements ZIF_MOCKUP_LOADER_STUB_CONTROL interface, which allows:

  • temporarily enable/disable separate or all stubbed methods, which might be useful for some specific testing situations
  • accessing to call counters (method was called X times)
  • in plans: potentially, caching call parameters

Deep data loading

Available since v2.1.0.

If you have a target data with deep fields - tables or structures - it is possible to fill them in one run. Let's consider a simple example: assume you have 2 linked tables - header and lines - the tables are represented by separate files in zip.

ID   DATE   ...
1    ...
2    ...

1       1        100.00   ...
1       2        123.00   ...
2       1        990.00   ...

The target structure is:

  begin of ty_line,
    docid  type numc10,
    lineid type numc3,
    " ...
  end of ty_line,
  tt_line type table of ty_line,
  begin of ty_document,
    id   type numc10,
    " ...
    lines type tt_line, " <<< DEEP FIELD, supposed to be filled with lines of the document
  end of ty_document.
  tt_documents type table of ty_document.

The following code will load this kind of structure

      i_obj  = 'path_to_head_file'
      i_deep = abap_true            " <<< ENABLE DEEP LOADING
      e_container = lt_docs ).      " <<< type tt_documents

To instruct mockup loader how to find the data for deep components you have to fill these components in the text in special format: <source_path>[<source_id_field>=<value|@reference_field>] which means "go find source_path file, parse it, extract the lines, filter those where source_id_field = value or reference_field value of the current header record". For example:

ID   DATE   ...   LINES
1    ...          path_to_lines_file[docid=@id]
2    ...          path_to_lines_file[docid=12345]

For the first record the mockup loader will find file path_to_lines_file.txt and load the lines with docid = 1 (value of id field of the first record). For the second record the explicit value 12345 will be used as the filter.


Disclaimer: There is an opinion that adding test-related code to the production code is a 'code smell'. I sincerely agree in general. If the code was designed to use e.g. accessor interfaces from the beginning this is good. Still 'store' functionality can be useful for older pieces of code to be tested without much refactoring.

data flow

Some code is quite difficult to test when it has a db select in the middle. Of course, good code design would assume isolation of DB operations from business logic code, but it is not always possible (or was not done in proper time). So we needed to create a way to substitute selects in code to a simple call, which would take the prepared test data instead if test environment was identified. We came up with the solution we called store.

" Test class (o_mls is mockup_loader_STORE instance)
o_mls->store( " Store some data with 'BKPF' label
    i_name = 'BKPF'
    i_data = ls_bkpf ). " One line structure

" Working class method
if is_test_env = abap_false. " Production environment detected
  select ... from db ...

else.                        " Test environment detected
    exporting i_name  = 'BKPF'
    importing e_data  = ls_fi_doc_header
    exceptions others = 4 ).

if sy-subrc is not initial.
  " Data not selected -> do error handling

In case of multiple test cases it can also be convenient to load a number of table records and then filter it based on some key field, available in the working code. This option is also possible:

" Test class
o_mls->store( " Store some data with 'BKPF' label
    i_name   = 'BKPF'
    i_tabkey = 'BELNR'    " Key field for the stored table
    i_data   = lt_bkpf ). " Table with MANY different documents

" Working class method
if is_test_env = abap_false. " Production environment detected
  " Do DB selects here 

else.                        " Test environment detected
      i_name  = 'BKPF'
      i_sift  = l_document_number " <<< Filter key from real local variable
      e_data  = ls_fi_doc_header  " Still a flat structure here
    exceptions others = 4 ).

if sy-subrc is not initial.
  " Data not selected -> error handling

As the final result we can perform completely dynamic unit tests, covering most of code, including DB select related code without actually accessing the database. Of course, it is not only the mockup loader which ensures that. This requires accurate design of the project code, separating DB selection and processing code. The mockup loader and "store" functionality makes it more convenient.

The zcl_mockup_loader has a shortcut method load_and_store to load data to the store directly without technical variables. For more information see

Some design facts about the store:

  • The store class ZCL_MOCKUP_LOADER_STORE is designed as a singleton class. So it is initiated once in a test class and the exists in one instance only.
  • RETRIEVE method, which takes data from the "Store" is static. It is assumed to be called from "production" code instead of DB selects. It acquires the instance inside and throws non-class based exception on error. This is made to avoid the necessity to handle test-related exceptions, irrelevant to the main code, and also to be able to catch the exception as SY-SUBRC value. SY-SUBRC can be checked later similarly to regular DB select. So the interference with the main code is minimal.


The most convenient way to install the package is to use abapGit - it is easily installed itself and then a couple of click to clone the repo into the system. There is also an option for offline installation - download the repo as zip file and import it with abapGit. Unit test execution is always recommended after installation.

Dependencies (to install before mockup loader):

  • text2tab - tab-delimited text parser (was a part of mockup loader but now a separate reusable tool). Mandatory prerequisite.
  • abap_w3mi_poller - optional - enables 'Upload to MIME' button in ZMOCKUP_LOADER_SWSRC. The mockup loader can be compiled without this package (the call is dynamic).

P.S. APACK manifest implementation is under consideration.

Load source redirection

Zipped mockups slug is supposed to be uploaded as a MIME object via SMW0. However, during data or test creation, it is more convenient (and faster) to read local file. In particular, not to upload 'draft' test data to the system.

i_type and i_path are the parameters to the create method to define the 'normal' mockup source. To temporarily switch to another source you can use the transaction ZMOCKUP_LOADER_SWSRC. It will initialize SET/GET parameters ZMOCKUP_LOADER_STYPE and ZMOCKUP_LOADER_SPATH(MIME) which will override defaults for the current session only.

switch source


  • Type change in the selection screen immediately changes the parameters in session memory, no run is required ('enter' should be pressed though after manual text fields change to trigger on screen)
  • Get SU3 reads param values from user master (useful when you work on the same project for some time)
  • Upload to MIME uploads the file to MIME storage instead of going through SMW0 (the MIME object must exist)
  • Saving variants is also convenient ;)

Conversion from Excel

You may have a lot of data prepared in Excel files. Many files, many sheets in each. Although Ctrl+C in Excel actually copies TAB-delimited text, which greatly simplifies the matter for minor cases, it is boring and time consuming to copy all the test cases to text. Here are special tools to simplify this workflow. Briefly: they take directory of excel files with mockup data and convert them into format compatible with mockup loader.

  • mockup compiler - ABAP implementation, requires abap2xlsx installed.

    compile zip slug

  • mockup compiler JS - java script implementation, requires nodejs environment at the developer's machine. This tool also make it possible to compile the target zip via continuous integration flows - so the test data in excel can be a part of source repository. (I'm planning a dedicated publication on this subject).

See for more info.

Examples and Reference


You are welcomed to suggest ideas and code improvements ! :) Let's make ABAP development more convenient. Please kindly respect the code conventions



The code is licensed under MIT License. Please see the LICENSE for details.

You can’t perform that action at this time.