-
Notifications
You must be signed in to change notification settings - Fork 43
Custom (user defined) pipeline stages
The convert
command processes its input files, each of which contains an array of JSON objects extracted from the third-party platform, through a series of pipeline stages. These stages, and the order in which they are to be processed, are defined in the config.yml
file.
Any custom conversion logic can be written in additional custom pipeline stages.
To define your own custom pipeline stage, run the following command:
$ shopify_transporter generate YourCustomStageName --object customer
create lib/custom_pipeline_stages/your_custom_stage_name.rb
Updated config.yml with the new pipeline stage
The above command will generate a new file named your_custom_stage_name.rb
that contains a stubbed out version of the class, for which the body of the convert
method will need to be provided by the user.
The convert
method receives a Hash
representing the JSON object currently being processed, as well as the current state of the corresponding Shopify object being converted (as determined by looking at the record_key
defined in config.yml
). The method's responsibility is to inject into the Shopify object the relevant attributes from the input object.
The newly-created pipeline stage is automatically added to config.yml
by the generate
sub-command.
Add all your customized conversion logic in the file created in the last step. When you run the convert
sub-command, the input will be passed through each pipeline stage, including any custom pipeline stages listed in config.yml
.
Here's what your custom pipeline stage will look like when it's generated:
module CustomPipeline
module Customer
class YourCustomStageName < ShopifyTransporter::Pipeline::Stage
def convert(input, record)
# The convert command reads the input files one-by-one, line-by-line.
#
# For each row, the value of the record_key column is used to lookup the Shopify object being built.
#
# If the Shopify object doesn't exist, it's created as a default empty hash.
#
# It's the role of a pipeline stage to examine the input rows and populate attributes on the Shopify object.
#
# For example, the TopLevelAttributes stage of a Magento customer migration would look for a column called firstname on the input,
# and then populate the Shopify object accordingly:
#
# record['first_name'] = input['firstname']
#
# Any modifications to the record within a pipeline stage are permanent to the Shopify record associated with the record_key.
#
# The next pipeline stage to receive the record will receive the same input and the existing record which would consist of:
#
# {
# 'first_name' => 'John',
# }
end
end
end
end
Suppose the reason we wanted this custom pipeline stage was to translate a field from Magento called favorite_color
into Shopify somehow. However, Shopify's Customer objects don't have a corresponding attribute that tracks the customer's favorite color. In cases like this, the recommended approach is to bring that data over as a Metafield.
When importing metafields on a customer via the Transporter App, there are four required fields:
metafield_namespace
metafield_key
metafield_value
metafield_value_type
We should read the favorite_color
attribute from the Magento input and populate the four metafield_
keys on the Shopify record.
module CustomPipeline
module Customer
class YourCustomStageName < ShopifyTransporter::Pipeline::Stage
def convert(input, record)
record['metafield_namespace'] = 'converted_fields_from_magento'
record['metafield_key'] = 'favorite_color'
record['metafield_value'] = input['favorite_color']
record['metafield_value_type'] = 'string'
end
end
end
end
This will create a metafield called favorite_color
in the converted_fields_from_magento
namespace. The value of that metafield will be taken from the Magento input object. The metafield value type should be set to one of: string
, integer
, or json_string
.
This pipeline stage will then automatically do its conversion as part of the convert
command. It also passes the newly modified record to the next custom pipeline stage if there are multiple, so your custom pipeline stages can build on each other to create complex behaviour.