Skip to content

Commit

Permalink
Merge pull request #2125 from hansva/gh-issues
Browse files Browse the repository at this point in the history
[DOC] fix al asciidoc errors & warnings
  • Loading branch information
hansva committed Jan 3, 2023
2 parents 181bb17 + 8b8b940 commit 1555d53
Show file tree
Hide file tree
Showing 19 changed files with 54 additions and 36 deletions.
4 changes: 1 addition & 3 deletions .github/pr-rules.yml
Original file line number Diff line number Diff line change
@@ -1,19 +1,17 @@
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

# Please keep the entries sorted lexicographically in each category.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,8 @@ specific language governing permissions and limitations
under the License.
////
:description: RAP is multi user framework by its nature. Every user session is associated with a display. In RAP, Display#getDefault() will not create a new display when it's called from non-UI thread - read Display#getDefault JavaDoc. When you execute a code in a background thread, RAP needs to know for which UI session (display) it belongs. That's why you have to provide the correct UISession/display from outside.
:openvar: ${
:closevar: }

= Developer Guide

Expand Down Expand Up @@ -109,7 +111,7 @@ To then run it simply execute:

== Configuring Hop Web

The main configuration of Hop is done through a single configuration file called `hop-config.json` and it is found in folder `${HOP_CONFIG_FOLDER}`
The main configuration of Hop is done through a single configuration file called `hop-config.json` and it is found in folder `{openvar}HOP_CONFIG_FOLDER{closevar}`

It is possible to set pass this standard Hop environment variable `HOP_CONFIG_FOLDER` to the docker container.
You can point it to a mounted volume for example:
Expand Down
6 changes: 4 additions & 2 deletions docs/hop-dev-manual/modules/ROOT/pages/hopweb/index.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,8 @@ specific language governing permissions and limitations
under the License.
////
:description: Building and setting up your own Hop Web environment is straightforward. The steps to set up the default Docker image are included in a helper script docker/create_hop_web_container.sh in the Hop code base. This should get you started to make modifications or create your own version entirely.
:openvar: ${
:closevar: }

= Hop Web Development Guide

Expand Down Expand Up @@ -82,11 +84,11 @@ export CATALINA_OPTS='${HOP_OPTIONS} -DHOP_AES_ENCODER_KEY="${HOP_AES_ENCODER_KE
----

If you want to run Hop Web with the `default` and `samples` projects, make sure the project root path in `hop-config.json` is set to `${HOP_CONFIG_FOLDER}.
If you want to run Hop Web with the `default` and `samples` projects, make sure the project root path in `hop-config.json` is set to `{openvar}HOP_CONFIG_FOLDER{closevar}.

On Linux or Mac, use the following sed command to fix this in one line:

`sed -i 's/config\/projects/${HOP_CONFIG_FOLDER}\/projects/g' webapps/hop/config/hop-config.json`
`sed -i 's/config\/projects/{openvar}HOP_CONFIG_FOLDER{closevar}\/projects/g' webapps/hop/config/hop-config.json`

On Windows, modify `hop-config.json` to make sure `projectsConf` looks like the one below:

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,14 +16,16 @@ under the License.
////
[[database-plugins]]
:imagesdir: ../../assets/images
:openvar: ${
:closevar: }
:description: Hop supports tens of databases out of the box. If your preferred database has no specific support, you can probably still connect through a generic database connection.
= Database Plugins

Creating a database connection in HOP is done using one of the many database types available, or you can create a generic connection.
To create a database connection go to file -> New and select Database connection.

The connection is saved in a central location and can then be used by all pipelines and workflows.
If you have set your project to work with Hop, the database information will be in the `${PROJECT_HOME}/metadata/rdbms` folder.
If you have set your project to work with Hop, the database information will be in the `{openvar}PROJECT_HOME{closevar}/metadata/rdbms` folder.
Each connection created will generate a .json file in this folder with the name of the connection, containing the connection information.

If the license allowed it, a jdbc driver is included in the distribution, in a folder specific for each driver, in the general path: `Installation directory/plugins/databases/Database type/lib`.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ under the License.
|Driver folder | Hop Installation/plugins/databases/mssqlnative/lib
|===

= Integrated Authentication / Windows Based Authentication
== Integrated Authentication / Windows Based Authentication

The native Microsoft SQL JDBC driver ships with extra files that enables authentication using your current MS Windows credentials.
When you download the JDBC drivers from Microsoft's site and unzip them, there will be a directory structure like the following:
Expand Down
9 changes: 2 additions & 7 deletions docs/hop-user-manual/modules/ROOT/pages/hop-server/index.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -182,6 +182,7 @@ Listen to all interfaces on the server:
[source,shell]
hop-server.sh 0.0.0.0 8080--
--
====


Expand Down Expand Up @@ -232,9 +233,7 @@ The syntax of this configuration file is fairly simple:
</hop-server-config>
----

Example startup commands with a configuration file are:

&nbsp; +
Example startup commands with a configuration file are: +

[tabs]
====
Expand All @@ -260,16 +259,12 @@ Linux, macOS::
+
--
[source,shell]
----
hop-server.sh /foo/bar/hop-server-config.xml
----
Or with a remote configuration file:
[source,shell]
----
hop-server.sh http://www.example.com/hop-server-config.xml
----
You can also enable a project lifecyfle environment for the Hop server:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -52,3 +52,5 @@ Linux, macOS::
----
Expected output: the Hop Translator tool starts.
--
====
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@ under the License.
////
[[HopServer]]
:imagesdir: ../../assets/images
:openvar: ${
:closevar: }
:description: This tutorial explains how to run Apache Hop web services from a Docker container

= Web Services in Apache Hop
Expand Down Expand Up @@ -72,15 +74,15 @@ In Apache Hop it is possible to set up runtime environments for different enviro

Connection details (e.g. DB server URL) for database connections, etc. can be stored as variables in configuration files.

For example, to set up a new database connection whose connection details may differ depending on the environment, enter the name of the environment variable (e.g. `${DB_HOST}`) instead of a concrete server URL. +
For example, to set up a new database connection whose connection details may differ depending on the environment, enter the name of the environment variable (e.g. `{openvar}DB_HOST{closevar}`) instead of a concrete server URL. +

As soon as you select an environment and the variable is contained in its configuration file, the variable in the DB configuration is replaced by the value from the environment configuration.

This functionality is very helpful, for example, to test the pipeline against different environments before you start the deployment on the Hop Server.

This functionality is also essential for multi-container applications (see above: full-stack architecture), whose service results from the interaction of different and externally isolated containers.

A separate environment configuration is therefore necessary if you want to start your application with Docker Compose in addition to your development environment (in this example, the `${DB_HOST}` variable would have the service name of the DB Container instead of the IP address of the DB Server).
A separate environment configuration is therefore necessary if you want to start your application with Docker Compose in addition to your development environment (in this example, the `{openvar}DB_HOST{closevar}` variable would have the service name of the DB Container instead of the IP address of the DB Server).

=== Step 4: Set up and start Docker Container

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@ under the License.
////
[[BeamFlinkPipelineEngine]]
:imagesdir: ../assets/images
:openvar: ${
:closevar: }
:description: Apache Hop supports running pipelines on Apache Flink using an Apache Beam Flink runner. This page describes how to configure this runner.
= Apache Beam Flink Pipeline Engine

Expand Down Expand Up @@ -130,7 +132,7 @@ In the meantime pass variables to the JVM by setting these in the conf/flink-con
env.java.opts: -DPROJECT_HOME=/path/to/project-home
----

In general, it is better not to use relative paths like `${Internal.Entry.Current.Folder}` when specifying filenames when executing pipelines remotely.
In general, it is better not to use relative paths like `{openvar}Internal.Entry.Current.Folder{closevar}` when specifying filenames when executing pipelines remotely.
It's usually better to pick a few root folders as variables.
PROJECT_HOME is as good as any variable to use.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@ under the License.
////
[[BeamSparkPipelineEngine]]
:imagesdir: ../assets/images
:openvar: ${
:closevar: }
:description: Apache Hop supports running pipelines on Apache Spark over Apache Beam. The Apache Spark Runner can be used to execute Beam pipelines using Apache Spark.

= Apache Beam Spark Pipeline Engine
Expand Down Expand Up @@ -118,7 +120,7 @@ In the meantime pass variables to the JVM with the option:
--driver-java-options '-DPROJECT_HOME=/path/to/project-home'
----

In general, it is better not to use relative paths like `${Internal.Entry.Current.Folder}` when specifying filenames when executing pipelines remotely.
In general, it is better not to use relative paths like `{openvar}Internal.Entry.Current.Folder{closevar}` when specifying filenames when executing pipelines remotely.
It's usually better to pick a few root folders as variables.
PROJECT_HOME is as good as any variable to use.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@ under the License.
////
[[RemotePipelineEngine]]
:imagesdir: ../assets/images
:openvar: ${
:closevar: }
:description: The remote run configuration runs Hop pipelines on a remote Hop Server. This run configuration requires little configuration, but requires a Hop server and a Hop Server metadata definition.

= Remote Pipeline Engine
Expand Down Expand Up @@ -55,7 +57,7 @@ If you don't specify a value this defaults to 2000ms (2 seconds)
See below for detailed information

|Named resources reference source folder
|This is the reference source folder for the named resources that are being used (e.g. `${PROJECT_HOME}`).
|This is the reference source folder for the named resources that are being used (e.g. `{openvar}PROJECT_HOME{closevar}`).
See below for detailed information.

|Named resources reference target folder
Expand All @@ -71,16 +73,16 @@ For example if you have mapping pipelines or workflows referenced they will be s
All the used pipelines and workflows together with the XML presentation of the pipeline execution configuration will be sent over to the server in the form of a ZIP archive.
The server receives this archive and without unzipping runs the pipeline.
To make this function correctly, Hop changes the references as well as references to filenames.
For example `${PROJECT_HOME}/mapping.hpl` will be changed to `${Internal.Entry.Current.Folder}/mapping.hpl`.
For example `{openvar}PROJECT_HOME{closevar}/mapping.hpl` will be changed to `{openvar}Internal.Entry.Current.Folder{closevar}/mapping.hpl`.
This means that it will try to use a relative path to the parent file.

If you have are using data files then those file names will be renamed as well.
For example, you might be reading a file called `${PROJECT_HOME}/files/bigfile.csv` in a `CSV File Input` transform.
During the export the referenced filename will be changed to `${DATA_PATH_1}/bigfile.csv`.
For example, you might be reading a file called `{openvar}PROJECT_HOME{closevar}/files/bigfile.csv` in a `CSV File Input` transform.
During the export the referenced filename will be changed to `{openvar}DATA_PATH_1{closevar}/bigfile.csv`.
For every folder that is referenced a new variable will be defined and set in the execution configuration.
By default, the path set for this variable will be the same as on the executing (local) machine.
On the server this might not make a lot of sense.
For this reason you can specify a reference source folder like `${PROJECT_HOME}` in combination with a target folder like `/server/`.
For this reason you can specify a reference source folder like `{openvar}PROJECT_HOME{closevar}` in combination with a target folder like `/server/`.
In that example variable `DATA_PATH_1` will get value `/server/files/`.
This in turn allows you to transfer required files in advance or map a folder into a docker container and so on.
It gives you flexibility when executing remotely while having ease of development on your client.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -80,4 +80,5 @@ Delete based on document paths and stream fields (`use JSON query` disabled):
|===

Delete based on JSON query (`use JSON query` enabled):
[source]
`{$or: [{"name": "${NAME1}"},{"name": "${NAME2}"}, {"name": "${NAME3}"} ]}`
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@ under the License.
////
:documentationPath: /pipeline/transforms/
:language: en_US
:openvar: ${
:closevar: }
:description: The Token Replacement transform replaces tokens in an input string or file.

= image:transforms/icons/token.svg[Token Replacement transform Icon, role="image-doc-icon"] Token Replacement
Expand All @@ -31,7 +33,7 @@ The transform can then output this data either to a file or a field on the strea

A token contains a start string, a name, and an end string.

For example ${my_token} could be a token.
For example {openvar}my_token{closevar} could be a token.

The start string, and end string are configurable and can be any series of characters.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -47,15 +47,15 @@ During this process the XML metadata is converted to the appropriate Hop format.
During this process the XML metadata is converted to the appropriate Hop format.

|kettle.properties
|The Kettle properties file in your .kettle folder (typically found in the home directory or `${KETTLE_HOME}`) often contains variables and values regarding your environment.
|The Kettle properties file in your .kettle folder (typically found in the home directory or `{openvar}KETTLE_HOME{closevar}`) often contains variables and values regarding your environment.
These variables and values are converted into an environment configuration file if you specified the `-c` or `--target-config-file` option.
When you create an environment in Hop you can simply add this file to it to make everything work.
If the configuration file already exists it will be updated, not overwritten.
The description of the newly imported variables is set to `Imported from Kettle` to indicate that it's new.
Values of existing variables are overwritten and the existing description is kept.

|shared.xml
|The shared.xml file in your .kettle folder (typically found in the home directory or `${KETTLE_HOME}`) often contains connections which are shared across many transformations and jobs.
|The shared.xml file in your .kettle folder (typically found in the home directory or `{openvar}KETTLE_HOME{closevar}`) often contains connections which are shared across many transformations and jobs.
These connections are imported as Relational Database Connection metadata stored in the target folder `metadata/rdbms` folder.

|jdbc.properties
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ To match standard development best practices you would check all these files int

TIP: project variables should only be used when you need variables on the project level. All variables to connect to infrastructure, e.g. database connection parameters, mail servers etc that take different values in different environments should be created at the environment level.

TIP: Project configurations are stored in hop-config.json, which is read from `hop/config` by default. Use the `${HOP_CONFIG_FOLDER}` operating system variable to store your Hop configuration in a folder outside your Hop folder. This will let you keep your project list if you switch Hop installations or upgrade to a newer Hop version.
TIP: Project configurations are stored in hop-config.json, which is read from `hop/config` by default. Use the `{openvar}HOP_CONFIG_FOLDER{closevar}` operating system variable to store your Hop configuration in a folder outside your Hop folder. This will let you keep your project list if you switch Hop installations or upgrade to a newer Hop version.

Projects can inherit metadata and variables from a parent project.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,13 +16,15 @@ under the License.
////

[[Variables]]
:openvar: ${
:closevar: }
:imagesdir: ../../assets/images

xref:variables.adoc[Variables] provide an easy way to avoid hard-coding all sorts of things in your system, environment or project.
Here is some best practices advice on the subject:

* Put environment specific settings in an environment (Duh!) configuration file.
Create an environment for this.
* When referencing file locations, prefer `${PROJECT_HOME}` over expressions like `${Internal.Entry.Current.Directory}` or `${Internal.Pipeline.Filename.Directory}`
* When referencing file locations, prefer `{openvar}PROJECT_HOME{closevar}` over expressions like `{openvar}Internal.Entry.Current.Directory{closevar}` or `{openvar}Internal.Pipeline.Filename.Directory{closevar}`
* Configure transform copies with variables to allow for easy transition between differently sized environments.
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,8 @@ specific language governing permissions and limitations
under the License.
////
:documentationPath: /workflow/actions/
:openvar: ${
:closevar: }
:language: en_US
:description:

Expand Down Expand Up @@ -50,10 +52,10 @@ You can pass command line arguments and set up logging for the Shell workflow ac
This is also useful, when you want to execute operating system commands like dir, ls or ipconfig without giving a specific path.
This option creates a temporary script in the working directory and executes it.
Note: Variables are resolved within the script when given.
|Script file name|The filename of the shell script to start, should include full path else ${user.dir} is used as path.
|Script file name|The filename of the shell script to start, should include full path else {openvar}user.dir{closevar} is used as path.
|Working directory|The directory that will be used as working directory for the shell script.
The working directory only becomes active when the shell script starts so "Filename" should still include the full path to the script.
When the field is left empty or the working directory is invalid ${user.dir} will be used as working directory.
When the field is left empty or the working directory is invalid {openvar}user.dir{closevar} will be used as working directory.
|Specify log file|Enable to specify a separate logging file for the execution of this workflow.
|Append logfile|Enable to append to the logfile as opposed to creating a new one
|Name of log file|The directory and base name of the log file (for example C:\logs)
Expand Down
Loading

0 comments on commit 1555d53

Please sign in to comment.