Skip to content

Commit

Permalink
doc update
Browse files Browse the repository at this point in the history
  • Loading branch information
antonymayi committed Nov 27, 2020
1 parent 9d046f0 commit 8ef58b9
Show file tree
Hide file tree
Showing 6 changed files with 7 additions and 7 deletions.
2 changes: 1 addition & 1 deletion docs/concept.rst
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ Data Source DSL
.where(student.score < 2) \
.orderby(student.level, student.score)

Full guide and the DSL references can be found in the :doc:`dsl/index` sections.
Full guide and the DSL references can be found in the :doc:`dsl` sections.

.. _concept-workflow:

Expand Down
File renamed without changes.
2 changes: 1 addition & 1 deletion docs/feed.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Source Feed
===========

Feed is a :doc:`runtime platform <platform>` component responsible for resolving the :doc:`project defined <project>`
:doc:`ETL query <dsl/index>` providing the requested data.
:doc:`ETL query <dsl>` providing the requested data.

.. autosummary::

Expand Down
2 changes: 1 addition & 1 deletion docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ Content
lifecycle
workflow
io
dsl/index
dsl
interactive
operator
testing
Expand Down
6 changes: 3 additions & 3 deletions docs/io.rst
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ Catalogized Schemas
-------------------

To achieve data access independence, ForML introduces a concept of *catalogized schemas*. Instead of implementing
direct operations on specific data source instances, the :doc:`dsl/index` used to define the input data ETL refers only
direct operations on specific data source instances, the :doc:`dsl` used to define the input data ETL refers only
to abstract data *schemas*. It is then the responsibility of the *platform* to resolve the requested schemas (and the
whole ETL queries specified on top of that) mapping them to actual datasources hosted in the particular runtime
environment.
Expand All @@ -58,14 +58,14 @@ producers. For private first-party datasets (ie. internal company data) this is
ForML) would just maintain a (private) package with schemas of their data sources. For public datasets (whose authors
don't endorse ForML yet) this leaves it to some (not yet established) community maintained schema catalogs.

See the :doc:`dsl/index` for a schema implementation guide.
See the :doc:`dsl` for a schema implementation guide.

.. _io-source-descriptor:

Source Descriptor
-----------------

ForML projects specify their input data requirements (mainly the ETL :doc:`DSL <dsl/index>` query optionally composed
ForML projects specify their input data requirements (mainly the ETL :doc:`DSL <dsl>` query optionally composed
with other transforming operators) in form of a *source descriptor* (supplied within the :doc:`project structure
<project>` using the :ref:`source.py <project-source>` component).

Expand Down
2 changes: 1 addition & 1 deletion docs/project.rst
Original file line number Diff line number Diff line change
Expand Up @@ -150,7 +150,7 @@ Source (``source.py``)
''''''''''''''''''''''

This component is a fundamental part of the :doc:`IO concept<io>`. A project can define the ETL process of sourcing
data into the pipeline using the :doc:`DSL <dsl/index>` referring to some :ref:`catalogized schemas
data into the pipeline using the :doc:`DSL <dsl>` referring to some :ref:`catalogized schemas
<io-catalogized-schemas>` that are at runtime resolved via the available :doc:`feeds <feed>`.

The source component is provided in form of a descriptor that's created using the ``.query()`` method as shown in the
Expand Down

0 comments on commit 8ef58b9

Please sign in to comment.