From d3afebeb728ba35f5c99f4b55c5808e451dbed65 Mon Sep 17 00:00:00 2001 From: Kathy Pippert Date: Wed, 24 Aug 2022 15:08:48 -0400 Subject: [PATCH] Round 1 of standardizing DPF-Core doc --- docs/source/_static/simple_example.rst | 4 +- docs/source/contributing.rst | 428 ++---------------- docs/source/getting_started/compatibility.rst | 67 +++ docs/source/getting_started/dependencies.rst | 30 ++ docs/source/getting_started/docker.rst | 96 ++-- docs/source/getting_started/index.rst | 146 +----- docs/source/getting_started/install.rst | 94 +++- docs/source/index.rst | 57 ++- docs/source/user_guide/how_to.rst | 4 +- docs/source/user_guide/index.rst | 14 +- docs/source/user_guide/main_entities.rst | 2 +- 11 files changed, 294 insertions(+), 648 deletions(-) create mode 100644 docs/source/getting_started/compatibility.rst create mode 100644 docs/source/getting_started/dependencies.rst diff --git a/docs/source/_static/simple_example.rst b/docs/source/_static/simple_example.rst index 7e145b5605a..f5db9a4d493 100644 --- a/docs/source/_static/simple_example.rst +++ b/docs/source/_static/simple_example.rst @@ -1,5 +1,5 @@ -Opening a result file generated from MAPDL (or another ANSYS solver) and -extracting results from it is easy: +Here's how you would open a result file generated by MAPDL (or another ANSYS solver) and +extract results: .. code-block:: default diff --git a/docs/source/contributing.rst b/docs/source/contributing.rst index 370e8200654..63eada455cd 100644 --- a/docs/source/contributing.rst +++ b/docs/source/contributing.rst @@ -1,412 +1,38 @@ .. _contributing: -============ -Contributing -============ -We absolutely welcome any code contributions and we hope that this -guide will facilitate an understanding of the DPF-Core code -repository. It is important to note that while the DPF-Core software -package is maintained by Ansys and any submissions will be reviewed -thoroughly before merging, we still seek to foster a community that -can support user questions and develop new features to make this -software a useful tool for all users. As such, we welcome and -encourage any questions or submissions to this repository. +========== +Contribute +========== +Overall guidance on contributing to a PyAnsys repository appears in +`Contribute `_ +in the *PyAnsys Developer's Guide*. Ensure that you are thoroughly familiar +with this guide, paying particular attention to `Guidelines and Best Practices +`_, before attempting +to contribute to PyDPF-Core. + +The following contribution information is specific to PyDPF-Core. -Cloning the Source Repository ------------------------------ - -You can clone the source repository from `DPF-Core -GitHub `_ -and install the latest version in development mode by running: - -.. include:: pydpf-core_clone_install.rst - - -Questions ---------- -For general or technical questions about the project, its -applications, or about software usage, please create an issue at -`DPF-Core Issues `_ where the -community or DPF-Core developers can collectively address your -questions. To reach the project support team, -email `pyansys.support@ansys.com `_. - -By posting on the issues page, your question can be addressed by -community members with the needed expertise and the knowledge gained -will remain available on the issues page for other users. - - -Reporting Bugs --------------- -If you encounter any bugs or crashes while using DPF-Core, please -report it at `DPF-Core Issues `_ -with an appropriate label so we can promptly address it. When -reporting an issue, please be overly descriptive so that we may -reproduce it. Whenever possible, please provide tracebacks, -screenshots, and sample files to help us address the issue. - - -Feature Requests ----------------- -We encourage users to submit ideas for improvements to DPF-Core! -Please create an issue on the `DPF-Core Issues `_ -with a **Feature Request** label to suggest an improvement. -Please use a descriptive title and provide ample background information to help -the community implement that functionality. For example, if you would like a -reader for a specific file format, please provide a link to documentation of -that file format and possibly provide some sample files with screenshots to work -with. We will use the issue thread as a place to discuss and provide feedback. - - -Contributing New Code ---------------------- -If you have an idea for how to improve DPF-Core, consider first -creating an issue as a feature request which we can use as a -discussion thread to work through how to implement the contribution. - -Once you are ready to start coding, please see the `Development -Practices <#development-practices>`__ section for more details. - - -Licensing ---------- -All contributed code will be licensed under The MIT License found in -the repository. If you did not write the code yourself, it is your -responsibility to ensure that the existing license is compatible and -included in the contributed files or you can obtain permission from -the original author to relicense the code. - --------------- - -Development Practices ---------------------- -This section provides a guide to how we conduct development in the -DPF-Core repository. Please follow the practices outlined here when -contributing directly to this repository. - -Guidelines -~~~~~~~~~~ - -Consider the following general coding paradigms when contributing: - -1. Follow the `Zen of Python `__. As - silly as the core Python developers are sometimes, there's much to - be gained by following the basic guidelines listed in PEP 20. - Without repeating them here, focus on making your additions - intuitive, novel, and helpful for DPF-Core and its users. - - When in doubt, ``import this`` - -2. **Document it**. Include a docstring for any function, method, or - class added. Follow the `numpydocs docstring - `_ - guidelines, and always provide a for simple use cases for the new - features. - -3. **Test it**. Since Python is an interperted language, if it's not - tested, it's probably broken. At the minimum, include unit tests - for each new feature within the ``tests`` directory. Ensure that - each new method, class, or function has reasonable (>90%) coverage. - -Additionally, please do not include any data sets for which a license -is not available or commercial use is prohibited. - -Finally, please take a look at our `Code of Conduct `_ - - -Contributing to DPF-Core through GitHub ---------------------------------------- -To submit new code to DPF-Core, first fork the `DPF-Core GitHub Repo -`_ and then clone the forked -repository to your computer. Next, create a new branch based on the -`Branch Naming Conventions Section <#branch-naming-conventions>`__ in -your local repository. - -Next, add your new feature and commit it locally. Be sure to commit -often as it is often helpful to revert to past commits, especially if -your change is complex. Also, be sure to test often. See the `Testing -Section <#testing>`__ below for automating testing. - -When you are ready to submit your code, create a pull request by -following the steps in the `Creating a New Pull Request -section <#creating-a-new-pull-request>`__. - - -Creating a New Pull Request -~~~~~~~~~~~~~~~~~~~~~~~~~~~ -Once you have tested your branch locally, create a pull request on -`DPF-Core `_ and target your -merge to `master`. This will automatically run continuous -integration (CI) testing and verify your changes will work across all -supported platforms. - -For code verification, someone from the pyansys developers team will -review your code to verify your code meets our our standards. Once -approved, if you have write permission you may merge the branch. If -you don't have write permission, the reviewer or someone else with -write permission will merge the branch and delete the PR branch. - -Since it may be necessary to merge your branch with the current -release branch (see below), please do not delete your branch if it -is a ``fix/`` branch. - - -Branch Naming Conventions -~~~~~~~~~~~~~~~~~~~~~~~~~ -To streamline development, we have the following requirements for -naming branches. These requirements help the core developers know what -kind of changes any given branch is introducing before looking at the -code. - -- ``fix/``: any bug fixes, patches, or experimental changes that are - minor -- ``feat/``: any changes that introduce a new feature or significant - addition -- ``junk/``: for any experimental changes that can be deleted if gone - stale -- ``maint/``: for general maintenance of the repository or CI routines -- ``doc/``: for any changes only pertaining to documentation -- ``no-ci/``: for low impact activity that should NOT trigger the CI - routines -- ``testing/``: improvements or changes to testing -- ``release/``: releases (see below) - -Testing -~~~~~~~ -Periodically when making changes, be sure to test locally before -creating a pull request. The following tests will be executed after -any commit or pull request, so we ask that you perform the following -sequence locally to track down any new issues from your changes. - -To test the core API, be sure to have ANSYS 2021R1 or newer -installed. Next, install the testing requirements with: - -.. code:: - - pip install -r requirements/requirements_test.txt - -Run the primary test suite and generate a coverage report with: - -.. code:: - - pytest -v --cov ansys-dpf-core - -If you do not have DPF-Core installed locally, setup the following -environment variables to connect to a remote server. - -.. code:: - - export DPF_START_SERVER=False - export DPF_PORT=50054 - export DPF_IP= - -Or on windows: - -.. code:: - - set DPF_START_SERVER=False - set DPF_PORT=50054 - set DPF_IP= - -This will tell `ansys.dpf.core` to attempt to connect to the existing -DPF service by default rather than launching a new service. - - -Spelling and Code Style -~~~~~~~~~~~~~~~~~~~~~~~ - -If you are using Linux or Mac OS, run check spelling and coding style -with: - -.. code:: - - make - -Any misspelled words will be reported. You can add words to be -ignored to ``ignore_words.txt`` - - -Documentation -------------- -Documentation for DPF-Core is generated from three sources: - -- Docstrings from the classes, functions, and modules of ``ansys.dpf.core`` using `sphinx.ext.autodoc `_. -- Restructured test from `docs/` -- Examples from `examples/` - -General usage and API descriptions should be placed within `docs/` and -the docstrings. Full examples should be placed in `examples`. - - -Documentation Style and Organization -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -Docstrings should follow the `numpydocs docstring -`_ guidelines. -Documentation from `docs` use reStructuredText format. Examples -within the `examples/` directory should be PEP8 compliant and will be -compiled dynamically during the build process; ensure they run -properly locally as they will be verified through the continuous -integration performed on GitHub Actions. - - -Building the Documentation Locally -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -Documentation for DPF-Core is hosted at docs.pyansys.com and is -automatically built and deployed using the GitHub Actions. You can -build and verify the html documentation locally by install ``sphinx`` -and the other documentation build dependencies by running the -following from the DPF-Core source directory: +Clone the repository +-------------------- +To clone and install the latest version of PyDPF-Core in +development mode, run: .. code:: - pip install -r requirements/requirements_docs.txt - - -Next, if running Linux/Mac OS, build the documentation with - -.. code:: - - make -C docs html - -Otherwise, if running Windows, build the documentation by running - -.. code:: - - cd docs - make.bat html - -Upon the successful build of the documentation, you can open the local -build by opening ``index.html`` at ``docs/build/html/`` with -your browser. - -If you are running DPF remotely or through docker, see the `Testing Section <#testing>`__ for setting up the correct environment variables. - - -Continuous Integration and Continuous Delivery ----------------------------------------------- -The DPF-Core project uses continuous integration and delivery (CI/CD) -to automate the building, testing, and deployment tasks. The CI -Pipeline is deployed on both GitHub Actions and Azure Pipelines and -performs following tasks: - -- Module wheel build -- Core API testing -- Spelling and style verification -- Documentation build - - -Branching Model -~~~~~~~~~~~~~~~ -This project has a branching model that enables rapid development of -features without sacrificing stability, and closely follows the -`Trunk Based Development `_ approach. - -The main features of our branching model are: - -- The `master` branch is the main development branch. All features, - patches, and other branches should be merged here. While all PRs - should pass all applicable CI checks, this branch may be - functionally unstable as changes might have introduced unintended - side-effects or bugs that were not caught through unit testing. -- There will be one or many `release/` branches based on minor - releases (for example `release/0.2`) which contain a stable version - of the code base that is also reflected on PyPi/. Hotfixes from - `fix/` branches should be merged both to master and to these - branches. When necessary to create a new patch release these - release branches will have their `__version__.py` updated and be - tagged with a patched semantic version (e.g. `0.2.1`). This - triggers CI to push to PyPi, and allow us to rapidly push hotfixes - for past versions of ``ansys.dpf.core`` without having to worry about - untested features. -- When a minor release candidate is ready, a new `release` branch will - be created from `master` with the next incremented minor version - (e.g. `release/0.2`), which will be thoroughly tested. When deemed - stable, the release branch will be tagged with the version (`0.2.0` - in this case), and if necessary merged with master if any changes - were pushed to it. Feature development then continues on `master` - and any hotfixes will now be merged with this release. Older - release branches should not be deleted so they can be patched as - needed. - - -Minor Release Steps -~~~~~~~~~~~~~~~~~~~ -Minor releases are feature and bug releases that improve the -functionality and stability of ``DPF-Core``. Before a minor release is -created the following will occur: - -1. Create a new branch from the ``master`` branch with name - ``release/MAJOR.MINOR`` (e.g. `release/0.2`). - -2. Locally run all tests as outlined in the `Testing Section <#testing>`__ -and ensure all are passing. - -3. Locally test and build the documentation with link checking to make sure -no links are outdated. Be sure to run `make clean` to ensure no results are -cached. - - .. code:: - - cd docs - make clean # deletes the sphinx-gallery cache - make html -b linkcheck - -4. After building the documentation, open the local build and examine - the examples gallery for any obvious issues. - -5. Update the version numbers in ``ansys/dpf/core/_version.py`` and commit it. - Push the branch to GitHub and create a new PR for this release that - merges it to master. Development to master should be limited at - this point while effort is focused on the release. - -6. It is now the responsibility of the `DPF-Core` community and - developers to functionally test the new release. It is best to - locally install this branch and use it in production. Any bugs - identified should have their hotfixes pushed to this release - branch. - -7. When the branch is deemed as stable for public release, the PR will - be merged to master and the `master` branch will be tagged with a - `MAJOR.MINOR.0` release. The release branch will not be deleted. - Tag the release with: - - .. code:: - - git tag - git push origin --tags - - -8. Create a list of all changes for the release. It is often helpful - to leverage `GitHub's compare feature - `_ to see the - differences from the last tag and the `master` branch. Be sure to - acknowledge new contributors by their GitHub username and place - mentions where appropriate if a specific contributor is to thank - for a new feature. - -9. Place your release notes from step 8 in the description within - `DPF-Core Releases `_ - - -Patch Release Steps -~~~~~~~~~~~~~~~~~~~ -Patch releases are for critical and important bugfixes that can not or -should not wait until a minor release. The steps for a patch release + git clone https://github.com/pyansys/pydpf-core + cd pydpf-core + pip install -e . -1. Push the necessary bugfix(es) to the applicable release branch. - This will generally be the latest release branch - (e.g. `release/0.2`). -2. Update `__version__.py` with the next patch increment - (e.g. `0.2.1`), commit it, and open a PR that merge with the - release branch. This gives the `DPF-Core` developers and community - a chance to validate and approve the bugfix release. Any - additional hotfixes should be outside of this PR. +Post issues +----------- +Use the `PyDPF-Core Issues `_ +page to submit questions, report bugs, and request new features. -3. When approved, merge with the release branch, but not `master` as - there is no reason to increment the version of the `master` branch. - Then create a tag from the release branch with the applicable - version number (see above for the correct steps). +To reach the PyAnsys support team, email `pyansys.support@ansys.com `_. -4. If deemed necessary a release notes page. +View documentation +------------------ +Documentation for the latest stable release of PyDPF-Core is hosted at +`PyDPF-Core Documentation `_. diff --git a/docs/source/getting_started/compatibility.rst b/docs/source/getting_started/compatibility.rst new file mode 100644 index 00000000000..acfed09d262 --- /dev/null +++ b/docs/source/getting_started/compatibility.rst @@ -0,0 +1,67 @@ +.. _ref_compatibility: + +============= +Compatibility +============= + +Operating system +---------------- + +DPF supports Windows 10 and CentOS 7 and later. For +more information, see `Ansys Platform Support `_. + +Client-server +------------- + +The DPF server version depends on your installed Ansys version. +The following table shows client-server compatibibilty for supported +Ansys versions. With Ansys 2021 R2 and later, you can use PyDPF-Core +version 3.0 or later. With Ansys 2021 R1, you must use a PyDPF-Core 0.2 +version. + +As new features are developed, every attempt is made to ensure backward +compatibility from the client to the server. + +The `ansys.grpc.dpf `_ package +should also be synchronized with the server version. + +.. list-table:: Client-server compatibility + :widths: 20 20 20 20 20 + :header-rows: 1 + + * - Ans.Dpf.Grpc.exe server version + - ansys.dpf.gatebin binaries Python module version + - ansys.dpf.gate Python module version + - ansys.grpc.dpf Python module version + - ansys.dpf.core Python module version + * - 4.0 (Ansys 2022R2) + - 0.1.1 + - 0.1.1 + - 0.5.1 + - 0.5.0 and later + * - 3.0 (Ansys 2022 R1) + - None + - None + - 0.4.0 + - 0.4.0 and later + * - 2.0 (Ansys 2021 R2) + - None + - None + - 0.3.0 + - 0.3.0 and later + * - 1.0 (Ansys 2021 R1) + - None + - None + - 0.2.2 + - 0.2.* + + +Environment variable +-------------------- + +The ``start_local_server`` method uses the ``Ans.Dpf.Grpc.bat`` file or +``Ans.Dpf.Grpc.sh`` file to start the server. Ensure that the ``AWP_ROOT{VER}`` +environment variable is set to your installed Ansys version. For example, if Ansys +2022 installation is installed, ensure that the ``AWP_ROOT222`` environment +variable is set to the path for this Ansys installation. + diff --git a/docs/source/getting_started/dependencies.rst b/docs/source/getting_started/dependencies.rst new file mode 100644 index 00000000000..e94283486f5 --- /dev/null +++ b/docs/source/getting_started/dependencies.rst @@ -0,0 +1,30 @@ +.. _ref_dependencies: + +============ +Dependencies +============ + +Package dependencies +-------------------- + +DPF-Core dependencies are automatically checked when packages are +installed. Package dependencies follow: + +- `ansys.dpf.gate `_, which is the gate + to the DPF C API or Python gRPC API. The gate depends on the server configuration: + - `ansys.grpc.dpf `_ is the gRPC code + generated from protobuf files. + - `ansys.dpf.gatebin `_ is the + operating system-specific binaries with DPF C APIs. +- `psutil `_ +- `tqdm `_ +- `packaging `_ +- `numpy `_ + +Optional dependencies +~~~~~~~~~~~~~~~~~~~~~ + +For plotting, you can install these optional Python packages: + +- `matplotlib `_ for chart plotting +- `pyvista `_ for 3D plotting diff --git a/docs/source/getting_started/docker.rst b/docs/source/getting_started/docker.rst index b2327bd5434..0e21eaef81c 100644 --- a/docs/source/getting_started/docker.rst +++ b/docs/source/getting_started/docker.rst @@ -1,70 +1,71 @@ .. _ref_docker: -************************ -Using DPF Through Docker -************************ +============= +DPF in Docker +============= -You can run DPF within a container on any OS using `Docker `_. +On any operating system, you can run DPF in a containerized environment +such as `Docker `_ or `Singularity `_. -Advantages of running DPF in a containerized environment, such -as Docker or `Singularity `_, include: +Advantages of using a containerized environment include: - Running in a consistent environment regardless of the host operating system - Offering portability and ease of install - Supporting large-scale cluster deployment using `Kubernetes `_ - Providing genuine application isolation through containerization -Installing the DPF Image ------------------------- -Using your GitHub credentials, you can download the Docker image in the -`DPF-Core GitHub `_ repository. +The following sections assume that you are going to run DPF in Docker. -If you have Docker installed, you can get started by authorizing Docker to -access this repository using a GitHub personal access token with -``packages read`` permissions. For more information, see -`_. +Install the DPF image +--------------------- -Save the token to a file: +#. Using your GitHub credentials, download the Docker image in the + `DPF-Core GitHub `_ repository. +#. If you have Docker installed, use a GitHub personal access token (PAT) with + ``packages read`` permission to authorize Docker to access this repository. + For more information, see `Creating a personal access token + `_. +#. Save the token to a file: -.. code:: + .. code:: - echo XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX > GH_TOKEN.txt + echo XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX > GH_TOKEN.txt -This lets you send the token to Docker without leaving the token value -in your history. + This lets you send the token to Docker without leaving the token value + in your history. -Next, authorize Docker to access the repository: +#. Authorize Docker to access the repository: -.. code:: - GH_USERNAME= - cat GH_TOKEN.txt | docker login docker.pkg.github.com -u $GH_USERNAME --password-stdin + .. code:: + GH_USERNAME= + cat GH_TOKEN.txt | docker login docker.pkg.github.com -u $GH_USERNAME --password-stdin -You can now launch DPF directly from Docker with a short script or -directly from the command line. -.. code:: +#. Launch DPF directly from Docker with a short script or directly from the command line: - docker run -it --rm -v `pwd`:/dpf -p 50054:50054 docker.pkg.github.com/pyansys/dpf-core/dpf:v2021.1 + .. code:: + docker run -it --rm -v `pwd`:/dpf -p 50054:50054 docker.pkg.github.com/pyansys/dpf-core/dpf:v2021.1 -Note that this command shares the current directory to the ``/dpf`` + +Note that the preceding command shares the current directory to the ``/dpf`` directory contained within the image. This is necessary as the DPF binary within the image must access the files within the image -itself. Any files that you want to have DPF read must be placed in -the ``pwd``. You can map other directories as needed, but these +itself. Any files that you want to have DPF read must be placed in +``pwd``. You can map other directories as needed, but these directories must be mapped to the ``/dpf`` directory for the server to -see the files you want it to read. +see the files that you want it to read. +Use the DPF container from Python +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -Using the DPF Container from Python -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -Normally ``ansys.dpf.core`` attempts to start the DPF server at the first -usage of a DPF class. If you do not have Ansys installed and simply want +Normally PyDPF-Core attempts to start the DPF server on the first +use of the ``dpf`` class. If you do not have Ansys installed and simply want to use the Docker image, you can override this behavior by connecting to the -DPF server on the port you mapped: +DPF server on a specified port: .. code:: python @@ -74,9 +75,10 @@ DPF server on the port you mapped: dpf_core.connect_to_server() -If you want to avoid having to run ``connect_to_server`` at the start of -every script, you can tell ``ansys.dpf.core`` to always attempt to -connect to DPF running within the Docker image by setting environment variables: +If you want to avoid having to run the ``connect_to_server()`` method +at the start of every script, you can set environment variables to tell +PyDPF-Core to always attempt to connect to DPF running within the Docker +image: On Linux: @@ -93,11 +95,11 @@ On Windows: set DPF_PORT=50054 -The environment variable ``DPF_PORT`` is the port exposed from the -DPF container. It should match the first value within the ``-p 50054:50054`` pair. - -The environment variable ``DPF_START_SERVER`` tells ``ansys.dpf.core`` not to start an -instance but rather look for the service running at ``DPF_IP`` and -``DPF_PORT``. If these environment variables are undefined, they -default to 127.0.0.1 and 50054 for ``DPF_IP`` and ``DPF_PORT`` -respectively. +- The ``DPF_PORT`` environment variable is the port exposed from the + DPF container. It should match the first value within the ``-p 50054:50054`` + pair. +- The ``DPF_START_SERVER`` environment variable tells PyDPF-core not to start + an instance but rather use ``DPF_IP`` and ``DPF_PORT`` environment variables + to look for a running instance of the service. If the ``DPF_IP`` and ``DPF_PORT`` + environment variables are undefined, they default to ``127.0.0.1`` and ``50054`` + respectively. diff --git a/docs/source/getting_started/index.rst b/docs/source/getting_started/index.rst index 2c7a8719f0b..68d74b40398 100755 --- a/docs/source/getting_started/index.rst +++ b/docs/source/getting_started/index.rst @@ -1,156 +1,20 @@ .. _ref_getting_started: =============== -Getting Started +Getting started =============== -Compatibility -~~~~~~~~~~~~~ -DPF supports Windows 10 and CentOS 7 and later. For -more information, see `Ansys Platform Support `_. - -*************************** -Client Server Compatibility -*************************** - -The DPF server version depends on the Ansys installation version. -The PyDPF-Core client used must be compatible with it according to the table below. -Notice that starting with Ansys 2021 R2 one can use any PyDPF-Core >= 3.0. -Only Ansys 2021 R1 requires a specific version of PyDPF-Core (0.2.*). - -Future development will always try to ensure backward compatibility from the client to the server. - -The `ansys.grpc.dpf `_ module should also be synchronized -with the server version. - -.. list-table:: Client-Server Compatibility - :widths: 20 20 20 20 20 - :header-rows: 1 - - * - Ans.Dpf.Grpc.exe server version - - ansys.dpf.gatebin binaries python module version - - ansys.dpf.gate python module version - - ansys.grpc.dpf python module version - - ansys.dpf.core python module version - * - 4.0 (Ansys 2022R2) - - 0.1.1 - - 0.1.1 - - 0.5.1 - - >=0.5.0 - * - 3.0 (Ansys 2022R1) - - None - - None - - 0.4.0 - - >=0.4.0 - * - 2.0 (Ansys 2021R2) - - None - - None - - 0.3.0 - - >=0.3.0 - * - 1.0 (Ansys 2021R1) - - None - - None - - 0.2.2 - - 0.2.* - -To start a server with Ans.Dpf.Grpc.bat or Ans.Dpf.Grpc.sh (used in the `start_local_server` function), -please make sure that the environment variable `AWP_ROOT{VER}` with (VER=212, 221, ...) is set. - -Architecture -~~~~~~~~~~~~~ - DPF-Core is a Python gRPC client communicating with the ``Ans.Dpf.Grpc`` server. To use the native DPF server, you must have a local installation of -Ansys 2021 R1 or higher. For more information on getting a licensed copy of Ansys, +Ansys 2021 R1 or later. For more information on getting a licensed copy of Ansys, visit the `Ansys website `_. - -.. _getting_started: - -Installation -~~~~~~~~~~~~ - -.. include:: install.rst - - .. toctree:: :hidden: :maxdepth: 2 + compatibility + install + dependencies docker - - -Tryout Installation -~~~~~~~~~~~~~~~~~~~ - -For a quick tryout installation, use: - -.. code-block:: default - - from ansys.dpf.core import Model - from ansys.dpf.core import examples - model = Model(examples.simple_bar) - print(model) - - - -.. rst-class:: sphx-glr-script-out - - Out: - - .. code-block:: none - - DPF Model - ------------------------------ - Static analysis - Unit system: Metric (m, kg, N, s, V, A) - Physics Type: Mecanic - Available results: - - displacement: Nodal Displacement - - element_nodal_forces: ElementalNodal Element nodal Forces - - elemental_volume: Elemental Volume - - stiffness_matrix_energy: Elemental Energy-stiffness matrix - - artificial_hourglass_energy: Elemental Hourglass Energy - - thermal_dissipation_energy: Elemental thermal dissipation energy - - kinetic_energy: Elemental Kinetic Energy - - co_energy: Elemental co-energy - - incremental_energy: Elemental incremental energy - - structural_temperature: ElementalNodal Temperature - ------------------------------ - DPF Meshed Region: - 3751 nodes - 3000 elements - Unit: m - With solid (3D) elements - ------------------------------ - DPF Time/Freq Support: - Number of sets: 1 - Cumulative Time (s) LoadStep Substep - 1 1.000000 1 1 - - - -Dependencies -~~~~~~~~~~~~~ - -DPF-Core dependencies are automatically checked when packages are -installed. The package dependencies are: - -- `ansys.dpf.gate `_ (Gate to DPF C API or python - grpc API). Dependencies of gate are (and/or depending on the server configuration): - - `ansys.grpc.dpf `_ (gRPC code generated from - protobufs) - - `ansys.dpf.gatebin `_ (os specific binaries - with DPF C APIs) -- `psutil `_ -- `tqdm `_ -- `packaging `_ -- `numpy `_ - -Optional Dependencies -~~~~~~~~~~~~~~~~~~~~~ - -Optional package dependencies can be installed for specific usage: -- `Matplotlib `_ for chart plotting -- `PyVista `_ for 3D plotting diff --git a/docs/source/getting_started/install.rst b/docs/source/getting_started/install.rst index 450f871747a..549d609b5e7 100644 --- a/docs/source/getting_started/install.rst +++ b/docs/source/getting_started/install.rst @@ -1,37 +1,95 @@ .. _installation: -********************* -Installation with PIP -********************* -Once Ansys 2021 R2 or later is installed, you can install PyDPF-Core with: +************ +Installation +************ + +PIP installation +---------------- + +To use PyDPF-Core with Ansys 2021 R2 or later, install the latest version +of PyDPF-Core with: .. code:: pip install ansys-dpf-core -This installs the latest version of PyDPF-Core and all necessary -dependencies. - -To use PyDPF-Core with Ansys 2021 R1, you must install PyDPF-Core with: +To use PyDPF-Core with Ansys 2021 R1, install PyDPF-Core version 0.30 with: .. code:: pip install ansys-dpf-core<0.3.0 -If you are unable to install this module on the host machine due to -network isolation, download the latest or a specific release wheel at `PyDPF-Core -GitHub `_ or from PyPi at -`PyDPF-Core PyPi `_ + +Wheel file installation +----------------------- +If you are unable to install PyDPF-Core on the host machine due to +network isolation, download the latest wheel file or the wheel file +for a specific release from `PyDPF-Core +GitHub `_ or +`PyDPF-Core PyPi `_. + +Tryout installation +------------------- + +For a quick tryout installation, use: + +.. code-block:: default + + from ansys.dpf.core import Model + from ansys.dpf.core import examples + model = Model(examples.simple_bar) + print(model) -**************************************** -Editable Installation (Development Mode) -**************************************** -If you want to edit and potentially contribute to the DPF-Core -module, clone the repository and install it using pip with the ``-e`` +.. rst-class:: sphx-glr-script-out + + Out: + + .. code-block:: none + + DPF Model + ------------------------------ + Static analysis + Unit system: Metric (m, kg, N, s, V, A) + Physics Type: Mecanic + Available results: + - displacement: Nodal Displacement + - element_nodal_forces: ElementalNodal Element nodal Forces + - elemental_volume: Elemental Volume + - stiffness_matrix_energy: Elemental Energy-stiffness matrix + - artificial_hourglass_energy: Elemental Hourglass Energy + - thermal_dissipation_energy: Elemental thermal dissipation energy + - kinetic_energy: Elemental Kinetic Energy + - co_energy: Elemental co-energy + - incremental_energy: Elemental incremental energy + - structural_temperature: ElementalNodal Temperature + ------------------------------ + DPF Meshed Region: + 3751 nodes + 3000 elements + Unit: m + With solid (3D) elements + ------------------------------ + DPF Time/Freq Support: + Number of sets: 1 + Cumulative Time (s) LoadStep Substep + 1 1.000000 1 1 + + + +Development mode installation +----------------------------- + +If you want to edit and potentially contribute to DPF-Core, +clone the repository and install it using pip with the ``-e`` development flag: -.. include:: ../pydpf-core_clone_install.rst +.. code:: + + git clone https://github.com/pyansys/pydpf-core + cd pydpf-core + pip install -e . diff --git a/docs/source/index.rst b/docs/source/index.rst index 1978e1460ca..9cf99227c9d 100644 --- a/docs/source/index.rst +++ b/docs/source/index.rst @@ -1,62 +1,61 @@ -================ -PyAnsys DPF-Core -================ +========== +PyDPF-Core +========== The Data Processing Framework (**DPF**) provides numerical simulation -users/engineers with a toolbox for accessing and transforming simulation -data. It is used to handle complex pre- or post-processing of simulation -data within a simulation workflow. +users and engineers with a toolbox for accessing and transforming simulation +data. With DPF, you can perform complex preprocessing or postprocessing of +large amounts of simulation data within a simulation workflow. -DPF is an independent, physics-agnostic tool that can be plugged into many -applications for both data input and data output -(result plots, visualization, and so on). +DPF is an independent, physics-agnostic tool that you can plug into many +apps for both data input and data output, including visualization and +result plots.It can access data from solver result files and other neutral +formats, such as CSV, HDF5, and VTK files. -DPF can access data from solver result files and other neutral formats -(for example, CSV, HDF5, and VTK). Various operators are available, -allowing you to manipulate and transform this data. -You can chain operators together to create simple or complex data-processing -workflows that can be reused for repeated or future evaluations. +Using the many DPF operators that are available, you can manipulate and +transform this data. You can also chain operators together to create simple +or complex data-processing workflows that you can reuse for repeated or +future evaluations. The data in DPF is defined based on physics-agnostic mathematical quantities -described in self-sufficient entities called fields. This allows DPF to be +described in self-sufficient entities called *fields*. This allows DPF to be a modular and easy-to-use tool with a large range of capabilities. -It is designed to handle large amounts of data. .. image:: images/drawings/dpf-flow.png :width: 670 :alt: DPF FLow -The module ``ansys.dpf.core`` provides a Python interface to the powerful -DPF framework, enabling rapid postprocessing of a variety of Ansys file -formats and physics solutions without ever leaving the Python environment. +The ``ansys.dpf.core`` package provides a Python interface to DPF, enabling +rapid postprocessing of a variety of Ansys file formats and physics solutions +without ever leaving the Python environment. -Brief Demo +Brief demo ~~~~~~~~~~ .. include:: _static/simple_example.rst -See the :ref:`gallery` for detailed examples. +For comprehesive demos, see :ref:`gallery`. -Key Features +Key features ~~~~~~~~~~~~ -**Computation Efficiency** +**Computation efficiency** DPF is a modern framework based on new hardware architectures. Thanks to continued development, new capabilities are frequently added. -**Generic Interface** +**Generic interface** DPF is physics-agnostic, which means that its use is not limited to a particular field, physics solution, or file format. -**Extensibility and Customization** +**Extensibility and customization** DPF is developed around two core entities: -- Data represented as a ``field`` -- An ``operator`` to act upon this data +- Data represented as a *field* +- An *operator* to act upon this data Each DPF capability is developed through operators that allow for componentization of the framework. Because DPF is plugin-based, new @@ -64,9 +63,9 @@ features or formats can be easily added. .. toctree:: - :maxdepth: 2 - :caption: Getting Started :hidden: + :maxdepth: 2 + getting_started/index user_guide/index diff --git a/docs/source/user_guide/how_to.rst b/docs/source/user_guide/how_to.rst index 9ab4cff1766..bf68cf8cfb8 100644 --- a/docs/source/user_guide/how_to.rst +++ b/docs/source/user_guide/how_to.rst @@ -1,7 +1,7 @@ .. _ref_how_to: -How to -~~~~~~ +How-tos +~~~~~~~ .. toctree:: :hidden: diff --git a/docs/source/user_guide/index.rst b/docs/source/user_guide/index.rst index dff34940c96..f781b0de287 100644 --- a/docs/source/user_guide/index.rst +++ b/docs/source/user_guide/index.rst @@ -1,22 +1,22 @@ .. _ref_user_guide: ========== -User Guide +User guide ========== -PyDPF-Core is a Python client API for easily accessing DPF (Data Processing Framework) +PyDPF-Core is a Python client API for accessing DPF (Data Processing Framework) postprocessing capabilities. The ``ansys.dpf.core`` package makes highly efficient computation, customization, and remote postprocessing accessible in Python. -The purpose of this User Guide is: +This section has the following goals: - - To describe basic DPF concepts and terminology (``DPF concepts``). + - Describe basic DPF concepts, including terminology - - To describe the most common DPF entities and how they can help you to access and modify solver data (``DPF Most used entities``). + - Describe the most-used DPF entities and how they can help you to access and modify solver data - - To provide you simple ways to tackle the most common use cases (``How to``). + - Provide simple how-tos for tackling most common use cases. -For more advanced API usage, refer to the :ref:`ref_api_section` section and to the :ref:`sphx_glr_examples` section. +Other sections include :ref:`ref_api_section` and :ref:`sphx_glr_examples`. .. include:: dpf_concepts.rst diff --git a/docs/source/user_guide/main_entities.rst b/docs/source/user_guide/main_entities.rst index 3b206f81fef..bb3b6f0f4d1 100644 --- a/docs/source/user_guide/main_entities.rst +++ b/docs/source/user_guide/main_entities.rst @@ -1,6 +1,6 @@ .. _ref_main_entities: -DPF Most used entities +DPF most-used entities ~~~~~~~~~~~~~~~~~~~~~~ .. toctree::