diff --git a/doc/source/api_reference.rst b/doc/source/api_reference.rst index 06255eb1..7ddb3f8e 100644 --- a/doc/source/api_reference.rst +++ b/doc/source/api_reference.rst @@ -1,6 +1,6 @@ .. _api_reference: -API Reference +API reference ============= .. toctree:: diff --git a/doc/source/api_reference/data_types.rst b/doc/source/api_reference/data_types.rst index aee770b8..48cdf72f 100644 --- a/doc/source/api_reference/data_types.rst +++ b/doc/source/api_reference/data_types.rst @@ -1,4 +1,4 @@ -Data Types +Data types ========== .. automodule:: ansys.simai.core.data.types diff --git a/doc/source/api_reference/design_of_experiments.rst b/doc/source/api_reference/design_of_experiments.rst index ff07485f..d1c13a31 100644 --- a/doc/source/api_reference/design_of_experiments.rst +++ b/doc/source/api_reference/design_of_experiments.rst @@ -1,11 +1,12 @@ .. _design_of_experiments: -Design of Experiments +Design of experiments ===================== .. py:module:: ansys.simai.core.data.design_of_experiments -This collection contain methods allowing to export your Design of Experiments in different formats. +This module contains methods for exporting design of experiments +in different formats. .. autoclass:: DesignOfExperimentsCollection() :members: diff --git a/doc/source/api_reference/geometries.rst b/doc/source/api_reference/geometries.rst index d172cf53..689e024c 100644 --- a/doc/source/api_reference/geometries.rst +++ b/doc/source/api_reference/geometries.rst @@ -5,8 +5,8 @@ Geometries .. py:module:: ansys.simai.core.data.geometries -Geometries are the core of *SimAI Deep Learning powered predictions*. -A geometry is a 3D model and associated metadata managed by the SimAI platform. +Geometries are the core of *SimAI deep learning-powered predictions*. +A geometry is a 3D model and the associated metadata managed by the SimAI platform. .. _geometry_format: @@ -14,9 +14,11 @@ File format ----------- The input format for your workspace is described by the model manifest. -You can access that information for a specific workspace through :attr:`workspace.model.geometry` +You use the :attr:`workspace.model.geometry` +attribute to access the information for a specific workspace. -If you have a problem converting to the expected format, please contact us for more information at support-simai@ansys.com +If you have a problem converting to the expected format, contact the SimAI team +at `support-simai@ansys.com `_. Directory --------- diff --git a/doc/source/api_reference/optimizations.rst b/doc/source/api_reference/optimizations.rst index 0b2d7b7e..df1038a5 100644 --- a/doc/source/api_reference/optimizations.rst +++ b/doc/source/api_reference/optimizations.rst @@ -18,10 +18,10 @@ Model :members: :inherited-members: -TrialRun -======== +TrialRuns +========= -Trial runs are a single step of the optimization process +Trial runs are a single step of the optimization process. Directory --------- diff --git a/doc/source/api_reference/post_processings.rst b/doc/source/api_reference/post_processings.rst index 8b83b9f9..5eaf0cfe 100644 --- a/doc/source/api_reference/post_processings.rst +++ b/doc/source/api_reference/post_processings.rst @@ -1,7 +1,7 @@ .. _post_processings: -Post-Processings -================ +Postprocessings +=============== .. py:module:: ansys.simai.core.data.post_processings @@ -20,7 +20,7 @@ Model .. _pp_methods: -Nested Prediction Namespace +Nested prediction namespace --------------------------- .. autoclass:: PredictionPostProcessings() @@ -28,11 +28,14 @@ Nested Prediction Namespace .. _available_pp: -Available post-processings +Available postprocessings -------------------------- -.. note:: Depending on the capabilities of your model, some of these may not be available in your workspace. - You can check which ones are available through the :meth:`~ansys.simai.core.data.post_processings.PostProcessingDirectory.info` method +.. note:: + Depending on the capabilities of your model, some of these objects may not + be available in your workspace. You can use the + :meth:`~ansys.simai.core.data.post_processings.PostProcessingDirectory.info` method + to see which ones are available. .. autoclass:: GlobalCoefficients() :members: diff --git a/doc/source/api_reference/predictions.rst b/doc/source/api_reference/predictions.rst index 943db3aa..6b6e32b2 100644 --- a/doc/source/api_reference/predictions.rst +++ b/doc/source/api_reference/predictions.rst @@ -1,13 +1,17 @@ +.. _predictions: + Predictions =========== .. py:module:: ansys.simai.core.data.predictions -The Prediction module is in charge of running the *SimAI-powered -predictions* on the :py:class:`Geometries` you have uploaded. +The ``Prediction`` module is in charge of running the *SimAI-powered +predictions* on the :py:class:`geometries` +that you have uploaded. -It represents a numerical prediction with geometry and boundary conditions. -The arguments to :py:func:`predictions.run()` depend on your model. +A prediction represents a numerical prediction with geometry and boundary conditions. +The arguments to the :py:meth:`predictions.run()` method +depend on your model. .. code-block:: python @@ -15,13 +19,6 @@ The arguments to :py:func:`predictions.run()` depend on velocity = 10.0 prediction = geometry.run_prediction(Vx=velocity) -.. warning:: - In order to better describe different physical constraints, - the SimAI SDK has been updated to describe boundary conditions with a dict, - replacing the previous tuple of 3 numbers. - Please make sure to update your existing scripts, for instance from ``(3.4, 0, 0)`` to ``dict(Vx=3.4)``; - there is no need to put ``Vy=0`` and ``Vz=0`` anymore if your project has not been trained on those velocities. - In method calls, boundary conditions can be passed directly as arguments: ``run_prediction(Vx=3.4)``. Directory --------- diff --git a/doc/source/api_reference/selections.rst b/doc/source/api_reference/selections.rst index 5b61a587..34b904bc 100644 --- a/doc/source/api_reference/selections.rst +++ b/doc/source/api_reference/selections.rst @@ -1,21 +1,20 @@ .. _selections: Selections -=========== +========== -Selections Basics ------------------ +Selection basics +---------------- .. py:module:: ansys.simai.core.data.selections -The Selection class allows you to run a large number of operations in parallel, -by manipulating whole collections of SimAI models +The :class:`Selection` class allows you +to run a large number of operations in parallel by manipulating whole collections of SimAI models (:class:`Geometries `, -:class:`Predictions `, -:py:class:`Post-Processings `). +:class:`Predictions `, and +:py:class:`Post-Processings ` instances). -A Selection is created by combining a list of -:class:`Geometries ` with a list of -:class:`~ansys.simai.core.data.types.BoundaryConditions`. +You create a selection by combining a list of :class:`Geometry ` +instances with a list of :class:`~ansys.simai.core.data.types.BoundaryConditions` instances: .. code-block:: python @@ -26,12 +25,13 @@ A Selection is created by combining a list of selection = Selection(geometries, boundary_conditions) -The resulting selection contains all possible combinations between -the geometries and boundary conditions. Each of those combinations is a -:class:`~ansys.simai.core.data.selections.Point`, which can be viewed as a potential -:class:`~ansys.simai.core.data.predictions.Prediction`. +The resulting selection contains all possible combinations between the geometries and +boundary conditions. Each of those combinations is a :class:`~ansys.simai.core.data.selections.Point` +instance, which can be viewed as a potential :class:`~ansys.simai.core.data.predictions.Prediction` +instance. -At first all predictions may not all exist, they can be run with the :func:`~Selection.run_predictions` method: +At first, all predictions may not exist. However, you can use the :meth:`~Selection.run_predictions` +method to run them: .. code-block:: python @@ -41,10 +41,11 @@ At first all predictions may not all exist, they can be run with the :func:`~Sel all_predictions = selection.predictions -Selections API Reference +Selection API reference ------------------------ -In essence a :class:`~ansys.simai.core.data.selections.Selection` is a collection of :class:`points `. +In essence, a :class:`~ansys.simai.core.data.selections.Selection` instance is a +collection of :class:`points ` instances. .. autoclass:: Point() :members: @@ -56,12 +57,13 @@ In essence a :class:`~ansys.simai.core.data.selections.Selection` is a collectio :inherited-members: -Post-processing Basics +Postprocessing basics ---------------------- -The :attr:`~Selection.post` namespace allows you to run and access all post-processings for existing predictions. -Please see :py:class:`~ansys.simai.core.data.selection_post_processings.SelectionPostProcessingsMethods` -for available post-processings. +The :attr:`~Selection.post` namespace allows you to run and access all postprocessings +for existing predictions. For available postprocessings, see the +:py:class:`~ansys.simai.core.data.selection_post_processings.SelectionPostProcessingsMethods` +class. .. code-block:: python @@ -70,10 +72,10 @@ for available post-processings. coeffs.data # is a list of results of each post-processings. -Results for exportable post-processings +You can use the :meth:`~ansys.simai.core.data.lists.ExportablePPList.export()` +method to export results in batch for exportable postprocessings (:py:class:`~ansys.simai.core.data.post_processings.GlobalCoefficients` -and :py:class:`~ansys.simai.core.data.post_processings.SurfaceEvol`) -can be exported in batch with the :func:`~ansys.simai.core.data.lists.ExportablePPList.export()` method: +and :py:class:`~ansys.simai.core.data.post_processings.SurfaceEvol` instances): .. code-block:: python @@ -81,8 +83,9 @@ can be exported in batch with the :func:`~ansys.simai.core.data.lists.Exportable "/path/to/file.xlsx" ) -Please note that the ``csv`` export generates a zip archive containing multiple csv files. -They can be read directly with python by using zipfile: +Note that a CSV export generates a ZIP file containing multiple CSV files. +You can read them directly using Python's `zipfile` +module: .. code-block:: python @@ -101,7 +104,7 @@ They can be read directly with python by using zipfile: df_geom = pd.read_csv(archive.open("Geometries.csv")) -Binary post-processings results can be downloaded by looping on the list, for instance: +You can download binary postprocessings results by looping on the list: .. code-block:: python @@ -109,7 +112,7 @@ Binary post-processings results can be downloaded by looping on the list, for in vtu.data.download(f"/path/to/vtu_{vtu.id}") -Post-processing API Reference +Postprocessing API reference ----------------------------- .. py:module:: ansys.simai.core.data.selection_post_processings diff --git a/doc/source/api_reference/training_data.rst b/doc/source/api_reference/training_data.rst index 50055e66..2d5c3c39 100644 --- a/doc/source/api_reference/training_data.rst +++ b/doc/source/api_reference/training_data.rst @@ -1,12 +1,13 @@ .. _training_data: -Training Data -============= +TrainingData +============ .. py:module:: ansys.simai.core.data.training_data -A TrainingData is a collection of :class:`parts` representing -a prediction that can then be used as input for the training of models. +A :class:`TrainingData` instance is a +collection of :class:`TrainingDataPart` +instances representing a prediction that can be used as input for the training of models. Directory --------- diff --git a/doc/source/api_reference/training_data_parts.rst b/doc/source/api_reference/training_data_parts.rst index ca2ca43e..6df79dc1 100644 --- a/doc/source/api_reference/training_data_parts.rst +++ b/doc/source/api_reference/training_data_parts.rst @@ -1,11 +1,13 @@ .. _training_data_parts: -Training Data Part -================== +TrainingDataParts +================= .. py:module:: ansys.simai.core.data.training_data_parts -A TrainingDataPart is a singular file, part of a :class:`~ansys.simai.core.data.training_data.TrainingData`. +A :class:`TrainingDataPart` instance +is a singular file that is part of a :class:`~ansys.simai.core.data.training_data.TrainingData` +instance. Directory --------- diff --git a/doc/source/api_reference/workspaces.rst b/doc/source/api_reference/workspaces.rst index 6ddc1997..1756682b 100644 --- a/doc/source/api_reference/workspaces.rst +++ b/doc/source/api_reference/workspaces.rst @@ -5,11 +5,11 @@ Workspaces .. py:module:: ansys.simai.core.data.workspaces -Workspaces are a set of specific geometries, predictions and post-processings. +Workspaces are a set of specific geometries, predictions, and postprocessings. Each workspace uses a specific kernel. -To set which workspace the client is configured for, please refer to -:meth:`SimAIClient.set_current_workspace() method` +You use the :meth:`SimAIClient.set_current_workspace()` +method to set the workspace that the client is configured for. Directory --------- diff --git a/doc/source/index.rst b/doc/source/index.rst index ead8b911..ea5af0b2 100644 --- a/doc/source/index.rst +++ b/doc/source/index.rst @@ -12,34 +12,34 @@ PySimAI documentation Release v\ |version| (:ref:`Changelog `) -The PySimAI library is a Python library for the Ansys SimAI API. -With it you can manage and access your data on the platform from within Python applications and scripts. +PySimAI is part of the `PyAnsys `_ ecosystem that allows you to use SimAI within +a Python environment of your choice in conjunction with other PyAnsys libraries and external Python +libraries. With PySimAI, you can manage and access your data on the platform from within Python apps and +scripts. -What is PySimAI ? -================= +Requirements +============ -PySimAI is part of the `PyAnsys `_ ecosystem that let's you use SimAI within a Python environment of your choice in conjunction with other PyAnsys libraries and external Python libraries. +PySimAI requires Python 3.9 or later. -Install -======= +Installation +================ -PySimAI requires **Python >= 3.9** - -SDK Installation -++++++++++++++++ - -Install the SDK with the following command: +Install PySimAI with this command: .. code-block:: bash pip install ansys-simai-core --upgrade -This same command can be used every time you want to update the PySimAI library. +Use this same command every time you want to update PySimAI. + +.. _getting_started: Getting started =============== -The :class:`~ansys.simai.core.client.SimAIClient` is the core of the SDK, all operations are made through it. +The :class:`~ansys.simai.core.client.SimAIClient` class is the core of PySimAI. +All operations are made through it. .. code-block:: python @@ -47,11 +47,15 @@ The :class:`~ansys.simai.core.client.SimAIClient` is the core of the SDK, all op simai = SimAIClient() -You will be prompted for your credentials. -Alternative ways to authenticate are described in the :ref:`configuration section`. +You are prompted for your credentials. + +.. note:: + You can also start a :class:`~ansys.simai.core.client.SimAIClient` instance + from a configuration file. For more information, see :ref:`configuration`. -Using the :class:`~ansys.simai.core.client.SimAIClient`, -you can now :meth:`~ansys.simai.core.data.geometries.GeometryDirectory.upload` your first geometry. +Once the :class:`~ansys.simai.core.client.SimAIClient` class is started, +you use the :meth:`~ansys.simai.core.data.geometries.GeometryDirectory.upload` +method to load a geometry: .. code-block:: python @@ -64,34 +68,39 @@ you can now :meth:`~ansys.simai.core.data.geometries.GeometryDirectory.upload` y }, ) -To learn more about what geometries are and how they should be formatted, see the :ref:`geometry_format` section. +To learn more about what geometries are and how they should be formatted, see +:ref:`geometries`. -You can then run a prediction on your geometry: +You use the :meth:`~ansys.simai.core.data.selections.Selection.run_predictions` +method to run a prediction on the geometry: .. code-block:: python prediction = geometry.run_prediction(boundary_conditions=dict(Vx=10.0)) -You can now analyse the prediction by :class:`post-processing` it. +The :class:`PredictionPostProcessing` +class provides for postprocessing and analzing your prediction data. -Run or get a post-processing through the :attr:`~ansys.simai.core.data.predictions.Prediction.post` attribute of the prediction. +You use the :attr:`~ansys.simai.core.data.predictions.Prediction.post` +attribute of the prediction to run the postprocessing and access its data: .. code-block:: python - # Run the post-processing + # Run postprocessing global_coefficients = prediction.post.global_coefficients() # Access its data print(global_coefficients.data) .. note:: - Depending on the post-processing :attr:`~ansys.simai.core.data.post_processings.PostProcessing.data` - will return a dict or a :class:`~ansys.simai.core.data.post_processings.DownloadableResult`. + Depending on the postprocessing, the :attr:`~ansys.simai.core.data.post_processings.PostProcessing.data` + attribute returns either a dictionary or a :class:`~ansys.simai.core.data.post_processings.DownloadableResult` + object. +For more information, see :ref:`post_processings`. -You can learn more about the available post-processings :ref:`here`. +You're all set. You can now learn about more advanced concepts, such as starting the +SimAI client from a :ref:`configuration file`, :ref:`exploring your data`, +and :ref:`best practices`. -You're all set: you can now look into more advanced concepts like :ref:`configuring your SDK`, -:ref:`data exploration` or :ref:`best practices`. -If you want to explore the functions and methods available to you in the SDK, -you can head over to the :ref:`API reference` section. +To explore the functions and methods available to you, see :ref:`api_reference`. diff --git a/doc/source/user_guide/best_practices.rst b/doc/source/user_guide/best_practices.rst index 5e42827e..f275fdcb 100644 --- a/doc/source/user_guide/best_practices.rst +++ b/doc/source/user_guide/best_practices.rst @@ -1,24 +1,28 @@ .. _best_practices: -Best Practices +Best practices ============== Asynchronicity -------------- -While the SDK doesn't use async/await mechanics, it is somewhat asynchronous in nature: -uploading geometries is a blocking method but running a prediction or a post-processing will -return the created object immediately before the result is computed on the servers or available locally. -This behavior makes it possible to request multiple computations to be ran on the -SimAI platform without waiting for any of the data to be available. +While the SimAI client doesn't use async/await mechanics, it is somewhat asynchronous in nature. +While uploading geometries is a blocking method, running a prediction or a postprocessing returns +the created object immediately, before the result is computed on the servers or available locally. +This behavior makes it possible to request that multiple computations be run on the SimAI platform +without waiting for any of the data to be available. -To wait for an object to be fully available, you can call the ``wait()`` method on the object -(for example :meth:`Prediction.wait()`) or you can call the global -:meth:`SimAIClient.wait()` method to wait for all requests to be complete. -Alternatively you can try to access the object's data in which case the SDK will automatically wait for the data to be ready if needed. +To wait for an object to be fully available, you can call the ``wait()`` method on the object. +For example, you can call the :meth:`Prediction.wait()` +method on a prediction. Or, you can call the global :meth:`SimAIClient.wait()` +method to wait for all requests to complete. -Because of this behavior, it is recommended when running a large number of computations to send all the -requests before accessing any of the data. +Alternatively, you can try to access the object's data, in which case the SimAI client automatically +waits for the data to be ready if needed. Because of this behavior, when running a large number of +computations, you should send all requests before accessing any of the data. + +This example requests the predictions and postprocessings sequentially, which requires waiting +for the data to be available and used before requesting the next one. .. code-block:: python :name: sequential-way @@ -33,14 +37,14 @@ requests before accessing any of the data. # Run prediction pred = geom.run_prediction(Vx=vx) # Request global coefficients postprocessing - # Since we're accessing the data, this will wait for the computation to finish + # Because you are accessing the data, you must wait for the computation to finish coeffs = pred.post.global_coefficients().data # do something with the data print(coeffs) -In the previous example, the predictions and post-processings will be requested sequentially, waiting for the data -to be available and used before requesting the next one. -Thus a more efficient way would be as follows: + +This more efficient example requests all the predictions and postprocessings right away +and then processes the data once they are all available. .. code-block:: python :name: requests-first @@ -56,7 +60,7 @@ Thus a more efficient way would be as follows: # Run prediction pred = geom.run_prediction(Vx=vx) # Request global coefficients postprocessing - # Since we're not accessing the data, this will not block + # Because you are not accessing the data, you are not blocked pred.post.global_coefficients() predictions.append(pred) @@ -65,5 +69,3 @@ Thus a more efficient way would be as follows: # do something with the data print(pred.post.global_coefficients().data) -In this example, all the predictions and post-processings are requested right away and the -data crunching will happen once all of it is available. diff --git a/doc/source/user_guide/config_file.rst b/doc/source/user_guide/config_file.rst index 11c2ac35..16cd5271 100644 --- a/doc/source/user_guide/config_file.rst +++ b/doc/source/user_guide/config_file.rst @@ -1,10 +1,11 @@ .. _config_file: -Configuration File +Configuration file ================== -To create a client from a configuration file, you can use the -:py:meth:`~ansys.simai.core.client.SimAIClient.from_config` function : +To create a :class:`~ansys.simai.core.client.SimAIClient` +instance from a configuration file, you use the +:py:meth:`~ansys.simai.core.client.SimAIClient.from_config` method: .. code-block:: python @@ -13,35 +14,34 @@ To create a client from a configuration file, you can use the Location -------- -If no ``path`` is given, the ``SimAIClient`` will look at default locations. -These locations differ according to your operating system: +If no path is given, the :class:`~ansys.simai.core.client.SimAIClient` +instance looks at default locations. These locations differ according to +your operating system. -* Linux/MacOS: +**Linux/MacOS** - For UNIX systems the default locations are, in order : +For UNIX systems, the default locations are, in order: - * ``$XDG_CONFIG_HOME/ansys_simai.conf`` - * ``$XDG_CONFIG_HOME/ansys/simai.conf`` - * ``~/.ansys_simai.conf`` - * ``~/.ansys/simai.conf`` - * ``/etc/ansys_simai.conf`` - * ``/etc/ansys/simai.conf`` +* ``$XDG_CONFIG_HOME/ansys_simai.conf`` +* ``$XDG_CONFIG_HOME/ansys/simai.conf`` +* ``~/.ansys_simai.conf`` +* ``~/.ansys/simai.conf`` +* ``/etc/ansys_simai.conf`` +* ``/etc/ansys/simai.conf`` - .. note :: +.. note :: - Only the first one found will be used. + The first location found is used. ``$XDG_CONFIG_HOME`` defaults to ``~/.config``. - ``$XDG_CONFIG_HOME`` defaults to ``~/.config``. +**Windows XP** -* For Windows XP : +* ``C:\Documents and Settings\\Local Settings\Application Data\Ansys\simai.conf`` - * ``C:\Documents and Settings\\Local Settings\Application Data\Ansys\simai.conf`` +**Windows 7 to 11** -* For Windows 7 to 11: +* ``C:\Users\\AppData\Roaming\Ansys\simai.conf`` - * ``C:\Users\\AppData\Roaming\Ansys\simai.conf`` - -Optionally you can specify the path yourself: +Optionally, you can specify the path yourself: .. code-block:: python @@ -50,13 +50,13 @@ Optionally you can specify the path yourself: Content ------- -The configuration file is written in `TOML `_. -Any parameter used to configure the :class:`~ansys.simai.core.client.SimAIClient` can -be be passed from the configuration file. +You write the configuration file in `TOML `_. +From this file, you can pass parameters for configuring +the :class:`~ansys.simai.core.client.SimAIClient` instance. -Example : -""""""""" +Example +""""""" .. code-block:: TOML @@ -69,11 +69,11 @@ Example : totp_enabled = true -Proxy : -""""""" +Proxy +""""" -If your network is situated behind a proxy, then you will need to add its address -in a `https_proxy` key in the `[default]` block: +If your network is situated behind a proxy, you must add its address +in a ``https_proxy`` key in the ``[default]`` block: .. code-block:: TOML @@ -84,9 +84,8 @@ in a `https_proxy` key in the `[default]` block: Profiles -------- -The SDK supports having multiple configurations in a single file through profiles. - -Profiles can be loaded like so : +The :class:`~ansys.simai.core.client.SimAIClient` instance supports having multiple +configurations in a single file through profiles, which are loaded like this: .. code-block:: TOML diff --git a/doc/source/user_guide/configuration.rst b/doc/source/user_guide/configuration.rst index 23c76f58..8c2a0fdb 100644 --- a/doc/source/user_guide/configuration.rst +++ b/doc/source/user_guide/configuration.rst @@ -1,17 +1,14 @@ .. _configuration: .. py:module:: ansys.simai.core.utils.configuration -Client Configuration +Client configuration ==================== Where to start -------------- -You can start by creating an `SimAIClient`, you will -be prompted for any missing parameter (see :ref:`getting started`). - -You can then start configuring the :class:`~ansys.simai.core.client.SimAIClient` -by passing the required parameters on client creation, like so: +You start by creating a :class:`~ansys.simai.core.client.SimAIClient` +instance: .. code-block:: python @@ -19,15 +16,19 @@ by passing the required parameters on client creation, like so: simai = ansys.simai.core.SimAIClient(organization="my-company") -Once you understand how this works, we recommend looking into the SimAI -:ref:`configuration file`. +As demonstrated in the preceding code, you configure the instance by +passing the required parameters on client creation. You are prompted +for any missing parameters. + +Once you understand how creating an instance works, you can look into using a +:ref:`configuration file` for creating a client instance. -Available options ------------------ +Configuration options +--------------------- -All of the configuration variables for :class:`~ansys.simai.core.client.SimAIClient` -are documented in the following class: +Descriptions follow of all configuration options for the :class:`~ansys.simai.core.client.SimAIClient` +class: .. autopydantic_model:: ClientConfig :model-show-config-summary: False @@ -37,14 +38,12 @@ are documented in the following class: Credentials -+++++++++++ - -To use SimAI API your SDK needs to be authenticated. +----------- -By default, :class:`~ansys.simai.core.client.SimAIClient` will prompt you to log in -via your web browser. - -You can also pass your credentials as parameters on client creation, like so: +To use the SimAI API, your :class:`~ansys.simai.core.client.SimAIClient` +instance must be authenticated. By default, you are prompted to log in +via your web browser. However, you can pass your credentials as parameters +on client creation: .. code-block:: python @@ -58,6 +57,12 @@ You can also pass your credentials as parameters on client creation, like so: }, ) +Credential options +------------------ + +Descriptions follow of all credential options for the :class:`~ansys.simai.core.client.SimAIClient` +class: + .. autopydantic_model:: Credentials :model-show-config-summary: False :model-show-validator-summary: False diff --git a/doc/source/user_guide/data_exploration.rst b/doc/source/user_guide/data_exploration.rst index b2aba6c5..46dade8f 100644 --- a/doc/source/user_guide/data_exploration.rst +++ b/doc/source/user_guide/data_exploration.rst @@ -1,19 +1,18 @@ .. _data_exploration: -Data Exploration +Data exploration ================ -The SimAI SDK provides utilities to help you run a large amount of predictions and post-processings, -explore your data and gather insights from it. +The SimAI client provides utilities to help you run a large number of predictions and +postprocessings, explore your data, and gather insights from it. Selections ---------- -:class:`Selections` enable you to manipulate a large number of -geometries and boundary conditions simultaneously. -It allows you to easily run many predictions or post-processings in parallel. +:ref:`selections` enable you to manipulate a large number of geometries and boundary conditions +simultaneously. They also allow you to easily run many predictions or postprocessings in parallel. -A Selection is created by combining a list of geometries with a list of boundary conditions. +You create a selection by combining a list of geometries with a list of boundary conditions: .. code-block:: python @@ -37,16 +36,26 @@ A Selection is created by combining a list of geometries with a list of boundary for global_coefficients in selection.post.global_coefficients() ] -You can dive deeper in the :ref:`selections page` +To help build selections, the Sim AI client exposes two methods that are useful for +different strategies: + +- The :meth:`geometry.sweep` method, which + is described in :ref:`sweeping`. +- The :meth:`GeometryDirectory.list` + method, which is described in :ref:`filtering_geometries`. + +For more information on selections and geometry exploration methods, see :ref:`selections` +and :ref:`geometries`. + +.. _sweeping: Sweeping -------- -To help building selections, the SimAIClient exposes two methods that are useful for different strategies: -The :meth:`geometry.sweep` method aims to explore the surroundings -of a given geometry and can help with local optimization or gradient descent. -It finds geometries which have metadata closest to the candidate geometry -(only for numerical metadata). +The :meth:`geometry.sweep` method allows you +to explore the surroundings of a given geometry, which can help with local optimization or +gradient descent. This method, only for numerical metadata, finds geometries that have +metadata closest to the candidate geometry. .. code-block:: python @@ -56,11 +65,14 @@ It finds geometries which have metadata closest to the candidate geometry # with which a selection can be built: selection = Selection(neighbour_geometries, [dict(Vx=13.4)]) +.. _filtering_geometries: + Filtering geometries -------------------- -The :meth:`GeometryDirectory.list` method enables to take a -more brute-force approach, by allowing to select large swaths of geometries with range filters. +The :meth:`GeometryDirectory.list` method +allows you to take a more brute-force approach. With this method, you can select large swaths of +geometries with range filters. .. code-block:: python @@ -68,4 +80,3 @@ more brute-force approach, by allowing to select large swaths of geometries with geometries = simai.geometries.list(filters={"SINK": Range(-5.1, -4.8)}) -Geometry exploration methods are described in the :ref:`geometries page` diff --git a/doc/source/user_guide/proxy.rst b/doc/source/user_guide/proxy.rst index b11630d8..dc7a1f11 100644 --- a/doc/source/user_guide/proxy.rst +++ b/doc/source/user_guide/proxy.rst @@ -3,12 +3,13 @@ Working behind a proxy ====================== -By default, the SDK will attempt to get your proxy configuration from your system if there is any. +By default, the SimAI client attempts to get your proxy configuration, if any, from your system. -SDK configuration -+++++++++++++++++ +SimAI client configuration +++++++++++++++++++++++++++ -You can manually set a proxy for the SDK when creating an :ref:`SimAIClient`, like so : +You can manually set a proxy when creating the :ref:`SimAIClient` +instance: .. code-block:: python @@ -18,28 +19,30 @@ You can manually set a proxy for the SDK when creating an :ref:`SimAIClient`. -Note that setting this parameter will override the default configuration retrieved from your system. +.. note:: + Setting this parameter overrides the default configuration retrieved from your system. Troubleshooting ~~~~~~~~~~~~~~~ -In case you get an error or the type ``ProxyError([...], SSLCertVerificationError([...]``, -it is likely that your proxy setup looks like ``|computer|<-https->|proxy|<-https->|internet|``, -but the proxy is not trusted by your computer (your web browser uses a -`special configuration `__). +If you get an error of the type ``ProxyError([...], SSLCertVerificationError([...]``, +it is likely that your proxy setup looks like ``|computer|<-https->|proxy|<-https->|internet|``. +Because your web browser uses a special +`proxy auto-configuration `_ file, the +proxy is not trusted by your computer. -To fix this: +To fix the issue: -1. Extract the certificates used by your company-configured browser on ``https://simai.ansys.com`` +1. Extract the certificates used by your company-configured browser on ``https://simai.ansys.com``. 2. Set the ``REQUESTS_CA_BUNDLE`` environment variable: - .. code:: python + .. code:: python - import os - from pathlib import Path + import os + from pathlib import Path - os.environ["REQUESTS_CA_BUNDLE"] = Path( - "~/Downloads/ansys-simai-chain.pem" - ).expanduser() - client = ansys.simai.core.from_config() + os.environ["REQUESTS_CA_BUNDLE"] = Path( + "~/Downloads/ansys-simai-chain.pem" + ).expanduser() + client = ansys.simai.core.from_config() diff --git a/doc/source/user_guide/training.rst b/doc/source/user_guide/training.rst index ef831734..7703b1e0 100644 --- a/doc/source/user_guide/training.rst +++ b/doc/source/user_guide/training.rst @@ -5,39 +5,48 @@ Training .. note:: - The training section is still experimental and subject to API changes. + Training is still experimental and subject to API changes. -In order to use the solver, the SimAI solution must first be trained on your prediction data. -Your prediction data is uploaded onto a global pool of :class:`training data` and can then be assigned to different :class:`projects` where you can configure how to train your model. +Before you can use the solver, you must train the SimAI solution on your prediction +data. You first upload your prediction data into a global pool of +:class:`training data` instances +and then assign this data to different :class:`Project` +instances, which you configure for training your model. -Getting started -=============== +Train on prediction data +======================== -Create an :class:`~ansys.simai.core.client.SimAIClient` object:: +#. Create a :class:`~ansys.simai.core.client.SimAIClient` instance:: - import ansys.simai.core + import ansys.simai.core - simai = ansys.simai.core.SimAIClient() + simai = ansys.simai.core.SimAIClient() -You will be prompted for your credentials and for the name of the workspace you want to use. -Alternative ways to authenticate are described :ref:`here`. + You are prompted for your credentials. -Start by uploading your prediction data by creating a training data and uploading your files into it:: + If desired, you can create an instance using a configuration file. For more + information, see :ref:`configuration`. - td = simai.training_data.create("my-first-data") - td.upload_folder("/path/to/folder/where/files/are/stored") +#. Upload your prediction data by creating a + :class:`TrainingData` instance + and then loading your files into it:: -You can then create a project:: + td = simai.training_data.create("my-first-data") + td.upload_folder("/path/to/folder/where/files/are/stored") - project = simai.projects.create("my-first-project") +#. Create a project:: -And assign the created training data to this project:: + project = simai.projects.create("my-first-project") - td.add_to_project(project) +#. Assign the created training data to your project:: -You can start training a model from the web-app once you have a few training data in your project. + td.add_to_project(project) + +Once you have training data in your project, you can use the web app to +train a model. Learn more ========== -Check out the API references for :ref:`training_data`, :ref:`training_data_parts` and :ref:`projects` to learn more about the actions available to you. +For more information on the actions available to you, see :ref:`training_data`, +:ref:`training_data_parts`, and :ref:`projects`. diff --git a/doc/styles/Vocab/ANSYS/accept.txt b/doc/styles/Vocab/ANSYS/accept.txt index 16a437df..61b85d79 100644 --- a/doc/styles/Vocab/ANSYS/accept.txt +++ b/doc/styles/Vocab/ANSYS/accept.txt @@ -1,7 +1,10 @@ (?i)ansys +(?i)simai [Aa]synchronicity csv +DOE https_proxy [Nn]amespace -(?i)simai +[Pp]ostprocessing +PySimAI zipfile diff --git a/src/ansys/simai/core/api/client.py b/src/ansys/simai/core/api/client.py index a4f35e8d..b90fbd5e 100644 --- a/src/ansys/simai/core/api/client.py +++ b/src/ansys/simai/core/api/client.py @@ -44,4 +44,4 @@ class ApiClient( TrainingDataPartClientMixin, WorkspaceClientMixin, ): - """Low-level client that handles direct communication with the server.""" + """Provides the low-level client that handles direct communication with the server.""" diff --git a/src/ansys/simai/core/api/design_of_experiments.py b/src/ansys/simai/core/api/design_of_experiments.py index 09c52318..24ce658a 100644 --- a/src/ansys/simai/core/api/design_of_experiments.py +++ b/src/ansys/simai/core/api/design_of_experiments.py @@ -30,7 +30,7 @@ class DesignOfExperimentsMixin(ApiClientMixin): - """Client for the design of experiments ("/design-of-experiments/") part of the API.""" + """Provides the client for the design of experiments ("/design-of-experiments/") part of the API.""" def download_design_of_experiments( self, @@ -41,14 +41,15 @@ def download_design_of_experiments( """Downloads the design of experiments into the file at the given path. Args: - file: A binary file-object or the path of the file to put the content into. - format: the format to download, ``xlsx`` or ``csv`` - workspace_id: id of the workspace for which to download the DoE + file: Binary file-object or the path of the file to put the content into. + format: Format to download. Options are ``'xlsx'`` or ``'csv'``. + workspace_id: ID of the workspace to download the design of experiments for. Return: - None if a file is provided, a BytesIO with the design of experiments's content otherwise + ``None`` if a file is provided or a ``BytesIO`` object with the content for + the design of experiments otherwise. """ - logger.debug("Attempting to download design of experiments") + logger.debug("Attempting to download design of experiments.") return self.download_file( f"design-of-experiments/export?format={format}&workspace={workspace_id}", file, diff --git a/src/ansys/simai/core/api/geometry.py b/src/ansys/simai/core/api/geometry.py index c0262902..58ec2225 100644 --- a/src/ansys/simai/core/api/geometry.py +++ b/src/ansys/simai/core/api/geometry.py @@ -32,7 +32,7 @@ class GeometryClientMixin(ApiClientMixin): - """Client for the Geometry ("/geometries/") part of the API.""" + """Provides the client for the Geometry ("/geometries/") part of the API.""" def geometries(self, workspace_id: str, filters: Optional[Dict[str, Any]] = None): """Get list of all geometries.""" @@ -46,26 +46,26 @@ def get_geometry(self, geometry_id: str): """Get information on a single geometry. Args: - geometry_id: The id of the geometry to get + geometry_id: ID of the geometry. """ return self._get(f"geometries/{geometry_id}") def get_geometry_by_name(self, name: str, workspace_id: str): - """Get information on a single geometry, by name instead of id. + """Get information on a single geometry by name instead of ID. Args: - name: The name of the geometry to get - workspace_id: The id of the workspace the geometry belongs to + name: Name of the geometry. + workspace_id: ID of the workspace that the geometry belongs to. """ return self._get(f"geometries/name/{quote(name)}", params={"workspace": workspace_id}) def delete_geometry(self, geometry_id: str): """Delete a single geometry. - All objects associated to that geometry are also deleted. + All objects associated with that geometry are also deleted. Args: - geometry_id: The id of the geometry to delete + geometry_id: ID of the geometry. """ # TODO: Have user confirm or delete confirmation from API ? return self._delete( @@ -80,12 +80,12 @@ def update_geometry( name: Optional[str] = None, metadata: Optional[dict] = None, ): - """Update a geometry information. + """Update the information for a given geometry. Args: - geometry_id: The id of the geometry to update - name: The new name to give to the geometries - metadata: The metadata to update the geometry with + geometry_id: ID of the geometry. + name: New name to give to the geometries. + metadata: Metadata to update the geometry with. """ request_json = {} if name is not None: @@ -101,16 +101,18 @@ def create_geometry( extension: Optional[str] = None, metadata: Optional[Dict[str, Any]] = None, ): - """Create a new geometry, without pushing the data. + """Create a geometry without pushing the data. Args: - workspace_id: The id of the workspace the geometry should belong to. - name: The name to give to the geometry - extension: The extension to give to the file - metadata: Metadata to apply to the geometry on creation + workspace_id: ID of the workspace to assign the geometry to. + name: Name to give to the geometry. + extension: Extension to give to the file. + metadata: Metadata to apply to the geometry on creation. Returns: - A tuple containing the geometry object and a 'presigned post' dictionary containing the url on which to upload the data and fields to include into the request + Tuple containing the geometry object and a 'presigned post' + dictionary, which contains the URL to upload the data and fields to + that was included in the request. """ post_data = { "name": name, @@ -134,13 +136,13 @@ def download_geometry( file: Optional[File] = None, monitor_callback: Optional[MonitorCallback] = None, ) -> Union[None, BinaryIO]: - """Downloads the input geometry into the file at the given path. + """Download the input geometry into the file at the given path. Args: - geometry_id: The id of the geometry to download - file: A binary file-object or the path of the file to put the content into - monitor_callback: Function or method that will be passed the bytes_read - delta. Can be used to monitor progress. + geometry_id: ID of the geometry to download. + file: Binary file-object or the path of the file to put the content into. + monitor_callback: Function or method to pass the ``bytes_read`` delta to. + This delta can be used to monitor progress. """ return self.download_file(f"geometries/{geometry_id}/download", file, monitor_callback) @@ -148,6 +150,6 @@ def get_geometry_predictions(self, geometry_id: str): """Get predictions associated with a geometry. Args: - geometry_id: The id of the geometry whose predictions to get + geometry_id: ID of the geometry. """ return self._get(f"geometries/{geometry_id}/predictions") diff --git a/src/ansys/simai/core/api/mixin.py b/src/ansys/simai/core/api/mixin.py index 9e2c7058..856d9707 100644 --- a/src/ansys/simai/core/api/mixin.py +++ b/src/ansys/simai/core/api/mixin.py @@ -44,7 +44,7 @@ class ApiClientMixin: - """The core on which all the mixins and the ApiClient are built.""" + """Provides the core that all mixins and the API client are built on.""" def __init__(self, *args, config: ClientConfig): # noqa: D107 self._session = requests.Session() @@ -103,19 +103,19 @@ def _request( ) -> APIResponse: """Wrap around :py:meth:`requests.Session.request`. - By default this method expects a json response. If you call an endpoint that does - not return a json, specify return_json=False + By default, this method expects a JSON response. If you call an endpoint that does + not return a JSON response, specify ``return_json=False``. Args: - method: The HTTP verb of the request - url: The url of the request - *args: Additional args for the request - return_json: Whether the expected response is a json. If yes returns - directly the json, otherwise the Response is returned - **kwargs: Additional kwargs for request + method: HTTP verb of the request. + url: URL of the request. + *args: Additional arguments for the request. + return_json: Whether the expected response is a json. If ``True``, the JSON + is returned directly. Otherwise, the response is returned. + **kwargs: Additional kwargs for request. Returns: - The json dict of the response if :py:args:`return_json` is True. The raw + JSON dictionary of the response if :py:args:`return_json` is True. The raw :py:class:`requests.Response` otherwise. """ logger.debug(f"Request {method} on {url}") @@ -136,21 +136,23 @@ def download_file( request_json_body: Optional[Dict[str, Any]] = None, request_method: str = "GET", ) -> Union[None, BinaryIO]: - """Download a file from the given URL into the given file or a :class:`BytesIO`. + """Download a file from a URL into a file or a :class:`BytesIO` object. Args: - download_url: url to GET the file - file: Optional binary file or path onto which to put the downloaded file - monitor_callback: An optional callback to monitor the progress of the download. - See :obj:`~ansys.simai.core.data.types.MonitorCallback` for details. - request_json_body: Optional JSON to include in the request - request_method: The HTTP verb + download_url: URL for getting the file. + file: Optional binary file or path for the downloaded file. + monitor_callback: Optional callback to monitor the progress of the download. + For more information, see the :obj:`~ansys.simai.core.data.types.MonitorCallback` + object. + request_json_body: Optional JSON to include in the request. + request_method: HTTP verb of the request. Raises: - ConnectionError: If an error occurred during the download + ConnectionError: If an error occurred during the download. Returns: - None if a file is provided, a BytesIO with the file's content otherwise + None if a file is provided or a ``BytesIO`` object with the file's + content otherwise. """ if file is None: output_file = BytesIO() @@ -227,10 +229,10 @@ def upload_parts( part_size: int = int(100e6), monitor_callback: Optional[MonitorCallback] = None, ) -> List[Dict[str, Any]]: - """Upload parts using the given endpoints to get presigned PUT urls. + """Upload parts using the given endpoints to get presigned ``PUT`` URLs. Returns: - The list of parts, with their id and their etag + List of parts with their IDs and HTTP ETags. """ part_number = 1 parts = [] diff --git a/src/ansys/simai/core/api/post_processing.py b/src/ansys/simai/core/api/post_processing.py index 5f0891ed..cd56701e 100644 --- a/src/ansys/simai/core/api/post_processing.py +++ b/src/ansys/simai/core/api/post_processing.py @@ -36,17 +36,18 @@ def run_post_processing( post_processing_type: str, params: Optional[Dict[str, Any]] = None, ) -> Dict[str, Any]: - """Run a post-processing on the given prediction. + """Run a postprocessing on the given prediction. - If the result of the requested post-processing already exists it will not be rerun. + If the result of the requested postprocessing already exists, the postprocessing + is not rerun. Args: - prediction_id: id of the prediction on which to run the post-processing - post_processing_type: the type of post-processing to run on the prediction - params: Additional json parameters as dict if required + prediction_id: ID of the prediction. + post_processing_type: Type of postprocessing to run on the prediction. + params: Additional JSON parameters as dictionary if required. Returns: - Json with the created or existing post-processing + JSON with the created or existing postprocessing. """ return self._post( f"predictions/{prediction_id}/post-processings/{post_processing_type}", @@ -54,24 +55,24 @@ def run_post_processing( ) def get_post_processing_result(self, post_processing_id: str) -> Dict[str, Any]: - """Get the result of a post-processing. + """Get the result of a postprocessing. Args: - post_processing_id: id of the post-processing + post_processing_id: ID of the postprocessing. Returns: - Json with the result of the post-processing + JSON with the result of the postprocessing. """ return self._get(f"post-processings/{post_processing_id}") def delete_post_processing(self, post_processing_id: str): - """Delete the post-processing. + """Delete a postprocessing. Args: - post_processing_id: id of the post-processing to delete + post_processing_id: ID of the postprocessing. Raises: - NotFoundError: if a post-processing with this id is not found on the server. + NotFoundError: If a postprocessing with this ID is not found on the server. """ return self._delete( f"post-processings/{post_processing_id}", @@ -81,11 +82,12 @@ def delete_post_processing(self, post_processing_id: str): def get_post_processings_in_workspace( self, workspace_id: str, pp_type: Optional[str] ) -> List[Dict[str, Any]]: - """Get all the post-processings in the given workspace. + """Get all postprocessings in the given workspace. Args: - workspace_id: the id of the target workspace - pp_type: Specify a type of post-processing to return, returns all of them if empty + workspace_id: ID of the target workspace. + pp_type: Type of postprocessings to return. If this parameter is empty, all + postprocessings are returned. """ endpoint = f"post-processings/type/{pp_type}" if pp_type else "post-processings/" # The content of this endpoint can be paginated @@ -104,12 +106,13 @@ def get_post_processings_for_prediction( pp_type: Optional[str], filters: Optional[Dict[str, Any]] = None, ) -> List[Dict[str, Any]]: - """Get all the post-processings belonging to the given prediction. + """Get all postprocessings belonging to the given prediction. Args: - prediction_id: the id of the target prediction - pp_type: Specify a type of post-processing to return, returns all of them if empty - filters: the filters to apply to the query, if any + prediction_id: ID of the target prediction. + pp_type: Type of postprocessings to return. If this parameter is empty, all + postprocessings are returned. + filters: Filters to apply to the query, if any. """ endpoint = f"predictions/{prediction_id}/post-processings/" if pp_type: diff --git a/src/ansys/simai/core/api/prediction.py b/src/ansys/simai/core/api/prediction.py index b84b311b..17123f34 100644 --- a/src/ansys/simai/core/api/prediction.py +++ b/src/ansys/simai/core/api/prediction.py @@ -34,17 +34,17 @@ class PredictionClientMixin(ApiClientMixin): - """Client for the Prediction ("/predictions") part of the API.""" + """Provides the client for the Prediction ("/predictions") part of the API.""" def predictions(self, workspace_id: str): - """Get list of all predictions.""" + """Get a list of all predictions.""" return self._get("predictions/", params={"workspace": workspace_id}) def get_prediction(self, prediction_id: str): """Get information on a single prediction. Args: - prediction_id: The id of the prediction to get + prediction_id: ID of the prediction. """ return self._get(f"predictions/{prediction_id}") @@ -52,7 +52,7 @@ def delete_prediction(self, prediction_id: str): """Delete a single prediction. Args: - prediction_id: The id of the prediction to delete + prediction_id: ID of the prediction. """ return self._delete( f"predictions/{prediction_id}", @@ -61,14 +61,15 @@ def delete_prediction(self, prediction_id: str): ) def run_prediction(self, geometry_id: str, **kwargs): # noqa: D417 - """Run a prediction on the given geometry. + """Run a prediction on a given geometry. Args: - geometry_id: The id of the target geometry + geometry_id: ID of the target geometry. Keyword Arguments: - boundary_conditions dict: The contrainsts of the problem in dictionary form. - tolerance float: The delta under which two boundary condition components are considered equal, default is 10**-6 + boundary_conditions dict: Constraints of the problem in dictionary form. + tolerance float: Delta under which two boundary condition components + are considered equal. The default is ``10**-6``. """ return self._post(f"geometries/{geometry_id}/predictions", json=kwargs) @@ -80,16 +81,15 @@ def send_prediction_feedback( solution: Optional[Union[BinaryIO, str, Path]] = None, monitor_callback: Optional[Callable[[int], None]] = None, ): - """Send feedback on your prediction. + """Send feedback on your prediction so improvements can be made. Args: - prediction_id: Id of the target prediction - rating: A rating from 0 to 4 - comment: Additional comment - solution: The client - solution to the prediction - monitor_callback: Function or method that will be passed - a :py:class:`~requests_toolbelt.multipart.encoder.MultipartEncoderMonitor` + prediction_id: ID of the target prediction. + rating: Rating from 0 to 4. + comment: Additional comment. + solution: Client solution to the prediction. + monitor_callback: Function or method to pass the + :py:class:`~requests_toolbelt.multipart.encoder.MultipartEncoderMonitor` to. """ if solution is None: with_solution = False @@ -103,7 +103,7 @@ def send_prediction_feedback( close_file = False else: raise ValueError( - "Could not handle the provided solution." " Please use a path or binary file." + "Could not handle the provided solution." " Use a path or binary file." ) with_solution = True upload_form = {"rating": str(rating), "comment": comment} diff --git a/src/ansys/simai/core/api/project.py b/src/ansys/simai/core/api/project.py index f3e62777..f5700a60 100644 --- a/src/ansys/simai/core/api/project.py +++ b/src/ansys/simai/core/api/project.py @@ -43,8 +43,8 @@ def update_project(self, project_id: str, name: str): """Update a project name. Args: - project_id: The id of the project to update - name: The new name to give to the project + project_id: ID of the project. + name: New name to give to the project. """ request_json = {} request_json["name"] = name diff --git a/src/ansys/simai/core/api/sse.py b/src/ansys/simai/core/api/sse.py index 2fafb528..a4bf6cec 100644 --- a/src/ansys/simai/core/api/sse.py +++ b/src/ansys/simai/core/api/sse.py @@ -38,7 +38,7 @@ class SSEMixin(ApiClientMixin): - """Client for the server-sent-events ("/sessions/events").""" + """Provides the client for the server-sent-events ("/sessions/events").""" def __init__(self, config: ClientConfig, simai_client=None): super().__init__(config=config) @@ -46,7 +46,7 @@ def __init__(self, config: ClientConfig, simai_client=None): if simai_client: self.simai_client = simai_client else: - logger.warning("SSEMixin has no simai_client") + logger.warning("SSEMixin has no SIM AI client.") # Disable sse thread in unit tests if config.no_sse_connection: @@ -54,12 +54,12 @@ def __init__(self, config: ClientConfig, simai_client=None): return # Flag for stopping the threads when this object is destroyed. - # We have to use a flag, as Python's threading lib does not provide + # A flag is used because Python's threading library does not provide # a "stop" command. # The _stop_sse_threads is to allow unit tests to kill the thread # immediately without needing another event. self._stop_sse_threads = getattr(config, "_stop_sse_threads", False) - logger.debug("Connecting to SSE") + logger.debug("Connecting to SSE.") def sse_connection_factory(last_event_id: Optional[str]): headers = {"Accept": "text/event-stream"} @@ -75,12 +75,12 @@ def sse_connection_factory(last_event_id: Optional[str]): try: self.sse_client = ReconnectingSSERequestsClient(sse_connection_factory) except Exception as e: - raise ConnectionError("Impossible to connect to events endpoint.") from e + raise ConnectionError("Impossible to connect to event's endpoint.") from e logger.debug("SSEMixin is connected to SSE endpoint.") - logger.debug("Starting listener thread") + logger.debug("Starting listener thread.") self.listener_thread = threading.Thread(target=self._sse_thread_loop, daemon=True) self.listener_thread.start() - logger.debug("Started listener thread") + logger.debug("Started listener thread.") def __del__(self): self._stop_sse_threads = True @@ -92,7 +92,7 @@ def _sse_thread_loop(self): break self._handle_sse_event(event) except Exception: - logger.critical("Unhandled exception in SSE Thread: ", exc_info=True) + logger.critical("Unhandled exception in SSE thread: ", exc_info=True) # if object has been garbage collected, ignore exceptions if not self._stop_sse_threads: os._exit(1) @@ -106,7 +106,7 @@ def _handle_sse_event(self, event): logger.debug(f"received {data}") if "type" not in data: - raise ValueError("No type for SSE Event") + raise ValueError("No type is given for SSE event.") msg_type = data["type"] if msg_type == "session": self._handle_session_event(data) @@ -134,7 +134,7 @@ def _handle_data_model_event(self, data): self.simai_client._optimization_trial_run_directory._handle_sse_event(data) else: logger.debug( - f"Unknown type {target['type']} received for job or resource event. Ignoring" + f"Unknown type {target['type']} received for job or resource event. Ignoring." ) def _handle_session_event(self, data): diff --git a/src/ansys/simai/core/api/training_data_part.py b/src/ansys/simai/core/api/training_data_part.py index fd40e272..251c74a8 100644 --- a/src/ansys/simai/core/api/training_data_part.py +++ b/src/ansys/simai/core/api/training_data_part.py @@ -29,15 +29,16 @@ class TrainingDataPartClientMixin(ApiClientMixin): def create_training_data_part( self, training_data_id: str, name: str, extension: str ) -> Tuple[Dict[str, Any], Dict[str, Any]]: - """Creates a new part under the given training data, without uploading the data. + """Create a part under the given training data without uploading the data. Args: - training_data_id: The parent TrainingData - name: The name of the part to create - extension: The extension of the file/part + training_data_id: ID of the parent training data. + name: Name of the part to create. + extension: Extension of the file or part. Returns: - A tuple containing the TrainingDataPart object and the upload id to use for further requests. + Tuple containing the ``TrainingDataPart`` object and the upload ID + to use for further requests. """ post_data = {"name": name, "file_extension": extension} response = self._post(f"training_data/{training_data_id}/parts/", json=post_data) diff --git a/src/ansys/simai/core/api/workspace.py b/src/ansys/simai/core/api/workspace.py index 7bbe6413..387ea211 100644 --- a/src/ansys/simai/core/api/workspace.py +++ b/src/ansys/simai/core/api/workspace.py @@ -28,32 +28,32 @@ class WorkspaceClientMixin(ApiClientMixin): - """Client for the Workspace ("/workspaces") part of the API.""" + """Provides the client for the Workspace ("/workspaces") part of the API.""" def workspaces(self): """List all workspaces.""" return self._get("workspaces/") def get_workspace(self, workspace_id: str) -> Dict[str, Any]: - """Gets information on a single workspace. + """Get information on a single workspace. Args: - workspace_id: The id of the workspace to get + workspace_id: ID of the workspace. Returns: - The workspace json representation. + JSON representation of the workspace. Raises: - ansys.simai.core.errors.NotFoundError: If no workspace with that id exists. - ansys.simai.core.errors.ApiClientError: On other HTTP errors + ansys.simai.core.errors.NotFoundError: If no workspace with that ID exists. + ansys.simai.core.errors.ApiClientError: On other HTTP errors. """ return self._get(f"workspaces/{workspace_id}") def get_workspace_by_name(self, name: str): - """Get information on a single workspace, by name instead of id. + """Get information on a single workspace by name instead of ID. Args: - name: The name of the workspace to get + name: Name of the workspace. """ return self._get(f"workspaces/name/{quote(name)}") @@ -61,36 +61,36 @@ def get_workspace_model_manifest(self, workspace_id): """Get the public part of the manifest for the given workspace. Args: - workspace_id: The id of the workspace whose's manifest to get + workspace_id: ID of the workspace. Raises: - ansys.simai.core.errors.NotFoundError: If no workspace with that id exists. - ansys.simai.core.errors.ApiClientError: On other HTTP errors + ansys.simai.core.errors.NotFoundError: If no workspace with that ID exists. + ansys.simai.core.errors.ApiClientError: On other HTTP errors. """ return self._get(f"workspaces/{workspace_id}/model/manifest/public") def create_workspace(self, name: str, model_id: str, **kwargs): - """Creates a new workspace. + """Create a workspace. Args: - name: The name to give to the new workspace - model_id: id of the model that the workspace will use - **kwargs: Additional arguments for the workspace creation + name: Name to give to the new workspace. + model_id: ID of the model that the workspace is to use. + **kwargs: Additional arguments for the workspace creation. Returns: - The new workspace's json representation. + JSON representation of the new workspace. """ return self._post("workspaces/", json={"name": name, "model": model_id, **kwargs}) def delete_workspace(self, workspace_id: str): - """Deletes workspace. + """Delete a workspace. Args: - workspace_id: The id of the workspace to delete. + workspace_id: ID of the workspace. Raises: - ansys.simai.core.errors.NotFoundError: If no workspace with that id exists. - ansys.simai.core.errors.ApiClientError: On other HTTP errors + ansys.simai.core.errors.NotFoundError: If no workspace with that ID exists. + ansys.simai.core.errors.ApiClientError: On other HTTP errors. """ return self._delete(f"workspaces/{workspace_id}") diff --git a/src/ansys/simai/core/client.py b/src/ansys/simai/core/client.py index a7864401..5b325e0e 100644 --- a/src/ansys/simai/core/client.py +++ b/src/ansys/simai/core/client.py @@ -52,9 +52,9 @@ class SimAIClient: - """A client to communicate with SimAI API. + """Provides the client for communicating with the SimAI API. - Keyword Args: see :class:`~ansys.simai.core.utils.configuration.ClientConfig` + For keyword arguments, see the :class:`~ansys.simai.core.utils.configuration.ClientConfig` class. Example: .. code-block:: python @@ -94,11 +94,11 @@ def __init__(self, **kwargs): @property def current_workspace(self) -> Workspace: - """The workspace currently used in the SDK session. + """Workspace currently used by the SimAI client. Note: - It is recommended not set this directly. Instead use the :meth:`set_current_workspace` - method which uses the workspace name and also ensures the workspace exists. + You should not set the workspace directly. Instead, use the :meth:`set_current_workspace` + method, which uses the workspace name and ensures that the workspace exists. """ if self._current_workspace is None: raise InvalidClientStateError("Current workspace is not set.") @@ -109,10 +109,10 @@ def current_workspace(self, workspace: Workspace): self._current_workspace = workspace def set_current_workspace(self, name: str): - """Set the current workspace for the SimAIClient. + """Set the current workspace for the SimAI client. Args: - name: The name of the workspace the client should switch to. + name: Name of the workspace that the client should switch to. Example: .. code-block:: python @@ -127,7 +127,7 @@ def set_current_workspace(self, name: str): try: # Ensure the workspace exists workspace = self.workspaces.get(name=name) - logger.info(f"Workspace set to {name}") + logger.info(f"Workspace set to {name}.") except NotFoundError: raise InvalidConfigurationError( f"""Configured workspace {name} does not exist on the server. @@ -142,11 +142,11 @@ def _available_workspaces_string(self): @property def current_project(self) -> Project: - """The project currently used in the SDK session. + """Project currently used by the SimAPI client. Note: - It is recommended not set this directly. Instead use the :meth:`set_current_project` - method which uses the project name and also ensures the project exists. + You should not set the project directly. Instead, use the :meth:`set_current_project` + method, which uses the project name and ensures that the project exists. """ if self._current_project is None: raise InvalidClientStateError("Current project is not set.") @@ -157,12 +157,13 @@ def current_project(self, project: Project): self._current_project = project def set_current_project(self, name: str): - """Sets the current project for the SimAIClient. + """Set the current project for the SimAI client. - Affects how some methods related to projects or associated data will behave. + This method affects how some methods related to projects or associated + data behave. Args: - name: The name of the project the client should switch to. + name: Name of the project that the client should switch to. """ try: # Ensure the project exists @@ -183,72 +184,63 @@ def _available_projects_string(self): @property def geometries(self): """Representation of all geometries on the server. - - More details in the :doc:`geometries documentation `. + For more information, see :ref:`geometries`. """ return self._geometry_directory @property def optimizations(self): """Representation of all optimizations on the server. - - More details in the :doc:`optimization documentation `. + For more information, see :ref:`optimizations`. """ return self._optimization_directory @property def training_data(self): """Representation of all training data on the server. - - More details in the :doc:`training data documentation`. + For more information, see :ref:`training_data`. """ return self._training_data_directory @property def training_data_parts(self): """Representation of all training data parts on the server. - - More details in the :doc:`training data parts documentation`. + For more information, see :ref:`training_data_parts`. """ return self._training_data_part_directory @property def predictions(self): """Representation of all predictions on the server. - - More details in the :doc:`predictions documentation `. + For more information, see :ref:`predictions`. """ return self._prediction_directory @property def post_processings(self): - """Representation of all post-processings on the server. - - More details in the :doc:`post-processings documentation ` + """Representation of all postprocessings on the server. + For more information, see :ref:`post_processings`. """ return self._post_processing_directory @property def projects(self): """Representation of all projects on the server. - - More details in the :doc:`projects documentations ` + For more information, see :ref:`projects`. """ return self._project_directory @property def design_of_experiments(self): - """Methods allowing to export design of experiments. - - More details in the :doc:`design of experiments documentation ` + """Methods for exporting design of experiments. + For more information, see :ref:`design_of_experiments`. """ return self._doe_collection @property def workspaces(self): """Representation of all workspaces on the server. - - More details in the :doc:`workspaces documentation `. + For more information, see :ref:`workspaces`. """ return self._workspace_directory @@ -259,31 +251,31 @@ def from_config( path: Optional[Path] = None, **kwargs, ) -> "SimAIClient": - """Initialize a `SimAIClient` by reading a configuration file. + """Initializes a SimAI client by reading a configuration file. - You can provide the path of the config to load. If no path is given it will look - at default locations. + You can provide the path of the configuration file to load. If no path is + given, this method looks at the default locations. - For more information on the configuration, see :ref:`Configuration File`. + For more information, see :ref:`Configuration file`. - ``kwargs`` can be used to override part of the configuration. + You can use ``kwargs`` to override part of the configuration. Args: - profile: The profile to load from the configuration, the `default` profile - will be loaded if not provided - path: The path at which the configuration is located. - **kwargs: Additional arguments to pass to the `SimAiClient` + profile: Profile to load from the configuration file. The default profile + is loaded if no profile is provided. + path: Path for the configuration file. + **kwargs: Additional arguments to pass to the SimAI client. Returns: - A configured client. + Configured client. Raises: ConfigurationNotFoundError: No configuration file was found at the given location - or defaults location if no path was given. - InvalidConfigurationError: The configuration is invalid or incomplete. + or in the default profile if no path was given. + InvalidConfigurationError: Configuration is invalid or incomplete. Example: - After setting up your :ref:`configuration file.` + Create the client after setting up your :ref:`configuration file.`. .. code-block:: python @@ -291,19 +283,19 @@ def from_config( simai = ansys.simai.core_from_config() - .. note:: The default paths are only supported on Unix systems. + Note: + The default paths are only supported on Unix systems. """ return cls(**get_config(path, profile, **kwargs)) def wait(self) -> None: - """Wait for all the ongoing operations - on locally known predictions and post-processings + """Wait for all ongoing operations on locally known predictions and postprocessings to finish. Raises: - :exception:`SimAIError`: if something went wrong on an operation. - :exception:`MultipleErrors`: if things when wrong on multiple operations + :exception:`SimAIError`: If something went wrong on an operation. + :exception:`MultipleErrors`: If things went wrong on multiple operations. """ errors: List[Exception] = [] for directory in [self.predictions, self.post_processings]: @@ -330,7 +322,7 @@ def _check_for_new_version(self, client_name="ansys.simai.core", current_version if version_current < version_latest: warn_template = ( f"A new version of {client_name} is %s. " - "Please upgrade to get new features and ensure compatibility with the server." + "Upgrade to get new features and ensure compatibility with the server." ) if ( version_current.major < version_latest.major diff --git a/src/ansys/simai/core/data/base.py b/src/ansys/simai/core/data/base.py index f698145d..cbff4b16 100644 --- a/src/ansys/simai/core/data/base.py +++ b/src/ansys/simai/core/data/base.py @@ -71,14 +71,14 @@ def __repr__(self) -> str: @property def id(self) -> str: - """The id of the object on the server.""" + """ID of the object on the server.""" # Uses the :py:attr:`id_key` to determine the name of the id. # :py:attr:`id_key` defaults to id return self._fields[self.id_key] @property def fields(self) -> dict: - """A dictionary containing the raw object representation.""" + """Dictionary containing the raw object representation.""" return self._fields @fields.setter @@ -96,7 +96,7 @@ def _classname(self): class ComputableDataModel(DataModel): - """Base class for all computable models whose creation eventually succeeds or fails.""" + """Provides the base class for all computable models whose creation eventually succeeds or fails.""" def __init__( self, @@ -134,8 +134,9 @@ def _set_is_pending(self): @property def is_pending(self): - """Boolean indicating the object is still in creation. - Becomes False once it is either successful or has failed. + """Boolean indicating if the object is still in creation. + The value becomes ``False`` once object creation is either successful + or has failed. See Also: - :meth:`~wait` @@ -162,7 +163,7 @@ def has_failed(self): @property def is_ready(self): - """Boolean indicating if the object is finished creating without error. + """Boolean indicating if the object has finished creating without error. See Also: - :meth:`~wait` @@ -173,8 +174,8 @@ def is_ready(self): @property def failure_reason(self): - """Optional message that can detail the causes of the failure - of the creation of the object. + """Optional message giving the causes for why the + creation of the object failed. See Also: - :attr:`~has_failed` @@ -187,9 +188,9 @@ def _failure_message(self): return f"{self._classname} id {self.id} failed {reason_str}" def _set_is_over(self): - """Sets the object as idle, that is without loading, + """Set the object as idle, that is without loading (either because loading finished or failed), - meaning a wait() will return immediately. + meaning a wait() returns immediately. """ self._is_over.set() @@ -198,10 +199,11 @@ def wait(self, timeout: Optional[float] = None) -> bool: or fail. Args: - timeout: maximum amount of time to wait in seconds (defaults to unlimited) + timeout: Maximum amount of time in seconds to wait. The default is + ``None``, in means that there is no maximum on the time to wait. Returns: - True if the computation is over. False if the operation timed out. + ``True`` if the computation has finished, ``False`` if the operation timed out. """ is_done = self._is_over.wait(timeout) if self.has_failed: @@ -243,12 +245,12 @@ def __init__(self, client: "ansys.simai.core.client.SimAIClient"): self._client = client # Registry for known objects of this type that have been created - # locally. It is used to ensure only 1 instance of a particular class - # exists with a given id. + # locally. It is used to ensure that only one instance of a particular class + # exists with a given ID. # Note that this dictionary must retain all models created locally # without relying on the user to keep a reference on them - # (meaning e.g. WeakValueDictionary is not appropriate), - # as he will expect simai.wait() to work on any object, even those + # (meaning for example that WeakValueDictionary is not appropriate), + # as the use expects simai.wait() to work on any object, even those # with no explicit reference. self._registry: Dict[str, DataModel] = {} @@ -261,13 +263,13 @@ def _model_from( ) -> DataModel: # _model_from overrides object data (fields), # thus it is, and should, only be called with - # complete data from the server, i.e. from a GET request. + # complete data from the server, such as from a GET request. if "id" not in data: raise ValueError("Cannot instantiate data object without an id") item_id = data["id"] if item_id in self._registry: - # An instance with this id already exists: update and return it, - # so we don't have two objects with different data + # An instance with this ID already exists: Update and return it, + # so there are not two objects with different data. item = self._registry[item_id] # update with new data item.fields = data @@ -279,13 +281,13 @@ def _model_from( return item def _unregister_item(self, item: DataModel): - """Removes the item from the internal registry, + """Remove the item from the internal registry, mainly after a deletion. """ self._unregister_item_with_id(item.id) def _unregister_item_with_id(self, item_id: str): - """Removes the item from the internal registry, + """Remove the item from the internal registry, mainly after a deletion. """ item = self._registry.pop(item_id, None) @@ -307,14 +309,14 @@ def _handle_sse_event(self, data): elif data["type"] == "resource" and isinstance(item, UploadableResourceMixin): item._handle_resource_sse_event(data) else: - logger.error("Received a server update that could not be interpreted") + logger.error("Received a server update that could not be interpreted.") def _all_objects(self): return self._registry.values() class UploadableResourceMixin: - """Class used to expand DataModel with support for resource type message from SSE.""" + """Provides the class used to expand ``DataModel`` with support for a resource type message from SSE.""" def __init__(self, *args, **kwargs): super().__init__(*args, **kwargs) @@ -335,4 +337,4 @@ def _handle_resource_sse_event(self, data): f"Could not complete upload because: {data.get('reason', 'Upload failed')}" ) else: - logger.error("Invalid resource state") + logger.error("Invalid resource state.") diff --git a/src/ansys/simai/core/data/design_of_experiments.py b/src/ansys/simai/core/data/design_of_experiments.py index 57c71ece..220a11c7 100644 --- a/src/ansys/simai/core/data/design_of_experiments.py +++ b/src/ansys/simai/core/data/design_of_experiments.py @@ -29,20 +29,21 @@ class DesignOfExperimentsCollection: - """Collection of methods related to the whole Design of Experiments. + """Provides a collection of methods related to design of experiments. - Accessed through ``client.design_of_experiments``. + This class is accessed through ``client.design_of_experiments``. """ def __init__(self, client: "ansys.simai.core.client.SimAIClient"): self._client = client def download(self, file: Union[str, Path], format: str = "xlsx") -> None: - """Downloads the design of experiments data to the specified file or path. + """Download the design of experiments data to the specified file or path. Args: - file: the path of the file to put the content into - format: the expected format, either ``xlsx`` for excel or ``csv`` (by default ``xlsx``). + file: Path of the file to put the content into. + format: Expected format. The default is ``'xlsx'``. Options are ``'xlsx'`` + and ``'csv'``. Example: .. code-block:: python @@ -57,14 +58,15 @@ def download(self, file: Union[str, Path], format: str = "xlsx") -> None: ) def in_memory(self, format: Optional[str] = "csv") -> io.BytesIO: - """Loads the design of experiments data in memory. + """Load the design of experiments data in memory. Args: - file: the path of the file to put the content into - format: the expected format, either ``xlsx`` for excel or ``csv`` (by default ``csv``). + file: Path of the file to put the content into. + format: Expected format. The default is ``'csv'``. Options are ``'xlsx'`` + and ``'csv'``. Returns: - A :class:`~io.BytesIO` object containing the design of experiments data. + :class:`~io.BytesIO` object containing the design of experiments data. Example: .. code-block:: python @@ -73,7 +75,7 @@ def in_memory(self, format: Optional[str] = "csv") -> io.BytesIO: simai = ansys.simai.core.from_config() data = simai.design_of_experiments.in_memory(format="csv") - # Read data with CSV Reader, ... + # Read data with CSV reader, ... """ return self._client._api.download_design_of_experiments( None, format, self._client.current_workspace.id diff --git a/src/ansys/simai/core/data/downloads.py b/src/ansys/simai/core/data/downloads.py index 74f0ae46..02bae203 100644 --- a/src/ansys/simai/core/data/downloads.py +++ b/src/ansys/simai/core/data/downloads.py @@ -30,7 +30,7 @@ class DownloadableResult: - """Object representing a result data for a post-processing in binary format.""" + """Provides the object representing a result data for a postprocessing in binary format.""" def __init__( self, @@ -45,18 +45,18 @@ def __init__( self._request_json_body = request_json_body def download(self, file: File) -> None: - """Downloads the post-processing data to the specified file or path. + """Download the postprocessing data to the specified file or path. Args: - file: Binary file-object or path of file to download the data into. + file: Binary file-object or path of the file to download the data into. """ self._download_file(self.url, file) def in_memory(self) -> BytesIO: - """Loads the post-processing data in memory. + """Load the postprocessing data in memory. Returns: - A :class:`io.BytesIO` object containing the post-processing data + :class:`io.BytesIO` object containing the postprocessing data. """ return self._download_file(self.url) diff --git a/src/ansys/simai/core/data/geometries.py b/src/ansys/simai/core/data/geometries.py index 540d0bf4..b3257397 100644 --- a/src/ansys/simai/core/data/geometries.py +++ b/src/ansys/simai/core/data/geometries.py @@ -48,59 +48,61 @@ class Geometry(UploadableResourceMixin, ComputableDataModel): - """Local representation of a geometry object.""" + """Provides the local representation of a geometry object.""" def __repr__(self) -> str: return f"" @property def name(self) -> str: - """The name of the geometry.""" + """Name of the geometry.""" return self.fields["name"] @property def metadata(self) -> Dict[str, Any]: - """User-given key-value associated to the geometry.""" + """User-given key-value associated with the geometry.""" return self.fields["metadata"] @property def creation_time(self) -> str: - """Time when the geometry was created, in an UTC ISO8601 format string.""" + """Time when the geometry was created in a UTC ISO8601 format string.""" return self.fields["creation_time"] def rename(self, name: str) -> None: - """Change the name of a geometry. + """Change the name of the geometry. Args: - name: the new name to give to the geometry + name: New name to give to the geometry. Note: - Only the stem part will be modified, the file extension is immutable. + Only the stem part is modified. The file extension is immutable. If a file extension is provided, it must be the same as the original one. - If your new filename already contains dots other than for the extension, + If the new filename already contains dots other than for the extension, the extension must be provided. """ self._client._api.update_geometry(self.id, name=name) self.reload() def update_metadata(self, metadata: Dict[str, Union[str, Number, bool]]): - """Change the metadata of a geometry. + """Change the metadata of the geometry. - - New keys-values will be added. - - Existing keys-values will be overwritten. - - Other key-values are not changed, to delete a metadata set it to None explicitly. + - New keys-values are added. + - Existing keys-values are overwritten. + - Other key-values are not changed. + + To delete a metadata, set it to ``None`` explicitly. Args: - metadata: dictionary with the new data + metadata: Dictionary with the new data. Examples: - Add or update a metadata + Add or update a metadata. .. code-block:: python geometry.update_metadata({"new_metadata": "value", "existing_metadata": 10}) - Remove all metadatas + Remove all metadata. .. code-block:: python @@ -112,7 +114,7 @@ def update_metadata(self, metadata: Dict[str, Union[str, Number, bool]]): def delete(self) -> None: """Delete the geometry and its data from the server. - All the objects associated to this geometry (predictions and post-processings) + All the objects associated with this geometry (predictions and postprocessings) are also deleted. See Also: @@ -123,24 +125,24 @@ def delete(self) -> None: def run_prediction( self, boundary_conditions: Optional[BoundaryConditions] = None, **kwargs ) -> "Prediction": - """Run a new prediction or return existing prediction. + """Run a new prediction or return an existing prediction. - This is a non-blocking method. The prediction object will be returned. - This prediction may be incomplete if its computation is not finished, - in which case the information will be filled once computation is complete. - State of the computation can be monitored with the prediction's is_ready - attribute, or waited upon with its wait() method. + This is a non-blocking method. The prediction object is returned. + This object may be incomplete if its computation is not finished, + in which case the information is filled once the computation is complete. + The state of the computation can be monitored with the prediction's ``is_ready`` + attribute or waited upon with its ``wait()`` method. - To learn more about the expected boundary conditions in your workspace you can do - ``simai.current_workspace.model.boundary_conditions`` or ``simai.predictions.boundary_conditions`` + To learn more about the expected boundary conditions in your workspace, you can use the + ``simai.current_workspace.model.boundary_conditions`` or ``simai.predictions.boundary_conditions``, where ``ex`` is your `~ansys.simai.core.client.SimAIClient` object. Args: - boundary_conditions: The boundary conditions to apply, as a dictionary - **kwargs: Boundary conditions can also be passed as kwargs + boundary_conditions: Boundary conditions to apply as a dictionary. + **kwargs: Boundary conditions to pass as keyword arguments. Returns: - The created prediction object + Created prediction object. Raises: ProcessingError: If the server failed to process the request. @@ -152,7 +154,7 @@ def run_prediction( geometry = simai.geometries.list()[0] geometry.run_prediction(dict(Vx=10.5, Vy=2)) - Using kwargs: + Use kwargs: .. code-block:: python @@ -163,23 +165,24 @@ def run_prediction( return self._client.predictions._model_from(prediction_response) def get_predictions(self) -> List["Prediction"]: - """Get the predictions objects associated to the geometry.""" + """Get the prediction objects associated with the geometry.""" predictions_data = self._client._api.get_geometry_predictions(self.id) return [self._client.predictions._model_from(pred_data) for pred_data in predictions_data] def download( self, file: Optional[File] = None, monitor_callback: Optional[MonitorCallback] = None ) -> Union[None, BinaryIO]: - """Downloads the geometry into the provided file, or in memory if no file is provided. + """Download the geometry into the provided file or in memory if no file is provided. Args: - file: An optional binary file-object or the path of the file to put the + file: Optional binary file-object or the path of the file to put the content into. - monitor_callback: An optional callback to monitor the progress of the download. - See :obj:`~ansys.simai.core.data.types.MonitorCallback` for details. + monitor_callback: Optional callback to monitor the progress of the download. + For more information, see the :obj:`~ansys.simai.core.data.types.MonitorCallback` + object. Returns: - None if a file is provided, a :class:`~io.BytesIO` with the geometry's content otherwise. + ``None`` if a file is provided or the :class:`~io.BytesIO` object with the geometry's content otherwise. """ return self._client._api.download_geometry(self.id, file, monitor_callback) @@ -193,64 +196,60 @@ def sweep( include_diagonals: Optional[bool] = None, tolerance: Optional[float] = None, ) -> List["Geometry"]: - """Returns geometries whose metadata are closest to the candidate geometry. + """Return geometries whose metadata are closest to the candidate geometry. - Sweep returns geometries which have the values closest to the candidate - geometry, for each considered metadata variable. For instance, if - sweeping along "length" and "width" metadata variables, the method - will return geometries which have identical width, and closest smaller - and bigger length; as well as identical length, and closest smaller - and bigger width. + This method returns geometries that have the values closest to the candidate + geometry for each considered metadata variable. For example, if + sweeping along ``length`` and ``width`` metadata variables, the method + returns geometries that have identical width and the closest smaller + and larger length, as well as identical length and the closest smaller + and larger width. - The ``fixed_metadata`` array allows to fix one or several variables: - for each fixed variable, the resulting geometries must have + The ``fixed_metadata`` array allows you to fix one or several variables. + For each fixed variable, the resulting geometries must have a metadata value equal to the considered geometry. For example, if - ``fixed_metadata`` is ``["xbow"]``, every result ``geometry.metadata["xbow"]`` - must be equal to the ``candidate_geometry.metadata["xbow"]`` + ``fixed_metadata`` is ``["xbow"]``, every ``geometry.metadata["xbow"]`` + result must be equal to the ``candidate_geometry.metadata["xbow"]``. Metadata passed neither in ``swept_metadata`` nor in ``fixed_metadata`` are ignored and can have any value (or absence thereof). Args: - swept_metadata: Optional metadata name, or list of metadata names, - which should be considered. Only variables containing numerical - values are supported. - If not passed, all metadata containing numerical values are - taken into account. - fixed_metadata: Optional list of metadata variables which should + swept_metadata: Optional metadata name or a list of metadata names + to consider. Only variables containing numerical values are + supported. If no metadata names are passed, all metadata containing + numerical values are taken into account. + fixed_metadata: Optional list of metadata variables that should be fixed, meaning that all the resulting geometries - will have those values equal to the candidate geometry. - geometries: Optional list of Geometry objects to consider for sweeping. - If not passed, all geometries are used. - tolerance: Optional delta, if the difference between two numbers - is lower than tolerance, they are considered as equal - (default 10**-6). - order: Optional depth of sweep, defaults to 1. Determines the number - of returned groups of equal smaller and bigger values for each - swept variable. For instance if sweeping on a space with - lengths [1, 2.1, 2.1, 3.5, 3.5, 4, 4] - from the candidate with length=1, - order=2 will return the geometries with lengths 2.1, 2.1, 3.5, 3.5. - include_center: Optional boolean (defaults to False) - determining if geometries with values equal to the candidate - geometry (including the candidate itself) should be returned - among the result. - include_diagonals: Optional boolean (defaults to False) - determining when sweeping on more than 1 variable, - if "diagonals" should be included. - For instance, if sweeping on 2 variables from point (0, 0) - and with order 1, - in addition to (0, 1) and (1, 0), geometry (1, 1) will be returned + have those values equal to the candidate geometry. + geometries: Optional list of ``Geometry`` objects to consider for sweeping. + If no ``Geometry`` objects are passed, all geometries are used. + tolerance: Optional delta. If the difference between two numbers + is lower than the tolerance, they are considered as equal. + The default is ``10**-6``. + order: Optional depth of the sweep. The default is ``1``. This parameter + determines the number of returned groups of equal smaller and + larger values for each swept variable. For example, if sweeping + on a space with lengths ``[1, 2.1, 2.1, 3.5, 3.5, 4, 4]`` + from the candidate with ``length=1``, ``order=2`` returns + the geometries with lengths ``2.1, 2.1, 3.5, 3.5``. + include_center: Optional Boolean indicating whether geometries with values + equal to the candidate geometry (including the candidate itself) are + to be returned among the result. The default is ``False``. + include_diagonals: Optional Boolean indicating whether to include diagonals + when sweeping on more than one variable. The default is ``False``. + For example, if sweeping on two variables from point ``(0, 0)`` + and with ``order=1``, in addition to ``(0, 1)`` and ``(1, 0)``, + geometry ``(1, 1)`` is returned. Returns: - A list of Geometry objects, - neighboring the candidate geometry for each metadata + List of ``Geometry`` objects neighboring the candidate geometry for each metadata. Raises: - ValueError: if a passed variable doesn't exist in the - candidate geometry - ValueError: if the condidered metadata contains non-numerical values - or mixed numerical and non numerical values + ValueError: If a passed variable doesn't exist in the + candidate geometry. + ValueError: If the considered metadata contains non-numerical values + or mixed numerical and non numerical values. Example: .. code-block:: python @@ -276,9 +275,9 @@ def sweep( class GeometryDirectory(Directory[Geometry]): - """Collection of methods related to geometries. + """Provides a collection of methods related to geometries. - Accessed through ``client.geometries``. + This class is accessed through ``client.geometries``. Example: .. code-block:: python @@ -299,20 +298,21 @@ def list( """List geometries from the server that belong to the currently set workspace or the specified one. Args: - workspace: The id or :class:`model <.workspaces.Workspace>` of the workspace to list geometries for. - Necessary if no global workspace is set for the client. + workspace: ID or :class:`model <.workspaces.Workspace>` of the workspace to list geometries for. + This parameter is required if no global workspace is set for the client. filters: Optional filters. Only the elements with matching key-values in - their metadata will be returned. Each filter can be either: + their metadata are returned. Each filter can be one of the following data types: - - a string - - a numerical value (int or float) - - a :class:`Range`, to filter values matching a given numerical range of values + - A string + - A numerical value (int or float) + - A :py:class:`Range` condition for filtering values matching a + given numerical range of values Returns: - The list of all or filtered geometries on the server. + List of all or filtered geometries on the server. Raises: - TypeError: if a Range condition is applied on non-numerical metadata + TypeError: If a :py:class:`Range` condition is applied on non-numerical metadata. """ workspace_id = get_id_from_identifiable(workspace, default=self._client._current_workspace) @@ -341,16 +341,18 @@ def filter(self, **kwargs: Dict[str, Union[str, float, Range]]) -> List[Geometry Args: kwargs: Filters to apply. Only the elements with matching key-values in - their metadata will be returned. Each filter can be either: - - a string - - a numerical value (int or float) - - a :py:class:`Range`, to filter values matching a given numerical range of values + their metadata are returned. Each filter can be one of the following data types: + + - A string + - A numerical value (int or float) + - A :py:class:`Range` condition for filtering values matching a + given numerical range of values Returns: - The list of filtered geometries on the server. + List of filtered geometries on the server. Raises: - TypeError: if a Range condition is applied on non-numerical metadata + TypeError: If a :py:class:`Range` condition is applied on non-numerical metadata. """ return self.list(filters=kwargs) @@ -360,24 +362,26 @@ def get( id: Optional[str] = None, workspace: Optional[Identifiable[Workspace]] = None, ) -> Geometry: - """Get a specific geometry object from the server, by name or by id. + """Get a specific geometry object from the server either by name or ID. + + You can specify either the ID or the name, not both. Args: - name: The name of the geometry to get, optional. - id: The id of the geometry to get, optional - workspace: The id or :class:`model <.workspaces.Workspace>` of the workspace containing the geometry. - Necessary when using name and no global workspace is set for the client + name: Name of the geometry. + id: ID of the geometry. + workspace: ID or :class:`model <.workspaces.Workspace>` of the workspace containing the geometry. + This parameter is necessary if providing a name and no global workspace is set for the client. Returns: - A :py:class:`Geometry` + :py:class:`Geometry`. Raises: - InvalidArguments: If neither name nor id is given - NotFoundError: No geometry with the given name or id exists + InvalidArguments: If neither a name nor an ID is given. + NotFoundError: If no geometry with the given name or ID exists. Examples: - Get a geometry by name: + Get a geometry by name. .. code-block:: python @@ -387,14 +391,14 @@ def get( geometry = simai.geometries.get("my_geometry.stl") # geometry = simai.geometries.get(name="my_geometry.stl") # is equivalent - Get a geometry by id: + Get a geometry by ID. .. code-block:: python geometry = simai.geometries.get(id="abcdef12") """ if name and id: - raise InvalidArguments("Name and Id cannot be both specified.") + raise InvalidArguments("Name and ID cannot both be specified.") if name: return self._model_from( self._client._api.get_geometry_by_name( @@ -404,19 +408,19 @@ def get( ) if id: return self._model_from(self._client._api.get_geometry(id)) - raise InvalidArguments("Either the name or the id must be specified.") + raise InvalidArguments("Either the name or the ID must be specified.") def delete(self, geometry: Identifiable[Geometry]) -> None: """Delete a specific geometry and its data from the server. - All the objects associated to this geometry (predictions and post-processings) + All the objects associated with this geometry (predictions and postprocessings) are also deleted. Args: - geometry: The id or :class:`model ` of the geometry to delete + geometry: ID or :class:`model ` of the geometry. Raises: - NotFoundError: No geometry with the given id exists + NotFoundError: No geometry with the given ID exists. See Also: :func:`Geometry.delete` @@ -431,21 +435,22 @@ def upload( # noqa: D417 monitor_callback: Optional[MonitorCallback] = None, **kwargs, ) -> Geometry: - """Upload a geometry to SimAI's platform. + """Upload a geometry to the SimAI platform. Args: - file: A binary file-object or the path of the file to open. - See :class:`~ansys.simai.core.data.types.NamedFile` for more details. - metadata: Optional metadatas to add to the geometry, simple key-value store. + file: Binary file-object or the path of the geometry to open. + For more information, see the :class:`~ansys.simai.core.data.types.NamedFile` class. + metadata: Optional metadata to add to the geometry's simple key-value store. Lists and nested objects are not supported. - workspace: The id or :class:`model <.workspaces.Workspace>` in which the geometry will be uploaded. - Necessary if no workspace is set for the client. - monitor_callback: An optional callback to monitor the progress of the download. - See :obj:`~ansys.simai.core.data.types.MonitorCallback` for details. - + workspace_id: ID or :class:`model <.workspaces.Workspace>` of the workspace to + upload the geometry to. This parameter is only necessary if no workspace + is set for the client. + monitor_callback: Optional callback for monitoring the progress of the download. + For more information, see the :obj:`~ansys.simai.core.data.types.MonitorCallback` + object. Returns: - The created :py:class:`Geometry` object + Created :py:class:`Geometry` object. """ workspace_id = get_id_from_identifiable(workspace, default=self._client._current_workspace) workspace = self._client.workspaces.get(workspace_id) @@ -473,17 +478,18 @@ def download( file: Optional[File] = None, monitor_callback: Optional[MonitorCallback] = None, ) -> Union[None, BinaryIO]: - """Downloads the geometry with the given id into the file at the given path. + """Download the geometry with the given ID into the file at the given path. Args: - geometry: The id or :class:`model ` of the geometry to download - file: An optional binary file-object or the path of the file to put the - content into - monitor_callback: An optional callback to monitor the progress of the download. - See :obj:`~ansys.simai.core.data.types.MonitorCallback` for details. + geometry: ID or :class:`model ` of the geometry. + file: Optional binary file-object or the path of the file to put the + content into. + monitor_callback: Optional callback for monitoring the progress of the download. + For more information, see the :obj:`~ansys.simai.core.data.types.MonitorCallback` + object. Returns: - None if a file is provided, a :class:`~io.BytesIO` with the geometry's content otherwise + ``None`` if a file is provided or a :class:`~io.BytesIO` object with the geometry's content otherwise. See Also: :func:`Geometry.download` @@ -503,9 +509,9 @@ def sweep( include_diagonals: Optional[bool] = None, tolerance: Optional[float] = None, ) -> List["Geometry"]: - """Returns geometries whose metadata are closest to the candidate geometry. + """Get the geometries whose metadata are closest to the candidate geometry. - See :func:`Geometry.sweep` for complete description and usage + For more information, see the :func:`Geometry.sweep` method. Example: .. code-block:: python diff --git a/src/ansys/simai/core/data/geometry_utils.py b/src/ansys/simai/core/data/geometry_utils.py index 02a1dea7..a8b1a7d8 100644 --- a/src/ansys/simai/core/data/geometry_utils.py +++ b/src/ansys/simai/core/data/geometry_utils.py @@ -52,9 +52,9 @@ def _sweep( if order is None: order = 1 if not isinstance(order, int): - raise TypeError(f"The order argument must be a positive integer (passed {order})") + raise TypeError(f"The 'order' argument must be a positive integer (passed {order}).") if order < 1: - raise ValueError(f"The order argument must be a positive integer (passed {order})") + raise ValueError(f"The 'order' argument must be a positive integer (passed {order}).") if tolerance is None: tolerance = DEFAULT_COMPARISON_EPSILON validate_tolerance_parameter(tolerance) @@ -68,12 +68,12 @@ def _sweep( swept_metadata = _enforce_as_list_passing_predicate( swept_metadata, lambda s: isinstance(s, str), - """The swept_metadata argument must be a geometry metadata name - or a list of metadata names + """The 'swept_metadata' argument must be a geometry metadata name + or a list of metadata names. """, ) - # Make sure all passed variables exist in geometry metadata, + # Make sure all passed variables exist in geometry metadata # and that the data is numerical for variable in swept_metadata: if variable not in candidate_geometry.metadata: @@ -100,7 +100,7 @@ def _sweep( # via multiple variables neighboring_geometries: Set["Geometry"] = set() - # if include_diagonal, we collect boundaries + # if include_diagonal, collect boundaries if include_diagonals: boundaries: Dict[str, Tuple[float, float]] = {} @@ -142,11 +142,11 @@ def _sweep( # 3. Now keep only as many smaller and bigger buckets as `order`. # Keep the central values if include_center is True. - # Smaller neighbors will selected by a deque of max size=`order`: + # Smaller neighbors are selected by a deque of max size=`order`: # we push in it all the smaller values; the deque - # will keep only as many as we want (nb=`order`) + # keeps only as many as we want (nb=`order`) smaller_groups = collections.deque([], order) - # Bigger_geometries are collected in a list, by counting directly + # Bigger_geometries are collected in a list by counting directly equal_and_bigger_groups = [] found_bigger_buckets = 0 for bucket, geometry_group in buckets: @@ -178,7 +178,7 @@ def _sweep( boundaries[checked_variable] = (min_boundary, max_boundary) if include_diagonals: # build the list of ranges matching all boundaries. - # selected geometries will need to match all those ranges. + # selected geometries must match all those ranges. all_ranges: Dict[str, Range] = {} for variable, (min_boundary, max_boundary) in boundaries.items(): # If not include_center, we want the geometries @@ -220,14 +220,15 @@ def _are_geometries_metadata_equal( tolerance: Optional[float] = None, ) -> bool: """Test if all compared metadata are equal between two geometries. + If a metadata is absent, consider the comparison as false. Returns: - True if metadata are equal for considered columns + ``True`` if the metadata are equal for considered columns. Raises: - ValueError: if one considered column contains numerical - and the other non-numerical data. + ValueError: If one considered column contains numerical + data and the other column contains non-numerical data. """ for variable in compared_variables: if variable not in left_geometry.metadata or variable not in right_geometry.metadata: @@ -235,7 +236,7 @@ def _are_geometries_metadata_equal( left_val: float = left_geometry.metadata.get(variable) # type: ignore right_val: float = right_geometry.metadata.get(variable) # type: ignore if is_number(left_val) != is_number(right_val): - raise ValueError("Mixed numerical and non-numerical values in metadata {variable}") + raise ValueError("Mixed numerical and non-numerical values in metadata {variable}.") if is_number(left_val): if not is_equal_with_tolerance(left_val, right_val, tolerance=tolerance): return False @@ -260,7 +261,7 @@ def _geometry_matches_range_constraints( raise TypeError( """ Range constraint can only be used - on numerical metadata""" + on numerical metadata.""" ) if not range.match_value(value): return False diff --git a/src/ansys/simai/core/data/lists.py b/src/ansys/simai/core/data/lists.py index fbf46210..2ca92378 100644 --- a/src/ansys/simai/core/data/lists.py +++ b/src/ansys/simai/core/data/lists.py @@ -36,10 +36,10 @@ class PPList(List, Generic[T]): - """A subclass of :class:`list` for storing post-processings, adding a few shortcut methods. + """Provides a subclass of the :class:`list` class for storing postprocessings and adding a few shortcut methods. - As a :class:`list` subclass, PPList support any list operation: - its elements can be iterated on and accessed by index. + As a :class:`list` subclass, the ``PPList`` class supports any list operation. + Its elements can be iterated on and accessed by index. """ def __init__(self, selection: "Selection", post: Callable[["Prediction"], PostProcessing]): # noqa: D107 @@ -51,53 +51,52 @@ def __init__(self, selection: "Selection", post: Callable[["Prediction"], PostPr @property def data(self) -> Union[List[Dict[str, List]], List[DownloadableResult]]: - """Returns a list containing the data of the underlying post-processings. + """List containing the data of the underlying postprocessings. - This is a blocking method, which will return once the data of all - post-processings is ready. + This is a blocking method, which returns once the data of all + postprocessings is ready. """ return [pp.data for pp in self] def wait(self): - """Wait for all concerned post-processings to finish.""" + """Wait for all concerned postprocessings to finish.""" _foreach_despite_errors(lambda pp: pp.wait(), self) class ExportablePPList(PPList, Generic[T]): - """A subclass of :class:`PPList` allowing to download the results of a group of post-processings. + """Provides a subclass of the :class:`PPList` class for downloading the results of a group of postprocessings. - As a :class:`list` subclass, ExportablePPList support any list operation: - its elements can be iterated on and accessed by index. + As a :class:`list` subclass, the ``ExportablePPList`` class supports any list operation. + Its elements can be iterated on and accessed by index. """ def export(self, format: Optional[str] = "json") -> DownloadableResult: - """Export the post-processings results in the desired format. + """Export the postprocessing results in the desired format. - Accessing this property will block until the data is ready. + Accessing this property blocks until the data is ready. Args: - format: format in which the data is to be exported: - one of ``json``, ``csv.zip`` or ``xlsx``, defaults to ``json``. - Please note that ``csv.zip`` exports a zip archive containing - multiple csv sheets. + format: format to exported data in. The default is ``'json'``. + Options are ``'csv.zip'``, ``'json'``, and ``'xlsx'``. + Note that the ``'csv.zip'`` option exports a ZIP file containing + multiple CSV sheets. Returns: - A :class:`~ansys.simai.core.data.downloads.DownloadableResult` object allowing - to download the exported data into a file - or access it in memory. + :class:`~ansys.simai.core.data.downloads.DownloadableResult` object for + downloading the exported data into a file or access it in memory. """ if format not in ["json", "csv.zip", "xlsx", "csv"]: raise InvalidArguments( - f"Export format must be among json, csv.zip, xlsx (passed {format})." + f"Export format must be json, csv.zip, or xlsx (passed {format})." ) if format == "csv": warnings.warn( - "csv format will be deprecated, use csv.zip instead", + "The ``csv`` option is being deprecated. Use the ``csv.zip`` option instead", PendingDeprecationWarning, stacklevel=1, ) if len(self) < 1: - raise InvalidArguments("Selection contains no exportable post-processing.") + raise InvalidArguments("Selection contains no exportable postprocessing.") # Wait for all concerned post-processings to finish (and raise if errors) self.wait() client = self[0]._client diff --git a/src/ansys/simai/core/data/optimizations.py b/src/ansys/simai/core/data/optimizations.py index 972b490e..de2988e4 100644 --- a/src/ansys/simai/core/data/optimizations.py +++ b/src/ansys/simai/core/data/optimizations.py @@ -36,7 +36,7 @@ class Optimization(ComputableDataModel): - """Local representation of an optimization definition object.""" + """Provides the local representation of an optimization definition object.""" def _try_geometry( self, geometry: Identifiable[Geometry], geometry_parameters: Dict @@ -47,7 +47,7 @@ def _try_geometry( class OptimizationTrialRun(ComputableDataModel): - """Local representation of an optimization trial run object. + """Provides the local representation of an optimization trial run object. The optimization trial run is an iteration of the optimization process. Each trial run tests a geometry and returns new parameters for the next geometry to try. @@ -55,9 +55,9 @@ class OptimizationTrialRun(ComputableDataModel): class OptimizationDirectory(Directory[Optimization]): - """Collection of methods related to optimizations. + """Provides a collection of methods related to optimizations. - Accessed through ``client.optimizations``. + This class is accessed through ``client.optimizations``. Example: .. code-block:: python @@ -74,10 +74,10 @@ def get(self, optimization_id: str) -> Optimization: """Get a specific optimization object from the server. Args: - optimization_id: The id of the optimization to get + optimization_id: ID of the optimization. Returns: - A :py:class:`Optimization` + :py:class:`Optimization`. """ return self._model_from(self._client._api.get_optimization(optimization_id)) @@ -96,33 +96,37 @@ def run( """Run an optimization process. Args: - geometry_generation_fn: The function that will be called to generate a new geometry - with the generated parameters. - Should return a :obj:`~ansys.simai.core.data.types.NamedFile` - geometry_parameters: Specifies the name of the geometry parameters and their bounds or possible values (choices) - boundary_conditions: The values of the boundary conditions at which the optimization is performed. - They should map to existing boundary conditions in your project/workspace + geometry_generation_fn: Function to call to generate a new geometry + with the generated parameters. This parameter should return a + :obj:`~ansys.simai.core.data.types.NamedFile` object. + geometry_parameters: Name of the geometry parameters and their bounds or possible values (choices). + boundary_conditions: Values of the boundary conditions to perform the optimization at. + The values should map to existing boundary conditions in your project/workspace. minimize: List of global coefficients to minimize. - They should map to existing coefficients in your project/workspace + The global coefficients should map to existing coefficients in your project/workspace. maximize: List of global coefficients to maximize. - They should map to existing coefficients in your project/workspace - outcome_constraints: - List of string representing a linear inequality constraint - on a global coefficient. - Outcome constraint should be of form ``gc >= x``, - where gc is a valid global coefficient name, - x is a float bound and comparison operator is ``>=`` or ``<=`` - n_iters: Number of iterations of the optimization loop - show_progress: Whether to print progress on stdout - workspace: The workspace in which to run the optimization. Defaults to the configured workspace if not specified + The global coefficients should map to existing coefficients in your project/workspace. + outcome_constraints: List of strings representing a linear inequality constraint + on a global coefficient. The outcome constraint should be in the form ``gc >= x``, + where: + + - ``gc`` is a valid global coefficient name. + - ``x`` is a float bound. + - The comparison operator is ``>=`` or ``<=``. + + n_iters: Number of iterations of the optimization loop. + show_progress: Whether to print progress on stdout. + workspace: Workspace to run the optimization in. If a workspace is + not specified, the default is the configured workspace. Returns: - A list of dictionaries representing the result of each iterations. The list can be shorter - than the number of iterations when constraints are specified. + List of dictionaries representing the result of each iteration. when constraints + are specified, the list can be shorter than the number of iterations. Warning: - This is a long running process and your computer needs to be powered on to generate the iterations. - This method will attempt to prevent your computer from sleeping but please keep your computer open during the process. + This is a long-running process and your computer must be powered on to generate the iterations. + This method attempts to prevent your computer from sleeping, keeping your computer open + during the process. Example: .. code-block:: python @@ -154,7 +158,7 @@ def my_geometry_generation_function(param_a, param_b): """ workspace_id = get_id_from_identifiable(workspace, False, self._client._current_workspace) if not minimize and not maximize: - raise InvalidArguments("No global coefficient to optimize") + raise InvalidArguments("No global coefficient to optimize.") objective = {} if minimize: for global_coefficient in minimize: @@ -170,32 +174,32 @@ def my_geometry_generation_function(param_a, param_b): "outcome_constraints": outcome_constraints or [], } with tqdm(total=n_iters, disable=not show_progress) as progress_bar: - progress_bar.set_description("Creating optimization definition") + progress_bar.set_description("Creating optimization definition.") optimization = self._model_from( self._client._api.define_optimization(workspace_id, optimization_parameters) ) optimization.wait() geometry_parameters = optimization.fields["initial_geometry_parameters"] - logger.debug("Optimization defined, starting optimization loop") + logger.debug("Optimization defined. Starting optimization loop.") iterations_results: List[Dict] = [] with keep.running() as k: if not k.success: logger.info("Failed to get sleep inhibition lock.") while geometry_parameters: - logger.debug(f"Generating geometry with parameters {geometry_parameters}") - progress_bar.set_description("Generating geometry") + logger.debug(f"Generating geometry with parameters {geometry_parameters}.") + progress_bar.set_description("Generating geometry.") # TODO: Somehow keep session alive for long geometry generation generated_geometry = geometry_generation_fn(**geometry_parameters) - logger.debug("Uploading geometry") - progress_bar.set_description("Uploading geometry") + logger.debug("Uploading geometry.") + progress_bar.set_description("Uploading geometry.") # TODO: Name geometry ourselves ? Then we need to know the output format geometry = self._client.geometries.upload( generated_geometry, metadata=geometry_parameters, workspace_id=workspace_id, ) - logger.debug("Running trial") - progress_bar.set_description("Running trial") + logger.debug("Running trial.") + progress_bar.set_description("Running trial.") trial_run = optimization.try_geometry(geometry, geometry_parameters) trial_run.wait() iteration_result = { @@ -208,10 +212,10 @@ def my_geometry_generation_function(param_a, param_b): else: logger.debug("Trial run results did not match constraints. Skipping.") geometry_parameters = trial_run.fields["next_geometry_parameters"] - logger.debug("Trial completed") + logger.debug("Trial complete.") progress_bar.update(1) - logger.debug("Optimization complete") - progress_bar.set_description("Optimization complete") + logger.debug("Optimization complete.") + progress_bar.set_description("Optimization complete.") return iterations_results @@ -220,7 +224,7 @@ class OptimizationTrialRunDirectory(Directory[OptimizationTrialRun]): _data_model = OptimizationTrialRun def get(self, trial_run_id: str): - """Get a specific trial run object from the server.""" + """Get a specific trial run from the server.""" return self._model_from(self._client._api.get_optimization_trial_run(trial_run_id)) def _try_geometry( diff --git a/src/ansys/simai/core/data/post_processings.py b/src/ansys/simai/core/data/post_processings.py index 32211d8a..6aa1ea77 100644 --- a/src/ansys/simai/core/data/post_processings.py +++ b/src/ansys/simai/core/data/post_processings.py @@ -42,13 +42,13 @@ class PostProcessing(ComputableDataModel, ABC): - """Local representation of a PostProcessing object. + """Provides the local representation of a ``PostProcessing`` object. - This is an abstract class, depending on the post-processing a different implementation - will be returned, see `Available post-processings`_. + This is an abstract class. Depending on the postprocessing, a different implementation + is returned. For more information, see :ref:`available_pp`. """ - # NOTE for developers: New post-processings must be added to the root __init__.py file + # NOTE for developers: New postprocessings must be added to the root ``__init__.py`` file. def __init__(self, *args, **kwargs): super().__init__(*args, **kwargs) @@ -58,26 +58,26 @@ def __init__(self, *args, **kwargs): @property def parameters(self) -> Optional[Dict[str, Any]]: - """The parameters with which this post-processing was ran.""" + """Parameters used to run the postprocessing.""" return self.fields["location"] @property def prediction_id(self) -> str: - """The parent prediction's id. + """Parent prediction's ID. See Also: - - :attr:`prediction`: Get the parent prediction + - :attr:`prediction`: Get the parent prediction. """ return self.fields["prediction_id"] @property def prediction(self) -> "Prediction": - """The parent prediction. + """Parent prediction. - It will be queried if not already known by the current SDK session. + The parent prediction is queried if it is not already known by the current SimAPI client session. See Also: - - :attr:`prediction_id`: Return the parent prediction's id without query + - :attr:`prediction_id`: Get the parent prediction's ID without query. """ if self._prediction is None: if self.prediction_id in self._client.predictions._registry: @@ -88,32 +88,32 @@ def prediction(self) -> "Prediction": @property def type(self) -> str: - """The type of PostProcessing that this object represents.""" + """Type of postprocessing that this object represents.""" return self._fields["type"] @property @abstractmethod def data(self): - """Get the data generated by the post-processing. + """Get the data generated by the postprocessing. - Return type may vary depending on the post-processing, it can be a ``dict`` - or if the data is binary it will be a :class:`DownloadableResult` which provides - helpers to download the data into a file or into memory. + The return type may vary depending on the postprocessing. It can be a dictionary + or, if the data is binary, a :class:`DownloadableResult` object, which provides + helpers to download the data into a file or in memory. """ @classmethod def _api_name(cls) -> str: - # The name of the post-processing in API calls. Override if different from the class name. + # Name of the postprocessing in API calls. Override if different from the class name. return cls.__name__ def _get_results(self, cache=True): - """Internal method to retrieve the results of this post-processing. + """Internal method for getting the results of this postprocessing. Args: cache: - True by default to save results to cache - False for download links which expire, - thus must be queried just before download. + Whether to save results to the cache. The default is ``True``, + which saves results to the cache. If ``False`` download + links expire. Thus, results must be queried just before download. """ if cache and self._results: return self._results @@ -123,10 +123,10 @@ def _get_results(self, cache=True): return res def delete(self): - """Delete the post-processing and its result data. + """Delete the postprocessing and its result data. Raises: - NotFoundError: if the post-processing has already been deleted. + NotFoundError: If the postprocessing has already been deleted. """ self._client._api.delete_post_processing(self.id) self._unregister() @@ -136,20 +136,19 @@ def delete(self): class ExportablePostProcessing(PostProcessing, ABC): def export(self, format: Optional[str] = "json") -> DownloadableResult: - """Export the post-processing results in the desired format. + """Export the postprocessing results in the desired format. - Accessing this property will block until the data is ready. + Accessing this property blocks until the data is ready. Args: - format: format in which the data is to be exported: - one of ``json``, ``csv.zip`` or ``xlsx``, defaults to ``json``. - Please note that ``csv.zip`` exports a zip archive containing - multiple csv sheets. + format: Format to export the data in. The default is ``'json'``. + Options are ``'csv.zip'``, ``'json'``, and ``'xlsx'``. Note that + the ``'csv.zip'`` option exports a ZIP file containing + multiple CSV sheets. Returns: - A :class:`DownloadableResult` object allowing - to download the exported data into a file - or access it in memory. + :class:`DownloadableResult` object for downloading the exported + data into a file or access it in memory. """ self.wait() return DownloadableResult( @@ -161,20 +160,21 @@ def export(self, format: Optional[str] = "json") -> DownloadableResult: class GlobalCoefficients(ExportablePostProcessing): - """Representation of the global coefficients of a prediction. + """Provides the representation of the global coefficients of a prediction. - The data attribute contains a dictionary representing the GlobalCoefficients + The data attribute contains a dictionary representing the global coefficients with its pressure and velocity components. - Generated through :meth:`PredictionPostProcessings.global_coefficients()` + This class is generated through the :meth:`PredictionPostProcessings.global_coefficients()` + method. """ @property def data(self) -> Dict[str, List]: - """A dictionary containing the GlobalCoefficients including pressure + """Dictionary containing the global coefficients, including pressure and velocity components. - Accessing this property will block until the data is ready. + Accessing this property blocks until the data is ready. """ self.wait() @@ -183,43 +183,44 @@ def data(self) -> Dict[str, List]: class SurfaceEvol(ExportablePostProcessing): - """Representation of the SurfaceEvol. + """Provides the representation of the ``SurfaceEvol`` object. - Generated through :meth:`PredictionPostProcessings.surface_evol()` + This class is generated through :meth:`PredictionPostProcessings.surface_evol()` """ @property def data(self) -> DownloadableResult: - """A :class:`DownloadableResult` object allowing access to the - SurfaceEvol json data both directly in memory or to download it into a file. + """:class:`DownloadableResult` object that allows access to the + ``SurfaceEvol`` JSON data, both directly in memory any by downloading it + into a file. - Accessing this property will block until the data is ready. + Accessing this property blocks until the data is ready. """ self.wait() results = self._get_results(cache=False) return DownloadableResult(results["data"]["resources"]["json"], self._client) def as_dict(self) -> Dict[str, Any]: - """Helper to download the SurfaceEvol json data and load it as python dictionary. + """Download the SurfaceEvol JSON data and load it as a Python dictionary. - Accessing this method will block until the data is ready. + Accessing this help method blocks until the data is ready. """ return json.load(self.data.in_memory()) class Slice(PostProcessing): - """Representation of a slice from the prediction, in PNG or VTP format. + """Provides a representation of a slice from the prediction in PNG or VTP format. - Generated through :meth:`PredictionPostProcessings.slice()` + This class is generated through the :meth:`PredictionPostProcessings.slice` method. """ @property def data(self) -> DownloadableResult: - """A :class:`DownloadableResult` object allowing - to access the slice data both directly in memory - or to download it into a file. + """:class:`DownloadableResult` object that allows + access to slice data, both directly in memory + and by downloading it into a file. - Accessing this property will block until the data is ready. + Accessing this property blocks until the data is ready. Returns: A :class:`DownloadableResult` @@ -233,15 +234,15 @@ def data(self) -> DownloadableResult: class _PostProcessingVTKExport(PostProcessing, ABC): - """Representation of the result of the prediction in a format of the VTK family.""" + """Provides the representation of the result of the prediction in a format of the VTK family.""" @property def data(self) -> DownloadableResult: - """A :class:`DownloadableResult` object allowing - access to the VTK data either directly in memory - or downloaded it into a file. + """:class:`DownloadableResult` object that allows + access to the VTK data, either directly in memory + or by downloading it into a file. - Accessing this property will block until the data is ready. + Accessing this property blocks until the data is ready. """ self.wait() results = self._get_results(cache=False) @@ -249,23 +250,23 @@ def data(self) -> DownloadableResult: class VolumeVTU(_PostProcessingVTKExport): - """Export of the volume of the prediction in VTU format. + """Provides for exporting the volume of the prediction in VTU format. - Generated through :meth:`PredictionPostProcessings.volume_vtu()` + This class is generated through the :meth:`PredictionPostProcessings.volume_vtu()` method. """ class SurfaceVTP(_PostProcessingVTKExport): - """Export of the surface of the prediction in VTP format. + """Provides for exporting the surface of the prediction in VTP format. - Generated through :meth:`~PredictionPostProcessings.surface_vtp()` + This class is generated through the :meth:`~PredictionPostProcessings.surface_vtp()` method. """ class PredictionPostProcessings: - """Class acting as namespace inside :py:class:`~ansys.simai.core.data.predictions.Prediction` objects. + """Acts as the namespace inside :py:class:`~ansys.simai.core.data.predictions.Prediction` objects. - Allows to analyse the results of a prediction. + This class allows you to analyze the results of a prediction. It can be accessed from any prediction object through its :attr:`~ansys.simai.core.data.predictions.Prediction.post` property: @@ -288,88 +289,93 @@ def __init__(self, prediction: "Prediction"): def global_coefficients(self, run: bool = True) -> Optional[GlobalCoefficients]: """Compute or get the global coefficients of the prediction. - This is a non-blocking method. It will return the GlobalCoefficients + This is a non-blocking method. It returns the ``GlobalCoefficients`` object without waiting. This object may not have data right away - if computation is still in progress. Data will be filled - asynchronously once computation is finished. - State of computation can be monitored with the is_ready flag, - or waited upon with the wait() method. + if the computation is still in progress. Data is filled + asynchronously once the computation is finished. + The state of computation can be monitored with the ``is_ready`` flag + or waited upon with the ``wait()`` method. - Computation will be launched only on first call of this method. - Subsequent calls will not relaunch it. + Computation is launched only on first call of this method. + Subsequent calls do not relaunch it. Args: - run: When False, the post-processing will not be computed and None will be - returned if it does not exist yet. + run: Boolean indicating whether to compute or get the postprocessing. + The default is ``True``. If ``False``, the postprocessing is not + computed, and ``None`` is returned if it does not exist yet. Returns: - A GlobalCoefficients object, that will eventually contain + ``GlobalCoefficients`` object that eventually contains the global coefficients with its pressure and velocity components. - `None` if `run` is `False` and the post-processing does not exist. + Returns ``None`` if ``run=False`` and the postprocessing does not exist. """ return self._get_or_run(GlobalCoefficients, {}, run) def surface_evol(self, axis: str, delta: float, run: bool = True) -> Optional[SurfaceEvol]: """Compute or get the SurfaceEvol for specific parameters. - This is a non-blocking method. It will return the SurfaceEvol + This is a non-blocking method. It returns the ``SurfaceEvol`` object without waiting. This object may not have data right away - if computation is still in progress. Data will be filled - asynchronously once computation is finished. - State of computation can be monitored with the is_ready flag, - or waited upon with the wait() method. + if computation is still in progress. Data is filled + asynchronously once the computation is finished. + The state of computation can be monitored with the ``is_ready`` flag + or waited upon with the ``wait()`` method. - Computation will be launched only on first call of this method - with a specific set of parameters. - Subsequent calls with the same parameters will not relaunch it. + The computation is launched only on first call of this method + with a specific set of parameters. Subsequent calls with the + same parameters do not relaunch it. Args: - axis: For which axis the surface evol should be computed - delta: Increment of the abscissa in meters - run: When False, the post-processing will not be computed and None will be - returned if it does not exist yet. + axis: Axis to compute the surface evol for. + delta: Increment of the abscissa in meters. + run: Boolean indicating whether to compute or get the postprocessing. + The default is ``True``. If ``False``, the postprocessing is not + computed, and ``None`` is returned if it does not exist yet. Returns: - A SurfaceEvol allowing to access the values. - `None` if `run` is `False` and the post-processing does not exist. + ``SurfaceEvol`` that allows access to the values. + Returns ``None`` if ``run=False`` and the postprocessing does not exist. """ if axis not in ["x", "y", "z"]: - raise TypeError("axis must be x, y or z") + raise TypeError("Axis must be x, y, or z.") if not isinstance(delta, numbers.Number) or not (delta > 0): - raise TypeError(f"delta must be a positive number (got: {delta})") + raise TypeError(f"Delta must be a positive number (got: {delta}).") return self._get_or_run(SurfaceEvol, {"axis": axis, "delta": delta}, run) def slice( self, axis: str, coordinate: float, format: str = "png", run: bool = True ) -> Optional[Slice]: - """Compute or get a Slice for specific plane parameters. + """Compute or get a slice for specific plane parameters. - This is a non-blocking method. It will return the Slice + This is a non-blocking method. It returns the ``Slice`` object without waiting. This object may not have data right away - if computation is still in progress. Data will be filled - asynchronously once computation is finished. - State of computation can be monitored with the is_ready flag, - or waited upon with the wait() method. + if computation is still in progress. Data is filled + asynchronously once the computation is finished. + The state of computation can be monitored with the ``is_ready`` flag + or waited upon with the ``wait()`` method. - Computation will be launched only on first call of this method - with a specific set of parameters. - Subsequent calls with the same parameters will not relaunch it. + The computation is launched only on first call of this method + with a specific set of parameters. Subsequent calls with the same + parameters do not relaunch it. The slice is in the NPZ format. Args: - axis: The axis to slice - coordinate: Coordinate along the given axis to slice at - format: The format of the output, "png" or "vtp", defaults to "png" - run: When False, the post-processing will not be computed and None will be - returned if it does not exist yet. + axis: Axis to slice. + coordinate: Coordinate along the given axis to slice at. + format: Format of the output. The default is ``'png'``. Options + are ``'png'`` and ``'vtp'``. + run: Boolean indicating whether to compute or get the postprocessing. + The default is ``True``. If ``False``, the postprocessing is not + computed, and ``None`` is returned if it does not exist yet. Returns: - A Slice object allowing to download the binary data. - `None` if `run` is `False` and the post-processing does not exist. + ``Slice`` object that allows downloading the binary data. + Returns ``None`` if ``run=False`` and the postprocessing does not exist. Example: - Make a slice and open it in a new window using the pillow library + Make a slice and open it in a new window using the `Pillow `_ + library. .. code-block:: python @@ -384,35 +390,36 @@ def slice( """ if axis not in ["x", "y", "z"]: - raise InvalidArguments(f"{axis} is not a valid axis, should be one of x, y, z") + raise InvalidArguments(f"{axis} is not a valid axis. It should be x, y, or z.") if format not in ["png", "vtp"]: - raise InvalidArguments(f"{format} is not a valid format, should be one of png, vtp") + raise InvalidArguments(f"{format} is not a valid format. It should be png or vtp.") plane = convert_axis_and_coordinate_to_plane_eq_coeffs(axis, coordinate) return self._get_or_run(Slice, {"plane": plane, "output_format": format}, run) def surface_vtp(self, run: bool = True) -> Optional[SurfaceVTP]: """Compute or get the result of the prediction's surface in VTP format. - This is a non-blocking method. It will return the PostProcessingVTP + This is a non-blocking method. It returns the ``PostProcessingVTP`` object without waiting. This object may not have data right away - if computation is still in progress. Data will be filled - asynchronously once computation is finished. - State of computation can be monitored with the is_ready flag, - or waited upon with the wait() method. + if the computation is still in progress. Data is filled + asynchronously once the computation is finished. + The state of computation can be monitored with the ``is_ready`` flag + or waited upon with the ``wait()`` method. - Computation will be launched only on first call of this method. - Subsequent calls will not relaunch it. + The computation is launched only on first call of this method. + Subsequent calls do not relaunch it. Args: - run: When False, the post-processing will not be computed and None will be - returned if it does not exist yet. + run: Boolean indicating whether to compute or get the postprocessing. + The default is ``True``. If ``False``, the postprocessing is not + computed, and ``None`` is returned if it does not exist yet. Returns: - A :class:`SurfaceVTP` object allowing to download the binary data. - `None` if `run` is `False` and the post-processing does not exist. + :class:`SurfaceVTP` object that allows downloading the binary data. + Returns ``None`` if ``run=False`` and the postprocessing does not exist. Examples: - Run and download a surface VTP + Run and download a surface VTP. .. code-block:: python @@ -423,7 +430,7 @@ def surface_vtp(self, run: bool = True) -> Optional[SurfaceVTP]: surface_vtp = prediction.post.surface_vtp().data.download("/tmp/simai.vtp") - Run a surface VTP and open a plot using pyvista + Run a surface VTP and open a plot using PyVista. .. code-block:: python @@ -446,22 +453,23 @@ def surface_vtp(self, run: bool = True) -> Optional[SurfaceVTP]: def volume_vtu(self, run: bool = True) -> Optional[VolumeVTU]: """Compute or get the result of the prediction's volume in VTU format. - This is a non-blocking method. It will return the PostProcessingVTU + This is a non-blocking method. It returns the ``PostProcessingVTU`` object without waiting. This object may not have data right away - if computation is still in progress. Data will be filled - asynchronously once computation is finished. - State of computation can be monitored with the is_ready flag, - or waited upon with the wait() method. + if the computation is still in progress. Data is filled + asynchronously once the computation is finished. + The state of computation can be monitored with the ``is_ready`` flag + or waited upon with the ``wait()`` method. - Computation will be launched only on first call of this method. - Subsequent calls will not relaunch it. + The computation is launched only on first call of this method. + Subsequent calls do not relaunch it. Args: - run: When False, the post-processing will not be computed and None will be - returned if it does not exist yet. + run: Boolean indicating whether to compute or get the postprocessing. + The default is ``True``. If ``False``, the postprocessing is not + computed, and ``None`` is returned if it does not exist yet. Returns: - A :class:`VolumeVTU` object allowing to download the binary data. + :class:`VolumeVTU` object that allows downloading the binary data. Examples: Run and download a volume VTU @@ -475,7 +483,7 @@ def volume_vtu(self, run: bool = True) -> Optional[VolumeVTU]: volume_vtu = prediction.post.volume_vtu().data.download("/tmp/simai.vtu") - Run a volume VTU and open a plot using pyvista + Run a volume VTU and open a plot using PyVista. .. code-block:: python @@ -497,9 +505,7 @@ def volume_vtu(self, run: bool = True) -> Optional[VolumeVTU]: @property def _local_post_processings(self): - """Returns the post processings launched by the local session, - thus which will be waited upon. - """ + """Postprocessings launched by the local session, which are waited upon.""" local_post_processings = [] for pp_by_type in self._post_processings.values(): local_post_processings.extend(pp_by_type.values()) @@ -513,22 +519,24 @@ def _delete_local_post_processing(self, post_processing: PostProcessing): def _get_or_run( self, pp_class: Type[PostProcessing], params: Dict[str, Any], run: bool ) -> Optional["PostProcessing"]: - """Get existing post-processing or run new one if it doesn't exist yet. + """Get the existing postprocessing or run one if it doesn't exist yet. Args: - pp_class: Type of post processing - params: The params of the post-processing - run: if false, only gets existing post-processing - Nonblocking method. Runs (if not already ran/running) the post-processing of given type, - with given parameters or gets it if the run arg is False. - If already existing, will return the existing PostProcessing + pp_class: Type of postprocessing. + params: Parameters of the postprocessing. + run: Boolean indicating whether to compute or get the postprocessing. + If ``False``, this method only gets an existing postprocessing. + + This is a non-blocking method. It runs (if not already run orrunning) the postprocessing + of given type with the given parameters. If ``run=False``, if a preprocessing already + exits, it gets it. """ # FIXME frozenset(params.items()) works as long as there are no # collision between params (axis and delta for surface evol, param for slice) - # but will be broken if a new type of post-processings can have - # 2 params with the same value. + # but will be broken if a new type of postprocessings can have + # two params with the same value. params_frozen = frozenset(params.items()) - # If post-processing of this type and with those params already exist locally, return it + # If a postprocessing of this type and with those params already exists locally, return it if pp_class in self._post_processings and params_frozen in self._post_processings[pp_class]: return self._post_processings[pp_class][params_frozen] if run: @@ -547,8 +555,8 @@ def _get_or_run( raise InvalidServerStateError( cleandoc( f""" - Multiple post-processings where found when only one should have been. - Please contact us at support-simai@ansys.com with that message to help us fix this issue. + Multiple postprocessings where found when only one should be found. + Contact us at support-simai@ansys.com with this message to help us fix the issue. {[pp['id'] for pp in api_response]} """ ) @@ -585,14 +593,14 @@ def _model_from( else: constructor = self._data_model_for_type_name(data["type"]) if not constructor: - raise ValueError(f"""Received unknown post-processing type {data['type']}.""") + raise ValueError(f"""Received unknown postprocessing type {data['type']}.""") post_processing = super()._model_from(data, data_model=constructor) post_processing._prediction = prediction return post_processing @property def info(self) -> List[Dict[str, Any]]: - """Return a dictionary containing information about the available post-processings and their parameters. + """Dictionary containing information about the available postprocessings and their parameters. Example: .. code-block:: python @@ -607,16 +615,16 @@ def info(self) -> List[Dict[str, Any]]: return self._client.current_workspace.model.post_processings def get(self, id: str) -> PostProcessing: - """Get a specific post-processing object from the server. + """Get a specific postprocessing object from the server. Args: - id: The id of the post-processing to get + id: ID of the postprocessing. Returns: - The :py:class:`PostProcessing` with the given id if it exists + :py:class:`PostProcessing` with the given ID if it exists. Raises: - NotFoundError: No post-processing with the given id exists + NotFoundError: No postprocessing with the given ID exists. """ data = self._client._api.get_post_processing_result(id) return self._model_from(data) @@ -626,19 +634,20 @@ def list( post_processing_type: Optional[Type[PostProcessing]] = None, prediction: Optional[Identifiable["Prediction"]] = None, ) -> List[PostProcessing]: - """List the post-processings in the current workspace or associated to a prediction. + """List the postprocessings in the current workspace or associated with a prediction. - Optionally you choose to list only post-processings of a specific type. - For the name of the available post-processings, please refer to :ref:`available_pp` + Optionally you can choose to list only postprocessings of a specific type. + For the name of the available postprocessings, see :ref:`available_pp`. Note that the case must be respected. Args: - post_processing_type: The type of post-processing to list - prediction: The id or :class:`model <.predictions.Prediction>` of a prediction, - if given the method will only return post-processings associated to it + post_processing_type: Type of postprocessing to list. + prediction: ID or :class:`model <.predictions.Prediction>` of a prediction. + If a value is specified, only postprocessings associated with this prediction + are returned. Raises: - NotFoundError: The post-processing type and/or the prediction id are incorrect. + NotFoundError: Postprocessing type and/or the prediction ID are incorrect. Example: .. code-block:: python @@ -670,17 +679,20 @@ def run( parameters: Optional[Dict[str, Any]] = None, **kwargs, ) -> PostProcessing: - """Run a post-processing on a prediction. + """Run a postprocessing on a prediction. - For the name and the parameters expected by the post-processings, - please refer to :ref:`available_pp` and :ref`pp_methods` sections. - Note that the case of the class names must be respected. + For the name and the parameters expected by the postprocessings, + see :ref:`available_pp` and :ref:`pp_methods`. Note that the case + of the class names must be respected. Args: - post_processing_type: The type of post-processing to run, as a string or as the class itself - prediction: The id or :class:`model <.predictions.Prediction>` of the prediction for which to run the post-processing. - parameters: The parameters to apply to the post-processing, if needed. Alternatively, parameters can be passed as kwargs. - **kwargs: Unpacked parameters for the post-processing + post_processing_type: Type of postprocessing to run as a string + or as the class itself. + prediction: ID or :class:`model <.predictions.Prediction>` of the prediction + to run the postprocessing for. + parameters: Parameters to apply to the postprocessing, if needed. + Alternatively, parameters can be passed as kwargs. + **kwargs: Unpacked parameters for the postprocessing. Examples: .. code-block:: python @@ -711,8 +723,9 @@ def run( else: raise InvalidArguments( cleandoc( - f""""{post_processing_type}" is not a valid post-processing type. - You can find the available post-processings by accessing the .post_processings.info attribute of your SimAIClient + f""""{post_processing_type}" is not a valid postprocessing type. + You can find the available postprocessings by accessing the + ``.post_processings.info`` attribute of your SimAI client. """ ) ) @@ -723,10 +736,10 @@ def run( return prediction.post._get_or_run(pp_class, parameters, True) def delete(self, post_processing: Identifiable[PostProcessing]): - """Delete a post-processing. + """Delete a postprocessing. Args: - post_processing: The id or :class:`model ` of the post-processing to delete + post_processing: ID or :class:`model ` of the postprocessing. """ # FIXME?: This won't update the post_processings of the prediction's PredictionPostProcessings if any. # Doing so would require an extra call to get the prediction info and I'm not sure there's really a point diff --git a/src/ansys/simai/core/data/predictions.py b/src/ansys/simai/core/data/predictions.py index 9b56e6fd..d235e304 100644 --- a/src/ansys/simai/core/data/predictions.py +++ b/src/ansys/simai/core/data/predictions.py @@ -38,7 +38,7 @@ class Prediction(ComputableDataModel): - """Local representation of a prediction object.""" + """Provides the local representation of a prediction object.""" def __init__(self, *args, **kwargs): super().__init__(*args, **kwargs) @@ -47,21 +47,22 @@ def __init__(self, *args, **kwargs): @property def geometry_id(self) -> str: - """The id of the parent geometry. + """ID of the parent geometry. See Also: - - :attr:`geometry`: Get the parent geometry + - :attr:`geometry`: Get the parent geometry. """ return self.fields["geometry_id"] @property def geometry(self) -> Geometry: - """The parent geometry. + """Parent geometry. - It will be queried if not already known by the current SDK session. + The parent geometry is queried if is not already known by the current + SimAI client session. See Also: - - :attr:`geometry_id`: Return the parent geometry's id without query + - :attr:`geometry_id`: Get the parent geometry's ID without query. """ if self._geometry is None: if self.geometry_id in self._client.geometries._registry: @@ -72,22 +73,23 @@ def geometry(self) -> Geometry: @property def boundary_conditions(self) -> BoundaryConditions: - """The boundary conditions of the prediction.""" + """Boundary conditions of the prediction.""" return self.fields["boundary_conditions"] @property def post(self) -> PredictionPostProcessings: - """Namespace containing methods to post-process the result of a prediction. + """Namespace containing methods for postprocessing the result of a prediction. - See :py:class:`~ansys.simai.core.data.post_processings.PredictionPostProcessings` for more information + For more information, see the :py:class:`~ansys.simai.core.data.post_processings.PredictionPostProcessings` + class. """ return self._post_processings @property def confidence_score(self) -> str: - """The confidence score. Either *high* or *low*. + """Confidence score, which is either ``high`` or ``low``. - This method will block until the confidence score is computed. + This method blocks until the confidence score is computed. """ self.wait() return self.fields["confidence_score"] @@ -98,30 +100,28 @@ def delete(self) -> None: self._unregister() def feedback(self, **kwargs): # noqa: D417 - """Give us your feedback on a prediction to help us improve. + """Provide feedback on a prediction so improvements might be made. This method enables you to give a rating (from 0 to 4) and a comment on a prediction. - Moreover you can upload your computed solution. - This feedback will help us make our predictions more accurate for you. + Moreover, you can upload your computed solution. + Your feedback is used to try to make predictions more accurate. Keyword Args: - rating (int): A rating from 0 to 4 - comment (str): Additional comment - solution (Optional[File]): Your solution to the - prediction + rating (int): Rating from 0 to 4. + comment (str): Additional comment. + solution (Optional[File]): Your solution to the prediction. """ self._client._api.send_prediction_feedback(self.id, **kwargs) def _wait_all(self): - """Wait until both this prediction, and any post-processing launched on it + """Wait until both this prediction and any postprocessing launched on it have finished processing. - Blocking method, which once called, blocks until both the prediction, - and any post-processing launched locally, have either finished processing, - or have failed. + This method blocks until both the prediction and any postprocessing launched + locally have either finished processing or have failed. - Post-processing launched by other SDK sessions or on the front-end + Postprocessing launched by other SimAI client sessions or on the front-end are not waited upon. """ # wait for own creation Event @@ -131,7 +131,7 @@ def _wait_all(self): return # Wait for its post-processings if any if self.post._local_post_processings: - logger.debug("prediction: waiting for post-processings loading") + logger.debug("prediction: waiting for postprocessings loading") for post_processing in self.post._local_post_processings: post_processing.wait() @@ -145,9 +145,9 @@ def _merge_fields_from_results(self, results: dict): class PredictionDirectory(Directory[Prediction]): - """Collection of methods related to model predictions. + """Provides a collection of methods related to model predictions. - Accessed through ``client.prediction``. + This method is accessed through ``client.prediction``. Example: .. code-block:: python @@ -162,17 +162,21 @@ class PredictionDirectory(Directory[Prediction]): @property def boundary_conditions(self) -> Dict[str, Any]: - """Information on the boundary conditions expected by the model of the current workspace, i.e. the prediction's input.""" + """Information on the boundary conditions expected by the model of the current workspace. + For example, the prediction's input. + """ return self._client.current_workspace.model.boundary_conditions @property def physical_quantities(self) -> Dict[str, Any]: - """Information on the physical quantities generated by the model, i.e. the prediction's output.""" + """Information on the physical quantities generated by the model. For example, the + prediction's output. + """ return self._client.current_workspace.model.physical_quantities @property def info(self): - """Information on the predictions inputs and outputs. + """Information on the prediction's inputs and outputs. Example: .. code-block:: python @@ -190,11 +194,11 @@ def info(self): } def list(self, workspace: Optional[Identifiable[Workspace]] = None) -> List[Prediction]: - """List all predictions on the server that belong to the specified workspace or configured one. + """List all predictions on the server that belong to the specified workspace or the configured one. Args: - workspace: The id or :class:`model <.workspaces.Workspace>` of the workspace for which to list the predictions, - necessary if no workspace is set for the client. + workspace: ID or :class:`model <.workspaces.Workspace>` of the workspace to list the predictions for. + This parameter is necessary if no workspace is set for the client. """ return [ self._model_from(prediction) @@ -204,16 +208,16 @@ def list(self, workspace: Optional[Identifiable[Workspace]] = None) -> List[Pred ] def get(self, id: str) -> Prediction: - """Get a specific prediction object from the server. + """Get a specific prediction object from the server by ID. Args: - id: The id of the prediction to get + id: ID of the prediction. Returns: - The :class:`Prediction` with the given id if it exists + :class:`Prediction` instance with the given ID if it exists. Raises: - :class:`NotFoundError`: No prediction with the given id exists + :class:`NotFoundError`: No prediction with the given ID exists. """ return self._model_from(self._client._api.get_prediction(id)) @@ -221,10 +225,10 @@ def delete(self, prediction: Identifiable[Prediction]) -> None: """Delete a specific prediction from the server. Args: - prediction: The id or :class:`model ` of the prediction to delete + prediction: ID or :class:`model ` of the prediction. Raises: - :py:class:`ansys.simai.core.errors.NotFoundError`: No prediction with the given id exists + :py:class:`ansys.simai.core.errors.NotFoundError`: No prediction with the given ID exists. """ prediction_id = get_id_from_identifiable(prediction) self._client._api.delete_prediction(prediction_id) @@ -236,20 +240,20 @@ def run( # noqa: D417 boundary_conditions: Optional[BoundaryConditions] = None, **kwargs, ) -> Prediction: - """Run a SimAI prediction on the given geometry with the given boundary conditions. + """Run a prediction on a given geometry with a given boundary conditions. Boundary conditions can be passed as a dictionary or as kwargs. - To learn more about the expected boundary conditions in your workspace you can do + To learn more about the expected boundary conditions in your workspace, you can use the ``simai.current_workspace.model.boundary_conditions`` or ``simai.predictions.boundary_conditions`` - where ``ex`` is your `~ansys.simai.core.client.SimAIClient` object. + method, where ``ex`` is your `~ansys.simai.core.client.SimAIClient` object. Args: - geometry: The id or :class:`model <.geometries.Geometry>` of the target geometry - boundary_conditions: The boundary conditions to apply, in dictionary form + geometry: ID or :class:`model <.geometries.Geometry>` of the target geometry. + boundary_conditions: Boundary conditions to apply in dictionary form. Returns: - The created prediction object + Created prediction object. Raises: ProcessingError: If the server failed to process the request. @@ -275,20 +279,19 @@ def run( # noqa: D417 return prediction def feedback(self, prediction: Identifiable[Prediction], **kwargs) -> None: # noqa: D417 - """Give us your feedback on a prediction to help us improve. + """Provide feedback on a prediction so improvements might be made. This method enables you to give a rating (from 0 to 4) and a comment on a prediction. - Moreover you can upload your computed solution. - This feedback will help us make our predictions more accurate for you. + Moreover, you can upload your computed solution. + Your feedback is used to try to make predictions more accurate. Args: - prediction: The id or :class:`model ` of the prediction to give feedback for + prediction: ID or :class:`model ` of the prediction. Keyword Args: - rating (int): A rating from 0 to 4, required - comment (str): Additional comment, required - solution (typing.Optional[File]): Your solution to the - prediction, optional + rating (int): Rating from 0 to 4. + comment (str): Additional comment. + solution (typing.Optional[File]): Your solution to the prediction. """ self._client._api.send_prediction_feedback(get_id_from_identifiable(prediction), **kwargs) diff --git a/src/ansys/simai/core/data/projects.py b/src/ansys/simai/core/data/projects.py index 881c9aa9..6237a5dc 100644 --- a/src/ansys/simai/core/data/projects.py +++ b/src/ansys/simai/core/data/projects.py @@ -31,14 +31,14 @@ class Project(DataModel): - """Local representation of a Project object.""" + """Provides the local representation of a project object.""" def __repr__(self) -> str: return f"" @property def name(self) -> str: - """The name of project.""" + """Name of project.""" return self.fields["name"] @name.setter @@ -46,14 +46,14 @@ def name(self, new_name: str): """Rename the project. Args: - new_name: the new name to give to the project + new_name: New name to give to the project. """ self._client._api.update_project(self.id, name=new_name) self.reload() @property def data(self) -> List["TrainingData"]: - """Lists all the :class:`~ansys.simai.core.data.training_data.TrainingData` in this project.""" + """List of all :class:`~ansys.simai.core.data.training_data.TrainingData` instances in the project.""" raw_td_list = self._client._api.iter_training_data_in_project(self.id) return [ self._client.training_data._model_from(training_data) for training_data in raw_td_list @@ -61,10 +61,7 @@ def data(self) -> List["TrainingData"]: @property def sample(self) -> Optional["TrainingData"]: - """The sample of the project. - - The sample determines what variable and settings are available during model configuration. - """ + """Sample of the project. The sample determines what variable and settings are available during model configuration.""" raw_sample = self.fields["sample"] if raw_sample is None: return None @@ -77,17 +74,17 @@ def sample(self, new_sample: Identifiable["TrainingData"]): self.reload() def delete(self) -> None: - """Deletes the project.""" + """Delete the project.""" self._client._api.delete_project(self.id) class ProjectDirectory(Directory[Project]): - """Collection of methods related to projects. + """Provides a collection of methods related to projects. - Accessed through ``client.projects`` + This class is accessed through ``client.projects``. Example: - Listing all the projects:: + List all projects:: import ansys.simai.core @@ -98,35 +95,35 @@ class ProjectDirectory(Directory[Project]): _data_model = Project def list(self) -> List[Project]: - """Lists all the projects available on the server.""" + """List all projects available on the server.""" return [self._model_from(data) for data in self._client._api.projects()] def create(self, name: str) -> Project: - """Creates a new project.""" + """Create a project.""" return self._model_from(self._client._api.create_project(name=name)) def get(self, id: Optional[str] = None, name: Optional[str] = None): - """Gets a project by name or id. + """Get a project by either ID or name. - Only one of id or name should be specified + You can specify either the ID or the name, not both. Args: - id: The id of the project to get - name: The name of the project to get + id: ID of the project. + name: Name of the project. """ if name and id: - raise InvalidArguments("Only one of name and id arguments should be specified") + raise InvalidArguments("Only the 'id' or 'name' argument should be specified.") elif name: return self._model_from(self._client._api.get_project_by_name(name)) elif id: return self._model_from(self._client._api.get_project(id)) else: - raise InvalidArguments("Either name or id argument should be specified") + raise InvalidArguments("Either the 'id' or 'name' argument should be specified.") def delete(self, project: Identifiable[Project]) -> None: - """Deletes a project. + """Delete a project. Args: - project: The id or :class:`model ` of the project to delete + project: ID or :class:`model ` of the project. """ self._client._api.delete_project(get_id_from_identifiable(project)) diff --git a/src/ansys/simai/core/data/selection_post_processings.py b/src/ansys/simai/core/data/selection_post_processings.py index 823d0c98..9ddaba78 100644 --- a/src/ansys/simai/core/data/selection_post_processings.py +++ b/src/ansys/simai/core/data/selection_post_processings.py @@ -36,8 +36,8 @@ class SelectionPostProcessingsMethods: - """Class acting as namespace inside :py:class:`~ansys.simai.core.data.selections.Selection` objects, - allowing to access or run post-processings on whole selections. + """Acts as a namespace inside :py:class:`~ansys.simai.core.data.selections.Selection` objects, + allowing you to access or run postprocessings on whole selections. """ def __init__(self, selection: "Selection"): @@ -46,18 +46,20 @@ def __init__(self, selection: "Selection"): def global_coefficients(self) -> ExportablePPList[GlobalCoefficients]: """Compute or get the global coefficients of the selected predictions. - This is a non-blocking method. It will return a - :py:class:`~ansys.simai.core.data.lists.ExportablePPList` of :py:class:`~ansys.simai.core.data.post_processings.GlobalCoefficients` - objects without waiting. Those PostProcessing objects may not have - data right away if computation is still in progress. Data will be filled - asynchronously once computation is finished. - State of computation can be waited upon with the wait() method. + This is a non-blocking method. It returns an + :py:class:`~ansys.simai.core.data.lists.ExportablePPList` instance + of :py:class:`~ansys.simai.core.data.post_processings.GlobalCoefficients` + objects without waiting. Those ``PostProcessing`` objects may not have + data right away if the computation is still in progress. Data is filled + asynchronously once the computation is finished. + The state of computation can be waited upon with the ``wait()`` method. - Computation will be launched only on first call of this method. - Subsequent calls will not relaunch it. + The computation is launched only on the first call of this method. + Subsequent calls do not relaunch it. Returns: - A :py:class:`~ansys.simai.core.data.lists.ExportablePPList` of :py:class:`~ansys.simai.core.data.post_processings.GlobalCoefficients` + :py:class:`~ansys.simai.core.data.lists.ExportablePPList` instance + of :py:class:`~ansys.simai.core.data.post_processings.GlobalCoefficients` objects. """ return ExportablePPList( @@ -65,26 +67,27 @@ def global_coefficients(self) -> ExportablePPList[GlobalCoefficients]: ) def surface_evol(self, axis: str, delta: float) -> ExportablePPList[SurfaceEvol]: - """Compute or get the SurfaceEvol of the predictions, for specific parameters. + """Compute or get the SurfaceEvol of the predictions for specific parameters. - This is a non-blocking method. It will return a - :py:class:`~ansys.simai.core.data.lists.ExportablePPList` of :py:class:`~ansys.simai.core.data.post_processings.SurfaceEvol` - objects without waiting. Those PostProcessing objects may not have - data right away if computation is still in progress. Data will be filled - asynchronously once computation is finished. - State of computation can be waited upon with the wait() method. + This is a non-blocking method. It returns an + :py:class:`~ansys.simai.core.data.lists.ExportablePPList` instance + of :py:class:`~ansys.simai.core.data.post_processings.SurfaceEvol` + objects without waiting. Those ``PostProcessing`` objects may not have + data right away if the computation is still in progress. Data is filled + asynchronously once the computation is finished. + The state of computation can be waited upon with the ``wait()`` method. - Computation will be launched only on first call of this method + The computation is launched only on the first call of this method with a specific set of parameters. - Subsequent calls with the same parameters will not relaunch it. + Subsequent calls with the same parameters do not relaunch it. Args: - axis: For which axis the SurfaceEvol should be computed - delta: Increment of the abscissa in meters + axis: Axis to compute the the SurfaceEvol on. + delta: Increment of the abscissa in meters. Returns: - A :py:class:`~ansys.simai.core.data.lists.ExportablePPList` of :py:class:`~ansys.simai.core.data.post_processings.SurfaceEvol` - objects. + :py:class:`~ansys.simai.core.data.lists.ExportablePPList` instance of + :py:class:`~ansys.simai.core.data.post_processings.SurfaceEvol` objects. """ return ExportablePPList( selection=self._selection, @@ -94,26 +97,27 @@ def surface_evol(self, axis: str, delta: float) -> ExportablePPList[SurfaceEvol] def slice(self, axis: str, coordinate: float) -> PPList[Slice]: """Compute or get a slice from each prediction in the selection. - This is a non-blocking method. It will return a - :py:class:`~ansys.simai.core.data.lists.PPList` of :py:class:`~ansys.simai.core.data.post_processings.Slice` - objects without waiting. Those PostProcessing objects may not have - data right away if computation is still in progress. Data will be filled - asynchronously once computation is finished. - State of computation can be waited upon with the wait() method. + This is a non-blocking method. It returns a + :py:class:`~ansys.simai.core.data.lists.PPList` instance of + :py:class:`~ansys.simai.core.data.post_processings.Slice` + objects without waiting. Those ``PostProcessing`` objects may not have + data right away if the computation is still in progress. Data is filled + asynchronously once the computation is finished. + The state of computation can be waited upon with the ``wait()`` method. - Computation will be launched only on first call of this method + The computation is launched only on the first call of this method with a specific set of parameters. - Subsequent calls with the same parameters will not relaunch it. + Subsequent calls with the same parameters do not relaunch it. - The slices will be in the NPZ format. + The slices are in the NPZ format. Args: - axis: The axis to slice - coordinate: Coordinate along the given axis to slice at + axis: Axis to slice. + coordinate: Coordinate along the given axis to slice at. Returns: - A :py:class:`~ansys.simai.core.data.lists.PPList` of :py:class:`~ansys.simai.core.data.post_processings.Slice` - objects. + :py:class:`~ansys.simai.core.data.lists.PPList` list of + :py:class:`~ansys.simai.core.data.post_processings.Slice` objects. """ return PPList( selection=self._selection, @@ -123,38 +127,40 @@ def slice(self, axis: str, coordinate: float) -> PPList[Slice]: def volume_vtu(self) -> PPList[VolumeVTU]: """Compute or get the result of each prediction's volume in the VTU format. - This is a non-blocking method. It will return a - :py:class:`~ansys.simai.core.data.lists.PPList` of :py:class:`~ansys.simai.core.data.post_processings.VolumeVTU` - objects without waiting. Those PostProcessing objects may not have - data right away if computation is still in progress. Data will be filled - asynchronously once computation is finished. - State of computation can be waited upon with the wait() method. + This is a non-blocking method. It returns a + :py:class:`~ansys.simai.core.data.lists.PPList` instance of + :py:class:`~ansys.simai.core.data.post_processings.VolumeVTU` + objects without waiting. Those ``PostProcessing`` objects may not have + data right away if the computation is still in progress. Data is filled + asynchronously once the computation is finished. + The state of computation can be waited upon with the ``wait()`` method. - Computation will be launched only on first call of this method. - Subsequent calls will not relaunch it. + The computation is launched only on the first call of this method. + Subsequent calls do not relaunch it. Returns: - A :py:class:`~ansys.simai.core.data.lists.PPList` of :py:class:`~ansys.simai.core.data.post_processings.VolumeVTU` - objects. + :py:class:`~ansys.simai.core.data.lists.PPList` instance of + :py:class:`~ansys.simai.core.data.post_processings.VolumeVTU` objects. """ return PPList(selection=self._selection, post=lambda pred: pred.post.volume_vtu()) def surface_vtp(self) -> PPList[SurfaceVTP]: """Compute or get the result of each prediction's surface in the VTP format. - This is a non-blocking method. It will return a - :py:class:`~ansys.simai.core.data.lists.PPList` of :py:class:`~ansys.simai.core.data.post_processings.SurfaceVTP` - objects without waiting. Those PostProcessing objects may not have - data right away if computation is still in progress. Data will be filled - asynchronously once computation is finished. - State of computation can be waited upon with the wait() method. + This is a non-blocking method. It returns a + :py:class:`~ansys.simai.core.data.lists.PPList` instance of + :py:class:`~ansys.simai.core.data.post_processings.SurfaceVTP` + objects without waiting. Those ``PostProcessing`` objects may not have + data right away if the computation is still in progress. Data is filled + asynchronously once the computation is finished. + The state of computation can be waited upon with the ``wait()`` method. - Computation will be launched only on first call of this method. - Subsequent calls will not relaunch it. + The computation is launched only on first call of this method. + Subsequent calls do not relaunch it. Returns: - A :py:class:`~ansys.simai.core.data.lists.PPList` of :py:class:`~ansys.simai.core.data.post_processings.SurfaceVTP` - objects. + :py:class:`~ansys.simai.core.data.lists.PPList` instance of + :py:class:`~ansys.simai.core.data.post_processings.SurfaceVTP` objects. """ return PPList(selection=self._selection, post=lambda pred: pred.post.surface_vtp()) diff --git a/src/ansys/simai/core/data/selections.py b/src/ansys/simai/core/data/selections.py index 3e9f9544..27cc6f82 100644 --- a/src/ansys/simai/core/data/selections.py +++ b/src/ansys/simai/core/data/selections.py @@ -39,9 +39,10 @@ class Point: - """A Point object, where a Prediction can be run. + """Provides a ``Point`` object, where a prediction can be run. - A Point is at the intersection of a :class:`~ansys.simai.core.data.geometries.Geometry` and :class:`~ansys.simai.core.data.types.BoundaryConditions`. + A point is at the intersection of a :class:`~ansys.simai.core.data.geometries.Geometry` + isstance and :class:`~ansys.simai.core.data.types.BoundaryConditions` instance. """ def __init__(self, geometry: Geometry, boundary_conditions: BoundaryConditions): @@ -51,23 +52,25 @@ def __init__(self, geometry: Geometry, boundary_conditions: BoundaryConditions): @property def geometry(self) -> Geometry: - """Returns the :class:`~ansys.simai.core.data.geometries.Geometry` object for this :class:`Point`.""" + """:class:`~ansys.simai.core.data.geometries.Geometry` object for the :class:`Point` instance.""" return self._geometry @property def boundary_conditions(self) -> BoundaryConditions: - """Returns the :class:`~ansys.simai.core.data.types.BoundaryConditions` for this :class:`Point`.""" + """:class:`~ansys.simai.core.data.types.BoundaryConditions` object for the :class:`Point` + instance. + """ return self._boundary_conditions @property def prediction(self) -> Union[Prediction, None]: - """Returns the :class:`~ansys.simai.core.data.predictions.Prediction` - corresponding to this Point, or None if no prediction has yet been ran. + """:class:`~ansys.simai.core.data.predictions.Prediction` instance + corresponding to the point or ``None`` if no prediction has yet been run. """ return self._prediction def run_prediction(self, boundary_conditions: BoundaryConditions): - """Runs the prediction on this Geometry for this boundary condition.""" + """Run the prediction on the geometry for this boundary condition.""" self._prediction = self._geometry.run_prediction(boundary_conditions=boundary_conditions) def __repr__(self): @@ -77,21 +80,21 @@ def __repr__(self): class Selection: - """A Selection object, which is a collection of :class:`Points `. + """Provides a ``Selection`` object, which is a collection of :class:`Points ` instances. Selections are built from a list of :class:`Geometries ` - and a list of :class:`~ansys.simai.core.data.types.BoundaryConditions`. + instances and a list of :class:`~ansys.simai.core.data.types.BoundaryConditions` instances. - The resulting Selection contains all combinations between the geometries + The resulting selection contains all combinations between the geometries and the boundary conditions. Args: - geometries: the geometries to include in the selection - boundary_conditions: the boundary conditions to include in the selection - tolerance: Optional delta applied to boundary condition equality; - if the difference between two boundary conditions - is lower than tolerance, they are considered as equal - (default 10**-6). + geometries: Geometries to include in the selection. + boundary_conditions: Boundary conditions to include in the selection. + tolerance: Optional delta to apply to boundary condition equality. + The default is ``10**-6``. If the difference between two boundary + conditions is lower than the tolerance, the two boundary conditions + are considered as equal. """ def __init__( @@ -104,12 +107,12 @@ def __init__( geometries = _enforce_as_list_passing_predicate( geometries, lambda g: isinstance(g, Geometry), - "geometries must be a Geometry or a list of Geometry objects", + "'geometries' must be a geometry or a list of 'Geometry' objects.", ) boundary_conditions = _enforce_as_list_passing_predicate( boundary_conditions, lambda bc: is_boundary_conditions(bc), - "boundary_conditions must be a dict of numbers", + "'boundary_conditions' must be a dictionary of numbers.", ) if tolerance is None: tolerance = DEFAULT_COMPARISON_EPSILON @@ -128,64 +131,71 @@ def __init__( @property def points(self) -> List[Point]: - """Returns a list of all the :class:`Points ` composing this Selection.""" + """List of all :class:`Points ` instances in the selection.""" return self._points @property def predictions(self) -> List[Prediction]: - """Returns a list of all the existing :class:`Predictions ` in this selection.""" + """List of all existing :class:`Prediction ` + instances in the selection. + """ return self.get_predictions() @property def geometries(self) -> List[Geometry]: - """Returns a list of all the existing :class:`Geometries ` in this selection.""" + """List of all existing :class:`Geometries ` + instances in the selection. + """ return self._geometries @property def boundary_conditions(self) -> List[BoundaryConditions]: - """Returns a list of all the existing :class:`BoundaryConditions ` in this selection.""" + """List of all existing :class:`BoundaryConditions ` + instances in the selection. + """ return self._boundary_conditions @property def points_with_prediction(self) -> List[Optional[Point]]: - """Returns a list of all the points :class:`Points ` in this selection for which a prediction exists.""" + """List of all :class:`Points ` instances in the selection where predictions exist.""" return [(point if point.prediction else None) for point in self.points] @property def points_without_prediction(self) -> List[Optional[Point]]: - """Returns a list of all the points :class:`Points ` in this selection for which no prediction exists.""" + """List of all :class:`Points ` instances in the selection where predictions don't exist.""" return [(point if point.prediction is None else None) for point in self.points] def get_predictions(self) -> List[Prediction]: # noqa D102 return [point.prediction for point in self.points if point.prediction is not None] def get_runnable_predictions(self) -> List[Point]: - """Return a list of :class:`Points ` in this selection - for which predictions haven't been ran yet. + """List of all :class:`Points ` instances in the selection where predictions haven't + been run yet. """ return [point for point in self.points if point.prediction is None] def run_predictions(self) -> None: - """Run all the missing predictions in this selection.""" + """Run all missing predictions in the selection.""" _foreach_despite_errors( lambda point: point.run_prediction(boundary_conditions=point.boundary_conditions), self.get_runnable_predictions(), ) def wait(self) -> None: - """Wait for all the ongoing operations (predictions, post-processings) - in this selection to finish. + """Wait for all ongoing operations (predictions and postprocessings) + in the selection to finish. Raises: - ansys.simai.core.errors.SimAIError: if a single error occurred during computing this selection's operations - ansys.simai.core.errors.MultipleErrors: if multiple exceptions occurred when computing this selection's operations + ansys.simai.core.errors.SimAIError: If a single error occurred when computing this selection's operations. + ansys.simai.core.errors.MultipleErrors: If multiple exceptions occurred when computing this selection's operations. """ _foreach_despite_errors(lambda prediction: prediction._wait_all(), self.get_predictions()) def reload(self) -> None: - """Refreshes the predictions in this selection. - Loads any prediction ran from another session, - or removes possible deleted predictions. + """Refreshes the predictions in the selection. + + This method loads any predictions run from another session and + removes possible deleted predictions. """ _predictions_by_geometry_id: Dict[str, List[Prediction]] = {} for point in self.points: @@ -210,10 +220,11 @@ def reload(self) -> None: @property def post(self) -> SelectionPostProcessingsMethods: - """Namespace containing methods to access and run post-processings - for predictions in this selection. + """Namespace containing methods to access and run postprocessings + for the predictions in the selection. + + For more information, see the :py:class:`~ansys.simai.core.data.selection_post_processings.SelectionPostProcessingsMethods` + class. - See :py:class:`~ansys.simai.core.data.selection_post_processings.SelectionPostProcessingsMethods` - for more information. """ return self._post_processings diff --git a/src/ansys/simai/core/data/training_data.py b/src/ansys/simai/core/data/training_data.py index f72519f8..b96db697 100644 --- a/src/ansys/simai/core/data/training_data.py +++ b/src/ansys/simai/core/data/training_data.py @@ -56,32 +56,35 @@ def _upload_training_data_part(id, named_part, client, monitor_callback): class TrainingData(ComputableDataModel): - """Local representation of a TrainingData object.""" + """Provides the local representation of a training data object.""" def __repr__(self) -> str: return f"" @property def name(self) -> str: - """The name of the training data.""" + """Name of the training data.""" return self.fields["name"] @property def parts(self) -> List["TrainingDataPart"]: - """Lists the :class:`parts` in that TrainingData.""" + """List of all :class:`parts` + objects in the training data. + """ return [ self._client.training_data_parts._model_from(training_data_part) for training_data_part in self.fields["parts"] ] def get_subset(self, project: Identifiable["Project"]) -> Optional[str]: - """Indicates which subset this training data belongs to, in relation to the given project. + """Get the subset that the training data belongs to, in relation to the given project. Args: - project: The id or :class:`model <.projects.Project>` of the project to check for :class:`~.projects.Project` to check for, or its id + project: ID or :class:`model <.projects.Project>` of the project to check + the :class:`~.projects.Project` object for, or its ID. Returns: - The name of the subset this training data belongs to in the given project + Name of the subset that the training data belongs to in the given project. """ project_model = get_object_from_identifiable( project, self._client.projects, default=self._client.current_project @@ -92,34 +95,35 @@ def get_subset(self, project: Identifiable["Project"]) -> Optional[str]: @property def extracted_metadata(self) -> Optional[Dict]: - """The metadata extracted from the training data.""" + """Metadata extracted from the training data.""" return self.fields["extracted_metadata"] def compute(self) -> None: - """Requests to compute or recompute the training data. + """Compute or recompute the training data. - Training data should be computed once all its parts have been fully uploaded + Training data should be computed once all its parts have been fully uploaded. - Recomputation can only be requested if it previously failed or if new data has been added. + Training data can only be recomputed if computation previously failed or if new data has been added. """ self._client._api.compute_training_data(self.id) def delete(self) -> None: - """Deletes the training data on the server.""" + """Delete the training data on the server.""" self._client._api.delete_training_data(self.id) def upload_part( self, file: NamedFile, monitor_callback: Optional[MonitorCallback] = None ) -> "TrainingDataPart": - """Adds a part to a training data. + """Add a part to the training data. Args: - file: A :obj:`~ansys.simai.core.data.types.NamedFile` to upload. - monitor_callback: An optional callback to monitor the progress of the download. - See :obj:`~ansys.simai.core.data.types.MonitorCallback` for details. + file: :obj:`~ansys.simai.core.data.types.NamedFile` to upload. + monitor_callback: Optional callback for monitoring the progress of the download. + For more information, see the :obj:`~ansys.simai.core.data.types.MonitorCallback` + object. Returns: - The created :class:`~ansys.simai.core.data.training_data_parts.TrainingDataPart` + Created :class:`~ansys.simai.core.data.training_data_parts.TrainingDataPart`. """ return _upload_training_data_part(self.id, file, self._client, monitor_callback) @@ -129,52 +133,53 @@ def upload_folder( compute: bool = True, monitor_callback: Optional[MonitorCallback] = None, ) -> List["TrainingDataPart"]: - """Uploads all the parts contained in a folder to a :class:`~ansys.simai.core.data.training_data.TrainingData`. + """Upload all the parts contained in a folder to a :class:`~ansys.simai.core.data.training_data.TrainingData` instance. - Automatically requests computation of the training data + This method automatically requests computation of the training data once the upload is complete unless specified otherwise. Args: - folder_path: Path to the folder which contains the files to upload - compute: Whether to compute the training data after upload, defaults to True - monitor_callback: An optional callback to monitor the progress of the download. - See :obj:`~ansys.simai.core.data.types.MonitorCallback` for details. + folder_path: Path to the folder with the files to upload. + compute: Whether to compute the training data after upload. The default is ``True``. + monitor_callback: Optional callback for monitoring the progress of the upload. + For more information, see the :obj:`~ansys.simai.core.data.types.MonitorCallback` + object. Returns: - The list of created training data parts + List of uploaded training data parts. """ return self._directory.upload_folder(self.id, folder_path, compute, monitor_callback) def add_to_project(self, project: Identifiable["Project"]): - """Adds the training data into a :class:`~ansys.simai.core.data.projects.Project`. + """Add the training data to a :class:`~ansys.simai.core.data.projects.Project` object. Args: - project: The id or :class:`model <.projects.Project>` of the project into which the data is to be added + project: ID or :class:`model <.projects.Project>` object of the project to add the data to. """ project_id = get_id_from_identifiable(project) self._client._api.add_training_data_to_project(self.id, project_id) def remove_from_project(self, project: Identifiable["Project"]): - """Removes the training data from a :class:`~ansys.simai.core.data.projects.Project`. + """Remove the training data from a :class:`~ansys.simai.core.data.projects.Project` object. Args: - project: The id or :class:`model <.projects.Project>` of the project from which the data is to be removed. + project: ID or :class:`model <.projects.Project>` of the project to remove data from. Raises: - ansys.simai.core.errors.ApiClientError: if the data is the project's sample - ansys.simai.core.errors.ApiClientError: if the project is in training + ansys.simai.core.errors.ApiClientError: If the data is the project's sample. + ansys.simai.core.errors.ApiClientError: If the project is in training. """ project_id = get_id_from_identifiable(project) self._client._api.remove_training_data_from_project(self.id, project_id) class TrainingDataDirectory(Directory[TrainingData]): - """Collection of methods related to training data. + """Provides a collection of methods related to training data. - Accessed through ``client.training_data``. + This class is accessed through ``client.training_data``. Example: - Listing all the training data:: + List all of the training data:: import ansys.simai.core @@ -185,10 +190,10 @@ class TrainingDataDirectory(Directory[TrainingData]): _data_model = TrainingData def list(self) -> List[TrainingData]: - """Lists :class:`TrainingData` from the server. + """List all :class:`TrainingData` objects on the server. Returns: - The list of all TrainingData records on the server. + List of all :class:`TrainingData` objects on the server. """ return [ self._model_from(training_data) @@ -196,26 +201,26 @@ def list(self) -> List[TrainingData]: ] def get(self, id) -> TrainingData: - """Gets a specific :class:`TrainingData` from the server.""" + """Get a specific :class:`TrainingData` object from the server.""" return self._model_from(self._client._api.get_training_data(id)) def delete(self, training_data: Identifiable[TrainingData]) -> None: - """Deletes a TrainingData and it's associated parts from the server. + """Delete a :class:`TrainingData` object and its associated parts from the server. Args: - training_data: The id or :class:`model ` of the training data to delete + training_data: ID or :class:`model ` object of the :class:`TrainingData` object. """ return self._client._api.delete_training_data(get_id_from_identifiable(training_data)) def create(self, name: str, project: Optional[Identifiable["Project"]] = None) -> TrainingData: - """Creates a new :class:`TrainingData` object. + """Create a :class:`TrainingData` object. Args: - name: The name given to the new :class:`TrainingData`. - project: Associate the data with a :class:`~.projects.Project`. + name: Name to give the new :class:`TrainingData` object. + project: :class:`~.projects.Project` object to associate the data with. Returns: - The created TrainingData + Created :class:`TrainingData` object. """ project_id = get_id_from_identifiable(project, required=False) return self._model_from(self._client._api.create_training_data(name, project_id)) @@ -226,16 +231,18 @@ def upload_part( file: NamedFile, monitor_callback: Optional[MonitorCallback], ) -> "TrainingDataPart": - """Adds a part to a training data. + """Add a part to a :class:`TrainingData` object. Args: - training_data: The id or :class:`model ` of the training data that will contain the part - file: A :obj:`~ansys.simai.core.data.types.NamedFile` to upload - monitor_callback: An optional callback to monitor the progress of the download. - See :obj:`~ansys.simai.core.data.types.MonitorCallback` for details. + training_data: ID or :class:`model ` object of the training data to + add the part to. + file: :obj:`~ansys.simai.core.data.types.NamedFile` to upload. + monitor_callback: Optional callback for monitoring the progress of the upload. + For more information, see the :obj:`~ansys.simai.core.data.types.MonitorCallback` + object. Returns: - The created :class:`~ansys.simai.core.data.training_data_parts.TrainingDataPart` + Added :class:`~ansys.simai.core.data.training_data_parts.TrainingDataPart` object. """ return _upload_training_data_part( get_id_from_identifiable(training_data), file, self._client, monitor_callback @@ -244,19 +251,19 @@ def upload_part( def upload_folder( self, training_data: Identifiable[TrainingData], folder_path: Path, compute: bool = True ) -> List["TrainingDataPart"]: - """Uploads all the files contained in a folder to a :class:`~ansys.simai.core.data.training_data.TrainingData`. + """Upload all files in a folder to a :class:`~ansys.simai.core.data.training_data.TrainingData` object. - Automatically requests computation of the training data - once the upload is complete unless specified otherwise + This method automatically requests computation of the training data once the upload is complete + unless specified otherwise. Args: - training_data: The id or :class:`model ` of the training data that will contain the parts - folder_path: Path to the folder which contains the files to upload - compute: Whether to compute the training data after upload, defaults to True + training_data: ID or :class:`model ` object of the training data to upload parts to. + folder_path: Path to the folder that contains the files to upload. + compute: Whether to compute the training data after upload. The default is ``True``. """ path = pathlib.Path(folder_path) if not path.is_dir(): - raise InvalidArguments("Provided path is not a folder") + raise InvalidArguments("Provided path is not a folder.") path_content = path.glob("[!.]*") files = (obj for obj in path_content if obj.is_file()) uploaded_parts = [] diff --git a/src/ansys/simai/core/data/training_data_parts.py b/src/ansys/simai/core/data/training_data_parts.py index 1ca0b679..f78fa25e 100644 --- a/src/ansys/simai/core/data/training_data_parts.py +++ b/src/ansys/simai/core/data/training_data_parts.py @@ -24,26 +24,26 @@ class TrainingDataPart(UploadableResourceMixin, DataModel): - """Local representation of a training data part object.""" + """Provides the local representation of a training data part object.""" def __repr__(self) -> str: return f"" @property def name(self) -> str: - """The name of the file.""" + """Name of the file.""" return self.fields["name"] @property def size(self) -> int: - """The size of the file, in bytes.""" + """Size of the file in bytes.""" return self.fields["size"] class TrainingDataPartDirectory(Directory[TrainingDataPart]): - """Collection of methods related to training data parts. + """Provides the collection of methods related to training data parts. - Accessed through ``client.training_data_parts`` + This class is accessed through ``client.training_data_parts``. """ _data_model = TrainingDataPart diff --git a/src/ansys/simai/core/data/types.py b/src/ansys/simai/core/data/types.py index 6a2bad2d..088ec4b8 100644 --- a/src/ansys/simai/core/data/types.py +++ b/src/ansys/simai/core/data/types.py @@ -47,7 +47,7 @@ Path = Union[pathlib.Path, str, os.PathLike] """ -Path to a file or folder, as a :obj:`pathlib.Path` or a format supported by pathlib. +Path to a file or folder as an :obj:`pathlib.Path` object or a format supported by ``pathlib``. """ File = Union[BinaryIO, io.RawIOBase, io.BufferedIOBase, Path] @@ -57,8 +57,8 @@ NamedFile = Union[Path, Tuple[File, str]] """ -A named file is either a `FilePath`, from which a name can be inferred, or a tuple with a `File` and a name. -To be valid the name needs to contain an extension. +A named file is either a ``FilePath``, from which a name can be inferred, or a tuple with a file and a name. +To be valid, the name needs to contain an extension. Example: .. code-block:: python @@ -76,27 +76,25 @@ APIResponse = Union[Response, Dict[str, Any], List[Dict[str, Any]]] MonitorCallback = Callable[[int], None] -"""A callback used to monitor the download or upload of a file. +"""Callback used to monitor the download or upload of a file. -For downloads: -It will be called one time with the total size of the download. -Subsequent calls will pass the amount bytes read this iteration. +For downloads, the callback is called one time with the total size of the download. +Subsequent calls are passed the number of bytes read in this iteration. -For uploads: -The callback will receive the amount of bytes written each iteration. +For uploads, the callback receives the number of bytes written each iteration. """ Identifiable = Union[DataModelType, str] -"""Either a Model or the string id of an object of the same type""" +"""Either a model or the string ID of an object of the same type.""" def build_boundary_conditions(boundary_conditions: Optional[Dict[str, Number]] = None, **kwargs): bc = boundary_conditions if boundary_conditions else {} bc.update(**kwargs) if bc is None: - raise ValueError("No boundary condition was specified") + raise ValueError("No boundary condition was specified.") if not is_boundary_conditions(bc): - raise ValueError("Boundary conditions needs to be a dictionary with numbers as values") + raise ValueError("Boundary conditions must be in a dictionary with numbers as values.") return bc @@ -127,23 +125,23 @@ def are_boundary_conditions_equal( class Range: - """Describes a numerical range, used for filtering geometries. + """Describes a numerical range used for filtering geometries. - Range objects describe a numerical range between a minimal and - a maximal boundary. Both are optional, thus if no maximal boundary - is passed, the range describes values >= the min boundary. - Not that ranges are inclusive, thus both min and max boundaries - will match if equal to the passed value - (as opposed for instance to python's range() method). + Range objects describe a numerical range between a minimum and + a maximum boundary. Both are optional. Thus, if no maximum boundary + is passed, the range describes values greater than or equal to the + minimum boundary. Note that ranges are inclusive. Thus, both minimum + and maximum boundaries match if they are equal to the passed value + (as opposed to Python's ``range()`` method). - Ranges can be used as a filter in + Ranges can be used as a filter in the :func:`geometries.list` method. Args: - min: the minimal boundary - max: the maximal boundary - tolerance: a tolerance delta; two values whose difference is smaller - than tolerance are considered as equal. + min: Minimum boundary. + max: Maximum boundary. + tolerance: Tolerance delta. Two values whose difference is smaller + than the tolerance are considered as equal. """ def __init__( @@ -157,7 +155,7 @@ def __init__( self.tolerance = tolerance def match_value(self, value: float) -> bool: - """Checks whether the given value belongs to the :class:`Range`.""" + """Determine whether the given value belongs to the :class:`Range` class.""" if not is_number(value): return False # if min, value >= min @@ -174,7 +172,7 @@ def match_value(self, value: float) -> bool: class _HollowRange(Range): - """_HollowRange is a Range which excludes a value in its center.""" + """_HollowRange is a range that excludes a value in its center.""" def __init__( self, @@ -200,7 +198,7 @@ def match_value(self, value: float): def unpack_named_file( named_file: NamedFile, ) -> Generator[Tuple[BinaryIO, str, str], None, None]: - """Unpack a NamedFile by providing a readable file, its name and an extension.""" + """Unpack a named file by providing a readable file, its name, and an extension.""" if ( isinstance(named_file, Tuple) and len(named_file) == 2 @@ -214,14 +212,14 @@ def unpack_named_file( file = named_file filename = pathlib.Path(named_file).name else: - raise InvalidArguments("Did not receive a valid NamedFile type") + raise InvalidArguments("Did not receive a valid named file type.") # Parse name and extension try: obj_name, file_ext = filename.rsplit(".", 1) assert file_ext except (ValueError, AssertionError): - raise AttributeError(f"Could not determine file extension for {named_file}") from None + raise AttributeError(f"Could not determine file extension for {named_file}.") from None # Open the file if needed close_file = False @@ -248,7 +246,7 @@ def get_id_from_identifiable( elif default: return get_id_from_identifiable(default, required) elif required: - raise InvalidArguments(f"Argument {identifiable} is neither a data model nor an id string.") + raise InvalidArguments(f"Argument {identifiable} is neither a data model nor an ID string.") def get_object_from_identifiable( @@ -263,4 +261,4 @@ def get_object_from_identifiable( elif default: return get_object_from_identifiable(default, directory) else: - raise InvalidArguments(f"Argument {identifiable} is neither a data model nor an id string.") + raise InvalidArguments(f"Argument {identifiable} is neither a data model nor an ID string.") diff --git a/src/ansys/simai/core/data/workspaces.py b/src/ansys/simai/core/data/workspaces.py index 80c4e983..bb2751c8 100644 --- a/src/ansys/simai/core/data/workspaces.py +++ b/src/ansys/simai/core/data/workspaces.py @@ -28,7 +28,7 @@ class ModelManifest: - """Information about a model associated to a workspace.""" + """Provides information about a model associated with a workspace.""" def __init__(self, raw_manifest: Dict[str, Any]): self._raw = raw_manifest @@ -38,44 +38,44 @@ def __repr__(self) -> str: @property def name(self) -> str: - """The name of the model.""" + """Name of the model.""" return self._raw["model_name"] @property def version(self) -> str: - """The version of the software running the model.""" + """Version of the software running the model.""" return self._raw["coreml_version"] @property def description(self) -> str: - """A short description of the model.""" + """Short description of the model.""" return self._raw["description"] @property def geometry(self) -> Dict[str, Any]: - """Information on the geometry format expected by this model.""" + """Information on the geometry format expected by the model.""" return self._raw["geometry"] @property def boundary_conditions(self) -> Dict[str, Any]: - """Information on the boundary conditions expected by the model, i.e. the prediction's input.""" + """Information on the boundary conditions expected by the model. For example, the prediction's input.""" return self._raw["boundary_conditions"] @property def physical_quantities(self) -> Dict[str, Any]: - """Information on the physical quantities generated by the model, i.e. the prediction's output.""" + """Information on the physical quantities generated by the model. For example, the prediction's output.""" return self._raw["physical_quantities"] @property def post_processings(self) -> List[Dict[str, Any]]: - """Information on the post-processings available for that model + """Information on the postprocessings available for the model and the accepted parameters when relevant. """ - return self._raw["available-post-processings"] + return self._raw["available-postprocessings"] class Workspace(DataModel): - """Local representation of a workspace object.""" + """Provides the local representation of a workspace object.""" def __init__(self, *args, **kwargs): super().__init__(*args, **kwargs) @@ -86,13 +86,13 @@ def __repr__(self) -> str: @property def name(self) -> str: - """The name of the workspace.""" + """Name of the workspace.""" return self.fields["name"] @property def model(self) -> ModelManifest: - """Returns a :class:`~ansys.simai.core.data.workspaces.ModelManifest` containing - information about the model associated to the workspace. + """:class:`~ansys.simai.core.data.workspaces.ModelManifest` instance containing + information about the model associated with the workspace. """ if self._model_manifest is None: self._model_manifest = ModelManifest( @@ -111,21 +111,21 @@ def set_as_current_workspace(self) -> None: def download_model_evaluation_report( self, file: Optional[File] = None ) -> Union[None, BinaryIO]: - """Download the model evaluation report PDF for this workspace. + """Download the PDF of the model evaluation report for the workspace. Args: - file: A binary file-object or the path of the file to put the content into. + file: Binary file-object or the path of the file to put the content into. Returns: - None if a file is specified, a binary file-object otherwise + ``None`` if a file is specified or a binary file-object otherwise. """ return self._client._api.download_workspace_model_evaluation_report(self.id, file) class WorkspaceDirectory(Directory[Workspace]): - """Collection of methods related to workspaces. + """Provides a collection of methods related to workspaces. - Accessed through ``client._workspaces``. + This class is accessed through ``client._workspaces``. Example: .. code-block:: python @@ -143,39 +143,41 @@ def list(self) -> List[Workspace]: return [self._model_from(workspace) for workspace in self._client._api.workspaces()] def get(self, id: Optional[str] = None, name: Optional[str] = None) -> Workspace: - """Get a specific workspace object from the server. + """Get a specific workspace object from the server by either ID or name. + + You can specify either the ID or the name, not both. Args: - id: The id of the workspace to get, incompatible with `name` - name: The name of the workspace to get, incompatible with `id` + id: ID of the workspace. + name: Name of the workspace. Returns: - The workspace + Workspace. Raises: - NotFoundError: No geometry with the given id exists + NotFoundError: No geometry with the given ID exists. """ if name and id: - raise ValueError("Name and Id cannot be both specified.") + raise ValueError("'id' and 'name' cannot both be specified.") if name: return self._model_from(self._client._api.get_workspace_by_name(name)) if id: return self._model_from(self._client._api.get_workspace(id)) - raise ValueError("Either the name or the id must be specified.") + raise ValueError("Either 'id' or 'name' must be specified.") def create(self, name: str, model_id: str) -> Workspace: - """Creates a new workspace. + """Create a workspace. Args: - name: The name to give to the new workspace - model_id: id of the model that the workspace will use + name: Name to give the new workspace. + model_id: ID of the model for the workspace to use. """ return self._model_from(self._client._api.create_workspace(name, model_id)) def delete(self, workspace: Identifiable[Workspace]) -> None: - """Deletes a workspace. + """Delete a workspace. Args: - workspace: the id or :class:`model ` of the workspace to delete + workspace: ID or :class:`model ` of the workspace. """ self._client._api.delete_workspace(get_id_from_identifiable(workspace)) diff --git a/src/ansys/simai/core/errors.py b/src/ansys/simai/core/errors.py index fb67d689..4937b7c6 100644 --- a/src/ansys/simai/core/errors.py +++ b/src/ansys/simai/core/errors.py @@ -26,14 +26,14 @@ class SimAIError(Exception): - """Base exception for all errors of the SimAI SDK. + """Provides the base exception for all errors of the SimAI client. - To catch any expected error that it might throw, use this exception. + To catch any expected error that the client might throw, use this exception. """ class ApiClientError(SimAIError, requests.exceptions.HTTPError): - """HTTP Error from the SimAi API.""" + """HTTP error from the SimAI API.""" def __init__(self, message: str, response=None): super(ApiClientError, self).__init__(message, response=response) @@ -45,7 +45,7 @@ def status_code(self): # noqa: D102 class NotFoundError(ApiClientError): - """The required resource was found on the server.""" + """Required resource was not found on the server.""" class ConnectionError(SimAIError, requests.exceptions.ConnectionError): @@ -53,19 +53,19 @@ class ConnectionError(SimAIError, requests.exceptions.ConnectionError): class ConfigurationError(SimAIError): - """The SDK could not be configured properly.""" + """Client could not be configured properly.""" class ConfigurationNotFoundError(ConfigurationError): - """The configuration file does not exist.""" + """Configuration file does not exist.""" class InvalidConfigurationError(ConfigurationError, ValueError): - """The given configuration is not valid.""" + """Given configuration is not valid.""" class ProcessingError(SimAIError): - """The data could not be processed.""" + """Data could not be processed.""" class InvalidArguments(SimAIError, ValueError): @@ -73,11 +73,11 @@ class InvalidArguments(SimAIError, ValueError): class InvalidClientStateError(SimAIError): - """The client's state is invalid.""" + """Client's state is invalid.""" class InvalidServerStateError(SimAIError): - """The server's state is invalid.""" + """Server's state is invalid.""" class MultipleErrors(SimAIError): @@ -94,9 +94,10 @@ def _map_despite_errors( function: Callable[[T], Any], iterable: Iterable[T], ): - """Like map(), applies the function for each item in iterable and return the result. - On exception, it will continue with next items, - and at the end raise either the exception or a MultipleError. + """Like the map() method, this method applies the function for + each item in the iterable and returns the result. On an exception, + it continue with the next items. At the end, it raises either the + exception or the ``MultipleError`` exception. """ results: List[T] = [] errors: List[SimAIError] = [] @@ -117,9 +118,10 @@ def _foreach_despite_errors( procedure: Callable[[T], None], iterable: Iterable[T], ): - """Applies the procedure for each item in iterable. - On exception, it will continue with next items, - and at the end raise either the exception or a MultipleError. + """This method applies the procedure for each item in the + iterable. On an exception, it continues with the next items. + At the end, it raises either the exception or the ``MultipleError`` + exception. """ errors = [] for item in iterable: diff --git a/src/ansys/simai/core/utils/auth.py b/src/ansys/simai/core/utils/auth.py index a0019979..04713f32 100644 --- a/src/ansys/simai/core/utils/auth.py +++ b/src/ansys/simai/core/utils/auth.py @@ -42,9 +42,10 @@ class _AuthTokens(BaseModel): - """Class that represents the OIDC tokens we receive from the auth server. - It can fetch and refresh these tokens. - It will cache the tokens to disk automatically. + """Represents the OIDC tokens received from the auth server. + + The class can fetch and refresh these tokens. + It caches the tokens to disk automatically. """ _EXPIRATION_BUFFER = timedelta(seconds=5) @@ -104,7 +105,7 @@ def from_request_device_auth( session.post(device_auth_url, data={"client_id": "sdk", "scope": "openid"}) ) print( # noqa: T201 - f"Please go to {auth_codes['verification_uri']} and enter the code {auth_codes['user_code']}" + f"Go to {auth_codes['verification_uri']} and enter the code {auth_codes['user_code']}" ) webbrowser.open(auth_codes["verification_uri_complete"]) while True: @@ -173,7 +174,7 @@ def __init__(self, config: ClientConfig, session: requests.Session) -> None: self._session = session self._enabled = not getattr(config, "_disable_authentication", False) if not self._enabled: - logger.debug("Disabling authentication logic") + logger.debug("Disabling authentication logic.") return self._url_prefix = config.url # HACK: start with a slash to override the /v2/ on the api url @@ -195,13 +196,13 @@ def __init__(self, config: ClientConfig, session: requests.Session) -> None: self._schedule_auth_refresh() def __call__(self, request: requests.Request) -> requests.Request: - """Called to prepare the requests. + """Call to prepare the requests. Args: - request: the request to authenticate + request: Request to authenticate. Returns: - the request with the authentication + Request with the authentication. """ request_host = request.url.split("://", 1)[-1] # ignore protocol part if self._enabled and request_host.startswith(self._url_prefix.host): @@ -222,7 +223,7 @@ def _refresh_auth(self): self._schedule_auth_refresh() def _schedule_auth_refresh(self): - """Schedule auth refresh to avoids refresh token expiring if the client is idle for a long time.""" + """Schedule authentication refresh to avoids refresh token expiring if the client is idle for a long time.""" if self._refresh_timer: self._refresh_timer.cancel() self._refresh_timer = threading.Timer( diff --git a/src/ansys/simai/core/utils/config_file.py b/src/ansys/simai/core/utils/config_file.py index 0bc4d487..a94ffa28 100644 --- a/src/ansys/simai/core/utils/config_file.py +++ b/src/ansys/simai/core/utils/config_file.py @@ -37,10 +37,10 @@ def _scan_defaults_config_paths() -> Optional[Path]: - """Look for a configuration files in the default locations. + """Look for configuration files in the default locations. Returns: - The path of the first configuration file from the list that exists + Path of the first configuration file from the list of existing configuration files. """ system = platform.system() if system == "Windows": @@ -61,7 +61,7 @@ def _scan_defaults_config_paths() -> Optional[Path]: for path in conf_paths: path = Path(path).expanduser() # noqa: PLW2901 if path.is_file(): - logger.debug(f"Found a configuration file at {path}") + logger.debug(f"Found a configuration file at {path}.") return path else: raise ConfigurationNotFoundError("Could not determine OS.") @@ -75,13 +75,15 @@ def get_config( ignore_missing=False, **kwargs, ) -> Dict[Any, Any]: - """Get configuration, validates it and returns it as a flat dict. + """Get configuration file, validate it, and return it as a flat dictionary. Args: - path: Where to find the config file. If None, looks in default locations. - profile: The profile to load. If not specified, looks for `[default]` - ignore_missing: don't raise exception if no path to a config file was found - **kwargs: Overrides to apply to the configuration + path: Path of the configuration file. The default is ``None, in which + case the method looks in default locations. + profile: Profile to load. If no oath is specified, the method looks for ``[default]``. + ignore_missing: Whether to raise an exception if no path to a configuration + file is found. The default is ``False``. + **kwargs: Overrides to apply to the configuration. """ config_path = path or _scan_defaults_config_paths() if config_path is None: @@ -102,7 +104,7 @@ def get_config( cleandoc( # Takes care of the indentation f""" Did not find the [{profile}] profile section in the configuration file. - Please make sure you have a [default] section or that you specify the name of the profile in your from_config call. + Make sure that you have a '[default]' section or specify the name of the profile in your 'from_config' call. """ ) ) diff --git a/src/ansys/simai/core/utils/configuration.py b/src/ansys/simai/core/utils/configuration.py index 0ad15dbe..ee087b0b 100644 --- a/src/ansys/simai/core/utils/configuration.py +++ b/src/ansys/simai/core/utils/configuration.py @@ -38,11 +38,11 @@ def prompt_for_input_factory(*args, **kwargs): class Credentials(BaseModel, extra="forbid"): username: str = "" # dummy default, the root validator will call prompt_for_input - "Username: required if :code:`Credentials` is defined, automatically prompted" + "Username: Required if :code:`Credentials` is defined, automatically prompted." password: str = "" # dummy default, like above - "Password: required if :code:`Credentials` is defined, automatically prompted" + "Password: Required if :code:`Credentials` is defined, automatically prompted." totp: Optional[str] = None - "One-time password: required if :code:`totp_enabled=True`, automatically prompted" + "One-time password: required if :code:`totp_enabled=True`, automatically prompted." @root_validator(pre=True) def prompt(cls, values): @@ -58,31 +58,31 @@ def prompt(cls, values): class ClientConfig(BaseModel, extra="allow"): url: HttpUrl = Field( default="https://api.simai.ansys.com/v2/", - description="The URL to the SimAi API.", + description="URL to the SimAI API.", ) - "The URL to the SimAI API." + "URL to the SimAI API." organization: str = Field( default_factory=prompt_for_input_factory("organization"), - description="The name of the organization(/company) the user belongs to.", + description="Name of the organization(/company) that the user belongs to.", ) - "The name of the organization(/company) the user belongs to." + "Name of the organization(/company) that the user belongs to." credentials: Optional[Credentials] = Field( default=None, - description="Authenticate via username/password instead of device authorization code.", + description="Authenticate via username/password instead of the device authorization code.", ) - "Authenticate via username/password instead of device authorization code." + "Authenticate via username/password instead of the device authorization code." workspace: Optional[str] = Field( - default=None, description="The name of the workspace to use by default." + default=None, description="Name of the workspace to use by default." ) - "The name of the workspace to use by default." + "Name of the workspace to use by default." project: Optional[str] = Field( - default=None, description="The name of the project to use by default." + default=None, description="Name of the project to use by default." ) - "The name of the project to use by default." + "Name of the project to use by default." https_proxy: Optional[AnyHttpUrl] = Field( - default=None, description="The URL of the https proxy to use." + default=None, description="URL of the HTTPS proxy to use." ) - "The URL of the https proxy to use." + "URL of the HTPPS proxy to use." skip_version_check: bool = Field(default=False, description="Skip checking for updates.") "Skip checking for updates." no_sse_connection: bool = Field( diff --git a/src/ansys/simai/core/utils/files.py b/src/ansys/simai/core/utils/files.py index d4b81cd2..708cd5cd 100644 --- a/src/ansys/simai/core/utils/files.py +++ b/src/ansys/simai/core/utils/files.py @@ -34,15 +34,15 @@ def _expand_user_path(file_path: "Path") -> pathlib.Path: - """Converts str inputs to Path and expands user. + """Convert string inputs to ``Path`` and expand the user. - This method allows to support paths starting with ~ on linux + This method supports paths starting with ``~`` on Linux. """ return pathlib.Path(str(file_path)).expanduser() def file_path_to_obj_file(file_path: "Path", mode: str) -> IO[Any]: - """Takes a file path and returns a file-object opened in the given mode.""" + """Take a file path and return a file-object opened in the given mode.""" file_path = _expand_user_path(file_path) file_path.parent.mkdir(parents=True, exist_ok=True) logger.debug(f"Opening file {file_path}") diff --git a/src/ansys/simai/core/utils/grouping.py b/src/ansys/simai/core/utils/grouping.py index 0e6d693b..41c1fa88 100644 --- a/src/ansys/simai/core/utils/grouping.py +++ b/src/ansys/simai/core/utils/grouping.py @@ -26,11 +26,11 @@ class _ToleranceGroup: - """_ToleranceGroup is a group of approximately equal values, - as returned by itertools.groupby when using _ToleranceGrouper. + """Provides a group of approximately equal values, + as returned by ``itertools.groupby`` when using ``_ToleranceGrouper``. - Note that with the way itertools.groupby works, - accessing center should be done only after accessing (iterating) the bucket's elements. + Note that with the way ``itertools.groupby`` works, + accessing the center should be done only after accessing (iterating) the bucket's elements. """ def __init__( @@ -71,21 +71,20 @@ def collect_value(self, new_value): class _ToleranceGrouper: - """_ToleranceGrouper is a grouping class, destined to be used by - itertools.groupby, to create groups of - approximately equal value (according to tolerance). + """Provides a grouping class destined to be used by ``itertools.groupby`` to create + groups of approximately equal value (according to tolerance). - _ToleranceGrouper is meant to be passed as `key` argument for itertools.groupby. - It create buckets defined by a central value, and will contain all - values around it, +-tolerance. Each bucket is an instance of _ToleranceGroup. + The ``_ToleranceGrouper`` class is meant to be passed as `key` argument for ``itertools.groupby``. + It creates buckets defined by a central value and contains all values around it, plus or minus + the tolerance. Each bucket is an instance of the ``_ToleranceGroup`` class. Args: - key_func: optional function computing a key value for each passed element. - If not passed, the element itself will be used. - tolerance: optional tolerance, by default 10**-6 - forced_central_value: optional value that we want to force to be at the - center of its group. Meaning the group will span maximally - from forced_central_value - tolerance to forced_central_value + tolerance + key_func: Optional function computing a key value for each passed element. + If no element is passed, the element itself is used. + tolerance: Optional tolerance. The default is ``10**-6``. + forced_central_value: Optional value that is forced to be at the + center of its group. This means that the group spans maximally + from forced_central_value - tolerance to forced_central_value + tolerance. """ def __init__( @@ -114,6 +113,6 @@ def __call__(self, item: Any) -> float: ) self.current_bucket.collect_value(new_value) - # Return bucket, which will be used by groupby + # Return bucket, which is used by groupby # to group all values together return self.current_bucket diff --git a/src/ansys/simai/core/utils/misc.py b/src/ansys/simai/core/utils/misc.py index 94f2c94b..938e513a 100644 --- a/src/ansys/simai/core/utils/misc.py +++ b/src/ansys/simai/core/utils/misc.py @@ -32,5 +32,5 @@ def build_boundary_conditions(boundary_conditions: Optional[Dict[str, Any]] = No bc = boundary_conditions if boundary_conditions else {} bc.update(**kwargs) if not bc: - raise ValueError("No boundary condition was specified") + raise ValueError("No boundary condition was specified.") return bc diff --git a/src/ansys/simai/core/utils/numerical.py b/src/ansys/simai/core/utils/numerical.py index 6aeffc34..3ee8d849 100644 --- a/src/ansys/simai/core/utils/numerical.py +++ b/src/ansys/simai/core/utils/numerical.py @@ -32,10 +32,12 @@ def is_number(value: Any): def is_smaller_with_tolerance(a: float, b: float, tolerance: Optional[float] = None): - """Strict smaller than (<) comparison, - with a tolerance (default .000001); - meaning if the diff between the two numbers is inferior - than the tolerance, a is considered equal, thus not smaller. + """Run the `less than`(<) comparison with a tolerance. + + The default for the tolerance is ``.000001``. + + If the difference between the two numbers is less + than the tolerance, ``a`` is considered equal, not smaller. """ if tolerance is None: tolerance = DEFAULT_COMPARISON_EPSILON @@ -43,10 +45,12 @@ def is_smaller_with_tolerance(a: float, b: float, tolerance: Optional[float] = N def is_bigger_with_tolerance(a: float, b: float, tolerance: Optional[float] = None): - """Strict bigger than (>) comparison, - with a tolerance (default .000001); - meaning if the diff between the two numbers is inferior - than the tolerance, b is considered equal, thus not bigger. + """Run the `greater than` (>) comparison with a tolerance. + + The default for the tolerance is ``.000001``. + + If the difference between the two numbers is greater + than the tolerance, ``b`` is considered equal, not larger. """ if tolerance is None: tolerance = DEFAULT_COMPARISON_EPSILON @@ -54,27 +58,33 @@ def is_bigger_with_tolerance(a: float, b: float, tolerance: Optional[float] = No def is_smaller_or_equal_with_tolerance(a: float, b: float, tolerance: Optional[float] = None): - """Smaller or equal to (<=) comparison, - with a tolerance (default .000001); - meaning if the diff between the two numbers is inferior - than the tolerance, b is considered equal. + """Run the `less than or equal to` (<=) comparison with a tolerance. + + The default for the tolerance is ``.000001``. + + If the difference between the two numbers is smaller + than the tolerance, ``b`` is considered equal. """ return not is_bigger_with_tolerance(a, b, tolerance) def is_bigger_or_equal_with_tolerance(a: float, b: float, tolerance: Optional[float] = None): - """Bigger or equal to (>=) comparison, - with a tolerance (default .000001); - meaning if the diff between the two numbers is inferior - than the tolerance, b is considered equal. + """Run the `greater than or equal to` (>=) comparison with a tolerance. + + The default for the tolerance is ``.000001``. + + If the difference between the two numbers is smaller + than the tolerance, ``b`` is considered equal. """ return not is_smaller_with_tolerance(a, b, tolerance) def is_equal_with_tolerance(a: float, b: float, tolerance: Optional[float] = None): - """Compare the equality of two numbers, - with a tolerance (default .000001); - meaning if the diff between the two numbers is inferior + """Compare the equality of two numbers with a tolerance. + + The default toleranance is ``.000001``. + + If the difference between the two numbers is smaller than the tolerance, they are considered equal. """ if tolerance is None: diff --git a/src/ansys/simai/core/utils/requests.py b/src/ansys/simai/core/utils/requests.py index 0e898855..bb6de60a 100644 --- a/src/ansys/simai/core/utils/requests.py +++ b/src/ansys/simai/core/utils/requests.py @@ -32,14 +32,14 @@ def handle_http_errors(response: requests.Response) -> None: - """Raises an error if the response status_code is an error. + """Raise an error if the response status code is an error. Args: - response: The response to check for errors + response: Response to check for errors. Raises: - NotFoundError: If the response is a 404 - ApiClientError: If the response is a 4xx or 5xx other than 404 + NotFoundError: If the response is a 404 error. + ApiClientError: If the response is a 4xx or 5xx error other than the 404 error. """ logger.debug("Checking for HTTP errors.") try: @@ -77,14 +77,14 @@ def handle_http_errors(response: requests.Response) -> None: def handle_response(response: requests.Response, return_json: bool = True) -> APIResponse: - """Handles http errors and returns the relevant data from the response. + """Handle HTTP errors and return the relevant data from the response. Args: - response: The response to handle - return_json: Whether to return the json content or the whole response. + response: Response to handle + return_json: Whether to return the JSON content or the whole response. Returns: - The json dict of the response if :py:args:`return_json` is True. The raw + JSON dict of the response if :py:args:`return_json` is ``True`` or the raw :py:class:`requests.Response` otherwise. """ handle_http_errors(response) @@ -94,7 +94,7 @@ def handle_response(response: requests.Response, return_json: bool = True) -> AP try: return response.json() except (ValueError, JSONDecodeError): - logger.debug("Failed to read json response.") + logger.debug("Failed to read JSON response.") raise ApiClientError( "Expected a JSON response but did not receive one.", response ) from None diff --git a/src/ansys/simai/core/utils/sse_client.py b/src/ansys/simai/core/utils/sse_client.py index a55af5f5..a8f76f92 100644 --- a/src/ansys/simai/core/utils/sse_client.py +++ b/src/ansys/simai/core/utils/sse_client.py @@ -49,9 +49,9 @@ def __init__( def _connect(self): self._disconnect_client() - logger.info(f"Will connect to SSE with last event id {self._last_event_id}") + logger.info(f"Will connect to SSE with last event id {self._last_event_id}.") event_source = self._event_source_factory(self._last_event_id) - logger.info("Create SSEClient with event source") + logger.info("Create SSEClient with event source.") self._sseclient = sseclient.SSEClient(event_source) def _disconnect_client(self): diff --git a/src/ansys/simai/core/utils/typing.py b/src/ansys/simai/core/utils/typing.py index 295ace24..9f48e0fd 100644 --- a/src/ansys/simai/core/utils/typing.py +++ b/src/ansys/simai/core/utils/typing.py @@ -32,7 +32,7 @@ def steal_kwargs_type( original_fn: "Callable[P, Any]", ) -> "Callable[[Callable], Callable[P, T]]": - """Returns casted original function, with the kwargs type stolen from original_fn.""" + """Return the casted original function, with the kwargs type stolen from original_fn.""" def return_func(func: "Callable[..., T]") -> "Callable[P, T]": return cast("Callable[P, T]", func) diff --git a/src/ansys/simai/core/utils/validation.py b/src/ansys/simai/core/utils/validation.py index 7941d8f4..3f3c2af0 100644 --- a/src/ansys/simai/core/utils/validation.py +++ b/src/ansys/simai/core/utils/validation.py @@ -32,13 +32,14 @@ def _list_elements_pass_predicate(items_list: List[T], predicate: Callable[[Any] def _enforce_as_list_passing_predicate( parameter: Union[T, List[T]], predicate: Callable[[Any], bool], error_message: str ) -> List[T]: - """Makes sure the passed parameter either is a single element passing predicate, - or is a list of elements all passing predicate. - In both case return a list. - If other cases, raises a TypeError with error_message. + """Make sure that the passed parameter is either a single element passing a predicate + or a list of elements that are all passing predicates. - Useful for validating a type of parameter, i.e. either accept a Geometry - or a list of Geometries. + In both case, a list is returned. In other cases, raise a TypeError with the + error message. + + This method is useful for validating a type of parameter. For example, accepting either + a geometry or a list of geometries. """ if predicate(parameter): return [parameter]