diff --git a/web/content/books/_index.md b/web/content/books/_index.md index 2112ad1b618ce646c65b1bcf0c33d256414c03a4..f559fcfa2e56df82082d63e8d3d954201881fe52 100644 --- a/web/content/books/_index.md +++ b/web/content/books/_index.md @@ -2,4 +2,4 @@ Title = "Books & Tutorials" +++ -We publish a series of books and tutorials on OpenGeoSys. The books contain comprehensive benchmark descriptions and can be seen as a reference to what is possible with OGS. The tutorials focus on specific topics such as [Groundwater Flow Modeling]({{< relref "computational-hydrology-i-groundwater-flow-modeling">}}) or [Models of Thermochemical Heat Storage]({{< relref "models-of-thermochemical-heat-storage" >}}) with step-by-step instructions giving the user a good introduction into modeling and simulation with OGS. Most of the tutorials can be downloaded as PDF! See the detail page on each tutorial. +We publish a series of books and tutorials on OpenGeoSys. The books contain comprehensive benchmark descriptions and can be seen as a reference to what is possible with OGS. The tutorials focus on specific topics such as [Groundwater Flow Modeling]({{< ref "computational-hydrology-i-groundwater-flow-modeling" >}}) or [Models of Thermochemical Heat Storage]({{< ref "models-of-thermochemical-heat-storage" >}}) with step-by-step instructions giving the user a good introduction into modeling and simulation with OGS. Most of the tutorials can be downloaded as PDF! See the detail page on each tutorial. diff --git a/web/content/docs/devguide/advanced/third-party-libraries.pandoc b/web/content/docs/devguide/advanced/third-party-libraries.pandoc index edca6731a43f132320393aca14859d1e63864062..41689896dd91fce7b73015e4579bcf5db826a405 100644 --- a/web/content/docs/devguide/advanced/third-party-libraries.pandoc +++ b/web/content/docs/devguide/advanced/third-party-libraries.pandoc @@ -12,7 +12,7 @@ weight = 1049 ::: {.note} ### <i class="fas fa-exclamation-triangle"></i> Attention! -We strongly recommend to simply use [Conan]({{< relref "conan.pandoc" >}}) for handling required third-party libraries. +We strongly recommend to simply use [Conan]({{< ref "conan.pandoc" >}}) for handling required third-party libraries. ::: ## Introduction @@ -29,4 +29,4 @@ If you want to build the Data Explorer you need these too: - [Shapelib](http://shapelib.maptools.org) >= 1.3.0 - [libgeotiff](https://trac.osgeo.org/geotiff/) >= 1.4.2 -Please refer to the library documentation on how to build. If you struggle consider using [Conan]({{< relref "conan.pandoc" >}})! Once built CMake tries its best to find the libraries. +Please refer to the library documentation on how to build. If you struggle consider using [Conan]({{< ref "conan.pandoc" >}})! Once built CMake tries its best to find the libraries. diff --git a/web/content/docs/devguide/advanced/working-on-envinf1.pandoc b/web/content/docs/devguide/advanced/working-on-envinf1.pandoc index ad308eaffbe219c6fd6908395a218995ffe83100..275d6e3760e1e9a115a1a67cac65b5e3292c0288 100644 --- a/web/content/docs/devguide/advanced/working-on-envinf1.pandoc +++ b/web/content/docs/devguide/advanced/working-on-envinf1.pandoc @@ -21,7 +21,7 @@ Load required modules by sourcing the environment script: $ source scripts/env/envinf1/cli.sh ``` -Then do the [build configuration]({{< relref "build-configuration.pandoc" >}}) and [build]({{< relref "/docs/devguide/getting-started/build.pandoc" >}}) the project. +Then do the [build configuration]({{< ref "build-configuration.pandoc" >}}) and [build]({{< ref "/docs/devguide/getting-started/build.pandoc" >}}) the project. ## Build the Data Explorer @@ -31,7 +31,7 @@ Load required modules by sourcing the environment script: $ source scripts/env/envinf1/gui.sh ``` -Then do the [build configuration]({{< relref "build-configuration.pandoc" >}}) and [build]({{< relref "/docs/devguide/getting-started/build.pandoc" >}}) the project. +Then do the [build configuration]({{< ref "build-configuration.pandoc" >}}) and [build]({{< ref "/docs/devguide/getting-started/build.pandoc" >}}) the project. ## Install and use Conan diff --git a/web/content/docs/devguide/development-workflows/continuous-integration.pandoc b/web/content/docs/devguide/development-workflows/continuous-integration.pandoc index 4884b8780e1978405339a25b2c82f2ce13f095e5..9972932eb3fd5053bd731d616eea83191225343e 100644 --- a/web/content/docs/devguide/development-workflows/continuous-integration.pandoc +++ b/web/content/docs/devguide/development-workflows/continuous-integration.pandoc @@ -16,10 +16,10 @@ weight = 1013 So for every proposed change to the source code the following is done automatically: - Compilation of the changed code merged with the official source code is tested on a variety of platforms (Windows, Linux, Mac OS, different compilers) -- A comprehensive [test suite]({{< relref "unit-testing.pandoc" >}}) checks validity of the proposed changes +- A comprehensive [test suite]({{< ref "unit-testing.pandoc" >}}) checks validity of the proposed changes - Additional checks regarding code formatting and documentation help in maintaining a good software quality and structure -After the system is done with all these tasks the developer can view build reports highlighting occurred errors and problems. We are using [Jenkins]({{< relref "jenkins.pandoc" >}}) as our CI system. +After the system is done with all these tasks the developer can view build reports highlighting occurred errors and problems. We are using [Jenkins]({{< ref "jenkins.pandoc" >}}) as our CI system. ## CI on OGS @@ -31,5 +31,5 @@ Click on the *Details* link to find out the reason for a failed check. If you ad # Enable CI on your fork -You can have the CI system testing all your branches in your fork of OGS even before creating a pull request. See the page on [Jenkins]({{< relref "jenkins.pandoc#automatic-testing-for-own-repository" >}}) for further information. +You can have the CI system testing all your branches in your fork of OGS even before creating a pull request. See the page on [Jenkins]({{< ref "jenkins.pandoc#automatic-testing-for-own-repository" >}}) for further information. diff --git a/web/content/docs/devguide/getting-started/build-configuration.pandoc b/web/content/docs/devguide/getting-started/build-configuration.pandoc index f3cffd04428b005dea06b9b3f37277ee4a185367..646a5c576711113d97da19f0933c277786d65c89 100644 --- a/web/content/docs/devguide/getting-started/build-configuration.pandoc +++ b/web/content/docs/devguide/getting-started/build-configuration.pandoc @@ -25,7 +25,7 @@ So just go ahead and create a build-directory along your source code directory. It is preferred to use Conan package manager to install required third-party libraries. To use Conan add the CMake option `OGS_USE_CONAN=ON` to the CMake run (see below). This will run Conan automatically downloading either prebuilt binaries of required libraries or building them locally if a binary for your setup (operating system, compiler, ..) is not available. [Check this]({{< ref "conan-package-manager.pandoc" >}}) for advanced Conan usage. -*Note:* Instead of using Conan you can optionally [install the required libraries manually]({{< relref "third-party-libraries.pandoc" >}}). +*Note:* Instead of using Conan you can optionally [install the required libraries manually]({{< ref "third-party-libraries.pandoc" >}}). ::: {.win} Add `-DOGS_USE_CONAN=ON` to the CMake-run (see below). diff --git a/web/content/docs/devguide/getting-started/prerequisites.pandoc b/web/content/docs/devguide/getting-started/prerequisites.pandoc index 577b0d72b5d601dfba1873aeda0f79c4a81ad5fb..92aa454752030fc7f764b8c7c5d2140714df9297 100644 --- a/web/content/docs/devguide/getting-started/prerequisites.pandoc +++ b/web/content/docs/devguide/getting-started/prerequisites.pandoc @@ -17,7 +17,7 @@ The minimum prerequisites to build OGS are: - Git (version control tool, at least version 1.7.x) - CMake (build configuration tool, at least version 3.1) - A compiler with [C++11](http://en.wikipedia.org/wiki/C%2B%2B11)-support -- [Conan package manager](https://www.conan.io/) **OR** install [required libraries]({{< relref "third-party-libraries.pandoc" >}}) manually (for advanced users only!) +- [Conan package manager](https://www.conan.io/) **OR** install [required libraries]({{< ref "third-party-libraries.pandoc" >}}) manually (for advanced users only!) ## Step: Install a compiler @@ -87,7 +87,7 @@ $ git --version git version 1.7.4.1 ``` -Otherwise please install Git with your favourite package manager: +Otherwise please install Git with your favorite package manager: ```bash $ sudo yum install git // RPM-based systems @@ -206,7 +206,7 @@ Go to [Conans download page](https://www.conan.io/downloads) and see platform-sp ::: {.win} Just use the provided Windows installer and make sure to add conan to your path environment as asked by the installer. -Check on a newly opened command line if Conan was installed succesfully: +Check on a newly opened command line if Conan was installed successfully: ``` $ conan --version @@ -217,7 +217,7 @@ Conan version 1.0.4 ::: {.linux} Use either the provided deb-package or install via Pythons `pip`. -Check on a newly opened command line if Conan was installed succesfully: +Check on a newly opened command line if Conan was installed successfully: ``` $ conan --version @@ -232,7 +232,7 @@ Use Homebrew: brew install conan ``` -Check on a newly opened command line if Conan was installed succesfully: +Check on a newly opened command line if Conan was installed successfully: ``` $ conan --version diff --git a/web/content/docs/devguide/testing/test-data.pandoc b/web/content/docs/devguide/testing/test-data.pandoc index 75f3e3418247d6919183abfd00a5ff4c4b681424..e383db56136d909918932add99ae64ddf588cedc 100644 --- a/web/content/docs/devguide/testing/test-data.pandoc +++ b/web/content/docs/devguide/testing/test-data.pandoc @@ -37,6 +37,6 @@ In the OGS-cli outputting to `[build-dir]/Tests/Data` is already handled (via th In code `BaseLib::BuildInfo::data_path` (from `BuildInfo.h`) references the data source directory and `BaseLib::BuildInfo::data_binary_path` references the data output directory. -For adding new data files make sure you have [setup git lfs](../../getting-started/prerequisites) already. Then simply commit the new files as usual. For pushing you need to have [setup an account]({{< relref "gitlab.pandoc" >}}) on our own GitLab server as the lfs files are stored there (due to bandwidth limitations on GitHub). When asked for GitLab credentials on pushing use your GitLab account name (should be the same as the GitHub account name) and your created GitLab personal access token ([see the GitLab Setup page]({{< relref "gitlab.pandoc" >}})). +For adding new data files make sure you have [setup git lfs](../../getting-started/prerequisites) already. Then simply commit the new files as usual. For pushing you need to have [setup an account]({{< ref "gitlab.pandoc" >}}) on our own GitLab server as the lfs files are stored there (due to bandwidth limitations on GitHub). When asked for GitLab credentials on pushing use your GitLab account name (should be the same as the GitHub account name) and your created GitLab personal access token ([see the GitLab Setup page]({{< ref "gitlab.pandoc" >}})). Check this [in-depth tutorial](https://www.atlassian.com/git/tutorials/git-lfs) to learn more about git lfs. diff --git a/web/content/docs/devguide/troubleshooting/cmake.pandoc b/web/content/docs/devguide/troubleshooting/cmake.pandoc index e0d1a58c25962b7fdbbadf57b2de2becae7e2aa6..2c9e35b89ba62017ed3307f41d3ab2e9406e9ef0 100644 --- a/web/content/docs/devguide/troubleshooting/cmake.pandoc +++ b/web/content/docs/devguide/troubleshooting/cmake.pandoc @@ -15,4 +15,4 @@ If something goes wrong when running CMake please try again with an **empty** or Please read the CMake output carefully. Often it will tell you what went wrong. -Check also [Conans troubleshooting page]({{< relref "conan.pandoc" >}}) if you use Conan for dependencies. +Check also [Conans troubleshooting page]({{< ref "conan.pandoc" >}}) if you use Conan for dependencies. diff --git a/web/content/docs/quickstart/basics/envinf1.pandoc b/web/content/docs/quickstart/basics/envinf1.pandoc index 8cded9034ef4a5f5844831af78612f492884998a..bfe0008ea88c1df209cc576b91f7160c8919eae4 100644 --- a/web/content/docs/quickstart/basics/envinf1.pandoc +++ b/web/content/docs/quickstart/basics/envinf1.pandoc @@ -28,4 +28,4 @@ module load ogs/6.0.9 # Loads stable version 6.0.9 in standard config, not You can select only one version at a time. Run `module purge` to unload all previously loaded modules. -See [Quickstart]({{< relref "introduction.pandoc" >}}) for running instructions. +See [Quickstart]({{< ref "introduction.pandoc" >}}) for running instructions. diff --git a/web/content/docs/tools/meshing/extract-surface/index.pandoc b/web/content/docs/tools/meshing/extract-surface/index.pandoc index 8a861fcf844e0dfb059f4595ba79bbcfd7fccf00..f61d6607c636dc8c959a2169f2d3d9bef5f7d43e 100644 --- a/web/content/docs/tools/meshing/extract-surface/index.pandoc +++ b/web/content/docs/tools/meshing/extract-surface/index.pandoc @@ -23,7 +23,7 @@ ExtractSurface -i [<file name of input mesh>] [-o <file name of output mesh>] [--node-property-name <string>] ``` -The normal of the surface that should be extracted is given by the arguments `-x`, `-y` and `-z`. The default normal is (0,0,-1). The command line option `-a` can be used to specify the allowed deviation of the normal of the surface element from the given normal. The data arrays added to the surface mesh by using the options `--face-property-name` (default value 'OriginalFaceIDs'), `--element-property-name` (default value 'OriginalSubsurfaceElementIDs'), and `--node-property-name` (default value 'OriginalSubsurfaceNodeIDs') are used in other tools (for instance in [ComputeNodeAreasFromSurfaceMesh]({{< relref "compute-node-areas-from-surface-mesh" >}})) and is required for flux calculations during a simulation run of OpenGeoSys. +The normal of the surface that should be extracted is given by the arguments `-x`, `-y` and `-z`. The default normal is (0,0,-1). The command line option `-a` can be used to specify the allowed deviation of the normal of the surface element from the given normal. The data arrays added to the surface mesh by using the options `--face-property-name` (default value 'OriginalFaceIDs'), `--element-property-name` (default value 'OriginalSubsurfaceElementIDs'), and `--node-property-name` (default value 'OriginalSubsurfaceNodeIDs') are used in other tools (for instance in [ComputeNodeAreasFromSurfaceMesh]({{< ref "compute-node-areas-from-surface-mesh" >}})) and is required for flux calculations during a simulation run of OpenGeoSys. ## Example diff --git a/web/content/docs/tools/meshing/remove-mesh-elements/index.pandoc b/web/content/docs/tools/meshing/remove-mesh-elements/index.pandoc index 9264c8420206a847e3cfd6900b9bce92f941e2c5..4cf6463bfefe7543245544490a552d96ef84c122 100644 --- a/web/content/docs/tools/meshing/remove-mesh-elements/index.pandoc +++ b/web/content/docs/tools/meshing/remove-mesh-elements/index.pandoc @@ -16,9 +16,9 @@ The tool `removeMeshElements` removes those elements from a given input mesh tha 3. Remove elements that have zero volume. 4. Remove elements by axis aligned bounding box criterion. -One possible application is to cut out a smaller mesh out of a bigger one by marking the inner/outer region using the tool [SetPropertiesInPolygonalRegion]({{< relref "set-properties-in-polygonal-region" >}}). +One possible application is to cut out a smaller mesh out of a bigger one by marking the inner/outer region using the tool [SetPropertiesInPolygonalRegion]({{< ref "set-properties-in-polygonal-region" >}}). -Another application is to cut out patches of a (top) surface (tool [ExtractSurface]({{< relref "extract-surface" >}})) for assigning boundary conditions. +Another application is to cut out patches of a (top) surface (tool [ExtractSurface]({{< ref "extract-surface" >}})) for assigning boundary conditions. ## Usage @@ -36,7 +36,7 @@ Each particular line with optional arguments refere to one of the different remo   -The left figure above is the result of the repeated application of [SetPropertiesInPolygonalRegion]({{< relref "set-properties-in-polygonal-region" >}}). It contains material ids 0 (red), 1 (yellow), 2 (turquoise) and 3 (blue). On the right figure the result of the following command line input is depicted: +The left figure above is the result of the repeated application of [SetPropertiesInPolygonalRegion]({{< ref "set-properties-in-polygonal-region" >}}). It contains material ids 0 (red), 1 (yellow), 2 (turquoise) and 3 (blue). On the right figure the result of the following command line input is depicted: ``` removeMeshElements -i TestCube-ResetPropertiesInPolygonalRegion.vtu -o TestCube-removeMeshElements.vtu -n MaterialIDs --int-property-value 1 --int-property-value 2 --int-property-value 3 ``` diff --git a/web/content/docs/tools/model-preparation/compute-node-areas-from-surface-mesh/index.pandoc b/web/content/docs/tools/model-preparation/compute-node-areas-from-surface-mesh/index.pandoc index d81e44df288840993a2a2cf30a77828583f1fcea..4f9e6d163be99fd88ebd826f1d92618440785550 100644 --- a/web/content/docs/tools/model-preparation/compute-node-areas-from-surface-mesh/index.pandoc +++ b/web/content/docs/tools/model-preparation/compute-node-areas-from-surface-mesh/index.pandoc @@ -10,7 +10,7 @@ author = "Thomas Fischer" ## General -In the process of incorporating boundary conditions of second type (or Neumann boundary conditions) into the simulation model, the area associated to each surface node is needed for the local assembly. This tool reads a surface mesh (see also [ExtractSurface]({{< relref "extract-surface" >}})), computes the associated area for each node and writes the information as txt and csv data. +In the process of incorporating boundary conditions of second type (or Neumann boundary conditions) into the simulation model, the area associated to each surface node is needed for the local assembly. This tool reads a surface mesh (see also [ExtractSurface]({{< ref "extract-surface" >}})), computes the associated area for each node and writes the information as txt and csv data. ## Usage @@ -19,15 +19,15 @@ ComputeNodeAreasFromSurfaceMesh -i <file name of input mesh> [-p <output path and base name as one string>] [--id-prop-name <property name>] ``` -If the option `-p` is not given the output path is extracted from the input path. The default value for the `--id-prop-name` argument is "OriginalSubsurfaceNodeIDs". This name is also used by [ExtractSurface]({{< relref "extract-surface" >}}) for storing the subsurface node ids. +If the option `-p` is not given the output path is extracted from the input path. The default value for the `--id-prop-name` argument is "OriginalSubsurfaceNodeIDs". This name is also used by [ExtractSurface]({{< ref "extract-surface" >}}) for storing the subsurface node ids. ## Example  The following steps were performed to obtain the example data: - 1. The hexahedral example domain was created by [generateStructuredMesh]({{< relref "structured-mesh-generation">}}) `generateStructuredMesh -o hex_6x7x3.vtu -e hex --lx 6 --ly 7 --lz 3`. - 2. The tool [ExtractSurface]({{< relref "extract-surface" >}}) was applied: + 1. The hexahedral example domain was created by [generateStructuredMesh]({{< ref "structured-mesh-generation">}}) `generateStructuredMesh -o hex_6x7x3.vtu -e hex --lx 6 --ly 7 --lz 3`. + 2. The tool [ExtractSurface]({{< ref "extract-surface" >}}) was applied: `ExtractSurface -i hex_6x7x3.vtu -o hex_6x7x3_surface.vtu` The generated surface mesh contains a property "OriginalSubsurfaceNodeIDs" assigned to the mesh nodes that contains the original subsurface mesh node ids. 3. Finally `ComputeNodeAreasFromSurfaceMesh -i hex_6x7x3_surface.vtu` generates two text files (`hex_6x7x3_surface.txt` and `hex_6x7x3_surface.csv`). The txt file is usable as boundary condition input file for OGS-5 simulation. The first column of the text file contains the original mesh node id (see image above), the second column the associated area. For example to the corner node 168 an area of 0.25 is associated. The edge node 169 has an area value of 0.5 and the interior node 176 has an area value of 1. diff --git a/web/content/docs/tools/model-preparation/create-boundary-conditions-along-a-polyline/index.pandoc b/web/content/docs/tools/model-preparation/create-boundary-conditions-along-a-polyline/index.pandoc index 1e874f89646330ce9112cc29f35a21189f3b6c5c..f993eeee1582d8171c3a0a3d4f61eea69fb137fe 100644 --- a/web/content/docs/tools/model-preparation/create-boundary-conditions-along-a-polyline/index.pandoc +++ b/web/content/docs/tools/model-preparation/create-boundary-conditions-along-a-polyline/index.pandoc @@ -16,7 +16,7 @@ The user has to provide the input mesh `mesh` and the geometry `geometry` that m The tool will output a OGS-5 boundary condition file (.bc) and a geometry file (.gli) containing the points the boundary conditions refer to. The original geometry will not be altered. Additional, it is possible to write the geometry in the gml format by setting the switch `gml` to 1. -The polylines should be in the vicinity of the mesh nodes, where the user wants to set the boundary conditions. The tool will generate boundary conditions at every node which lies within the search radius `search_radius`. Idealy, the geometry should be mapped as close as possible to the area the user wants to set boundary conditions (for this, also see tool [MapGeometryToSurface]({{< relref "map-geometric-object-to-the-surface-of-a-mesh" >}})). +The polylines should be in the vicinity of the mesh nodes, where the user wants to set the boundary conditions. The tool will generate boundary conditions at every node which lies within the search radius `search_radius`. Idealy, the geometry should be mapped as close as possible to the area the user wants to set boundary conditions (for this, also see tool [MapGeometryToSurface]({{< ref "map-geometric-object-to-the-surface-of-a-mesh" >}})). ## Usage diff --git a/web/content/docs/tools/model-preparation/set-properties-in-polygonal-region/index.pandoc b/web/content/docs/tools/model-preparation/set-properties-in-polygonal-region/index.pandoc index 35202a786d983086fb9400140a02335d4d448ae4..9a906510510cc7e9f7e3bb7b550d01c9120de7c3 100644 --- a/web/content/docs/tools/model-preparation/set-properties-in-polygonal-region/index.pandoc +++ b/web/content/docs/tools/model-preparation/set-properties-in-polygonal-region/index.pandoc @@ -16,7 +16,7 @@ The new value must be of one of the data types, either [integer](https://en.wiki The polygon must be located within a plane. A node is located in the cylindrical volume iff (ie. if and only if) the node's orthogonal projection to the plane of the polygon lies in the polygon, ie. the plane can be defined in any arbitrary direction in 3d space. -In combination with a threshold filter the tool can also be used to cut out some region of the mesh. The tool can be also used in combination with the tool [removeMeshElements]({{< relref "remove-mesh-elements" >}}). +In combination with a threshold filter the tool can also be used to cut out some region of the mesh. The tool can be also used in combination with the tool [removeMeshElements]({{< ref "remove-mesh-elements" >}}). The tool writes a new mesh `modified_mesh`. diff --git a/web/content/docs/tools/workflows/create-boundary-condition-in-polygonal-region/index.pandoc b/web/content/docs/tools/workflows/create-boundary-condition-in-polygonal-region/index.pandoc index 2e79f40bef73ae6bc514cd5ccc8b8a4c31c989eb..77998c88f627a92f849ee22aef83ad255fd5c647 100644 --- a/web/content/docs/tools/workflows/create-boundary-condition-in-polygonal-region/index.pandoc +++ b/web/content/docs/tools/workflows/create-boundary-condition-in-polygonal-region/index.pandoc @@ -12,22 +12,22 @@ author = "Thomas Fischer" In order to create boundary conditions in a polygonal region the following workflow can be used: -1. Extract surface using [ExtractSurface]({{< relref "extract-surface" >}}) (a: Original subsurface mesh; b: Extracted surface) +1. Extract surface using [ExtractSurface]({{< ref "extract-surface" >}}) (a: Original subsurface mesh; b: Extracted surface)   -2. Mark the mesh elements within the polygonal region deploying the tool [ResetPropertiesInPolygonalRegions]({{< relref "set-properties-in-polygonal-region" >}}) (a: Surface and polygon; b: Marked elements in polygonal region are colored yellow; c: Marked regions visualized by different colors) +2. Mark the mesh elements within the polygonal region deploying the tool [ResetPropertiesInPolygonalRegions]({{< ref "set-properties-in-polygonal-region" >}}) (a: Surface and polygon; b: Marked elements in polygonal region are colored yellow; c: Marked regions visualized by different colors)    -3. Remove marked/unmarked mesh elements deploying tool [removeMeshElements]({{< relref "remove-mesh-elements" >}}) (Resulting patches visualized by different colors and z-translations) +3. Remove marked/unmarked mesh elements deploying tool [removeMeshElements]({{< ref "remove-mesh-elements" >}}) (Resulting patches visualized by different colors and z-translations)  -4. Compute the associated area for the nodes of the surface mesh deploying tool [ComputeNodeAreasFromSurfaceMesh]({{< relref "compute-node-areas-from-surface-mesh" >}}) +4. Compute the associated area for the nodes of the surface mesh deploying tool [ComputeNodeAreasFromSurfaceMesh]({{< ref "compute-node-areas-from-surface-mesh" >}}) The surface mesh patches (created until step 3) can be used as input for OGS-6 simulations. For OGS-5 simulations only the additional step 4 has to be performed.