-
Notifications
You must be signed in to change notification settings - Fork 0
OpenMpp Setup Development Environment
Your development and runtime environment must meet following:
- OS: 64 or 32 bits version of:
- Linux (tested): Debian stable (12), MX Linux 23, Ubuntu 24.04, 22.04, RedHat 9+
- Windows (tested): 11, 10, may work on 7
- tested on MacOS latest, may work starting from Catalina 10.15 and Big Sur 11.1+, including new Apple Arm64 CPU (a.k.a. M1)
Note: It does work on most of latest Linux'es, any Windows 7+ or 2008R2+, 32 and 64 bits. We just not testing it regularly on every possible Windows / Linux version.
- Support of c++20:
- g++ >= 11.4+
- Visual Studio 2022, including Community Edition, Visual Studio 2019 also works, but not tested regularly
- Xcode 11.2+
- (optional) if want to build omc (openM++ compiler) from sources:
- bison 3.3+ and flex 2.6+
- (optional) it is recommended to have MPI installed on your local machine or in your HPC cluster:
- Linux (tested): OpenMPI 1.6+
- Windows (tested): Microsoft MPI v8+, expected to work starting from HPC Pack 2012 R2 MS-MPI Redistributable Package
- expected to work: MPICH (MS-MPI is in fact MPICH redistributed by Microsoft)
Optional development tools:
- R 3.5+
- Go 1.21+, on Windows required MinGw for g++ compiler
- node.js LTS version
Linux: To check g++ version type: g++ --version
, expected output:
g++ (Debian 12.2.0-14) 12.2.0
g++ (Ubuntu 13.3.0-6ubuntu2~24.04) 13.3.0
g++ (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0
g++ (GCC) 11.5.0 20240719 (Red Hat 11.5.0-2)
Note: Above output does not include all possible Linux versions and may be outdated, openM++ supports latest Linux distributions.
MacOS: To check c++ version type: clang --version
or g++ --version
, expected output:
Apple clang version 11.0.0 (clang-1100.0.33.12)
MacOS: install command line developer tools, if not installed already by Xcode: xcode-select --install
Windows: Make sure you have Visual Studio 2022 installed with latest update (VS 2019 is not supported but may work).
If you are using different c++ vendor, i.e. Intel c++ then compile and run following test:
#include <iostream>
#include <source_location>
#include <string_view>
using namespace std;
void log(const string_view msg, const source_location loc = source_location::current())
{
clog << "file: "
<< loc.file_name() << '('
<< loc.line() << ':'
<< loc.column() << ") "
<< loc.function_name() << ": "
<< msg << '\n';
}
template<typename T>
void fun(T x)
{
log(x); // line 20
}
int main(int, char*[])
{
log("Hello world!"); // line 25
fun("Hello C++20!");
}
Save above code as h20.cpp
, compile and run it:
g++ -Wall -std=c++20 -pthread -o h20 h20.cpp
./h20
Expected output may vary depending on your compiler version.
Optional: If you want to recompile omc (OpenM++ compiler) then you need bison version >= 3.3 and flex 2.6+ installed.
To check bison and flex version type following commands:
bison --version
flex --version
Windows:
- download Windows version of bison and flex
- if your OpenM++ checkout folder is:
C:\SomeDir\
then unzipwin_flex_bison-2.5.24.zip
intoC:\SomeDir\bin\
To check bison and flex version type following commands with current directory C:\SomeDir\bin\
:
win_bison --version
win_flex --version
Expected output:
bison (GNU Bison) 3.7.4
flex 2.6.4
MacOS Bison:
Bison version included in MacOS bison (GNU Bison) 2.3
released in 2006 and too old for openM++.
You can install bison 3.8 from HomeBrew or from (MacPorts)[https://www.macports.org/]
MacOS Bison from HomeBrew:
- install HomeBrew from GUI terminal:
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install.sh)"
- install bison 3.8 using HomeBrew:
brew install bison@3.8
- export bison, you may also want to add it into your .zprofile: if MacOS on Intel CPU:
export PATH="/usr/local/opt/bison/bin:$PATH"
export LDFLAGS="-L/usr/local/opt/bison/lib ${LDFLAGS}"
if MacOS on Apple Arm64 CPU (a.k.a. M1):
export PATH="/opt/homebrew/opt/bison/bin:${PATH}"
export LDFLAGS="-L/opt/homebrew/opt/bison/lib ${LDFLAGS}"
- verify bison version
bison --version
....
bison (GNU Bison) 3.8.2
OpenM++ is using MPI to run the models on multiple computers in your network, in cloud or HPC cluster environment.
Linux: To check your MPI version:
[user@host ~]$ mpirun --version
mpirun (Open MPI) 1.10.7
You may need to load MPI module in your environment on RedHat:
module load mpi/openmpi-x86_64
mpirun --version
Windows: To check your MPI version:
C:\> mpiexec /?
Microsoft MPI Startup Program [Version 10.0.12498.5]
........
Windows: download and install Microsoft MPI SDK and MPI Redistributable.
You can test your MPI environment with following code:
#include <mpi.h>
#include <iostream>
using namespace std;
int main(int argc, char **argv)
{
int mpiCommSize;
int mpiRank;
int procNameLen;
char procName[MPI_MAX_PROCESSOR_NAME];
MPI_Init(&argc, &argv);
MPI_Comm_size(MPI_COMM_WORLD, &mpiCommSize);
MPI_Comm_rank(MPI_COMM_WORLD, &mpiRank);
MPI_Get_processor_name(procName, &procNameLen);
cout << "Process: " << mpiRank << " of " << mpiCommSize << " name: " << procName << endl;
MPI_Finalize();
return 0;
}
Save this code as mhp.cpp
, compile and run it:
mpiCC -o mhp mhp.cpp
mpirun -n 4 mhp
Expected output is similar to:
Process: 0 of 4 name: omm.beyond2020.com
Process: 2 of 4 name: omm.beyond2020.com
Process: 1 of 4 name: omm.beyond2020.com
Process: 3 of 4 name: omm.beyond2020.com
Windows: To build MPI tests in Visual Studio:
- create C++ command-line project
- adjust following in project properties:
- VC Directories -> Include Directories -> C:\Program Files\Microsoft MPI\Inc
- VC Directories -> Library Directories -> C:\Program Files\Microsoft MPI\Lib\i386
- Linker -> Input -> Additional Dependencies -> msmpi.lib
- build it and run under Visual Studio debugger
Please use amd64
version of MS MPI libraries if you want to build 64bit version.
To run MPI test on Windows type following in your command-line prompt:
mpiexec -n 4 mhp.exe
Expected output is similar to:
Process: 3 of 4 name: anatolyw7-om.beyond2020.local
Process: 2 of 4 name: anatolyw7-om.beyond2020.local
Process: 0 of 4 name: anatolyw7-om.beyond2020.local
Process: 1 of 4 name: anatolyw7-om.beyond2020.local
Download and install R version 3.5+ (v4+ not tested):
- Windows: https://cran.r-project.org/bin/macosx/R-3.6.3.nn.pkg
- on Linux use your package manager, e.g.:
sudo yum install R
- MacOS on Intel CPU: https://cran.r-project.org/bin/macosx/R-3.6.3.nn.pkg
It is recommended to use RStudio or RStudio Server for development.
-
Windows:
- download Go from https://golang.org/ and install into any directory, e.g.:
C:\Program Files\go
- download MinGw from your preferable distribution, ex: https://nuwen.net/mingw.html and unpack into any directory:
C:\MinGW\
- create your Go working directory, e.g.:
C:\go_workspace\
- set your environment variables:
- download Go from https://golang.org/ and install into any directory, e.g.:
set GOPATH=C:\go_workspace
set PATH=%GOPATH%\bin;%PATH%
cd %GOPATH%
C:\MinGW\set_distro_paths.bat
It is recommended to use Visual Studio Code for development.
- MacOS on Intel CPU: download and install fresh Go version, for example: https://golang.org/dl/go1.16.3.darwin-amd64.pkg
- MacOS on Arm64 CPU: download and install fresh Go version, for example: https://golang.org/dl/go1.16.3.darwin-arm64.pkg
-
MacOS Go also can be installed from
go1.16.3.linux-amd64.tar.gz
orgo1.16.3.linux-arm64.tar.gz
archive, similar to Linux - MacOS: include into your .zprofile PATH to Go, for example:
export GOROOT=$HOME/go
export PATH=$GOROOT/bin:${PATH}
Note: above version number 1.16.3 is only an example, please most recent stable version.
-
Linux:
- download Go, for example version 1.16.3 from: https://golang.org/dl/go1.16.3.linux-amd64.tar.gz
- unpack into any directory, e.g.:
~/go
- set your environment variables (in .profile or .bash_profile or .bashrc, etc.):
export GOROOT=$HOME/go
export PATH=$GOROOT/bin:${PATH}
If you want to copy models database content from SQLite to other vendors then you may also need to install unixODBC development package:
su -c "yum install unixODBC unixODBC-devel"
Currently supported database vendors are: SQLite (default), Microsoft SQL Server, MySql, PostgreSQL, IBM DB2, Oracle. You can use dbcopy utility to copy model data between any of vendors above, for example copy from MySQL to MSSQL or from PostgeSQL to SQLite.
You need node.js
in order to build and develop openM++ UI. Please download and install stable version from Node.js.
Windows
- Use any of:
- MSI installer: https://nodejs.org/dist/v14.16.1/node-v14.16.1-x64.msi
- Zip archive: https://nodejs.org/dist/v14.16.1/node-v14.16.1-win-x64.zip
- if you are using archive then unpack it into
C:\node
directory and to start development open command prompt and type:
C:\node\nodevars.bat
cd C:\my-openm-plus-plus-dir\ompp-ui
npm install
Linux
- Use your favorite package manager
- Or directly download archive from Node.js and unpack into
$HOME/node
:
curl https://nodejs.org/dist/v14.16.1/node-v14.16.1-linux-x64.tar.xz -o node.tar.xz
mkdir $HOME/node
tar -xJf node.tar.xz -C node --strip-components=1
- add PATH to Node into your .bash_profile (or .profile or .bashrc, etc):
export PATH=$HOME/node/bin/:${PATH}
- checkout and build UI:
cd my-openm-plus-plus-dir
git clone https://github.com/openmpp/UI.git ompp-ui
cd ompp-ui
npm install
npm run build
MacOS on Intel CPU
- Use any of:
- if you are using archive then unpack it into
$HOME/node
and try checkout and build UI:
mkdir $HOME/node
tar -xzf node-v14.16.1-darwin-x64.tar.gz -C node --strip-components=1
- add PATH to Node into your .zprofile:
export PATH=$HOME/node/bin/:${PATH}
- checkout and build UI as described in Linux section above
MacOS on Arm64 CPU
- install HomeBrew from GUI terminal:
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install.sh)"
- install Node.js LTS version using HomeBrew:
brew install node@14
- add PATH to Node into your .zprofile:
export PATH=/opt/homebrew/opt/node@14/bin:${PATH}
- checkout and build UI as described in Linux section above
Note: In examples above node-v14.16.1
is an example of current LTS (long term support) version. Please check Node.js site to download latest LTS version.
- Windows: Quick Start for Model Users
- Windows: Quick Start for Model Developers
- Linux: Quick Start for Model Users
- Linux: Quick Start for Model Developers
- MacOS: Quick Start for Model Users
- MacOS: Quick Start for Model Developers
- Model Run: How to Run the Model
- MIT License, Copyright and Contribution
- Model Code: Programming a model
- Windows: Create and Debug Models
- Linux: Create and Debug Models
- MacOS: Create and Debug Models
- MacOS: Create and Debug Models using Xcode
- Modgen: Convert case-based model to openM++
- Modgen: Convert time-based model to openM++
- Modgen: Convert Modgen models and usage of C++ in openM++ code
- Model Localization: Translation of model messages
- How To: Set Model Parameters and Get Results
- Model Run: How model finds input parameters
- Model Output Expressions
- Model Run Options and ini-file
- OpenM++ Compiler (omc) Run Options
- OpenM++ ini-file format
- UI: How to start user interface
- UI: openM++ user interface
- UI: Create new or edit scenario
- UI: Upload input scenario or parameters
- UI: Run the Model
- UI: Use ini-files or CSV parameter files
- UI: Compare model run results
- UI: Aggregate and Compare Microdata
- UI: Filter run results by value
- UI: Disk space usage and cleanup
- UI Localization: Translation of openM++
-
Highlight: hook to self-scheduling or trigger attribute
-
Highlight: The End of Start
-
Highlight: Enumeration index validity and the
index_errors
option -
Highlight: Simplified iteration of range, classification, partition
-
Highlight: Parameter, table, and attribute groups can be populated by module declarations
- Oms: openM++ web-service
- Oms: openM++ web-service API
- Oms: How to prepare model input parameters
- Oms: Cloud and model runs queue
- Use R to save output table into CSV file
- Use R to save output table into Excel
- Run model from R: simple loop in cloud
- Run RiskPaths model from R: advanced run in cloud
- Run RiskPaths model in cloud from local PC
- Run model from R and save results in CSV file
- Run model from R: simple loop over model parameter
- Run RiskPaths model from R: advanced parameters scaling
- Run model from Python: simple loop over model parameter
- Run RiskPaths model from Python: advanced parameters scaling
- Windows: Use Docker to get latest version of OpenM++
- Linux: Use Docker to get latest version of OpenM++
- RedHat 8: Use Docker to get latest version of OpenM++
- Quick Start for OpenM++ Developers
- Setup Development Environment
- 2018, June: OpenM++ HPC cluster: Test Lab
- Development Notes: Defines, UTF-8, Databases, etc.
- 2012, December: OpenM++ Design
- 2012, December: OpenM++ Model Architecture, December 2012
- 2012, December: Roadmap, Phase 1
- 2013, May: Prototype version
- 2013, September: Alpha version
- 2014, March: Project Status, Phase 1 completed
- 2016, December: Task List
- 2017, January: Design Notes. Subsample As Parameter problem. Completed
GET Model Metadata
- GET model list
- GET model list including text (description and notes)
- GET model definition metadata
- GET model metadata including text (description and notes)
- GET model metadata including text in all languages
GET Model Extras
GET Model Run results metadata
- GET list of model runs
- GET list of model runs including text (description and notes)
- GET status of model run
- GET status of model run list
- GET status of first model run
- GET status of last model run
- GET status of last completed model run
- GET model run metadata and status
- GET model run including text (description and notes)
- GET model run including text in all languages
GET Model Workset metadata: set of input parameters
- GET list of model worksets
- GET list of model worksets including text (description and notes)
- GET workset status
- GET model default workset status
- GET workset including text (description and notes)
- GET workset including text in all languages
Read Parameters, Output Tables or Microdata values
- Read parameter values from workset
- Read parameter values from workset (enum id's)
- Read parameter values from model run
- Read parameter values from model run (enum id's)
- Read output table values from model run
- Read output table values from model run (enum id's)
- Read output table calculated values from model run
- Read output table calculated values from model run (enum id's)
- Read output table values and compare model runs
- Read output table values and compare model runs (enun id's)
- Read microdata values from model run
- Read microdata values from model run (enum id's)
- Read aggregated microdata from model run
- Read aggregated microdata from model run (enum id's)
- Read microdata run comparison
- Read microdata run comparison (enum id's)
GET Parameters, Output Tables or Microdata values
- GET parameter values from workset
- GET parameter values from model run
- GET output table expression(s) from model run
- GET output table calculated expression(s) from model run
- GET output table values and compare model runs
- GET output table accumulator(s) from model run
- GET output table all accumulators from model run
- GET microdata values from model run
- GET aggregated microdata from model run
- GET microdata run comparison
GET Parameters, Output Tables or Microdata as CSV
- GET csv parameter values from workset
- GET csv parameter values from workset (enum id's)
- GET csv parameter values from model run
- GET csv parameter values from model run (enum id's)
- GET csv output table expressions from model run
- GET csv output table expressions from model run (enum id's)
- GET csv output table accumulators from model run
- GET csv output table accumulators from model run (enum id's)
- GET csv output table all accumulators from model run
- GET csv output table all accumulators from model run (enum id's)
- GET csv calculated table expressions from model run
- GET csv calculated table expressions from model run (enum id's)
- GET csv model runs comparison table expressions
- GET csv model runs comparison table expressions (enum id's)
- GET csv microdata values from model run
- GET csv microdata values from model run (enum id's)
- GET csv aggregated microdata from model run
- GET csv aggregated microdata from model run (enum id's)
- GET csv microdata run comparison
- GET csv microdata run comparison (enum id's)
GET Modeling Task metadata and task run history
- GET list of modeling tasks
- GET list of modeling tasks including text (description and notes)
- GET modeling task input worksets
- GET modeling task run history
- GET status of modeling task run
- GET status of modeling task run list
- GET status of modeling task first run
- GET status of modeling task last run
- GET status of modeling task last completed run
- GET modeling task including text (description and notes)
- GET modeling task text in all languages
Update Model Profile: set of key-value options
- PATCH create or replace profile
- DELETE profile
- POST create or replace profile option
- DELETE profile option
Update Model Workset: set of input parameters
- POST update workset read-only status
- PUT create new workset
- PUT create or replace workset
- PATCH create or merge workset
- DELETE workset
- POST delete multiple worksets
- DELETE parameter from workset
- PATCH update workset parameter values
- PATCH update workset parameter values (enum id's)
- PATCH update workset parameter(s) value notes
- PUT copy parameter from model run into workset
- PATCH merge parameter from model run into workset
- PUT copy parameter from workset to another
- PATCH merge parameter from workset to another
Update Model Runs
- PATCH update model run text (description and notes)
- DELETE model run
- POST delete model runs
- PATCH update run parameter(s) value notes
Update Modeling Tasks
Run Models: run models and monitor progress
Download model, model run results or input parameters
- GET download log file
- GET model download log files
- GET all download log files
- GET download files tree
- POST initiate entire model download
- POST initiate model run download
- POST initiate model workset download
- DELETE download files
- DELETE all download files
Upload model runs or worksets (input scenarios)
- GET upload log file
- GET all upload log files for the model
- GET all upload log files
- GET upload files tree
- POST initiate model run upload
- POST initiate workset upload
- DELETE upload files
- DELETE all upload files
Download and upload user files
- GET user files tree
- POST upload to user files
- PUT create user files folder
- DELETE file or folder from user files
- DELETE all user files
User: manage user settings
Model run jobs and service state
- GET service configuration
- GET job service state
- GET disk usage state
- POST refresh disk space usage info
- GET state of active model run job
- GET state of model run job from queue
- GET state of model run job from history
- PUT model run job into other queue position
- DELETE state of model run job from history
Administrative: manage web-service state
- POST a request to refresh models catalog
- POST a request to close models catalog
- POST a request to close model database
- POST a request to delete the model
- POST a request to open database file
- POST a request to cleanup database file
- GET the list of database cleanup log(s)
- GET database cleanup log file(s)
- POST a request to pause model run queue
- POST a request to pause all model runs queue
- PUT a request to shutdown web-service