Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Requirements for new release of ExternalMedia and action plan #4

Closed
casella opened this issue Nov 20, 2017 · 33 comments
Closed

Requirements for new release of ExternalMedia and action plan #4

casella opened this issue Nov 20, 2017 · 33 comments

Comments

@casella
Copy link
Collaborator

casella commented Nov 20, 2017

High-level requirements for the new release of ExternalMedia

  • Run under Windows and Linux (possibly also MacOS)
  • Run with 32-bit and 64-bit simulation runtimes
  • Run at least with Dymola and OpenModelica (possibly also other Modelica tools)
  • Easy installation: ideally, the library should run in all the above-mentioned Modelica tools as-is, with as little as possible installation actions.
  • Easy management of dependencies on other software, currently CoolProp and FluidProp
@casella
Copy link
Collaborator Author

casella commented Nov 20, 2017

Recent history:

  • Developed until 3 Apr 2015 on the Modelica Association's SVN Repository and mirrored in the Modelica Association's GitHub Repository. At that time it interfaced with FluidProp (any version, to install separately), CoolProp (v. 4.2, embedded) and worked in Dymola/Windows, OpenModelica/Windows and OpenModelica/Linux, all 32-bit only. The library on the repository included compiled binaries and required a minimalistic installation procedure, described in the Installation.txt file
  • In July 2015, @jowr started development of a new version interfacing with CoolProp v5 on a branch of the SVN server, then continued the development with @JonWel on GitHub under coolprop/ExternalMedia until Dec 2016. The files on the repository contain all the source code and build script, but not the binaries
  • The Modelica Association moved all the library development to GitHub, so the SVN server is now read-only and all new developments of this library should be made on modelica/ExternalMedia

@casella
Copy link
Collaborator Author

casella commented Nov 20, 2017

Even though the library can be compiled as a dynamically linked library, error management is not satisfactory in that setting, because the external function cannot call ModelicaError and thus can only abort the simulation, preventing the solver from retrying, e.g. with a shorter time step.

One month ago I thought that dynamically linked libraries could help reducing dependencies, by avoiding the need to have different binaries for each version of Visual Studio under Windows and by also interfacing to CoolProp via a pre-compiled DLL. This would have needed some changes to the C/C++ codebase to ensure proper error handling.

However, after some discussion with @adrpo, I have changed my mind, because there is a concrete risk that we run into another wasp's nest of dependencies (a.k.a DLL hell) on specific machine installations and hard-to-debug issues on a multitude of different platforms, for which end-user support is basically impossible. I am now convinced that the current approach of compiling self-contained static libraries for each software and platform is safer: if the compiled library works on the developer's machine, it can be deployed with a reasonable confidence that it will also work on other people's machines.

In the meantime, the Modelica Specification was improved regarding the handling of compiler-version-specific static libraries (see section 12.9.4 of the Modelica Specification 3.4), so it is now possible to ship a single release of the library which contains all the statically linked libraries for each possible combination of Modelica tool, C/C++ compiler and operating system, thus achieving the goal of limited installation effort.

@adrpo also suggested me to use CMAKE to streamline the code generation for multiple platforms, an approach that has already been undertaken in coolprop/ExternalMedia

@casella
Copy link
Collaborator Author

casella commented Nov 20, 2017

All these things considered, I think that a good tentative plan could be as follows:

  • open a branch of modelica/ExternalMedia for development (I can do that right away)
  • merge all commits made to coolprop/ExternalMedia there. As far as I understand, there is a common base 1b77869, so this should be easily done. I would appreciate if somebody more proficient with GIT than I am provides a PR for this.
  • compile static libraries for 32-bit Window/Linux Dymola/OpenModelica combinations that are currently supported in the code base, test them, include them in the Resources/Library directories in the appropriate places (including vsXXXX-specific directories for the Dymola/Windows platform) and tag a new release 3.3.0 that uses CoolProp v5 and can be used right away without the need of building the libraries from sources, which is always a bit tricky. This part I can take care of, help in the form of PRs is of course welcome.

Once this is done, we can work on the next release 3.4.0 with the following features:

  • use of CoolProp v6
  • support of 64-bit simulation runtimes

@JonWel, @jowr, @slamer59, @bilderbuchi, @thorade, @ibell, @squoilin, please add your comments and availability, so we can proceed

@casella
Copy link
Collaborator Author

casella commented Nov 20, 2017

One more thing, when CoolProp was moved on GitHub but the ExternalMedia trunk was still on SVN, we removed the svn:external directory from the ExternalMedia repository and started relying on a script to checkout the sources on the fly from GitHub. Although it would now be possible to re-introduce a CoolProp sub-module in the modelica/ExternalMedia repo, I think it is better to keep the current configuration to avoid too much cluttering.

We just need to define a policy on how to deal with new commits on the CoolProp/CoolProp codebase. I guess it is ok if the scripts checkout the latest master branch, but we should use the latest released version when we compile the binaries that are placed in the Resources/Library directory, and of course check that all test cases work fine.

@bilderbuchi
Copy link
Collaborator

bilderbuchi commented Nov 20, 2017

I can provide the pr with the coolpro changes you request above pretty soon, please open a "develop" branch I can target. (edit: this is ready and just waits for a dev branch to target)

A couple of really quick remarks about your comments above:
if you vendor in foolproof, a git submodule would be imo the preferable way to do that, as the state of the included repo is much clearer. You can always be in the latest head (or as current as you want) in the main Dev branch, but switch to a stable point/release before releasing.
Also, in my experience, including binaries in your report is A Very Bad Idea, as it will make your report grow very much with every change. I participated in another issue tool where after years we ended up with a multi-GB repo (most in the history), which is basically impossible to fix cleanly once you have a decent-sized community. Really, if in any way possible, have a CI server create and test your build artifacts on the fly. You can package and host release packages with binaries and all here on GitHub.

@beutlich
Copy link

Even though the library can be compiled as a dynamically linked library, error management is not satisfactory in that setting, because the external function cannot call ModelicaError and thus can only abort the simulation, preventing the solver from retrying, e.g. with a shorter time step.

This is exactly the problem I raised in https://trac.modelica.org/Modelica/ticket/2191.

@JonWel
Copy link
Contributor

JonWel commented Nov 20, 2017

@casella In term of CoolProp 5 and 6, if I remember correctly, the work in coolprop/ExternalMedia was already targeting some early versions of CoolProp 6.
So it may be simpler to directly target to make a release with CoolProp 6.1.0 (released the 25 Oct 2016) as it will be much closer to the development version used at the time of the latest commits in coolprop/ExternalMedia.

@sjoelund
Copy link

if you vendor in foolproof, a git submodule would be imo the preferable way to do that, as the state of the included repo is much clearer. You can always be in the latest head (or as current as you want) in the main Dev branch, but switch to a stable point/release before releasing.

Yes, you never want to have the case when you checkout a known good copy of a repository and it doesn't work. Note that git submodule sometimes fails because people push force and cleanup parent repositories (though usually not for release tags); git subtree prevents that from happening, but increases the size of your repository.

@bilderbuchi
Copy link
Collaborator

Can you describe that failure in a little more detail(especially the "cleanup" part )? Most of the errors I've seen related to git submodule is people forgetting to update or --init them after changes have been pulled.

@ibell
Copy link

ibell commented Nov 21, 2017 via email

@bilderbuchi
Copy link
Collaborator

I know that git can be magical/troublesome at times, but I haven't encountered that particular situation before.
What was the error you encountered? "Reference is not a tree" perhaps?
Going into the submodule directory and doing a git pull origin, git fetch origin master or equivalent depending on your exact situation should have fixed that - were you aware of that solution at the time? After that, you go back to the coolprop repo and commit the updated submodule reference.
For reference see SO questions here and here.

@thorade
Copy link
Contributor

thorade commented Nov 21, 2017

@ibell and @bilderbuchi That issue with submodules and commits not being available anymore is to my understanding the result of someone force-pushing to the master branch of the submodule. Force-pushing to the master branch can create some trouble and should be avoided, e.g. by using protected branches.

@bilderbuchi
Copy link
Collaborator

Yes, indeed, but as it is typically/often a dependency of your project that you include with a subproject, you don't necessarily have any influence over the development process of said project (like apparently happened with Eigen above).

Also, a squash-and-rebase PR workflow would also lead to rewritten history in PR branches, so if one relied on a commit from a PR (probably a non-optimal decision), that could also create these problems.

@thorade
Copy link
Contributor

thorade commented Nov 21, 2017

As also stated by @bilderbuchi in #4 (comment) I would also not recommend placing the binaries directly within the git repository.
One way to distribute the binaries (or, library with binaries included) could be the github releases:
https://help.github.com/articles/about-releases/
One would first tag a commit, e.g. as v3.4.0 and then the release page allows to attach any binary files or a zip file with everything required by the user.
CI services like TravisCI can even deploy automatically to github releases, e.g. triggered by an event like tag creation. That again works well with CMake (even though I have to admit I have only used CMake, never wrote a CMake script).

@beutlich
Copy link

beutlich commented Nov 21, 2017

One would first tag a commit, e.g. as v3.4.0 and then the release page allows to attach any binary files or a zip file with everything required by the user.

This is even possible before the tag is created by attaching the binaries to the draft = hidden release.

@slamer59
Copy link

I worked extensively with CMake and write/correct some scripts for cross-platform binary generation together with wrapper generation (python/java/...). I think it is mandatory to work with it.

That's what I read when @casella give the requirements : possibly also other Modelica tools and Mac OS. If we don't provide the right plaform someone can build is own thanks to CMake.

I also used to build Dynamic library and I understand why it can be difficult. I agree to start with static library first. If at some point in Modelica developpement it is easier to make dynamic library, changing CMake to do that is straight forward.

I also agree that tracking binary file with git is not a good idea since it save the whole binary file forever. If it's a real need, it is better to track SHA of the file more than the file itself (but it's cumbersome). git-annex git-media are built for this purpose.

@slamer59
Copy link

By the way, I always wondered if FMU/FMI was not writen for this kind of things ?
Anyone have experienced in FMU/FMI standard ?

@casella
Copy link
Collaborator Author

casella commented Nov 21, 2017

OK, thanks for the quick feedback!

@bilderbuchi: I just created the v.3.3.0 branch from master, please proceed with the PR

@bilderbuchi: regarding not putting binaries on git, you are of course right. There are not that many binaries in there, but there will be some, as the number of supported platforms increases, and of course with GIT you carry all of them along with each clone of the repo. We will add binary files to the released versions only, using GitHub's release features. Unless there is somebody around who volunteers to compile and test all the binaries for all platforms, it would be nice if we had some kind of shared file system where different people could upload their contributions and other people could test, before we go for the final release. Any suggestion for that?

A CI system that does it automatically would be nice, but it needs to have all the compilers (MSVC pro is not freeware) and all the operating systems installed, which is not trivial to set up. Given the small number of releases I envision, I guess we can do this manually, possibly sharing the workload. If someone volunteers to set that up, that would be great.

@slamer59, if you can help to set up a full cross-compilation evironment using CMake, that would be great

@JonWel: if what we currently have is close to 6.1, then let's go for it directly

So, the updated plan is

  • @bilderbuchi makes a pull request for the v.3.3.0-dev branch with what we have now
  • we update it to 6.1.0
  • we try to see on how many platforms can we compile in a reasonable amount of time and then release

For me, it would be mandatory to have something going on 32-bit runtimes by the end of the year. If we can also get 64-bit in the same period, that's excellent, otherwise we do it next year

@casella
Copy link
Collaborator Author

casella commented Nov 21, 2017

For the binaries, we could possibly use GitLFS to manage the pre-release process, then just make one .zip file of the library with all the binaries in the Resources directory when it's ready

@casella
Copy link
Collaborator Author

casella commented Nov 21, 2017

As a side remark, I think it is definitely a good idea to discuss using new architectures such as FMI, but that's way beyond the scope of this issue, which is just the next release of the library after a long hibernation period.

What we have now is a 95% ready solution that needs to be polished up, streamlined and released properly, in a way that allows people who are not wizards at Modelica/C/C++/GIT but just want to use the library for their projects to do so easily. IMHO the goal now is to minimize the amount of work and decisions required to do so and get there fast.

Once that is done, any fancy new development ideas would be welcome, but please let's discuss them in a separate issue. Feel free to open it (them?) right away ☺️

@bilderbuchi
Copy link
Collaborator

We will add binary files to the released versions only, using GitHub's release features.

This is fine, then they don't add up in the repo/history, which is the most important thing! I only caution strongly against adding binaries in the main repo/history, this decision will bite you in the future, and is extremely hard to revert later as you would have to rewrite public history, breaking all forks of the project. (see here, if you must, for some extended multiyear discussion on this.).

it would be nice if we had some kind of shared file system where different people could upload their contributions and other people could test, before we go for the final release. Any suggestion for that?

Yes, you could make a "dummy"/separate repo to collect the binaries in preparation for release.
that way the main repo stays clean and binary free, and you will be able to coordinate the collection/compilation of binaries.

maybe manage that using github LFS (so the history is limited in growth), but you'll have a dependency on LFS, then. you could even link/submodule that repo into the main repo, for dev work.

@casella
Copy link
Collaborator Author

casella commented Nov 22, 2017

Yes, you could make a "dummy"/separate repo to collect the binaries in preparation for release.
that way the main repo stays clean and binary free, and you will be able to coordinate the collection/compilation of binaries.

Sounds like a good idea. If I get it right, you fork the project when preparing the release, keep it up to date with the commits to the source code in the feature, add binaries, but then delete it after the release. Right?

I agree that adding a dependency on LFS maybe overkill. After all these files are not that large, if you don't carry them along forever.

@bilderbuchi
Copy link
Collaborator

bilderbuchi commented Nov 22, 2017

Sounds like a good idea. If I get it right, you fork the project when preparing the release, keep it up to date with the commits to the source code in the feature, add binaries, but then delete it after the release. Right?

Not quite. I meant a github repo (or if you prefer a ftp or other online data collection), which just has the compiled binaries. Then the devs do the work you talked about above:

...needs to have all the compilers (MSVC pro is not freeware) and all the operating systems installed, which is not trivial to set up. Given the small number of releases I envision, I guess we can do this manually, possibly sharing the workload. If someone volunteers to set that up, that would be great.

i.e. before a release start compiling the lib on their various setups, upload them to that repo, and you or some other release packager packages it all together with the necessary source into a release, which gets uploaded/zipped to the Github release section, i.e. https://github.com/modelica/ExternalMedia/releases

This means that the binaries are not in the source repo (so the repo stays lean), all tagged/published releases can easily be downloaded/distributed from that page (so end users don't have to know git), and you are still able to manually compile a release (so are not beholden to a CI system).

The only potential downside I see is how to do day-to-day development work. I am reasonably confident that you can solve this in this way:

  • The directory where the binaries reside gets flagged ignored by git (using the .gitignore file)
  • When making changes, developers work as usual, and compile the lib on their system with their toolchain/compiler/...
  • The compiled binaries are there for the devs, but don't end up in the source repo

Then

  • Before release, a release commit is specified, and all the devs compile and upload all the combinations we need to an (e.g.) ExternalMedia_binaries repo, to be collected and packaged into a release. (nota bene: you could also just use an FTP server, but a GH repo probably has the lowest barrier-to-entry, and you can nuke it at will later)
  • Users download releases as zip files from https://github.com/modelica/ExternalMedia/releases

@casella
Copy link
Collaborator Author

casella commented Nov 23, 2017

I would just try to limit as much as possible the need of manual operations that could go wrong. A github repo containing only the binaries requires you to copy them back and forth between the repo with the Modelica code and the C source code and the repo for the binaries.

My idea of the forked repo was to have Git do this automatically, so you can always checkout an up-to-date fully functional version of the library including the binaries, those that you compiled and those that other compiled, and test it on the fly on your computer with your installed Modelica tools, without worring about anything. Maybe there are other smart ways to do so.

@bilderbuchi
Copy link
Collaborator

Well, you said (iirc) that you want to do manual compilation by many people to avoid the complexity of automated compilation with CI, so I suggested manual steps...

Is it realistic that you get an always working dev version "without worrying", when people making changes just compile on their machine for the Dev branch, as you seem to suggest? Doesn't this automatically fall behind, with a Linux-gcc binary available for the state 2 months ago, then some changes by a Windows Dev, with a binary for that now 3 weeks old, and the last changes by a Linux, but clang Dev with the current state from yesterday? Assuming a Linux GCC Dev checked that state out, would that work?
My guess is no, which is why I suggested that precomputed binaries at arbitrary states don't make too much sense to include, and it's better to provide binaries at release points for the "users", and the Devs should compile on the fly, on their machine, for their state, but leave that on their local directory, ignored by git.

@casella
Copy link
Collaborator Author

casella commented Nov 23, 2017

I agree, if we do truly multi-platform support, compiling at arbitrary points for different platforms and committing to a repo doesn't make much sense

@beutlich
Copy link

@casella What is the current state of the ExternalMedia update plan?

@greenwoodms06
Copy link

greenwoodms06 commented Jan 24, 2019

@casella Will ExternalMedia no longer be updated? There appears to be no updates since Nov 2017. This is a valuable resource and it would be a shame for it to become obsolete.

@ibell
Copy link

ibell commented Jan 24, 2019

I think the only one currently who might have the time and inclination to update ExternalMedia is @casella . I'm from time to time touching CoolProp, but that's about it.

@greenwoodms06
Copy link

Agreed... I do hope there are plans/motivation to perform this work.

@slamer59
Copy link

slamer59 commented Jan 29, 2019 via email

@beutlich
Copy link

Is the v3.3.0 development still active? There was not any commit in 2018 at all.

And actually I wonder, if I should base pull requests based on master or v3.3.0-dev branch.

@jowr
Copy link
Collaborator

jowr commented May 5, 2021

I am going to close this now - all future discussions should get their own issue. Especially the DLL discussion should be taken of when preparing for v4.0.0 since it seems like OpenModelica has some advanced ways of initializing values from external shared libraries.

@jowr jowr closed this as completed May 5, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants