Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove binpacker from ALISA #2463

jjhugues opened this issue Sep 28, 2020 · 1 comment · Fixed by #2551

Remove binpacker from ALISA #2463

jjhugues opened this issue Sep 28, 2020 · 1 comment · Fixed by #2551


Copy link

When invoking the binpacker from ALISA, and the system is not schedulable, the following code is triggered

./analyses/ org.osate.ui.dialogs.Dialog.showError(Application Binding Results",

this results in a window popping up with an error message.

This corresponds to the folloing ALISA verification method

method ResourceAllocationScheduling : "Allocate threads to processors and assure schedulability." [
	plugin BinPack
	category Quality.Timing
			"Bind all threads to processors while maintaining schedulability and meeting binding constraints. This analysis uses a BinPacking technique by Dio DeNiz."
@lwrage lwrage added the alisa label Sep 28, 2020
@lwrage lwrage added this to the 2.9.1 milestone Oct 2, 2020
@joeseibel joeseibel changed the title Windows pops up when running binpacker from ALISA Remove binpacker from ALISA Dec 18, 2020
Copy link

The binpacker does not produce any results that ALISA can use. The binpacker uses a dialog to give all of the results and ALISA looks for problem markers, which means that ALISA has no way of knowing if the binpacking was sucessful or if it had failed. Since we are planning on rewriting the binpacker, it doesn't make sense to fix bugs in the current implementation. Therefore, we are going to remove binpacking support from ALISA. The binpacker will still be available as a standard analysis from the Analyses menu.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
None yet

Successfully merging a pull request may close this issue.

3 participants