Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Minimal kernel bases #28143

Closed
vneiger opened this issue Jul 9, 2019 · 17 comments
Closed

Minimal kernel bases #28143

vneiger opened this issue Jul 9, 2019 · 17 comments

Comments

@vneiger
Copy link
Contributor

vneiger commented Jul 9, 2019

New functionalities:

  • computation of shifted minimal kernel bases (e.g. via approximant bases at large order and/or via Zhou-Labahn-Storjohann 2012),
  • verification that a matrix is a shifted minimal kernel basis.

This should be done in a general context:

  • accepting non-uniform shifts,
  • allowing to work row-wise or column-wise,
  • offering the possibility of obtaining the canonical basis (that is, the one in shifted Popov form).

CC: @romainlebreton @johanrosenkilde @ke456 @vneiger

Component: algebra

Keywords: polynomial matrix, kernel basis, approximant basis

Author: Vincent Neiger

Branch/Commit: 46a8dea

Reviewer: Romain Lebreton, Pascal Giorgi, Johan Rosenkilde, Seung Gyu Hyun

Issue created by migration from https://trac.sagemath.org/ticket/28143

@vneiger vneiger added this to the sage-8.9 milestone Jul 9, 2019
@vneiger

This comment has been minimized.

@vneiger

This comment has been minimized.

@johanrosenkilde
Copy link
Contributor

comment:4

Great that you're picking this up! Some initial comments:

  • There is already left_kernel and right_kernel which move into the fraction field and return a module. Perhaps the function here should be called left_kernel_basis, without the explicit "minimal". This is under the assumption that whenever a user asks for (a basis of) the (left/right) kernel of an F[x] matrix, he should/would invariably prefer a reduced basis of the kernel, represented over F[x] again. Note similarly that when you ask for a left_kernel().basis_matrix() of a Q-matrix, you get one in rref form.

  • Row-wise/Column-wise: Handled by optional arguments like in weak_popov_form() and friends, I presume?

  • We don't yet have M.approximant_basis() or some such. If following the first suggested approach for implementing this ticket's functionality, then we should add that as a stand-alone method.

  • Note that we don't yet have a M.popov_form() method. If we already had that, then the ticket's last suggested functionality really becomes a question of optimisation rather than added functionality.

  • "verification that a matrix is a shifted minimal kernel basis." I'm not sure I understand what this means. Is it a method on a matrix M to ask whether another given matrix K is a left/right kernel basis (minimality is just row/col reduced, right?). Doesn't this boil down to M * K == 0 and K.is_column_reduced() and K.is_saturated()? We don't have is_saturated() right now, of course.

@embray
Copy link
Contributor

embray commented Dec 30, 2019

comment:5

Ticket retargeted after milestone closed

@embray embray modified the milestones: sage-8.9, sage-9.1 Dec 30, 2019
@vneiger
Copy link
Contributor Author

vneiger commented Apr 1, 2020

Branch: u/vneiger/minimal_kernel_bases

@sagetrac-git
Copy link
Mannequin

sagetrac-git mannequin commented Apr 3, 2020

Branch pushed to git repo; I updated commit sha1. New commits:

32bfc3bfix small bug + add some examples
0c9bca0minor doc fixes

@sagetrac-git
Copy link
Mannequin

sagetrac-git mannequin commented Apr 3, 2020

Commit: 0c9bca0

@vneiger
Copy link
Contributor Author

vneiger commented Apr 3, 2020

comment:8

Hi,

Thanks Johan for your input.

The ticket functionalities are now written. Can you please check if you see any issue?

Since minimal_approximant_basis now already exists (answering your third point) I chose to follow the same naming, i.e. minimal_kernel_basis (answering your first point). Still, these names can be discussed and we could get rid of minimal_ in both.

Your second and fourth points: yes, this is right. Also, since the implemented minimal_approximant_basis algorithm already allows to return the Popov form (with small overhead compared to weak Popov), and since the chosen kernel basis algorithm is essentially a call to that algorithm at sufficiently high order, it made sense to do it this way rather computing a weak Popov kernel basis and then calling another normalization procedure (which indeed is not yet written, this is clearly a missing basic feature).

For your last point: yes, this is right. I implemented this with plain multiplication for M*K==0 (no Freivalds or such), and for the saturation I compute a HNF basis of the column space: this is the identity matrix iff the (full rank) matrix is row-saturated. Here as well, a Popov form algorithm would also work (instead of HNF) and would probably be somewhat faster.

@vneiger vneiger self-assigned this Apr 3, 2020
@mkoeppe
Copy link
Member

mkoeppe commented Apr 14, 2020

comment:11

Batch modifying tickets that will likely not be ready for 9.1, based on a review of the ticket title, branch/review status, and last modification date.

@mkoeppe mkoeppe modified the milestones: sage-9.1, sage-9.2 Apr 14, 2020
@vneiger
Copy link
Contributor Author

vneiger commented May 4, 2020

Changed reviewer from Romain Lebreton, Pascal Giorgi to Romain Lebreton, Pascal Giorgi, Johan Rosenkilde, Seung Gyu Hyun

@vneiger
Copy link
Contributor Author

vneiger commented May 5, 2020

comment:13

Replying to @johanrosenkilde:

Great that you're picking this up! Some initial comments:

  • There is already left_kernel and right_kernel which move into the fraction field and return a module. Perhaps the function here should be called left_kernel_basis, without the explicit "minimal". This is under the assumption that whenever a user asks for (a basis of) the (left/right) kernel of an F[x] matrix, he should/would invariably prefer a reduced basis of the kernel, represented over F[x] again. Note similarly that when you ask for a left_kernel().basis_matrix() of a Q-matrix, you get one in rref form.

I guess we could make use of the new function to implement the core of left_kernel (just need to make a module from the computed basis matrix, if I follow correctly). One advantage is that we have freedom of the shift here: we could take it as the one yielding best performance, i.e. the row degrees of the input matrix.

  • Row-wise/Column-wise: Handled by optional arguments like in weak_popov_form() and friends, I presume?

Yes, this is how it is done.

  • We don't yet have M.approximant_basis() or some such. If following the first suggested approach for implementing this ticket's functionality, then we should add that as a stand-alone method.

We do have minimal_approximant_basis , indeed the first approach directly builds upon this. The second approach was not incorporated in the end since it did not provide sufficient gains (we would need faster polynomial matrix multiplication for it to make sense).

  • Note that we don't yet have a M.popov_form() method. If we already had that, then the ticket's last suggested functionality really becomes a question of optimisation rather than added functionality.

Indeed. By the way, writing a Popov form algorithm should be done; it is not much to do since there is already a weak Popov form algorithm.

  • "verification that a matrix is a shifted minimal kernel basis." I'm not sure I understand what this means. Is it a method on a matrix M to ask whether another given matrix K is a left/right kernel basis (minimality is just row/col reduced, right?). Doesn't this boil down to M * K == 0 and K.is_column_reduced() and K.is_saturated()? We don't have is_saturated() right now, of course.

Yes indeed. It boils down to this, but I think it is convenient to have a method to check. This was implemented using the characterization of being saturated as having unimodular column bases; this is checked by verifying that the column Hermite form is the identity matrix (ideally, once the Popov form algorithm exists, we should use a column Popov form for likely better performance in usual cases).

@mkoeppe mkoeppe modified the milestones: sage-9.2, sage-9.3 Sep 5, 2020
@vneiger
Copy link
Contributor Author

vneiger commented Feb 18, 2021

comment:15

Adding gh accounts in cc.

@vneiger
Copy link
Contributor Author

vneiger commented Feb 18, 2021

@ke456
Copy link
Mannequin

ke456 mannequin commented Feb 19, 2021

Changed commit from 0c9bca0 to 46a8dea

@ke456
Copy link
Mannequin

ke456 mannequin commented Feb 19, 2021

New commits:

46a8deaMerge branch 'develop' into t/28143/minimal_kernel_bases

@ke456 ke456 mannequin added s: positive review and removed s: needs review labels Feb 19, 2021
@ke456
Copy link
Mannequin

ke456 mannequin commented Feb 19, 2021

comment:18

The current implementation computes minimal kernel bases via approximant bases at large order, not the algorithm of Zhou-Labahn-Storjohann. It accepts non-uniform shifts, works row-wise or column-wise, and can optionally return the output in shifted Popov form.

@vbraun
Copy link
Member

vbraun commented Mar 1, 2021

Changed branch from u/gh-vneiger/minimal_kernel_bases to 46a8dea

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants