Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Merged by Bors] - feat(analysis/calculus/mean_value): remove assumption in strict_mono_on.strict_convex_on_of_deriv #15133

Closed
wants to merge 5 commits into from

Conversation

sgouezel
Copy link
Collaborator

@sgouezel sgouezel commented Jul 5, 2022

Currently, the lemma strict_mono_on.strict_convex_on_of_deriv states that, if a real function f is continuous on a convex set D, differentiable on its interior, and deriv f is strictly monotone on its interior, then f is convex on D. We remove the differentiability assumption: since deriv f is strictly monotone, there is at most one point of nondifferentiability (as deriv f x = 0 when f is not differentiable), and the result is still true (although the proof is a little bit more complicated) in this case.

Of course, in essentially all applications the functions will be differentiable, but the lemma becomes easier to use as the user doesn't need to prove this differentiability.


Open in Gitpod

@sgouezel sgouezel added the awaiting-review The author would like community review of the PR label Jul 5, 2022
@urkud
Copy link
Member

urkud commented Jul 17, 2022

Thanks! 🎉
bors merge

@github-actions github-actions bot added ready-to-merge All that is left is for bors to build and merge this PR. (Remember you need to say `bors r+`.) and removed awaiting-review The author would like community review of the PR labels Jul 17, 2022
bors bot pushed a commit that referenced this pull request Jul 17, 2022
…on.strict_convex_on_of_deriv (#15133)

Currently, the lemma `strict_mono_on.strict_convex_on_of_deriv` states that, if a real function `f` is continuous on a convex set `D`, differentiable on its interior, and `deriv f` is strictly monotone on its interior, then `f` is convex on `D`. We remove the differentiability assumption: since `deriv f` is strictly monotone, there is at most one point of nondifferentiability (as `deriv f x = 0` when `f` is not differentiable), and the result is still true (although the proof is a little bit more complicated) in this case.

Of course, in essentially all applications the functions will be differentiable, but the lemma becomes easier to use as the user doesn't need to prove this differentiability.
@bors
Copy link

bors bot commented Jul 17, 2022

Pull request successfully merged into master.

Build succeeded:

@bors bors bot changed the title feat(analysis/calculus/mean_value): remove assumption in strict_mono_on.strict_convex_on_of_deriv [Merged by Bors] - feat(analysis/calculus/mean_value): remove assumption in strict_mono_on.strict_convex_on_of_deriv Jul 17, 2022
@bors bors bot closed this Jul 17, 2022
@bors bors bot deleted the strict_convex_relax branch July 17, 2022 06:40
joelriou pushed a commit that referenced this pull request Jul 23, 2022
…on.strict_convex_on_of_deriv (#15133)

Currently, the lemma `strict_mono_on.strict_convex_on_of_deriv` states that, if a real function `f` is continuous on a convex set `D`, differentiable on its interior, and `deriv f` is strictly monotone on its interior, then `f` is convex on `D`. We remove the differentiability assumption: since `deriv f` is strictly monotone, there is at most one point of nondifferentiability (as `deriv f x = 0` when `f` is not differentiable), and the result is still true (although the proof is a little bit more complicated) in this case.

Of course, in essentially all applications the functions will be differentiable, but the lemma becomes easier to use as the user doesn't need to prove this differentiability.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready-to-merge All that is left is for bors to build and merge this PR. (Remember you need to say `bors r+`.)
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants