New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Merged by Bors] - feat(linear_algebra/tensor_power): the tensor powers form a graded algebra #10255
Closed
Closed
[Merged by Bors] - feat(linear_algebra/tensor_power): the tensor powers form a graded algebra #10255
Changes from all commits
Commits
Show all changes
84 commits
Select commit
Hold shift + click to select a range
3482a3a
wip
eric-wieser 68f90bf
Merge remote-tracking branch 'origin/master' into eric-wieser/tensor_…
eric-wieser e912ffa
Working definition of mul
eric-wieser f5e2f45
mul_equiv mostly proved
eric-wieser 01801aa
add reindexing
eric-wieser e856f0d
wip
eric-wieser 27dfd0f
Merge branch 'master' of github.com:leanprover-community/mathlib into…
eric-wieser d05bb1d
Get it working
eric-wieser 7576243
Tidy
eric-wieser 5a25d1d
Merge branch 'master' of github.com:leanprover-community/mathlib into…
eric-wieser 21c472a
Move to a new file
eric-wieser 5953273
wip
eric-wieser 0db761e
wip
eric-wieser 970afbd
Merge branch 'master' of github.com:leanprover-community/mathlib into…
eric-wieser bf21b53
more wip
eric-wieser 86632a7
Merge remote-tracking branch 'origin/master' into eric-wieser/tensor_…
eric-wieser 52c2c74
remove junk
eric-wieser 48b9a49
Fix for latest mathlib
eric-wieser ea3e579
Add all the sorries
eric-wieser 6b7cd9f
feat(linear_algebra/pi_tensor_prod): generalize actions and provide m…
eric-wieser c41362f
squeeze simps
eric-wieser 5426586
shortcut instances
eric-wieser f7b7fcf
Getting close!
eric-wieser 5f8b45e
Sorrys pushed further away
eric-wieser b1ca876
remove a duplicate lemma
eric-wieser d7a0f1e
Fix universe variables
eric-wieser a5aa816
rename concat to append, add repeat
eric-wieser 1204301
Remove incomplete stuff
eric-wieser 64d18f7
Merge remote-tracking branch 'origin/master' into eric-wieser/tensor_…
eric-wieser 1fe183c
State the equivalence. Untested due to orange bars.
eric-wieser 5a850b4
Merge remote-tracking branch 'origin/master' into eric-wieser/tensor_…
eric-wieser 4a09e21
fix slightly
eric-wieser eb0f0a5
wip
eric-wieser 4532666
Tidy up
eric-wieser b3c44ab
Fix typos in graded_monoid
eric-wieser 0290f12
remove three sorries
eric-wieser 2e247cf
Most of the iso sorrys gone
eric-wieser 1fa857d
Merge remote-tracking branch 'origin/master' into eric-wieser/tensor_…
eric-wieser 7f1a1f3
Clean up a sorry, golf
eric-wieser 8ddce01
Sorry-free!
eric-wieser 088aa65
Split the file
eric-wieser 70daf96
wip
eric-wieser 09e0f5a
Merge remote-tracking branch 'origin/master' into eric-wieser/tensor_…
eric-wieser bb8d310
Fill the hard sorry
eric-wieser 884bdcb
fix
eric-wieser b976668
move lemmas
eric-wieser 452b42a
feat(algebra/graded_monoid): dependent products
eric-wieser e0ae1d6
docstring, better name
eric-wieser 4b4b1e9
tweaks
eric-wieser 4d72e5c
Merge remote-tracking branch 'origin/master' into eric-wieser/list.dprod
eric-wieser a805153
Merge branch 'master' into eric-wieser/list.dprod
eric-wieser e1794bb
Add lemmas about subtypes
eric-wieser 77058d5
reorder
eric-wieser bee5300
Merge branch 'eric-wieser/list.dprod' into eric-wieser/tensor_algebra…
eric-wieser e1beed8
fix bad merge
eric-wieser bc8a7f3
Merge remote-tracking branch 'origin/master' into eric-wieser/tensor_…
eric-wieser df8e6f9
Merge remote-tracking branch 'origin/master' into eric-wieser/tensor_…
eric-wieser f7c011a
fix import
eric-wieser 489fe70
fix
eric-wieser 86e9d63
Merge remote-tracking branch 'origin/master' into eric-wieser/tensor_…
eric-wieser 926be50
Merge remote-tracking branch 'origin/master' into eric-wieser/tensor_…
eric-wieser ef40618
get building again
eric-wieser edd4bf8
add a cast lemma
eric-wieser 03a8df1
squeeze the sorry
eric-wieser 9051b0d
wip
eric-wieser ca4a585
fix
eric-wieser e99c879
Merge remote-tracking branch 'origin/master' into eric-wieser/tensor_…
eric-wieser a33c5c3
fix formatting
eric-wieser 18fca3f
fix build errors
eric-wieser aaec256
Merge remote-tracking branch 'origin/master' into eric-wieser/tensor_…
eric-wieser 35736f5
remove redundant defs
eric-wieser 6852bb1
fixes after merging
eric-wieser fbcd743
remove unused file
eric-wieser 11d1e40
remove unused line
eric-wieser 00b0663
Merge remote-tracking branch 'origin/master' into eric-wieser/tensor_…
eric-wieser aa4a064
remove duplicate lemma
eric-wieser 56ee3ee
move lemma to a different file
eric-wieser 9d92edb
golf
eric-wieser 5baa898
remove some dead code, golf a proof
eric-wieser 76452dd
adjust a lemma statement
eric-wieser d3fd71a
rename algebra_map to algebra_map₀
eric-wieser f5b7f79
tidy parens
eric-wieser 2ff50fc
golf and comment
eric-wieser 9362a5a
move to appropriate file
eric-wieser File filter
Filter by extension
Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,163 @@ | ||
/- | ||
Copyright (c) 2021 Eric Wieser. All rights reserved. | ||
Released under Apache 2.0 license as described in the file LICENSE. | ||
Authors: Eric Wieser | ||
-/ | ||
import linear_algebra.tensor_algebra.basic | ||
import linear_algebra.tensor_power | ||
/-! | ||
# Tensor algebras as direct sums of tensor powers | ||
|
||
In this file we show that `tensor_algebra R M` is isomorphic to a direct sum of tensor powers, as | ||
`tensor_algebra.equiv_direct_sum`. | ||
-/ | ||
open_locale direct_sum tensor_product | ||
|
||
variables {R M : Type*} [comm_semiring R] [add_comm_monoid M] [module R M] | ||
|
||
namespace tensor_power | ||
|
||
/-- The canonical embedding from a tensor power to the tensor algebra -/ | ||
def to_tensor_algebra {n} : ⨂[R]^n M →ₗ[R] tensor_algebra R M := | ||
pi_tensor_product.lift (tensor_algebra.tprod R M n) | ||
|
||
@[simp] | ||
lemma to_tensor_algebra_tprod {n} (x : fin n → M) : | ||
tensor_power.to_tensor_algebra (pi_tensor_product.tprod R x) = tensor_algebra.tprod R M n x := | ||
pi_tensor_product.lift.tprod _ | ||
|
||
@[simp] | ||
lemma to_tensor_algebra_ghas_one : | ||
(@graded_monoid.ghas_one.one _ (λ n, ⨂[R]^n M) _ _).to_tensor_algebra = 1 := | ||
tensor_power.to_tensor_algebra_tprod _ | ||
|
||
@[simp] | ||
lemma to_tensor_algebra_ghas_mul {i j} (a : ⨂[R]^i M) (b : ⨂[R]^j M) : | ||
(@graded_monoid.ghas_mul.mul _ (λ n, ⨂[R]^n M) _ _ _ _ a b).to_tensor_algebra | ||
= a.to_tensor_algebra * b.to_tensor_algebra := | ||
begin | ||
-- change `a` and `b` to `tprod R a` and `tprod R b` | ||
rw [tensor_power.ghas_mul_eq_coe_linear_map, ←linear_map.compr₂_apply, | ||
←@linear_map.mul_apply' R, ←linear_map.compl₂_apply, ←linear_map.comp_apply], | ||
refine linear_map.congr_fun (linear_map.congr_fun _ a) b, | ||
clear a b, | ||
ext a b, | ||
simp only [linear_map.compr₂_apply, linear_map.mul_apply', | ||
linear_map.compl₂_apply, linear_map.comp_apply, linear_map.comp_multilinear_map_apply, | ||
pi_tensor_product.lift.tprod, tensor_power.tprod_mul_tprod, | ||
tensor_power.to_tensor_algebra_tprod, tensor_algebra.tprod_apply, ←ghas_mul_eq_coe_linear_map], | ||
refine eq.trans _ list.prod_append, | ||
congr', | ||
rw [←list.map_of_fn _ (tensor_algebra.ι R), ←list.map_of_fn _ (tensor_algebra.ι R), | ||
←list.map_of_fn _ (tensor_algebra.ι R), ←list.map_append, list.of_fn_fin_append], | ||
end | ||
|
||
@[simp] | ||
lemma to_tensor_algebra_galgebra_to_fun (r : R) : | ||
(@direct_sum.galgebra.to_fun _ R (λ n, ⨂[R]^n M) _ _ _ _ _ _ _ r).to_tensor_algebra | ||
= algebra_map _ _ r := | ||
by rw [tensor_power.galgebra_to_fun_def, tensor_power.algebra_map₀_eq_smul_one, linear_map.map_smul, | ||
tensor_power.to_tensor_algebra_ghas_one, algebra.algebra_map_eq_smul_one] | ||
|
||
end tensor_power | ||
|
||
namespace tensor_algebra | ||
|
||
/-- The canonical map from a direct sum of tensor powers to the tensor algebra. -/ | ||
def of_direct_sum : (⨁ n, ⨂[R]^n M) →ₐ[R] tensor_algebra R M := | ||
Vierkantor marked this conversation as resolved.
Show resolved
Hide resolved
|
||
direct_sum.to_algebra _ _ (λ n, tensor_power.to_tensor_algebra) | ||
tensor_power.to_tensor_algebra_ghas_one | ||
(λ i j, tensor_power.to_tensor_algebra_ghas_mul) | ||
(tensor_power.to_tensor_algebra_galgebra_to_fun) | ||
|
||
@[simp] lemma of_direct_sum_of_tprod {n} (x : fin n → M) : | ||
of_direct_sum (direct_sum.of _ n (pi_tensor_product.tprod R x)) = tprod R M n x := | ||
(direct_sum.to_add_monoid_of _ _ _).trans (tensor_power.to_tensor_algebra_tprod _) | ||
|
||
/-- The canonical map from the tensor algebra to a direct sum of tensor powers. -/ | ||
def to_direct_sum : tensor_algebra R M →ₐ[R] ⨁ n, ⨂[R]^n M := | ||
tensor_algebra.lift R $ | ||
direct_sum.lof R ℕ (λ n, ⨂[R]^n M) _ ∘ₗ | ||
(linear_equiv.symm $ pi_tensor_product.subsingleton_equiv (0 : fin 1) : M ≃ₗ[R] _).to_linear_map | ||
|
||
@[simp] lemma to_direct_sum_ι (x : M) : | ||
to_direct_sum (ι R x) = | ||
direct_sum.of (λ n, ⨂[R]^n M) _ (pi_tensor_product.tprod R (λ _ : fin 1, x)) := | ||
tensor_algebra.lift_ι_apply _ _ | ||
|
||
lemma of_direct_sum_comp_to_direct_sum : | ||
of_direct_sum.comp to_direct_sum = alg_hom.id R (tensor_algebra R M) := | ||
begin | ||
ext, | ||
simp [direct_sum.lof_eq_of, tprod_apply], | ||
end | ||
|
||
@[simp] lemma of_direct_sum_to_direct_sum (x : tensor_algebra R M) : | ||
of_direct_sum x.to_direct_sum = x := | ||
alg_hom.congr_fun of_direct_sum_comp_to_direct_sum x | ||
|
||
@[simp] lemma mk_reindex_cast {n m : ℕ} (h : n = m) (x : ⨂[R]^n M) : | ||
graded_monoid.mk m (pi_tensor_product.reindex R M (equiv.cast $ congr_arg fin h) x) = | ||
graded_monoid.mk n x := | ||
eq.symm (pi_tensor_product.graded_monoid_eq_of_reindex_cast h rfl) | ||
|
||
@[simp] lemma mk_reindex_fin_cast {n m : ℕ} (h : n = m) (x : ⨂[R]^n M) : | ||
graded_monoid.mk m (pi_tensor_product.reindex R M (fin.cast h).to_equiv x) = | ||
graded_monoid.mk n x := | ||
by rw [fin.cast_to_equiv, mk_reindex_cast h] | ||
|
||
/-- The product of tensor products made of a single vector is the same as a single product of | ||
all the vectors. -/ | ||
lemma _root_.tensor_power.list_prod_graded_monoid_mk_single (n : ℕ) (x : fin n → M) : | ||
((list.fin_range n).map | ||
(λ a, (graded_monoid.mk _ (pi_tensor_product.tprod R (λ i : fin 1, x a)) | ||
: graded_monoid (λ n, ⨂[R]^n M)))).prod = | ||
graded_monoid.mk n (pi_tensor_product.tprod R x) := | ||
begin | ||
refine fin.cons_induction _ _ x; clear x, | ||
{ rw [list.fin_range_zero, list.map_nil, list.prod_nil], | ||
refl, }, | ||
{ intros n x₀ x ih, | ||
rw [list.fin_range_succ_eq_map, list.map_cons, list.prod_cons, list.map_map, function.comp], | ||
simp_rw [fin.cons_zero, fin.cons_succ], | ||
rw [ih, graded_monoid.mk_mul_mk, tensor_power.tprod_mul_tprod], | ||
refine tensor_power.graded_monoid_eq_of_cast (add_comm _ _) _, | ||
dsimp only [graded_monoid.mk], | ||
rw [tensor_power.cast_tprod], | ||
simp_rw [fin.append_left_eq_cons, function.comp], | ||
congr' 1 with i, | ||
congr' 1, | ||
rw [fin.cast_trans, fin.cast_refl, order_iso.refl_apply] }, | ||
end | ||
|
||
lemma to_direct_sum_tensor_power_tprod {n} (x : fin n → M) : | ||
to_direct_sum (tprod R M n x) = direct_sum.of _ n (pi_tensor_product.tprod R x) := | ||
begin | ||
rw [tprod_apply, alg_hom.map_list_prod, list.map_of_fn, function.comp], | ||
simp_rw to_direct_sum_ι, | ||
dsimp only, | ||
rw direct_sum.list_prod_of_fn_of_eq_dprod, | ||
apply direct_sum.of_eq_of_graded_monoid_eq, | ||
rw graded_monoid.mk_list_dprod, | ||
rw tensor_power.list_prod_graded_monoid_mk_single, | ||
end | ||
|
||
lemma to_direct_sum_comp_of_direct_sum : | ||
to_direct_sum.comp of_direct_sum = alg_hom.id R (⨁ n, ⨂[R]^n M) := | ||
begin | ||
ext, | ||
simp [direct_sum.lof_eq_of, -tprod_apply, to_direct_sum_tensor_power_tprod], | ||
end | ||
|
||
@[simp] lemma to_direct_sum_of_direct_sum (x : ⨁ n, ⨂[R]^n M) : | ||
(of_direct_sum x).to_direct_sum = x := | ||
alg_hom.congr_fun to_direct_sum_comp_of_direct_sum x | ||
|
||
/-- The tensor algebra is isomorphic to a direct sum of tensor powers. -/ | ||
@[simps] | ||
def equiv_direct_sum : tensor_algebra R M ≃ₐ[R] ⨁ n, ⨂[R]^n M := | ||
alg_equiv.of_alg_hom to_direct_sum of_direct_sum | ||
to_direct_sum_comp_of_direct_sum | ||
of_direct_sum_comp_to_direct_sum | ||
|
||
end tensor_algebra |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Mentioning the name of the instance in the lemma is surprising to me. Why so?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is about
ghas_one.one : A 0
, nothas_one.one : A
; we don't have a convention in place for lemmas about the former.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would have suggested
gone
💨