New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Search
gives Error: Anomaly "Uncaught exception Not_found."
#17963
Comments
@coqbot minimize coq.dev git clone https://github.com/JasonGross/neural-net-coq-interp.git --branch=zzz-bug-anomaly-search
cd neural-net-coq-interp
make theories/bug_search_anom_not_found_03.vo -j2 |
Hey @JasonGross, the coq bug minimizer is running your script, I'll come back to you with the results once it's done. |
@JasonGross, Minimized File /github/workspace/neural-net-coq-interp/theories/bug_search_anom_not_found_03.v (full log on GitHub Actions) Partially Minimized Coq File (could not inline NeuralNetInterp.TransformerLens.HookedTransformer)(* -*- mode: coq; coq-prog-args: ("-emacs" "-q" "-w" "+implicit-core-hint-db,+implicits-in-term,+non-reversible-notation,+deprecated-intros-until-0,+deprecated-focus,+unused-intro-pattern,+variable-collision,+unexpected-implicit-declaration,+omega-is-deprecated,+deprecated-instantiate-syntax,+non-recursive,+undeclared-scope,+deprecated-hint-rewrite-without-locality,+deprecated-hint-without-locality,+deprecated-instance-without-locality,+deprecated-typeclasses-transparency-without-locality,-ltac2-missing-notation-var,unsupported-attributes" "-w" "-deprecated-native-compiler-option" "-native-compiler" "no" "-R" "/github/workspace/neural-net-coq-interp/theories" "NeuralNetInterp" "-Q" "/github/workspace/cwd" "Top" "-Q" "/home/coq/.opam/4.13.1+flambda/lib/coq/user-contrib/Bignums" "Bignums" "-Q" "/home/coq/.opam/4.13.1+flambda/lib/coq/user-contrib/Ltac2" "Ltac2" "-top" "NeuralNetInterp.bug_search_anom_not_found_03") -*- *)
(* File reduced by coq-bug-minimizer from original input, then from 220 lines to 39 lines, then from 52 lines to 522 lines, then from 524 lines to 229 lines, then from 699 lines to 216 lines, then from 228 lines to 118 lines, then from 131 lines to 665 lines, then from 670 lines to 135 lines, then from 148 lines to 1326 lines, then from 1331 lines to 173 lines, then from 186 lines to 788 lines, then from 794 lines to 190 lines, then from 204 lines to 663 lines, then from 668 lines to 216 lines, then from 230 lines to 364 lines, then from 370 lines to 216 lines, then from 230 lines to 992 lines, then from 995 lines to 246 lines, then from 260 lines to 453 lines, then from 455 lines to 265 lines, then from 279 lines to 349 lines, then from 354 lines to 275 lines, then from 289 lines to 326 lines, then from 332 lines to 277 lines, then from 283 lines to 278 lines *)
(* coqc version 8.19+alpha compiled with OCaml 4.13.1
coqtop version buildkitsandbox:/home/coq/.opam/4.13.1+flambda/.opam-switch/build/coq-core.dev/_build/default,master (61ee398ed32f9334dd664ea8ed2697178e6e3844)
Modules that could not be inlined: NeuralNetInterp.TransformerLens.HookedTransformer
Expected coqc runtime on this file: 1.021 sec *)
Require NeuralNetInterp.TransformerLens.HookedTransformer.
Module Export Hetero.
Definition relation A B := A -> B -> Prop.
Module Export NeuralNetInterp_DOT_Util_DOT_Program_DOT_Basics_DOT_Dependent_WRAPPED.
Module Export Dependent.
Import Coq.Program.Basics.
Declare Scope type_function_scope.
Delimit Scope type_function_scope with type_function.
Definition type_function := Type -> Type.
Notation "F ∘ G" := (fun A : Type => F%type_function (G%type_function A)) : type_function_scope.
End Dependent.
End NeuralNetInterp_DOT_Util_DOT_Program_DOT_Basics_DOT_Dependent_WRAPPED.
Module Export NeuralNetInterp_DOT_Util_DOT_Relations_DOT_Relation_Definitions_DOT_Dependent_WRAPPED.
Module Export Dependent.
Declare Scope dependent_signature_scope.
Definition relation (F : type_function)
:= forall A B, Hetero.relation A B -> Hetero.relation (F A) (F B).
Module Export RelationsNotations.
Delimit Scope dependent_signature_scope with dependent_signature.
Bind Scope dependent_signature_scope with relation.
Notation "R ∘ R'" := (fun (A B : Type) (RAB : Hetero.relation A B) => R%dependent_signature _ _ (R'%dependent_signature A B RAB)) : dependent_signature_scope.
End RelationsNotations.
End Dependent.
End NeuralNetInterp_DOT_Util_DOT_Relations_DOT_Relation_Definitions_DOT_Dependent_WRAPPED.
Module Export Dependent.
Section Proper.
Context {F : Type -> Type}.
Class Proper (R : Dependent.relation F) (m : forall T, F T) : Prop :=
proper_prf : forall {A B} (R0 : Hetero.relation A B), R A B R0 (m A) (m B).
Definition respectful_hetero {A : Type -> Type} {B : forall x, A x -> Type}
(R : Dependent.relation A)
(R' : forall x y, Hetero.relation x y -> forall x' y', Hetero.relation (B x x') (B y y'))
: Dependent.relation (fun T => forall a : A T, B T a).
exact (fun a0 b0 R0 f g => forall x y, R _ _ R0 x y -> R' _ _ R0 _ _ (f x) (g y)).
Defined.
Definition respectful {A B : Type -> Type} (R : Dependent.relation A) (R' : Dependent.relation B) : Dependent.relation (fun T => A T -> B T)
:= Eval cbv [respectful_hetero] in
@respectful_hetero A (fun T _ => B T) R (fun x y R0 _ _ => R' x y R0).
End Proper.
Module Export ProperNotations.
Notation " R ==> R' " := (@respectful _ _ (R%dependent_signature) (R'%dependent_signature))
(right associativity, at level 55) : dependent_signature_scope.
End ProperNotations.
Notation idR := (fun (A B : Type) (R : Hetero.relation A B) => R).
Notation const R := (match _ as F return forall A B, Hetero.relation A B -> Hetero.relation F F with
| F => fun (A B : Type) (_ : Hetero.relation A B) => R
end) (only parsing).
Module Export Option.
Section Relations.
Definition option_eq {A B} eq (x : option A) (y : option B) :=
match x with
| None => y = None
| Some ax => match y with
| None => False
| Some ay => eq ax ay
end
end.
End Relations.
Definition invert_Some {A} (x : option A)
: (if x then A else unit).
Admitted.
#[export] Instance invert_Some_Proper_dep
: Dependent.Proper
(@Dependent.respectful_hetero
(fun T => option T)
(fun T (x : option T) => if x then _ else _)
(@option_eq)
(fun A B R x y => match x, y with
| Some _, Some _ => R
| None, None => fun _ _ => True
| Some _, None | None, Some _ => fun _ _ => False
end))
(@invert_Some).
Admitted.
Module Export NeuralNetInterp_DOT_Torch_DOT_Tensor_DOT_Instances_WRAPPED.
Module Export Instances.
Import NeuralNetInterp.Torch.Tensor.
Module Export Tensor.
Definition eqfR_rank {r} : Dependent.relation (@tensor_of_rank r).
exact (fun A B R x y => forall i, R (x i) (y i)).
Defined.
Definition eqfR {r s} : Dependent.relation (@tensor r s).
exact (eqfR_rank).
Defined.
#[global] Arguments eqfR {r s} [A B] R x y.
Notation eqf := (eqfR eq).
End Tensor.
End Instances.
End NeuralNetInterp_DOT_Torch_DOT_Tensor_DOT_Instances_WRAPPED.
Module Export Instances.
Import NeuralNetInterp.TransformerLens.HookedTransformer.
Module Export HookedTransformer.
Definition block_params_type_genR n_heads d_model d_head nt : Dependent.relation (block_params_type_gen n_heads d_model d_head nt).
Admitted.
Module Export HookedTransformer.
#[export] Instance masked_attn_scores_Proper_dep {d_vocab n_heads d_model d_head n_ctx r batch pos normalization_type}
: Dependent.Proper
(Dependent.idR
==> (Dependent.const eq ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR)
==> Dependent.const (fun _ _ => True)
==> Dependent.idR
==> Tensor.eqfR
==> Tensor.eqfR
==> List.Forall2 ∘ @block_params_type_genR n_heads d_model d_head normalization_type
==> Dependent.const eq
==> Dependent.const Tensor.eqf
==> @option_eq ∘ Tensor.eqfR)
(@HookedTransformer.HookedTransformer.masked_attn_scores d_vocab n_heads d_model d_head n_ctx r batch pos normalization_type).
Admitted.
End HookedTransformer.
End HookedTransformer.
End Instances.
Module Export Parameters.
Import Coq.Floats.Floats.
Import Coq.NArith.NArith.
Import NeuralNetInterp.TransformerLens.HookedTransformer.Config.Common.
Local Open Scope float_scope.
Module cfg <: CommonConfig.
Definition d_model := 32%N.
Definition n_ctx := 2%N.
Definition d_head := 32%N.
Definition n_heads := 1%N.
Definition d_vocab := 64%N.
Definition eps := 1e-05.
Definition normalization_type := Some LN.
Definition d_vocab_out := 64%N.
End cfg.
End Parameters.
Import Coq.Floats.Floats.
Import Coq.QArith.QArith.
Import Coq.Lists.List.
Import NeuralNetInterp.Util.Default.
Import NeuralNetInterp.Util.Pointed.
Import NeuralNetInterp.Util.Arith.Classes.
Import NeuralNetInterp.Util.Arith.Instances.
Import NeuralNetInterp.Torch.Tensor.
Import NeuralNetInterp.TransformerLens.HookedTransformer.
Import Instances.Truncating.
Module Model (cfg : Config).
Module Export HookedTransformer.
Section __.
Context {r} {batch : Shape r} {pos}
(s := (batch ::' pos)%shape)
(resid_shape := (s ::' cfg.d_model)%shape)
{A} {coer_float : has_coer float A} {coerZ : has_coer Z A}
{addA : has_add A} {subA : has_sub A} {mulA : has_mul A} {divA : has_div A}
{sqrtA : has_sqrt A} {expA : has_exp A}
{use_checkpoint : with_default "use_checkpoint" bool true}.
Let coerA' (x : float) : A.
Admitted.
#[local] Coercion coerA' : float >-> A.
Let coer_ln_tensor : cfg.ln_tensor float -> cfg.ln_tensor A.
Admitted.
Definition coer_blocks_params
:= List.map
(fun '((W_Q, W_K, W_V, W_O,
b_Q, b_K, b_V, b_O,
ln1_w, ln1_b) : cfg.block_params_type float)
=> ((W_Q:tensor _ A), (W_K:tensor _ A), (W_V:tensor _ A), (W_O:tensor _ A),
(b_Q:tensor _ A), (b_K:tensor _ A), (b_V:tensor _ A), (b_O:tensor _ A),
coer_ln_tensor ln1_w, coer_ln_tensor ln1_b)).
Local Definition masked_attn_scores (n : nat) (tokens : tensor s IndexType)
: option (tensor (batch ::' cfg.n_heads ::' pos ::' pos) A).
exact (HookedTransformer.HookedTransformer.masked_attn_scores
(A:=A) (n_ctx:=cfg.n_ctx) (normalization_type:=cfg.normalization_type)cfg.eps
cfg.W_E cfg.W_pos
(coer_blocks_params cfg.blocks_params)
n tokens).
Defined.
End __.
End HookedTransformer.
End Model.
Import ListNotations.
Module cfg <: Config.
Include Parameters.cfg.
Parameter W_E : tensor [d_vocab; d_model] float.
Parameter W_pos : tensor [n_ctx; d_model] float.
Parameter W_U : tensor [d_model; d_vocab_out] float.
Parameter b_U : tensor [d_vocab_out] float.
Notation ln_tensor A := (ln_tensor_gen d_model normalization_type A).
Parameter ln_final_w : ln_tensor float.
Parameter ln_final_b : ln_tensor float.
Notation block_params_type A := (block_params_type_gen n_heads d_model d_head normalization_type A).
Parameter block_params : block_params_type float.
Definition blocks_params := [block_params].
End cfg.
Include Model cfg.
Section with_batch.
Context {r} {batch : Shape r} {pos}
(s := (batch ::' pos)%shape)
(resid_shape := (s ::' cfg.d_model)%shape)
{return_per_token : with_default "return_per_token" bool false}
{A} {coer_float : has_coer float A} {coerZ : has_coer Z A}
(defaultA : pointed A := @coer _ _ coerZ point)
{addA : has_add A} {subA : has_sub A} {mulA : has_mul A} {divA : has_div A}
{ltbA : has_ltb A}
{oppA : has_opp A} {sqrtA : has_sqrt A} {expA : has_exp A} {lnA : has_ln A}
{use_checkpoint : with_default "use_checkpoint" bool true}.
Definition masked_attn_scores (tokens : tensor s IndexType)
: tensor (batch ::' cfg.n_heads ::' pos ::' pos) A.
exact (Option.invert_Some
(HookedTransformer.masked_attn_scores (A:=A) 0 tokens)).
Defined.
End with_batch.
#[export] Instance masked_attn_scores_Proper_dep {r batch pos}
: Dependent.Proper
((Dependent.const eq ==> Dependent.idR)
==> (Dependent.const eq ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR)
==> Dependent.const (fun _ _ => True)
==> Dependent.const Tensor.eqf
==> Tensor.eqfR)
(@masked_attn_scores r batch pos).
Proof.
cbv [masked_attn_scores].
pose proof (@HookedTransformer.HookedTransformer.masked_attn_scores_Proper_dep) as H.
repeat intro.
repeat (let v := open_constr:(_) in specialize (H v)).
move H at bottom.
revert H.
lazymatch goal with
| [ |- ?R _ _ ?R'' ?x ?y -> ?R' (invert_Some ?x' ?i) (invert_Some ?y' ?i) ]
=> unify x x'; unify y y'; unify R'' R'; set (x'':=x); set (y'':=y);
intro H;
refine (@invert_Some_Proper_dep _ _ (Tensor.eqfR R') x y H i)
end.
Unshelve.
Search HookedTransformer.block_params_type_genR. Intermediate Coq File (useful for debugging if minimization did not go as far as you wanted) (truncated to 8.0KiB; full 31KiB file on GitHub Actions Artifacts under
|
@coqbot minimize coq.dev git clone https://github.com/JasonGross/neural-net-coq-interp.git --branch=zzz-bug-anomaly-search
cd neural-net-coq-interp
make theories/bug_search_anom_not_found_03.vo -j2 |
Hey @JasonGross, the coq bug minimizer is running your script, I'll come back to you with the results once it's done. |
@JasonGross, Minimized File /github/workspace/neural-net-coq-interp/theories/bug_search_anom_not_found_03.v (full log on GitHub Actions) Partially Minimized Coq File (could not inline Ltac2.Array)(* -*- mode: coq; coq-prog-args: ("-emacs" "-q" "-w" "+implicit-core-hint-db,+implicits-in-term,+non-reversible-notation,+deprecated-intros-until-0,+deprecated-focus,+unused-intro-pattern,+variable-collision,+unexpected-implicit-declaration,+omega-is-deprecated,+deprecated-instantiate-syntax,+non-recursive,+undeclared-scope,+deprecated-hint-rewrite-without-locality,+deprecated-hint-without-locality,+deprecated-instance-without-locality,+deprecated-typeclasses-transparency-without-locality,-ltac2-missing-notation-var,unsupported-attributes" "-w" "-deprecated-native-compiler-option" "-native-compiler" "no" "-R" "/github/workspace/neural-net-coq-interp/theories" "NeuralNetInterp" "-Q" "/github/workspace/cwd" "Top" "-Q" "/home/coq/.opam/4.13.1+flambda/lib/coq/user-contrib/Bignums" "Bignums" "-Q" "/home/coq/.opam/4.13.1+flambda/lib/coq/user-contrib/Ltac2" "Ltac2" "-top" "NeuralNetInterp.bug_search_anom_not_found_03") -*- *)
(* File reduced by coq-bug-minimizer from original input, then from 220 lines to 39 lines, then from 52 lines to 522 lines, then from 524 lines to 229 lines, then from 699 lines to 216 lines, then from 228 lines to 118 lines, then from 131 lines to 665 lines, then from 670 lines to 135 lines, then from 148 lines to 1326 lines, then from 1331 lines to 173 lines, then from 186 lines to 747 lines, then from 752 lines to 498 lines, then from 511 lines to 883 lines, then from 887 lines to 744 lines, then from 757 lines to 844 lines, then from 849 lines to 749 lines, then from 762 lines to 829 lines, then from 834 lines to 761 lines, then from 774 lines to 850 lines, then from 855 lines to 774 lines, then from 787 lines to 837 lines, then from 842 lines to 790 lines, then from 803 lines to 1422 lines, then from 1427 lines to 811 lines, then from 824 lines to 989 lines, then from 995 lines to 57 lines, then from 63 lines to 58 lines *)
(* coqc version 8.19+alpha compiled with OCaml 4.13.1
coqtop version buildkitsandbox:/home/coq/.opam/4.13.1+flambda/.opam-switch/build/coq-core.dev/_build/default,master (61ee398ed32f9334dd664ea8ed2697178e6e3844)
Modules that could not be inlined: Ltac2.Array
Expected coqc runtime on this file: 0.121 sec *)
Require Ltac2.Control.
Module Export Pattern.
Import Ltac2.Init.
Ltac2 Type t := pattern.
Ltac2 Type context.
Ltac2 Type match_kind := [
| MatchPattern
| MatchContext
].
Ltac2 @ external empty_context : unit -> context :=
"coq-core.plugins.ltac2" "pattern_empty_context".
Ltac2 @ external matches_vect : t -> constr -> constr array :=
"coq-core.plugins.ltac2" "pattern_matches_vect".
Ltac2 @ external matches_subterm_vect : t -> constr -> context * constr array :=
"coq-core.plugins.ltac2" "pattern_matches_subterm_vect".
Ltac2 lazy_match0 t pats :=
let rec interp m := match m with
| [] => Control.zero Match_failure
| p :: m =>
let next _ := interp m in
let (knd, pat, f) := p in
let p := match knd with
| MatchPattern =>
(fun _ =>
let context := empty_context () in
let bind := matches_vect pat t in
fun _ => f context bind)
| MatchContext =>
(fun _ =>
let (context, bind) := matches_subterm_vect pat t in
fun _ => f context bind)
end in
Control.plus p next
end in
Control.once (fun () => interp pats) ().
Ltac2 Notation "lazy_match!" t(tactic(6)) "with" m(constr_matching) "end" :=
Pattern.lazy_match0 t m.
Ltac2 shape_to_list (c : constr) : constr list
:= let rec aux c acc
:= lazy_match! c with
| Shape.nil => acc
| Shape.snoc ?cs ?c
=> aux cs (c :: acc)
end in
aux c []. Intermediate Coq File (useful for debugging if minimization did not go as far as you wanted) (truncated to 8.0KiB; full 45KiB file on GitHub Actions Artifacts under
|
Unrelated Not_found Ltac2 uses harcoded names instead of some Register-like mechanism so inlining the Ltac2 files won't work well. |
@coqbot minimize coq.dev git clone https://github.com/JasonGross/neural-net-coq-interp.git --branch=zzz-bug-anomaly-search
cd neural-net-coq-interp
make theories/bug_search_anom_not_found_03.vo -j2 |
Hey @JasonGross, the coq bug minimizer is running your script, I'll come back to you with the results once it's done. |
@JasonGross, Minimized File /github/workspace/neural-net-coq-interp/theories/bug_search_anom_not_found_03.v (full log on GitHub Actions) Minimized Coq File (consider adding this file to the test-suite)(* -*- mode: coq; coq-prog-args: ("-emacs" "-q" "-w" "+implicit-core-hint-db,+implicits-in-term,+non-reversible-notation,+deprecated-intros-until-0,+deprecated-focus,+unused-intro-pattern,+variable-collision,+unexpected-implicit-declaration,+omega-is-deprecated,+deprecated-instantiate-syntax,+non-recursive,+undeclared-scope,+deprecated-hint-rewrite-without-locality,+deprecated-hint-without-locality,+deprecated-instance-without-locality,+deprecated-typeclasses-transparency-without-locality,-ltac2-missing-notation-var,unsupported-attributes" "-w" "-deprecated-native-compiler-option" "-native-compiler" "no" "-R" "/github/workspace/neural-net-coq-interp/theories" "NeuralNetInterp" "-Q" "/github/workspace/cwd" "Top" "-Q" "/home/coq/.opam/4.13.1+flambda/lib/coq/user-contrib/Bignums" "Bignums" "-Q" "/home/coq/.opam/4.13.1+flambda/lib/coq/user-contrib/Ltac2" "Ltac2" "-top" "NeuralNetInterp.bug_search_anom_not_found_03") -*- *)
(* File reduced by coq-bug-minimizer from original input, then from 220 lines to 39 lines, then from 52 lines to 522 lines, then from 524 lines to 229 lines, then from 699 lines to 216 lines, then from 228 lines to 118 lines, then from 131 lines to 665 lines, then from 670 lines to 135 lines, then from 148 lines to 1326 lines, then from 1331 lines to 173 lines, then from 186 lines to 747 lines, then from 752 lines to 498 lines, then from 511 lines to 883 lines, then from 887 lines to 744 lines, then from 757 lines to 844 lines, then from 849 lines to 749 lines, then from 762 lines to 829 lines, then from 834 lines to 761 lines, then from 774 lines to 850 lines, then from 855 lines to 774 lines, then from 787 lines to 837 lines, then from 842 lines to 790 lines, then from 803 lines to 1422 lines, then from 1427 lines to 811 lines, then from 824 lines to 1082 lines, then from 766 lines to 303 lines, then from 316 lines to 428 lines, then from 433 lines to 315 lines, then from 328 lines to 758 lines, then from 763 lines to 322 lines, then from 335 lines to 937 lines, then from 942 lines to 340 lines, then from 353 lines to 1497 lines, then from 1501 lines to 448 lines, then from 461 lines to 759 lines, then from 764 lines to 452 lines, then from 465 lines to 992 lines, then from 997 lines to 452 lines, then from 465 lines to 786 lines, then from 790 lines to 459 lines, then from 472 lines to 648 lines, then from 653 lines to 482 lines, then from 495 lines to 529 lines, then from 534 lines to 493 lines, then from 506 lines to 539 lines, then from 544 lines to 497 lines, then from 510 lines to 544 lines, then from 549 lines to 502 lines, then from 515 lines to 975 lines, then from 979 lines to 529 lines, then from 542 lines to 646 lines, then from 651 lines to 531 lines, then from 544 lines to 679 lines, then from 684 lines to 530 lines, then from 543 lines to 1306 lines, then from 1308 lines to 560 lines, then from 573 lines to 767 lines, then from 768 lines to 579 lines, then from 592 lines to 663 lines, then from 667 lines to 589 lines, then from 602 lines to 640 lines, then from 645 lines to 591 lines, then from 596 lines to 592 lines *)
(* coqc version 8.19+alpha compiled with OCaml 4.13.1
coqtop version buildkitsandbox:/home/coq/.opam/4.13.1+flambda/.opam-switch/build/coq-core.dev/_build/default,master (61ee398ed32f9334dd664ea8ed2697178e6e3844)
Expected coqc runtime on this file: 1.111 sec *)
Require Coq.Array.PArray.
Require Coq.Floats.Floats.
Require Coq.Numbers.Cyclic.Int63.Sint63.
Axiom proof_admitted : False.
Tactic Notation "admit" := abstract case proof_admitted.
Module Export Hetero.
Definition relation A B := A -> B -> Prop.
Module Export NeuralNetInterp_DOT_Util_DOT_Program_DOT_Basics_DOT_Dependent_WRAPPED.
Module Export Dependent.
Import Coq.Program.Basics.
Declare Scope type_function_scope.
Delimit Scope type_function_scope with type_function.
Definition type_function := Type -> Type.
Notation "F ∘ G" := (fun A : Type => F%type_function (G%type_function A)) : type_function_scope.
End Dependent.
End NeuralNetInterp_DOT_Util_DOT_Program_DOT_Basics_DOT_Dependent_WRAPPED.
Module Export NeuralNetInterp_DOT_Util_DOT_Relations_DOT_Relation_Definitions_DOT_Dependent_WRAPPED.
Module Export Dependent.
Declare Scope dependent_signature_scope.
Definition relation (F : type_function)
:= forall A B, Hetero.relation A B -> Hetero.relation (F A) (F B).
Module Export RelationsNotations.
Delimit Scope dependent_signature_scope with dependent_signature.
Bind Scope dependent_signature_scope with relation.
Notation "R ∘ R'" := (fun (A B : Type) (RAB : Hetero.relation A B) => R%dependent_signature _ _ (R'%dependent_signature A B RAB)) : dependent_signature_scope.
End RelationsNotations.
End Dependent.
End NeuralNetInterp_DOT_Util_DOT_Relations_DOT_Relation_Definitions_DOT_Dependent_WRAPPED.
Module Export Dependent.
Section Proper.
Context {F : Type -> Type}.
Class Proper (R : Dependent.relation F) (m : forall T, F T) : Prop :=
proper_prf : forall {A B} (R0 : Hetero.relation A B), R A B R0 (m A) (m B).
Definition respectful_hetero {A : Type -> Type} {B : forall x, A x -> Type}
(R : Dependent.relation A)
(R' : forall x y, Hetero.relation x y -> forall x' y', Hetero.relation (B x x') (B y y'))
: Dependent.relation (fun T => forall a : A T, B T a).
exact (fun a0 b0 R0 f g => forall x y, R _ _ R0 x y -> R' _ _ R0 _ _ (f x) (g y)).
Defined.
Definition respectful {A B : Type -> Type} (R : Dependent.relation A) (R' : Dependent.relation B) : Dependent.relation (fun T => A T -> B T)
:= Eval cbv [respectful_hetero] in
@respectful_hetero A (fun T _ => B T) R (fun x y R0 _ _ => R' x y R0).
End Proper.
Module Export ProperNotations.
Notation " R ==> R' " := (@respectful _ _ (R%dependent_signature) (R'%dependent_signature))
(right associativity, at level 55) : dependent_signature_scope.
End ProperNotations.
Notation idR := (fun (A B : Type) (R : Hetero.relation A B) => R).
Notation const R := (match _ as F return forall A B, Hetero.relation A B -> Hetero.relation F F with
| F => fun (A B : Type) (_ : Hetero.relation A B) => R
end) (only parsing).
Reserved Infix "::'" (at level 59, left associativity).
Reserved Infix "++'" (at level 59, left associativity).
Reserved Infix "+'" (at level 48, left associativity).
Module Export Option.
Section Relations.
Definition option_eq {A B} eq (x : option A) (y : option B) :=
match x with
| None => y = None
| Some ax => match y with
| None => False
| Some ay => eq ax ay
end
end.
End Relations.
Definition invert_Some {A} (x : option A)
: (if x then A else unit).
Admitted.
#[export] Instance invert_Some_Proper_dep
: Dependent.Proper
(@Dependent.respectful_hetero
(fun T => option T)
(fun T (x : option T) => if x then _ else _)
(@option_eq)
(fun A B R x y => match x, y with
| Some _, Some _ => R
| None, None => fun _ _ => True
| Some _, None | None, Some _ => fun _ _ => False
end))
(@invert_Some).
Admitted.
Fixpoint radd (n m : nat) {struct m} : nat.
exact (match m with
| 0 => n
| S p => S (radd n p)
end).
Defined.
Infix "+'" := radd : nat_scope.
Import Coq.ZArith.ZArith.
Import Coq.Array.PArray.
Class pointed T := point : T.
#[export] Instance default_Z : pointed Z.
Admitted.
#[export] Instance default_array {A} {a : pointed A} : pointed (array A).
Admitted.
Import Coq.Strings.String.
Definition with_default (name : string) {A} (x : A) := A.
#[global] Arguments with_default _ {_} _, _ _ _.
Existing Class with_default.
Ltac fill_default _ :=
lazymatch goal with
| [ |- @with_default ?name ?A ?x ]
=> match goal with
| [ H : @with_default ?name' ?A' _ |- _ ] => constr_eq A A'; constr_eq name name'; fail 1
| _ => exact x
end
end.
#[global] Hint Extern 0 (with_default _ _) => fill_default () : typeclass_instances.
Cumulative Polymorphic Inductive TyList := tynil | tycons (_ : Type) (_ : TyList).
Class has_ltb A := ltb : A -> A -> bool.
Class has_add_with A B C := add : A -> B -> C.
Notation has_add A := (has_add_with A A A).
Class has_sub_with A B C := sub : A -> B -> C.
Notation has_sub A := (has_sub_with A A A).
Class has_mul_with A B C := mul : A -> B -> C.
Notation has_mul A := (has_mul_with A A A).
Class has_opp A := opp : A -> A.
Class has_zero A := zero : A.
Class has_sqrt A := sqrt : A -> A.
Class has_div_by A B C := div : A -> B -> C.
Notation has_div A := (has_div_by A A A).
Class has_exp_to A B := exp : A -> B.
Notation has_exp A := (has_exp_to A A).
Class has_ln_to A B := ln : A -> B.
Notation has_ln A := (has_ln_to A A).
Class has_coer A B := coer : A -> B.
Definition has_coer_to (avoid : TyList) := has_coer.
#[export] Hint Extern 0 (has_coer_to _ _ ?A)
=> lazymatch goal with H : has_coer _ A |- _ => exact H end
: typeclass_instances.
#[export] Instance lift_coer_has_zero {A B} {coerAB : has_coer_to (tycons B tynil) A B} {zeroA : has_zero A} : has_zero B.
Admitted.
#[export] Coercion Z.of_N : N >-> Z.
#[export] Instance Z_has_zero : has_zero Z.
Admitted.
#[export] Coercion Uint63.of_Z : Z >-> Uint63.int.
Module Export PArray.
Import Coq.Numbers.Cyclic.Int63.Sint63.
Definition init_default {A} {default : pointed A} (len : int) (f : int -> A) : array A.
Admitted.
Definition Rank := nat.
Module Type IndexType.
Parameter t : Type.
End IndexType.
Module Type ExtendedIndexType.
Include IndexType.
End ExtendedIndexType.
Module Export IndexGen.
Module Make (IndexType : IndexType).
Notation IndexType := IndexType.t.
Fixpoint t (r : Rank) : Type
:= match r with
| O => unit
| S r => t r * IndexType.t
end.
Notation Index := t.
Definition nil : t 0.
Admitted.
Definition snoc {r} (s : t r) x : t (S r) := (s, x).
Module Import IndexNotations0.
Declare Scope index_scope.
Delimit Scope index_scope with index.
Notation "xs ::' x" := (snoc xs x) : index_scope.
End IndexNotations0.
Definition hd {r : Rank} : Index (S r) -> Index r.
Admitted.
Definition tl {r : Rank} : Index (S r) -> IndexType.
exact (@snd _ _).
Defined.
Fixpoint app {r1 r2 : Rank} {struct r2} : Index r1 -> Index r2 -> Index (r1 +' r2).
exact (match r2 with
| 0%nat => fun sz _tt => sz
| S r2 => fun sz1 sz2 => @app r1 r2 sz1 (hd sz2) ::' tl sz2
end%index).
Defined.
End Make.
Module ExtendedMake (IndexType : ExtendedIndexType).
Include Make IndexType.
End ExtendedMake.
Module Export Shape.
Module ShapeType <: ExtendedIndexType.
Definition t : Type.
exact (int).
Defined.
End ShapeType.
Include IndexGen.ExtendedMake ShapeType.
Declare Scope shape_scope.
Delimit Scope shape_scope with shape.
Bind Scope shape_scope with t.
Notation "xs ::' x" := (snoc xs x) : shape_scope.
Notation "[ ]" := nil : shape_scope.
Notation "[ x ]" := (snoc nil x) : shape_scope.
Notation "[ x ; y ; .. ; z ]" := (snoc .. (snoc (snoc nil x) y) .. z) : shape_scope.
Notation "s1 ++ s2" := (app s1 s2) : shape_scope.
Notation "s1 ++' s2" := (app s1 s2) : shape_scope.
Notation Shape := Shape.Index.
Module Export RawIndex.
Module RawIndexType <: ExtendedIndexType.
Definition t : Type.
exact (int).
Defined.
End RawIndexType.
Include IndexGen.ExtendedMake RawIndexType.
Notation RawIndex := RawIndex.Index.
Monomorphic Definition tensor_of_rank (r : Rank) (A : Type) : Type.
exact (RawIndex r -> A).
Defined.
Monomorphic Definition tensor {r : Rank} (s : Shape r) (A : Type) : Type.
exact (tensor_of_rank r A).
Defined.
Fixpoint concrete_tensor_of_rank (r : Rank) (A : Type) : Type
:= match r with
| O => A
| S r => concrete_tensor_of_rank r (array A)
end.
Definition concrete_tensor {r : Rank} (s : Shape r) (A : Type) : Type.
exact (concrete_tensor_of_rank r A).
Defined.
Fixpoint concretize {r : Rank} {s : Shape r} {A : Type} {default : pointed A} {struct r} : forall (t : tensor s A), concrete_tensor s A
:= match r return forall (s : Shape r) (t : tensor s A), concrete_tensor s A with
| 0%nat => fun _tt f => f tt
| S r
=> fun '(s, len) f
=> concretize (r:=r) (A:=array A) (s:=s) (fun idxs => PArray.init_default len (fun idx => f (idxs, idx)))
end s.
Definition reabstract {r s A} (t_ : unit -> @tensor r s A) (t : @concrete_tensor r s A) : @tensor r s A.
Admitted.
Definition checkpoint {r s A default} t : @tensor r s A
:= let t_ _ := t in
let t := @concretize r s A default t in
reabstract t_ t.
Definition coer_tensor {r s A B} {coerAB : has_coer A B} : @tensor r s A -> @tensor r s B.
Admitted.
#[export] Coercion coer_tensor : tensor >-> tensor.
Module Export NeuralNetInterp_DOT_Torch_DOT_Tensor_DOT_Instances_WRAPPED.
Module Export Instances.
Module Export Tensor.
Definition eqfR_rank {r} : Dependent.relation (@tensor_of_rank r).
exact (fun A B R x y => forall i, R (x i) (y i)).
Defined.
Definition eqfR {r s} : Dependent.relation (@tensor r s).
exact (eqfR_rank).
Defined.
#[global] Arguments eqfR {r s} [A B] R x y.
Notation eqf := (eqfR eq).
End Tensor.
End Instances.
End NeuralNetInterp_DOT_Torch_DOT_Tensor_DOT_Instances_WRAPPED.
Variant NormalizationType := LN .
Module Type CommonConfig.
Parameter d_model : N.
Parameter n_ctx : N.
Parameter d_head : N.
Parameter d_vocab : N.
Parameter n_heads : N.
Parameter normalization_type : with_default "normalization_type" (option NormalizationType) (Some LN).
End CommonConfig.
Import Coq.Floats.Floats.
Definition ln_tensor_gen d_model (nt : with_default "normalization_type" (Some LN)) A
:= (match nt with
| Some LN => tensor [d_model] A
| Datatypes.None => with_default "()" True I
end).
Definition block_params_type_gen n_heads d_model d_head (nt : with_default "normalization_type" (Some LN)) A
:= (
tensor [n_heads; d_model; d_head] A * tensor [n_heads; d_model; d_head] A * tensor [n_heads; d_model; d_head] A * tensor [n_heads; d_model; d_head] A
* tensor [n_heads; d_head] A * tensor [n_heads; d_head] A * tensor [n_heads; d_head] A
* tensor [d_model] A
* ln_tensor_gen d_model nt A
* ln_tensor_gen d_model nt A)%type.
Module Type ExtraConfig (Import cfg : CommonConfig).
Parameter W_E : tensor [d_vocab; d_model] float.
Parameter W_pos : tensor [n_ctx; d_model] float.
Notation ln_tensor A := (ln_tensor_gen d_model normalization_type A).
Notation block_params_type A := (block_params_type_gen n_heads d_model d_head normalization_type A).
Parameter blocks_params : list (block_params_type float).
End ExtraConfig.
Module Type Config := CommonConfig <+ ExtraConfig.
Module Export NeuralNetInterp_DOT_TransformerLens_DOT_HookedTransformer_WRAPPED.
Module Export HookedTransformer.
Module Export TransformerBlock.
Section __.
Context {r} {batch : Shape r}
{pos n_heads d_model d_head} {n_ctx:N}
{use_split_qkv_input : with_default "use_split_qkv_input" bool false}
{normalization_type : with_default "normalization_type" (option NormalizationType) (Some LN)}
(maybe_n_heads := fun b : bool => (if b return Shape (if b then _ else _) then [n_heads] else [])%shape)
(ln_s := (batch ::' pos ++ maybe_n_heads use_split_qkv_input)%shape)
{A}
{zeroA : has_zero A} {coerZ : has_coer Z A}
{addA : has_add A} {subA : has_sub A} {mulA : has_mul A} {divA : has_div A}
{sqrtA : has_sqrt A} {expA : has_exp A}
(defaultA : pointed A := @coer _ _ coerZ point)
{use_checkpoint : with_default "use_checkpoint" bool true}
(W_Q W_K W_V W_O : tensor [n_heads; d_model; d_head] A)
(b_Q b_K b_V : tensor [n_heads; d_head] A)
(b_O : tensor [d_model] A)
(eps : A)
(ln1_w ln1_b ln2_w ln2_b : ln_tensor_gen d_model normalization_type A)
(resid_pre : tensor ((batch ::' pos) ++' [d_model]) A).
Definition attn_only_out : tensor (batch ++ [pos; d_model]) A.
Proof using A W_K W_O W_Q W_V addA b_K b_O b_Q b_V batch coerZ d_head d_model defaultA divA
eps expA ln1_b ln1_w ln_s maybe_n_heads mulA n_ctx n_heads normalization_type pos r
resid_pre sqrtA subA use_checkpoint use_split_qkv_input zeroA
.
Admitted.
Definition attn_masked_attn_scores : tensor (batch ::' n_heads ::' pos ::' pos) A.
Proof using A W_K W_Q addA b_K b_Q batch coerZ d_head d_model defaultA divA eps ln1_b ln1_w
ln_s maybe_n_heads mulA n_ctx n_heads normalization_type pos r resid_pre sqrtA subA
use_checkpoint use_split_qkv_input zeroA
.
Admitted.
End __.
End TransformerBlock.
Module Export HookedTransformer.
Section __.
Context {d_vocab d_vocab_out n_heads d_model d_head} {n_ctx:N}
{r} {batch : Shape r} {pos}
(s := (batch ::' pos)%shape)
(resid_shape := (s ::' d_model)%shape)
{normalization_type : with_default "normalization_type" (option NormalizationType) (Some LN)}
{A}
{zeroA : has_zero A} {coerZ : has_coer Z A}
{addA : has_add A} {subA : has_sub A} {mulA : has_mul A} {divA : has_div A}
{sqrtA : has_sqrt A} {expA : has_exp A}
(defaultA : pointed A := @coer _ _ coerZ point)
{use_checkpoint : with_default "use_checkpoint" bool true}
(eps : A)
(W_E : tensor [d_vocab; d_model] A)
(W_pos : tensor [n_ctx; d_model] A)
(blocks_params
: list (block_params_type_gen n_heads d_model d_head normalization_type A)
)
(ln_final_w ln_final_b : ln_tensor_gen d_model normalization_type A)
(W_U : tensor [d_model; d_vocab_out] A) (b_U : tensor [d_vocab_out] A)
.
#[local] Existing Instance defaultA.
Definition blocks : list (tensor resid_shape A -> tensor resid_shape A).
exact (List.map
(fun '(W_Q, W_K, W_V, W_O,
b_Q, b_K, b_V,
b_O,
ln1_w, ln1_b)
=> TransformerBlock.attn_only_out
(n_ctx:=n_ctx)
W_Q W_K W_V W_O
b_Q b_K b_V
b_O
eps
ln1_w ln1_b)
blocks_params).
Defined.
Polymorphic Definition blocks_cps {T} {n : with_default "blocks n" nat (List.length blocks)} (residual : tensor resid_shape A) (K : tensor resid_shape A -> T) : T.
admit.
Defined.
Definition resid_postembed (tokens : tensor s IndexType) : tensor resid_shape A.
Proof using A W_E W_pos addA batch coerZ d_model d_vocab defaultA n_ctx pos r resid_shape s
use_checkpoint.
Admitted.
Definition blocks_attn_masked_attn_scores
: list (tensor resid_shape A -> tensor (batch ::' n_heads ::' pos ::' pos) A).
exact (List.map
(fun '(W_Q, W_K, W_V, W_O,
b_Q, b_K, b_V,
b_O,
ln1_w, ln1_b)
=> TransformerBlock.attn_masked_attn_scores
(n_ctx:=n_ctx)
W_Q W_K
b_Q b_K
eps
ln1_w ln1_b)
blocks_params).
Defined.
Definition masked_attn_scores (n : nat) (tokens : tensor s IndexType)
: option (tensor (batch ::' n_heads ::' pos ::' pos) A).
exact (match List.nth_error blocks_attn_masked_attn_scores n with
| Some block_n_attn_masked_attn_scores
=> Some (let residual := resid_postembed tokens in
blocks_cps
(n:=Nat.pred n)
residual
(fun residual
=> checkpoint (block_n_attn_masked_attn_scores residual)))
| None => None
end).
Defined.
End __.
End HookedTransformer.
End HookedTransformer.
End NeuralNetInterp_DOT_TransformerLens_DOT_HookedTransformer_WRAPPED.
Module Export Instances.
Module Export HookedTransformer.
Definition block_params_type_genR n_heads d_model d_head nt : Dependent.relation (block_params_type_gen n_heads d_model d_head nt).
Admitted.
Module Export HookedTransformer.
#[export] Instance masked_attn_scores_Proper_dep {d_vocab n_heads d_model d_head n_ctx r batch pos normalization_type}
: Dependent.Proper
(Dependent.idR
==> (Dependent.const eq ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR)
==> Dependent.const (fun _ _ => True)
==> Dependent.idR
==> Tensor.eqfR
==> Tensor.eqfR
==> List.Forall2 ∘ @block_params_type_genR n_heads d_model d_head normalization_type
==> Dependent.const eq
==> Dependent.const Tensor.eqf
==> @option_eq ∘ Tensor.eqfR)
(@HookedTransformer.HookedTransformer.masked_attn_scores d_vocab n_heads d_model d_head n_ctx r batch pos normalization_type).
Admitted.
End HookedTransformer.
End HookedTransformer.
End Instances.
Module Export Parameters.
Local Open Scope float_scope.
Module cfg <: CommonConfig.
Definition d_model := 32%N.
Definition n_ctx := 2%N.
Definition d_head := 32%N.
Definition n_heads := 1%N.
Definition d_vocab := 64%N.
Definition eps := 1e-05.
Definition normalization_type := Some LN.
End cfg.
End Parameters.
Import Coq.Lists.List.
Module Model (cfg : Config).
Module Export HookedTransformer.
Section __.
Context {r} {batch : Shape r} {pos}
(s := (batch ::' pos)%shape)
(resid_shape := (s ::' cfg.d_model)%shape)
{A} {coer_float : has_coer float A} {coerZ : has_coer Z A}
{addA : has_add A} {subA : has_sub A} {mulA : has_mul A} {divA : has_div A}
{sqrtA : has_sqrt A} {expA : has_exp A}
{use_checkpoint : with_default "use_checkpoint" bool true}.
Let coerA' (x : float) : A.
Admitted.
#[local] Coercion coerA' : float >-> A.
Let coer_ln_tensor : cfg.ln_tensor float -> cfg.ln_tensor A.
Admitted.
Definition coer_blocks_params
:= List.map
(fun '((W_Q, W_K, W_V, W_O,
b_Q, b_K, b_V, b_O,
ln1_w, ln1_b) : cfg.block_params_type float)
=> ((W_Q:tensor _ A), (W_K:tensor _ A), (W_V:tensor _ A), (W_O:tensor _ A),
(b_Q:tensor _ A), (b_K:tensor _ A), (b_V:tensor _ A), (b_O:tensor _ A),
coer_ln_tensor ln1_w, coer_ln_tensor ln1_b)).
Definition masked_attn_scores (n : nat) (tokens : tensor s IndexType)
: option (tensor (batch ::' cfg.n_heads ::' pos ::' pos) A).
exact (HookedTransformer.HookedTransformer.masked_attn_scores
(A:=A) (n_ctx:=cfg.n_ctx) (normalization_type:=cfg.normalization_type)cfg.eps
cfg.W_E cfg.W_pos
(coer_blocks_params cfg.blocks_params)
n tokens).
Defined.
End __.
End HookedTransformer.
End Model.
Import ListNotations.
Module cfg <: Config.
Include Parameters.cfg.
Parameter W_E : tensor [d_vocab; d_model] float.
Parameter W_pos : tensor [n_ctx; d_model] float.
Notation block_params_type A := (block_params_type_gen n_heads d_model d_head normalization_type A).
Parameter block_params : block_params_type float.
Definition blocks_params := [block_params].
End cfg.
Include Model cfg.
Section with_batch.
Context {r} {batch : Shape r} {pos}
(s := (batch ::' pos)%shape)
(resid_shape := (s ::' cfg.d_model)%shape)
{return_per_token : with_default "return_per_token" bool false}
{A} {coer_float : has_coer float A} {coerZ : has_coer Z A}
(defaultA : pointed A := @coer _ _ coerZ point)
{addA : has_add A} {subA : has_sub A} {mulA : has_mul A} {divA : has_div A}
{ltbA : has_ltb A}
{oppA : has_opp A} {sqrtA : has_sqrt A} {expA : has_exp A} {lnA : has_ln A}
{use_checkpoint : with_default "use_checkpoint" bool true}.
Definition masked_attn_scores (tokens : tensor s IndexType)
: tensor (batch ::' cfg.n_heads ::' pos ::' pos) A.
exact (Option.invert_Some
(HookedTransformer.masked_attn_scores (A:=A) 0 tokens)).
Defined.
End with_batch.
#[export] Instance masked_attn_scores_Proper_dep {r batch pos}
: Dependent.Proper
((Dependent.const eq ==> Dependent.idR)
==> (Dependent.const eq ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR)
==> (Dependent.idR ==> Dependent.idR)
==> Dependent.const (fun _ _ => True)
==> Dependent.const Tensor.eqf
==> Tensor.eqfR)
(@masked_attn_scores r batch pos).
Proof.
cbv [masked_attn_scores].
pose proof (@HookedTransformer.HookedTransformer.masked_attn_scores_Proper_dep) as H.
repeat intro.
repeat (let v := open_constr:(_) in specialize (H v)).
move H at bottom.
revert H.
lazymatch goal with
| [ |- ?R _ _ ?R'' ?x ?y -> ?R' (invert_Some ?x' ?i) (invert_Some ?y' ?i) ]
=> unify x x'; unify y y'; unify R'' R'; set (x'':=x); set (y'':=y);
intro H;
refine (@invert_Some_Proper_dep _ _ (Tensor.eqfR R') x y H i)
end.
Unshelve.
Search HookedTransformer.block_params_type_genR. Intermediate Coq File (useful for debugging if minimization did not go as far as you wanted)Build Log (contains the Coq error message) (truncated to last 8.0KiB; full 61KiB file on GitHub Actions Artifacts under
|
We pass sigma everywhere env is already passed. This is probably overkill, since most of the time, env and sigma do not change, but since it is already done for env, we imitate.
In the minimized example, the problem is caused by an existential variable in the hypothesis An alternative shorter example, not failing though, is:
which returns
instead of
Or also:
which shows Fixed in #17987. |
We pass sigma everywhere env is already passed. This is probably overkill, since most of the time, env and sigma do not change, but since it is already done for env, we imitate.
We pass sigma everywhere env is already passed. This is probably overkill, since most of the time, env and sigma do not change, but since it is already done for env, we imitate.
Description of the problem
Coq Version
8.19+alpha 61ee398
The text was updated successfully, but these errors were encountered: