Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unify hash Base encoding/decoding funcitons #87

Merged
merged 8 commits into from
Dec 17, 2020

Conversation

Anton-Latukha
Copy link
Contributor

Minimal start on refactoring the module interface step-by-step towards Cryptonite migration.

@sorki
Copy link
Member

sorki commented Dec 14, 2020

Looks reasonable - maybe we can drop encodeBaseXY altogether and use encodeIn, decode instead (encodeBase | decodeBase sounds even better).

There are only a few places that use *codeBase16 *codeBase32 outside of the module.

Can you please rebase on master? It needs adding qualified B32 in test.

sorki added a commit to sorki/hnix-store that referenced this pull request Dec 14, 2020
@sorki
Copy link
Member

sorki commented Dec 14, 2020

After rebase you can cherry-pick sorki@b26a815

Anton-Latukha pushed a commit to Anton-Latukha/hnix-store that referenced this pull request Dec 15, 2020
Anton-Latukha pushed a commit to Anton-Latukha/hnix-store that referenced this pull request Dec 16, 2020
@Anton-Latukha Anton-Latukha force-pushed the 2020-12-14-uni-encode-decode branch 2 times, most recently from 17b8cac to c2c3a4b Compare December 16, 2020 10:23
shouldBe (encodeBase32 @StorePathHashAlgo (hash exampleStr))
shouldBe ((encodeIn Base32) @StorePathHashAlgo (hash exampleStr))
Copy link
Contributor Author

@Anton-Latukha Anton-Latukha Dec 16, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting encounter:

-- Old code:
encodeBase32 :: Digest a -> T.Text
-- New code:
encodeIn Base32 :: Digest a -> T.Text

Recieves:

tests/Hash.hs:44:17: error:
    • Cannot apply expression of type ‘Digest a0
                                       -> Data.Text.Internal.Text’
      to a visible type argument ‘StorePathHashAlgo’
    • In the first argument of ‘shouldBe’, namely
        ‘((encodeIn Base32) @StorePathHashAlgo (hash exampleStr))’
      In a stmt of a 'do' block:
        shouldBe
          ((encodeIn Base32) @StorePathHashAlgo (hash exampleStr))
          "xv2iccirbrvklck36f1g7vldn5v58vck"
      In the second argument of ‘($)’, namely
        ‘do let exampleStr
                  = "source:sha256:2bfef67de873c54551d884fdab3055d84d573e654efa79db3"
                      <> "c0d7b98883f9ee3:/nix/store:myfile"
            shouldBe
              ((encodeIn Base32) @StorePathHashAlgo (hash exampleStr))
              "xv2iccirbrvklck36f1g7vldn5v58vck"’
   |
44 |       shouldBe ((encodeIn Base32) @StorePathHashAlgo (hash exampleStr))
   |                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

I need to lookup, more what it is about, on the face seems really strange.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Type application @StorePathHashAlgo has to be the first "argument".

Your type is encodeIn :: BaseEncoding -> Digest a -> T.Text which due to ScopedTypeVariables (which implies ExplicitForAll) gets rewritten to encodeIn :: forall a . BaseEncoding -> Digest a -> T.Text and using TypeApplications you need to apply algorithm to a type var so it's not ambiguous.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably for you it is something obvious, but for me - it is completely obscure and needs reading.

Since the promise of functions being the first class, the equivalent function sectioning should be equivalent. Thanks for the pointer into the direction to understand.
I definitely need to look deeper into quantifiers and types.

Currently the easiest/prettiest solution - is to where the section application 🤣

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok.

I got the interplay between the language extensions and quantifiers.

I forgot what ScopedTypeVariables does, thou the name is instructive when one remembers.

The caveat seems to be that visible type application happens to an invisible type argument that GHC exposes to a declared or annotated function, otherwise, GHC seems to not give access to the invisible type application argument before argument.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool, I've learned about this while working on this codebase and toying with polysemy and my toy polytype lib on top of it. It takes some time to get used to type level programming in Haskell :-)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Type level programming in Haskell:

Me: So how many extensions do you want?
GHC: Yes.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TypeApplicaton with DataKinds is a strange beast.

I want my kinds to be * -> * and even better - sometimes polymorphic, so * accepts everything. I want my kinds to be * and * -> * not HashAlgorithm suddenly. What is a Hash Algorithm kind, not calling the harder cases. When types suddenly become not types and kinds suddenly become not kinds - how I should think about them.

DataKinds is an extension from the dependent types lambda space, btw "Giving Haskell a Promotion" has an argumentation against dragging-in all functionality from dependent types.

I am OK and cool with native lambda field extensions.
And as I see, dependent type extensions need to be used one, to several at a time in an enclosed environment. If someone somewhere wants to go full dependent types - do that in self-enclosed environment so the interface is classic Haskell (typed parametric lambda calculus). Because DT extensions may easily slip across the project - and suddenly the code is harder and becomes rigid, not refactorable and refactoring becomes not evolution but a ripping-out of that system.

Copy link
Contributor Author

@Anton-Latukha Anton-Latukha Dec 18, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also many of the extensions are shown and discussed/copy of one trivial spiel in a lab coat.
And nobody explained what that extension can be effective for "in life application". What field of application are well-known to get much from it. What are classical occurrences where TypeApplication with DataKinds really make code more performant and to what degree, I understand that it allowed to move some computation to compile-time, but newcomer needs to know "how much" comparing to the cost of the introduced code rigidity and complexity. And it is impolite to ask the implementation author about it, if newcomer can read it in the newspaper (documentation).

@sorki
Copy link
Member

sorki commented Dec 16, 2020

I would prefer encodeBase | decodeBase naming, decode is rather ambiguous (e.g. used by Cereal and tons of other libs).

@Anton-Latukha
Copy link
Contributor Author

I agree about encode/decode being too generous here.

The root cause for me is - encode/decode are general names, but suddenly they do only the Base encoding.

And in functions which purpose from the name (and from the type) is ambiguous - I like to indicate the direction. While decodeBase is grammatically correct - and encodeBase - "encode the base", or "encode into the base" - that is why using encodeIntoBase or more pidgin minimalistic encodeInBase - it creates minor inconvenience for the writer, but it gives ease and right understanding during reading, so tend to insert that to remove the ambiguity.

Anton-Latukha pushed a commit to Anton-Latukha/hnix-store that referenced this pull request Dec 16, 2020
@Anton-Latukha Anton-Latukha changed the title Unify hash encode/decode Unify hash encode/decode into Base encoding Dec 16, 2020
Anton-Latukha pushed a commit to Anton-Latukha/hnix-store that referenced this pull request Dec 16, 2020
@Anton-Latukha Anton-Latukha force-pushed the 2020-12-14-uni-encode-decode branch 3 times, most recently from ffdbcd1 to 3551815 Compare December 16, 2020 22:40
Anton-Latukha pushed a commit to Anton-Latukha/hnix-store that referenced this pull request Dec 16, 2020
Anton-Latukha pushed a commit to Anton-Latukha/hnix-store that referenced this pull request Dec 16, 2020
Anton-Latukha pushed a commit to Anton-Latukha/hnix-store that referenced this pull request Dec 16, 2020
Anton-Latukha pushed a commit to Anton-Latukha/hnix-store that referenced this pull request Dec 16, 2020
@Anton-Latukha
Copy link
Contributor Author

This seems good enough.

@Anton-Latukha Anton-Latukha changed the title Unify hash encode/decode into Base encoding Unify hash Base encoding/decoding funcitons Dec 17, 2020
Copy link
Member

@sorki sorki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking good, just two nitpicks.

hnix-store-remote/src/System/Nix/Store/Remote.hs Outdated Show resolved Hide resolved
hnix-store-remote/src/System/Nix/Store/Remote/Parsers.hs Outdated Show resolved Hide resolved
Anton-Latukha and others added 2 commits December 17, 2020 13:05
Co-authored-by: Richard Marko <srk@48.io>
Co-authored-by: Richard Marko <srk@48.io>
Copy link
Member

@sorki sorki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you!

@sorki sorki merged commit ba166a5 into haskell-nix:master Dec 17, 2020
@Anton-Latukha
Copy link
Contributor Author

Thank you also.

@Anton-Latukha Anton-Latukha deleted the 2020-12-14-uni-encode-decode branch December 17, 2020 11:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants