Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Protocols should double as first-class functions #45

Open
mlanza opened this issue May 10, 2023 · 4 comments
Open

Protocols should double as first-class functions #45

mlanza opened this issue May 10, 2023 · 4 comments

Comments

@mlanza
Copy link

mlanza commented May 10, 2023

In #43 I asked how the syntax for invoking protocols would look.

The answer provided was:

stooges[Functor.map](stooge => stooge.toUpperCase())

This is unpleasant to look at. I find it awkward (and metaprogrammery) to reach inside an object with a symbol to pull out a method and invoke it. I agree it has to happen (this is similar to what I do in my library only that part is under the hood) but I don't want to write/read that aesthetically displeasing syntax. I want it to remain an implementation detail.

I'm not aware of any languages with first-class protocols necessitating this kind of chicanery. Most invoke them as ordinary functions. Having implemented and used protocols (in JavaScript!) for nearly a decade, this is how I invoke them:

Functor.map(stooges, stooge => stooge.toUpperCase()) //improved readability!

When something gets promoted into the language proper one of the usual gains is integrated syntax. Partial application and pipelines exemplify this. I use both regularly today with subpar syntax.

Records and tuples are possible today with no changes in the language:

const della = Record({
  name: 'Della',
  children: Tuple([
    Record({
      name: 'Huey',
    }),
    Record({
      name: 'Dewey',
    }),
    Record({
      name: 'Louie',
    }),
  ]),
});

But they are changing the language so that we can have:

const della = #{
  name: 'Della',
  children: #[
    #{
      name: 'Huey',
    },
    #{
      name: 'Dewey',
    },
    #{
      name: 'Louie',
    },
  ],
};

This further illustrates how syntax, good or bad, either spurs or hinders adoption.

The respondent to my original question states:

It's similar to how you can use, for example, Symbol.iterator by doing myIterable[Symbol.iterator]().

Reaching into an object with a symbol and then invoking it might be bearable if were only done occasionally, but protocols are not a once-in-a-while kind of thing. I use protocols interchangeably with functions (i.e. all the time!). I don't make a distinction between them. Both are just functions. I even export them as such:

export const map = Functor.map; 

so that I can

map(stooges, stooge => stooge.toUpperCase())

The beauty is you can start off by implementing a function and later, when the use case appears, promote it to a protocol. You don't have to prematurely decide you need a protocol for some situation. This follows the usual advice about not too quickly deciding that.

The dev importing your protocol need not even know it's anything but an ordinary function! He shouldn't have to care whether it's an ordinary function, a composition, a higher-order function, a multimethod, or a protocol. If it's a function it should interface and act like a first-class function. And it should be possible to pass it around as a value.

But when a protocol is invoked via symbol it no longer can be. It loses its first-class function status. This is a serious design mistake.

function groupBy(xs, f){
  //somewhere in the body `f(x)` is called.
  //but if `x[f]()` is required of protocols then protocols are not also first-class functions
}

If there were no distinction between flavors of functions (protocol v. ordinary) then f could be either and that's ideal. I mean, you wouldn't want to exclude a whole class of functions from being passed around as values.

Furthermore, you'd want to be able use the underlying flavors of functions interchangeably, e.g. to treat this as a private implementation detail.

Protocols are more akin to functions than methods, as their first-class nature shows. You'd absolutely want to be able to pass them around and use them in all the spots where you might pass in a vanilla function. TypeScript's types and interfaces are fine as methods as they're suited to OOP, whereas protocols are more suited to FP as I've previously elaborated.

Interfaces make the object the subject whereas protocols make the function the subject.

This is a key distinction.

//protocol
Functor.map(stooges, stooge => stooge.toUpperCase()) // kingdom of verbs

//method/interface
stooges.map(stooge => stooge.toUpperCase()) // kingdom of nouns

JavaScript (via TypeScript and/or duck typing) already has interface polymorphism. What it lacks is function polymorphism. Protocols can afford this and provide a different (FP-improved) style of programming, by allowing protocols to be treated as functions rather than methods. Clojure popularized protocols and this is how it uses them and from where much of their power is derived.

@michaelficarra
Copy link
Member

michaelficarra commented May 10, 2023

To be fair, in the same way that you do

export const map = Functor.map;

you can do

export const map = (target, ...args) => target[Functor.map](...args);

Nevertheless, I share your view that we should make the calling convention as nice as possible. Indeed, Functor.map(target, ...args) is nicer than doing the indirection yourself at each call site. We could have protocols provide such functions, but that namespace is currently used for referring to the symbols that underpin the protocol (the plumbing, essentially). They'll need to go somewhere, which will make manually wiring up that plumbing slightly more cumbersome. But that use case is far less common/important than invoking a protocol, so if you have an idea for how one might access the protocol's symbols, I'd be open to repurposing the namespace on the protocol for these helpers that do the indirection for you (and possibly solve #43 at the same time). Maybe protocols need a meta-protocol for accessing them? Protocol.symbolFor(Functor, 'map'). Functor[Protocol.symbolFor(Protocol, 'symbolFor')]('map') lol.

@bakkot
Copy link

bakkot commented May 10, 2023

Re: calling convention, we could have Functor.map be a function object which just invokes the symbol-named method on its first argument, and also give the function object a Symbol.toPrimitive method which returns the symbol for the protocol, which would let you use either style:

// this stuff gets done for you when you define a `Functor` protocol with a `map` method:
let functorMapSymbol = Symbol('Functor.map');
let Functor = {};
Functor.map = function(rec, ...args){ return rec[functorMapSymbol](...args); };
Functor.map[Symbol.toPrimitive] = () => functorMapSymbol;

// this is how you implement the protocol:
let o = {
  [Functor.map](arg) {
    console.log(arg);
  },
};

// usage:
o[Functor.map]('hi');

// alternative usage:
Functor.map(o, 'hi');

I kind of hate this idea but it's also kind of amazing.

@mlanza
Copy link
Author

mlanza commented May 10, 2023

I really appreciate your willingness to consider the angles. Thank you for being open.

I'm not just offering up the first ideas which occurred to me. I don't think it's a reach, based on my decade of experience with protocols in JS, to say I have more experience and unique qualifications in this arena than the vast majority of JS devs.

Now, I'm not saying I'm the best programmer. That's why I was excited to see this proposal. I'm eager to have better programmers implementing it. I want this proposal to be a strong as I know it can be. Also, I know some of my suggestions are counterintuitive in JS where kingdom-of-noun thinking is normative.

I largely modeled my implementation after ClojureScript's which, I think, is a pretty excellent implementation (including satisfies, reify among other details). It allows some protocols to remain private implementation details.

For example, my reduce protocol has this signature (with the subject holding the initial slot as all protocols require) but it's not the primary export for reduce.

IReducible.reduce(coll, f, init)

Rather I export another function using a different arrangement of args which itself defers to the protocol just as ClojureScript does. It routinely makes sense to do this. And this is beautiful because you could decide on a different shape and/or optimization for the function which gets exported. That's another reason protocols and functions ought be interchangeable. What you're passing around is a function interface irrespective of it current (or future) implementation details!

reduce(f, init, coll)

To your question, it's been a while since I looked at all my low-level protocol implementation details, but I recommend you look at my library. A lot has changed in EcmaScript since I implemented protocols, so I've undoubtedly missed some opportunities to refactor.

Like your proposal, I use symbols but not in the same way. (I think I used them as class privates before class privates existed.) Run its tests in your browser from a static server and set some breakpoints. You'll quickly see how I implement the details (including nil-punning).

I'm not the most adept programmer (esp. with performance concerns) but what I have works and appears in a lot of shipped-to-production code. I even handle prototype chains which is, I thought, one of your outstanding questions.

It really is a very big deal to be able to pass protocols around as ordinary functions. The community really ought not miss this.

@ljharb
Copy link
Member

ljharb commented May 10, 2023

Alternatively Functor.map could be a function that Symbol.toPrimitives to the symbol :-p

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants