Skip to content

how to update APIs for generics #48287

rsc announced in Discussions
how to update APIs for generics #48287
Sep 9, 2021 · 38 comments · 183 replies

@ianlancetaylor, @griesemer, and I are wondering about different possible plans for updating APIs that would have used generics but currently use interface{}, and we wanted to cast a wider net for ideas.

There exist types and functions that clearly would use generics if we wrote them today. For example:

  • type sync.Pool would naturally be sync.Pool[T]
  • type sync.Map would naturally be sync.Map[K, V]
  • type atomic.Value would naturally be atomic.Value[T]
  • type list.List would naturally be list.List[T]
  • func math.Abs would naturally be math.Abs[T]
  • func math.Min would naturally be math.Min[T]
  • func math.Max would naturally be math.Max[T]

There are certainly more of these, both in Go repos and everyone else's code.
The question is what should be the established convention for updating them.

One suggestion is to adopt an "Of" suffix, as in PoolOf[T], MinOf[T], and so on.
For types, which require the parameter list, that kind of works.
For functions, which will often infer it, the result is strange:
there is no obvious difference between math.MinOf(1, 2) and math.Min(1, 2).
Any new, from-scratch, post-generics API would presumably not include the Of - Tree[K,V], not TreeOf[K,V] -
so the "Of" in these names would be a persistent awkward reminder of our pre-generics past.

Another possibility is to adopt some kind of notation for default type parameters.
For example, suppose you could write

type List[E any (= interface{})] struct { ... }

func Min[T comparable (= float64)](x, y T) T { ... }

The (= ...) sets the default for a type parameter.
The rule for types could be that the bracketed type parameter list may be omitted when all parameters have defaults,
so saying List would be equivalent to List[interface{}], which, if we are careful, would be identical to the current code,
making the introduction of a generic List as list.List not a backwards-incompatible change.

The rule for functions could be that when parameters have defaults,
those defaults are applied just before the application of default constant types in the type inference algorithm.
That way, Min(1, 2) stays a float64, while Min(i, j) for i, j of type int, infers int instead.

The downside of this is a little bit more complexity in the language spec,
while the upside is potentially smoother migration for users.
Like the "Of" suffix, an API with type defaults would be a persistent awkward reminder of our pre-generics past,
but it would remind mainly the author of the code rather than all the users.
And users would not need to remember which APIs need an "Of" suffix added.

Are there other options we haven't considered?
Are there arguments for or against these options that haven't been mentioned?

Thanks very much.

Replies

38 comments
·
183 replies
var i int
Min(i, 5)   // is Min[int]
Min(3, 5) // is Min[float64]

Would definitely be confusing. Especially if you converted from the former to the latter while debugging something.

11 replies
@mibk

And what about using !go1.18 and go1.18 build tags and actually change the definition, so modules using go 1.18 would need to be adjusted to the new syntax?

@bcmills

@mibk, !go1.18 and go1.18 build tags would change the definition of the package, not its contents as viewed by the importer.

@rsc

rsc Sep 16, 2021
Maintainer Author

Min is probably just a bad example.
I agree with Keith that Min(i, j) being int but Min(1, 2) not being int is probably too surprising.
(And Min(1, 2) must stay a float64 for x := Min(1,2).)
So let's take Min off the table.

@nirui

But what about

func min() int {
    return math.Min(3, 5)
}

or

func getPool() int {
    return pool.Get()
}

etc? Will they still be inferred as math.Min[float64]&sync.Pool[interface{}]{}, or they will be inferred to math.Min[int]&sync.Pool[int]{} to match the return type?

IMO a reasonable default is fine. math was accept&/returning float64 only, dudes just silently sit there casting int32 to float64 before feeding them to math.Abs (maybe crying internally at the same time). Now you can have math.Abs take and return float64, int32, int, uint, PackOf6 etc. So at very least the func Min[T comparable (= float64)](x, y T) T { ... } (or whatever syntax it will become) approach is not a lost.

@ianlancetaylor

I want to note that this example:

var i int
Min(i, 5)   // is Min[int]
Min(3, 5) // is Min[float64]

is an issue in general, even without default type arguments. Suppose we have a hypothetical

alg.Min[T constraints.Ordered](a, b T) T

with no default type argument. Now we get

var f float64
Min(f, 5) // is Min[float64]
Min(3, 5) // is Min[int]

So I don't think introducing default type arguments makes this any worse.

Personally, I don't like the asymmetry this would create between type-parameters and regular parameters. Default values for arguments is something that has often been asked about, but we never did it. To me, the case for default type arguments isn't really stronger than for default regular arguments though.

I also don't like introducing a new language feature just for a one-time migration. If the motivator for this feature is just the migration of pre-generics APIs to post-generics APIs, it will become de-facto obsolete in a year or so. I don't like that idea.

6 replies
@Merovius

The difference with type aliases is that they are not only useful for a one-time migration. They will always be useful to gracefully migrate APIs between packages. That use case is never going to go away, we will always run into situations where we have to move types between packages, even for new APIs.

Default type arguments, however, are motivated by a one-time migration of pre-generics APIs to post-generics APIs. At some point, there won't be any pre-generics APIs left (i.e. every new API created after go 1.18 is a post-generics API) and they become obsolete.

Of course, if we think default type arguments are useful beyond this one-time migration, that's another story. But I'd argue that default function arguments would be just as useful (if not more) and we never did those either.

@rogpeppe

Default type arguments, however, are motivated by a one-time migration of pre-generics APIs to post-generics APIs. At some point, there won't be any pre-generics APIs left (i.e. every new API created after go 1.18 is a post-generics API) and they become obsolete.

I'm not sure that's true. With a small tweak to the rules, default type arguments could be useful any time a type parameter is added to an existing API, which I suspect will be more than we currently think, particularly in larger programs.

The rule that I'd suggest tweaking would be this one:

The rule for types could be that the bracketed type parameter list may be omitted when all parameters have defaults

in favour of:

The rule for types is that they may be omitted if that parameter and all following parameters have defaults and all the following types are omitted as well.

That change could also be made at a later date without problem, I think.

I don't really buy the parallel with default function arguments - functions are first-class values in Go but generic types and functions are not, so the impact of adding default arguments to functions is considerably deeper and has more runtime implications (think of the issues that Python has with default arguments, for example) than adding them for type arguments.

@Merovius

I'm not sure that's true. With a small tweak to the rules, default type arguments could be useful any time a type parameter is added to an existing API, which I suspect will be more than we currently think, particularly in larger programs.

I think wanting to add parameters to functions is even more common. And yet we never made that easier with default arguments. There are many changes to APIs which are fairly commonly useful. We don't support most of them.

If I want to, say, add a context.Context argument to a function, I have to come up with a new name for the migration. I don't see why adding type parameters should get any preferential treatment here.

@rogpeppe

I think wanting to add parameters to functions is even more common. And yet we never made that easier with default arguments.

It might be more common, but adding function default parameters to the language is hard. I don't think anyone knows how to do it, and it would have big implications for existing code (I'm not sure it would be possible to add them without breaking compatibility). By contrast, adding default type arguments is easy.

@AndrewHarrisSPU

I hope I'm not saying this in too abruptly - maybe I'm missing something - but would the rule rewrite effectively mean this:

The rule for types is that they may be omitted if that parameter and all following parameters have defaults and all the following types are omitted as well. Also, that the default won't be changed at a later point in time.

One more data-point: This strategy wouldn't help with sort.Slice, for example. At least as far as I can tell. Even though it looks like we might lean towards making that slices.Sort, I feel like this speaks against this as a general strategy. Many generic APIs can't just replace an interface{} with a type-parameter.

4 replies
@rsc

rsc Sep 9, 2021
Maintainer Author

sort.Slice was always an odd-ball, because of the comparison function taking indexes and not values. It was never going to transition smoothly to generics.

@Merovius

Yes. That's pretty much my argument :) I don't see a general strategy for smooth transition to generics working.

Personally, I'd rather figure out if we can do a v2 of stdlib packages or something like that and lean into cleaning up APIs for generics.

@rsc

rsc Sep 9, 2021
Maintainer Author

Most types are nothing like sort.Slice.

@carlmjohnson

containers/heap has the same problem as sort.

Using new major versions for stdlib packages, e.g. sort/v2, seems like a clean solution to me. I've seen a few people mention this, generally with a disclaimer like "there seems to be opposition to the idea," but I haven't seen the counterarguments. As a bonus, the precedent could also facilitate redesigns independent of generics, like #26263 and #22697.

37 replies
@smyrman

I don't see any inherent difficulty with net/http/v2 and encoding/json/v2 and archive/zip/v2. I don't think it would make sense to have net/v2/http`. Maybe I'm missing something.

@ianlancetaylor, usually sub-packages import the parent package, so I wold think there are potentially acceptable reasons you might want the versioning to be after the first element. E.g. if introducing a breaking change to net.Listener in net/v2 (arbitrary example), then surely you would need a v2 pacakge of net/http as well, and then net/v2/http would be a sensible way of communicating that dependency.

Given package versioning in the standard library is done infrequently and with great care, then the added duplication and potential cleanup that would result of introducing versioning after the first path element, might be an acceptable trade-off.

@ianlancetaylor

I don't think the interesting argument is whether the subpackage imports the parent package. I think the interesting argument is whether we would normally want to have a v2 version of all the subpackages at the same time. For example, it seems to me that writing a v2 of encoding/json or encoding/xml in no way implies that we need or want a v2 of encoding/binary or encoding/base64 or encoding/hex, or, for that matter, of encoding itself.

For the case of archive/zip, note that archive/zip does not import archive. In fact, there is no archive package.

@smyrman

I think the interesting argument is whether we would normally want to have a v2 version of all the sub-packages at the same time.

I completely agree to that. I am not to worried either way, as I think the answer to this question would become more clear with proto-types. The more v2 proto-types (related to generics or not) the better. If the proto-type doesn't have to be in code, but could be in the form of a Gist or a blog article, then anyone can write one.

I won't be writing any proto-types in code, but I though I would do a few articles to visualize how a v2 of different packages might look like for generics in particular. I have started out with an article for sync/v2, showing a potential sync/v2/atomic as well. I will share it when it's in a more ready state. I also started looking at math/v2 which has a higher number of sub-packages. If anyone feeels inspired to write a piece on this, and beats me to it, then I don't mind.

@lpar

None of this is meant to imply that versioning standard library packages is a good idea, or a bad idea. It does solve the problem at hand, but it's a little weird to have to remember to write math/v2.

Well, that’s a problem with Go’s package system in general, not specific to generics. I can’t be the only person who has gotten half way through using a third party library for something and only then discovered that I want the v2 version. (Or in some cases the v4 version, I’m looking at you UUID.)

A solution, of course, would be for the default to be the most recent if the version is not explicitly specified. Perhaps doing generics using regular package naming conventions could apply some pressure to solve that problem for packages in general?

There is no math2.h or java.lang.Math2.

On the other hand, Java’s library is now a decent date and time library plus an unholy mess of obsolete non-thread-safe functions that programmers have to remember to avoid. And there’s java.io2, they just decided to call it nio for New IO instead.

@smyrman

I though I would do a few articles to visualize how a v2 of different packages might look like for generics in particular. I have started out with an article for sync/v2, showing a potential sync/v2/atomic as well. I will share it when it's in a more ready state.

My progress on this has been very slow as I have been busy with other stuff, but here is an early draft that might already be useful to read:

There is a very nice overview of suggestions to changes in #48287 (comment), but I have tried to go a little-bit more in depth. E.g. could we extend the assembly syntax to allow writing a generic version of the various functions in atomic? If so, we could reduce the number of functions from 29 to 5.

I haven't gotten down to write anything about the math package yet, not to mention math/big. I probably won't get the time to do so anytime soon, but I would actually accept PRs, and be sure to mention anyone contributing to the article if/when I decide to release them for a wider audience.

Another approach is to observe that the language supports type aliases to support transitions. So let's use type aliases.

type Pool[T any] ...
type Pool = Pool[interface{}]

The rules here would be:

  1. The type alias must be at the same scope as the type (which currently means only at package scope).
  2. The type alias identifier would be the same as the alias target (otherwise it is an ordinary alias).
  3. The type alias identifier must not have any type parameters.
  4. The type alias target must specify arguments for all type parameters.

This alias rule does not work for functions. We can either say there is no transition for functions, or we can introduce another rule.

A function may be defined both with and without type parameters. A reference to the function (not a call) with no type arguments is permitted, and gets the version without type parameters. A call of the function with no type arguments gets the version with type parameters and does type inference as usual. If type inference fails, the call is instead made to the version without type parameters (and may fail if the arguments type are not assignable).

That gives us

func Min[T constraints.Ordered](a, b T) T { ... }
func Min(a, b float64) float64 { return Min[float64](a, b) }
16 replies
@srfrog

Why not merge both ideas?

// New generic func with naming convention.
func MinOf[T comparable (= float64)](x, y T) T { ... }

// Support for existing code, and default behavior.
func Min = Min[float64]

So if I call math.Min(4.2, 1.0) or math.MinOf(4.2, 1.0) I get the same behavior at none of the intellectual cost. It works as expected and it won't force me to learn a whole new way unless I need it. I'm lazy.

@mvdan

On the minus side, it's a bit odd to permit two different objects with the same name in a scope; in order to not break all kinds of invariants both in the implementation and in the go/types API, such a pair of type declarations would probably need to be considered a single object, but that is also strange: What is the object named "Pool"? Where is is declared?

Don't we have to worry about this problem anyway, due to #46477?

@gbarr

An additional benefit for having 2 declarations as opposed to default types in the declaration is that they can have separate documentation. The new generic documentation can be kept clear without having to mention the compatibility defaults and the second declaration also allows the developer to mark it as deprecated in the documentation.

@akavel

Seeing that this proposal (as many others too) is adding a language feature, I'd like to ask a question: would this language feature be defensible alone if the need for the transition from non-generics to generics didn't occur? I.e., if generics were there from the start, would this feature be deemed valuable enough to warrant its inclusion in the language? (I don't know the answer, I'd just like to make sure the Core Team themselves are consciously aware of the answer and take it into account when making a decision.)

@ianlancetaylor

@akavel No, I don't think this language feature would be defensible if generics has been there from the start. It's a good question to ask. Still we have to do something (though we don't necessarily have to add a language feature, we can use different packages or different type/function names).

It's clearly essential to always have a crazy idea that everybody can reject, so here is one. default is already a keyword. So let's permit, only at package scope,

default Pool Pool[interface{}]
default Min Min[float64]
3 replies
@griesemer

This is essentially the same as the alias idea above (at least for types), except that a keyword is used.

@changkun

Not necessarily need the default keyword, this might be more consistent with the subject:

type Pool[T any] ...
type Pool = Pool[interface{}] // type alias

func Min[T constraints.Ordered](a, b T) T { ... }
func Min = Min[float64]       // func alias

(° ο°)...

@jpap

I like the idea of keeping the declaration as one: with a separate alias/default declaration, one of the two parts might end up far from the other, reducing source code readability.

What about using the default keyword with the original proposal:

type List[E any default interface{}] struct { ... }

func Min[T comparable default float64](x, y T) T { ... }

func Print2[T1 any default string, T2 any default interface{}](s1 []T1, v2 []T2) { ... }

@bcmills
bcmills Sep 9, 2021
Maintainer

One suggestion is to adopt an "Of" suffix, as in PoolOf[T], MinOf[T], and so on.
For types, which require the parameter list, that kind of works.
For functions, which will often infer it, the result is strange:
there is no obvious difference between math.MinOf(1, 2) and math.Min(1, 2).
Any new, from-scratch, post-generics API would presumably not include the Of - Tree[K,V], not TreeOf[K,V] -
so the "Of" in these names would be a persistent awkward reminder of our pre-generics past.

I'm going to go out on a limb and suggest that we use the Of suffix for generic container types, regardless of whether they are new. Then the Of in the names is no longer a persistent awkward reminder, but a consistent pattern.

Constructor functions for generic containers get the Of suffix because the type has it, and because they can't generally be inferred locally anyway: list.NewListOf[T]().

math gets a mulligan for using the good names for the 64-bit-only variants: v2 it. import "math/v2" and the only call sites that change are the ones that can't infer arguments (e.g. Nan()) or pass only untyped constants. Then maybe we can revisit the awkward inclusions for that package at the same time. (Do people really use math.Yn on a regular basis‽)

I think atomic.Value needs more thought anyway (#47657 (comment)). So maybe v2 that package too, or use names that aren't just ValueOf..?

If we do the above, what are the remaining problematic names?

2 replies
@rsc

rsc Sep 9, 2021
Maintainer Author

Didn't that ship sail with map?

@bcmills

bcmills Sep 10, 2021
Maintainer

Maybe, but as a built-in it's already funny-looking. (And its second type parameter spills outside the brackets! 😅)

The idea clearly exists already, as it was mentioned in this thread but I think it's worth its own suggestion as I think it aligns pretty nicely with what we're asking package maintainers to do in general:

Don't introduce generics in most of these libraries. /v2 them:

  • "sync" -> "sync/v2" gets generic Value, Pool, Map
  • "math" -> "math/v2" gets generic Abs, Min, etc
  • "list", "heap", "ring", etc get "list/v2", etc for their generic versions.

The normal packages should be implementable in terms of the generic version, so if it doesn't compromise performance we could do the go fix trick to rewrite them all to /v2.

7 replies
@twmb

If choosing a v2 approach, I think it'd be worth it for gopls / goimports / etc. to adopt a convention of importing v2 by default if go.mod uses go.1.18+ (which it should, if a library / binary is using generics).

@lu4p

Alternatively /v2/math, /v2/sync, etc, to fit the usual module-version style. That means we'd have a single v2 directory with all the new API in. As you say, all the current packages should be implementable in terms of the new API.

I think this is not the right approach because stdlib packages might want to version independently from another in the future. (e.g. math/v3 while sync is still at v2)

@Merovius

@lu4p That's not something precluded by putting the version in front. Effectively, the v3/math might exists, while v2/sync doesn't, until we decide to revamp it again. The std as a module is already special (in that you don't need to put the module path into the import), there's no reason to force it to upgrade all packages at once.

@nickkeets

What if we take this one step further? Go 1.18 becomes Go 2 and importing v2/math etc happens by default, with import math. But, we also keep the old versions too as v1/math etc. We do break compatibility this way, but in a way that can be ported automatically with go fix.

@ncruces

@nickkeets, if the Go community really stands by SIV, there's no reason to exclude the standard library from the associated ugliness.

I support this suggestion, particularly as I see it as having the potential to be generally useful into the future given a minor tweak to the rules.

I'd question the need for the parentheses. I understand that they're there to suggest the "maybe" aspect of the default but I don't think they pull their weight, and there's considerable precedent from other languages (e.g. Rust's default type parameters, Python's default function parameters) that they could be omitted without much confusion.

That is, I think it would be fine if the examples were written thus:

type List[E any = interface{}] struct { ... }

func Min[T comparable = float64](x, y T) T { ... }
3 replies
@AlaxLee

Drop the parentheses is good idea. "=" usually used for "equal" or "assignment", if we can use another word instead of "="?
For example:
type List[E any default interface{}] struct { ... }

@AndrewHarrisSPU

I agree the sense in which E would be assigned interface{} as a default value by = feels a bit out of place here. default seems a little more obvious. Really, I think the type assertion syntax is the strongest analogy: E.(interface{}) in the production of type List[E.(interface{}) any] struct { ... }. (Maybe it's clearer to say it's a type judgement, as 'assertion' can imply something that can fail?)

The spec states the following for the type assertion from an interface x.(T):

... even though the dynamic type of x is known only at run time, the type of x.(T) is known to be T in a correct program.

Here we would be analogously stating for a type judgement E.(interface{}):

... even though the concrete type of E is only known at instantiation time, the type of E.(interface{}) is known to be interface{} in a correct program.

This judgement holds even if no other instantiation of E can be reached ... so, there is a subtle difference between E any and E.(any) any.

For Min[T.(float64) comparable](x, y T) T { ... }, when something like Min(1, 2) returns float64, maybe it's sensible to say it's a judgement?

@AlaxLee

IMHO, There may be a confusing place here.
When using x.(T), it often means:

if x's type is T, right; if not T, wrong.

When using E.(interface{}) in type List[E.(interface{}) any] struct { ... }, it means:

if E's type is interface{}, right; if not interface{}, right too.

Is this something that could be handled by the go directive in go.mod? Something like go 1.18 and up changes them to generic versions, with a go fix directive to add an explicit sync.Map[interface{}, interface{}] and the like? This would obviously only work for the standard library, but external modules can use a major version update to introduce generics instead.

It seems like overkill for something like this, and if a clean alternative can be found than that's probably better, but I hadn't seen it discussed anywhere and introducing incompatibilities was one of the primary motivations for adding the directive, if I remember correctly.

2 replies
@bcmills

That option came up a bit in #48287 (comment). (I think the tricky part is in defining the syntax for the declaration site.)

@suared

Why would that be required? Wouldnt it be a 1 time fix?

A proposal/idea related to various more or less vague "go fix" approaches already mentioned around here and there, but specific enough and subtly different enough to warrant separate mention, given this is clearly a brainstorming discussion:

  • Treat go version directive in go.mod as transition point, such that:
    • for code having version < go1.18, new packages are available via import "go2/math" etc., old packages via regular import "math"
    • for code having version >= go1.18, new packages are now available via import "math" etc., old packages via import "go1compat/math"
    • a go fix upgrade (or similar) rewrites the imports & bumps the go version directive in go.mod

(obviously, the "go2/" and "go1compat/" prefixes being subject to bikeshedding, and go1.18 subject to change to newer)

edit: Notably, AFAIU this is also basically how Rust editions are working, so Rust community leaders could be consulted about any nonobvious pros or cons of this approach. FWIW, from a distance it seems to be working for them well enough that they repeated this "phase transition" a few times already. Also, if adopted, this could possibly also help with other bigger "Go2" changes if needed at some point.

edit 2: This also seems to me fairly similar to the /v2, /v3, etc. versioning in 3rd-party packages, as introduced with the modules system. The difference being, Go stdlib under this proposal would come with a "deflation" mechanism via go fix, basically purely for ergonomy/ease-of-writing's sake. Though the disadvantage of this "deflation system" would be that when reading Go code, one could then not be immediately sure which version of the stdlib the code is using - only being able to discern this after looking into go.mod.

1 reply
@ianlancetaylor

I think this could work but I think it would be difficult for people to understand. It's always problematic when the exact same code means different things based on some other file. It makes it much harder to understand the code when reading it. In this case the problem is somewhat mitigated because it is a one time transition cost. But still.

We could teach go.mod replace to work with stdlib packages.

As an example:

module xyz

go 1.18

replace sort => typed/sort

require (
  ...
)

Benefits:

  • Backward compatibility: all parameterized types are under typed/ (or something similar)
  • Explicit: no magical compiler behavior choosing one version over another based on some heuristic.
  • New code can trivially use the go.mod replace (and go mod init can add the replace directive automatically for Go versions that support generics).
  • Existing code can be migrated by explicitly naming the import path import sort "typed/sort".
  • Go 2 will rename all typed/* to * and provide code upgrade tool.
1 reply
@Merovius

This seems unhelpful to me. Library code is going to want to use the new versions of packages and can't rely on replace. So they will have to type out the entire path. At that point, it seems simpler and less confusing to just have everybody type out the full import path.

Is the plan to keep the generics behaving the same as the original?

As implemented as a generic bits.RotateLeft() the inliner is assigning a much higher cost (almost x2) to the function. Whilst doesn't prevent it from inlining, it prevents the caller, QVarint(), from inlining (non-generic cost is < 50, generic cost is > 120), even though the assembly generated from the generic code is practically identical to the assembly generated from the non-generic code.

type Unsigned interface {
	~uint8 | ~uint16 | ~uint32 | ~uint64 | ~uint
}

func BitWidth[T Unsigned]() uint {
	return uint(8 + (T(8<<8) >> 8) + (T(16<<16) >> 16) + (T(32<<32) >> 32))
}

func RotateLeft[T Unsigned](x T, k int) T {
	n := BitWidth[T]()
	s := uint(k) & (n - 1)
	return x<<s | x>>(n-s)
}

func QVarint[T Unsigned](x T) (uint64, int) {
    y := uint64(RotateLeft(x, 2))
    n := uint(8) << (y & 0b11)
    if w := BitWidth[T](); w == 64 || n <= w {
        s := w + 2 - n
        return y >> (s % 64), int(n / 8)
    }
    // truncated 
    return 0, -1
}

func main() {
	x := uint64(1 << 56)
	y, n := QVarint(x)
	_, _ = y, n
}


1 reply
@rsc

rsc Sep 16, 2021
Maintainer Author

It is definitely the case that if we write generic APIs that we expect to be inlined, we will need to pay attention to the cost assumed by the inliner in higher-level frames. Thanks for raising this issue.

A possible awful idea for rejection: implementation-specific suffixes akin to _unix, _windows, etc, but for generics/not: _generic, _typed. This would allow us to compile a typed version in contexts where generics are not available, and a generic version where they are. Also possible/less collide-y would be version build suffixes instead (_1.17, _1.18`).

This builds on the observation:

For functions, which will often infer [the type], the result is strange:
there is no obvious difference between math.MinOf(1, 2) and math.Min(1, 2).

1 reply
@clarkmcc

If these suffixes behave anything like existing suffixes, then you could only flip generics on or off for the entire project at compile time. In my opinion, a solution that allows incremental migration of existing projects rather than an all or nothing approach is pretty valuable.

I wonder how much compilation time it would take if the function signatures would also be distinguished by their parameter lists. Preferring a generic implementation over the interface one.
Just a thought..

0 replies

Another thought:

import (
    "generic/sort"
)

This gets the new generic-based API, old sort lives where it does now for historical reasons, and if for some reason you need both, you can specify a prefix.

I like this slightly better than v2/sort or sort/v2 because it's more clear about what specifically you're recommending, and it's explicitly a distinct package so it's allowed to change the API more than it might otherwise.

On the other hand, this works poorly for cases, like sync, where some things have no reason to change, and it seems silly to have to duplicate them all into the new package. Type aliases can soak some of that pain for us, but function aliases would be a much bigger change...

I'm also unsure what the right type restrictions would be for math.Abs. Any floating point type? Should it work on Complex? How about integers? How about unsigned integers?

1 reply
@Merovius

Type aliases can soak some of that pain for us, but function aliases would be a much bigger change...

The general recommendation for functions is to forward them as a call: func Foo(x T) { somepkg.Foo(x) }.

How about just break, since there is no promise to keep compatible of stdlib. It's simple and for users, the changes are small. And of course, some people will develop tools to migrate them automatically.

1 reply
@Merovius

since there is no promise to keep compatible of stdlib.

There is:

Go 1 defines two things: first, the specification of the language; and second, the specification of a set of core APIs, the "standard packages" of the Go library. The Go 1 release includes their implementation in the form of two compiler suites (gc and gccgo), and the core libraries themselves.

It is intended that programs written to the Go 1 specification will continue to compile and run correctly, unchanged, over the lifetime of that specification. At some indefinite point, a Go 2 specification may arise, but until that time, Go programs that work today should continue to work even as future "point" releases of Go 1 arise (Go 1.1, Go 1.2, etc.).

Compatibility is at the source level. Binary compatibility for compiled packages is not guaranteed between releases. After a point release, Go source will need to be recompiled to link against the new release.

The APIs may grow, acquiring new packages and features, but not in a way that breaks existing Go 1 code.

(emphasis mine)

I went through the list of packages in the standard library to see what I think should be done with them. Going through the list, it seems pretty clear to me that you're either going to add some /v2 packages or be stuck with really ugly docs forever. Changing how aliasing works or adding default types can only help in limited situations. Regexp and math/big in particular really need v2 to be viable. However, I think a lot of them would be fine with just adding a generic Of type and some back references. It varies a quite bit depending on the specifics of what needs to be added or removed and why.

container/heap

  • Add new PriorityQueue type; leave existing top level functions but indicate most users probably just want PriorityQueue.

container/list

  • Add ListOf[T any] with deque implementation and deprecate List.

container/ring

  • Add RingOf, with Ring being RingOf[interface{}]

regex

  • Add a v2 package to consolidate String/Bytes variants

sort

  • See discussion of slices.Sort

sync

  • Add MapOf, PoolOf

sync/atomic

  • Add ValueOf

testing

  • Add t.MustEqual[T comparable](a, b T) bool

math

  • Create v2 package ; there are too many functions to try to change just some of them and not others and the semantics are particular to each one, ie you can take a Sin with float32 but not int, etc.

math/big

  • Create v2 package to consolidate Int/Float methods

strconv

  • Add ParseNumber[T constraints.Number](s string, base int) (n T, err error)

Packages which could benefit from a v2 cleanup unrelated to generics:

  • image
  • log
  • math/rand
  • database/sql
  • net
  • net/http
  • net/http/httputil
13 replies
@Merovius

but the interface type switches are here now and work fine.

It works aggressively mediocre, at best. For example, if we where to use that, we are limited to pre-declared types.
But that's probably fine and I think we can relax the constraints later, if ever applicable.
I probably wouldn't object doing this, but I wouldn't do it in my own code for the limited benefit it provides.

@carlmjohnson

I was thinking that ParseNumber would only work with unnamed numerical types, although I see that it would be better if generic type switches existed so you could support named types without importing reflect. Maybe that should wait until the type switch thing is decided one way or another.

@deanveloper

That there is no testing.MustEqual means people have been abusing reflect.DeepEqual instead

This doesn't have much to do with generics though, the "generic solution" would essentially behave exactly the same as reflect.DeepEqual, all it can do is guarantee that a and b are of the same type. So there really doesn't seem to be much benefit to t.MustEqual.

Also, it's not really "abusing" to use it for what it's meant to do... Either way, the most common cases for reflect.DeepEqual would be fixed by things like slices.Equal and maps.Equal (and their Func variants) anyway, correct? Doesn't really seem necessary to also add t.MustEqual.

@aarzilli
* regex has 16 methods to create the Find variants. Having a single Regexp.Find(in StringLike, out ResultType, limit int) method would clean things up considerably. It's just a confusing package as is now.

* That there is no testing.MustEqual means people have been abusing reflect.DeepEqual instead. There's already an open issue about adding it somewhere. The main hang up is how to print diffs.

* ParseNumber would just be convenient when you need a float32 or whatever.

All those things are true but you have to consider that there is a compile time penalty to every exported type parametric symbol and the standard library should teach users proper restraint in using type parameters, rather than spray them everywhere they make the slightest bit of sense.

@jhenstridge

Add t.MustEqual[T comparable](a, b T) bool

Is this one even possible under the proposed language change? I thought it was only adding generic types and functions: not generic methods. You might be able to do a function something like this though:

func MustEqual[V comparable](t *T, a, b V) bool

That's quite different from the rest of the API provided by the package.

This comment was deleted.

2 replies
@Merovius

This comment has been hidden.

@dotwaffle

This comment has been hidden.

I think this is a good time to declare Go 2, and let Go1 Rest In Peace.It served its purpose well but it is starting to show its age and hiding all the worts is getting hard. We could and should have done this awhile ago eg when modules were introduced which obsoleted lots of the premodule packages books and tools. I don’t see much benefit of keeping the same major version number if the style and idioms change so significantly. If planned correctly the drawbacks of some older code not compiling due to some needed minor changes is far less than the consequences of having to learn maintain and debug complex code. Tools can deal with minor code breakage. Humans pay the price for complexity.

6 replies
@DrGo

@ianlancetaylor Thanks for the link to the nice analysis of previous versioning debacles.

I like to class Go with c rather than C++ or Java (and JS) in the sense that these two are more of a family of languages with different dialects and accents rather than a single simple to read and write language like Go and c .

My concern is that a strategy of trimming nothing while continually adding major new features (eg modules, type parameters,) and minor features (e.g., two or three versions of many important lib routines eg http funcs that take context vs those that do not) is what transformed c into C++ and Java. Soon we will have very different ways of working with slices and possibly other data structures. This proposal adds more subtly and not so subtly different versions of many other routines. Readers and maintainers of code will have to master all these varieties.

Go's major appeal has been its simplicity and robustness (a direct consequence of simplicity). In light of the relentless pressure to keep adding features, I think we should aim to continue this tradition by taking opportunities offered by the introduction of a major dialectal change, like generics, to trim as many redundancies and inconsistencies as we can.

Just a thought and thanks for all you do to make us, the small potatoes, feel welcome here.

@nemith

See https://go.googlesource.com/proposal/+/refs/heads/master/design/28221-go2-transitions.md .

Offtopic: Be interesting to see where Rust's editions fits into the thoughts around Go2 transitioning.

@deanveloper

Rust editions looks to use a similar pattern that Go Modules use: the module specifies a language version, and the Go compiler will use that version to compile the module

@fsouza

@deanveloper Go doesn't use that version to introduce breaking changes though, Rust does. I think maybe that's what @nemith was inquiring about?

@deanveloper

The idea of the version being listed in go.mod is to (possibly) use it to introduce breaking changes to the language if needed. I don’t think it’s been used quite yet, but it is there for now.

This reminds me of the context.Context package integration. We had net.Dial and we created net.DialContext. The ideal would be to just have net.Dial, but we're too late. Maybe it's not so bad to do a similar thing and have a suffix like Of. New context aware functions don't add Context to their name anymore and people seemed to understand why we did this.

1 reply
@cep21

Maybe the function call is sortGeneric?

I'd rather stuck with some *Of functions (much like *Context) functions, than add some new quirky features to the language itself. There would be a limited set of those functions, any new would be generic by design.

Addition of context.Context to the stdlib is a most valid reference here, imo.

7 replies
@sfllaw

Sorry, I miswrote and my analogy was imprecise. This is not exactly “colouring” so I shouldn’t have used that word.

context.Context has a problem where you really want to call other context-aware functions, instead of working around a blocking call.

Similarily, as a library author, having an FooOf convention for generic functions is going to lead to a similar bifurcation. Imagine I have written the following package:

package maths

import "math"

func Clamp(x float64, lo float64, hi float64) {
	return math.Min(math.Max(x, lo), hi)
}

Now imagine that the Go standard library decides to implement MinOf and MaxOf. I will soon get a feature request to implement ClampOf because my dependents really want to clamp other number-types:

package maths

import "math"

func ClampOf[T constraints.Ordered](x T, lo T, hi T) {
	return math.MinOf(math.MaxOf(x, lo), hi)
}

func Clamp(x float64, lo float64, hi float64) {
	return ClampOf(x, lo, hi)
}

Now we have two parallel ecosystems, because there is no source compatibility between Clamp and ClampOf, just like there is no source compatiblity between Dial and DialContext. Because the context transition has been unpleasant (and still incomplete), people are suggesting solutions that allow for backwards compatibility without making their downstreams deal with pre- and post-generics APIs.

@DeedleFake

I mostly agree on the context.Context point, though my biggest problem post context has generally been a lack of context support in various functions, or strange, partial support like requiring the use of a net.Dialer in order to get access to a context variant. The existence of both Func and FuncContext doesn't actually bother me that much.

@Merovius

@sfllaw My impression is that occasions for that to happen are relatively rare and almost always shallow. For example, I don't think there are many libraries which wrap around math to offer additional functions and ~no libraries around those. There currently isn't a huge ecosystem of wrapping more functionality around third-party libraries in a way that would benefit from a simple addition of type-parameters with specific defaults either. So I'm not sure how big of an actual problem this will be, in practice.

@carlmjohnson

Now we have two parallel ecosystems, because there is no source compatibility between Clamp and ClampOf, just like there is no source compatiblity between Dial and DialContext.

ISTM, the alternatives are:

  • Manually promote Clamp to generic with something like the default type system proposed in other comments for legacy callers
  • Somehow automatically promote Clamp to generic (this seems like a bad idea, but maybe someone will think of a workable variation on it)
  • maths/v2 with generic Clamp (this has the two parallel ecosystems problem, but is less ugly at the call sites)
  • Don't add generics to Go (for better or worse, generics are coming, so this ain't gonna happen)

It's not clear in the case of maths that the parallel ecosystem problem is worth solving. With context, the problem is annoying because context tries to be the one unified mechanism for cancellation across the whole standard library, but there are a lot of pre-context loose ends scattered about, so you may or may not be able to rely on it, and there are a bunch of deprecated mechanisms to skim past in the docs. With math(s), I guess there will be some ints and float32s being cast to and from float64 that don't have to be, but it doesn't affect the big picture of things and piecemeal cleanup doesn't seem like a big deal.

ISTM, the only reason not to promote math.Min, math.Max, and maths.Clamp to generic is that someone, somewhere might have written f := math.Min; if cond { f = math.Max } or something. I think just adding a type hint mechanism for that specific scenario ought to be enough to allow it to move forward without MinOf, etc.

@sfllaw

I would be interested in hearing from https://pkg.go.dev/gonum.org/v1/gonum about how they would like to see their library used in a post-generics world.

I am sure there are other major libraries that would also benefit.

I personally like the default type parameters approach because:

  • For user, it can be introduced largely hassle-free. The old code will enjoy new features without too many modification
  • In the future, some clever mind might find a way to capitalize the feature and turn it into something far more general and useful
  • Not important point, but while some might see it as a "awkward reminder of our pre-generics past", I call it "the heritage of our glory legend"

However, there are down sides (I can think of):

  • It's a "Magical Default", not zero
  • It makes type infer confusing (like in the math.Min case)

The second favor solution of mine is the v2 package one, the pros:

  • It utilizes the version convention that the Go team promotes
  • The old std is unchanged so compatibility is not an issue, while user can still use generics to implement their own struct/functions (should they dislike the one in the std)
  • Let's face it, many packages in the std will face versioning eventually. Might as well just let it happen sooner
  • It's pollution-free, no need to introduce new syntax for the default type parameter

And the cons:

  • It's basically a "Let user upgrade themselves" approach, user must change their code in order to use generics
  • Third-party package maintainers now might has to target two std dependencies rather than just one earlier

My opinion is, Go should have a clear "Sink and Lift" strategy that "sinks" packages into a "Compatibility group" if the package cannot fully utilize the new language features (similar to the v1compat idea), and then replace them with packages that can (the "lift" part).

The Sink and Lift operation should be performed alongside a language upgrade. For example, during Go1 -> Go2 upgrade, math from Go1 should be moved to v1compat/math, and then math/v2 becomes the new math.

When user decides to upgrade their code to Go2, they can use go fix to help them to redirect/replace their dependencies to either the new math or the old v1compat/math one.


Here is the idea I don't like: Prefix/Suffix idea ("Of",typed/* and generics/*), because:

  • It could introduce new elements to std, and they will always be there at least until the next major version (which is true for everything in the std, of course)
  • The next major version will probably undo it (all "Of",typed/* will eventually be removed or replaced with a more general one)
  • User still needs to change their code to utilize generics, same as the v2 idea, but without versioning and other benefits
  • The new element is feature-specific. net.DialContext might be good during the time when Context was introduced, but that does not prove that net.DialOf is also good. Think of it: what if you want to use Context and generics at the same time? net.DialContextOf? What if in the future Sumtype become real? net.DialContextOfSum? .... Yeah... I don't think this is really sustainable

So, to sum up again: my small little brain says it loves the default type parameters idea (maybe just not in that exact syntax). And if we can't have that, then the v2 idea is the way to go.

Just my two brain cells.

2 replies
@bbrodriges

In the future, some clever mind might find
This is not a Go approach. Generics is a good example of that. Current proposal describes bare minimum features with intention to extend it in the future based on usage by community.

Making language change (without new keyword introduction but nonetheless) just for the sake of single problem solution and hoping that some day someone will find another useful application for this feature is bad, imo.

net.DialContextOf

This must not happen because all functions after context.Context introduction are not contain Context suffix. Same must go with generics and *Of suffix. net.DialOf must be implemented with context as first argument by default, as well as all new suitable for generics functions must be implemented without *Of suffix by default.

math/v2

As much as I like Go approach with package versioning it seems to me that moving just a part of stdlib under versioned directory will introduce more confusion than solve the problem.

I can imagine user code with imports like math/v2, image/v3 in a single file. This approach may fraction stdlib to many (probably incompatible) parts.

@nirui

This must not happen because all functions after context.Context introduction are not contain Context suffix. Same must go with generics and *Of suffix.

Is this an artificially enforced rule? If so, then why

imports like math/v2, image/v3 in a single file.

can not be artificially avoided by for example a properly designed versioning strategy?

This approach may fraction stdlib to many (probably incompatible) parts.

"May" is a strange word. This approach may indeed fraction std to many incompatible parts, the world may also be gone tomorrow in a flash of bang, and I may become the sol dominator of Mars one day.

The truth is, only some "may"s are really statistical may, others are decided by somebody. The std in Go1.16 is compatible with the one in Go1.17 because somebody decided to guarantee that.

That been said, down to the core it's really depends on the strategy employed. If Go team wants to protect compatibility, they will design their strategy accordingly. std versioning does not creates fraction naturally.

There are many good ideas. However, all are workarounds in some or the other way.

Hence, why not be super clear about the switch to a generics based standard library, but in a transparent way. Here is just a rough sketch of the idea:

  1. If I specify 1.18 in my go.mod file or maybe on a per-file basis, the compiler uses the generics' version of the standard lib.
  2. If I specify < 1.18 or nothing, the compiler still uses the non-generics version.
  3. This applies for all dependencies too.

As long as the API semantics don't change, this should work. Maybe some name mangling is needed to separate both code bases on the binary level.

Instead of doing anything for backward compatibility, we are doing something for forward migration.

10 replies
@matt0xFF

@carlmjohnson:

The thing about v2+ [...] you don't have to wonder, "Should I use CancelRequest or cancel the context.Context?" it's very clear v1 is old; v2 is newer; use v2 for new stuff. It's a simple story to be told and understand

It looks like as of last week #40850 is enabled at https://pkg.go.dev, it lets you mark things as deprecated and have them automatically hidden in the documentation. If it's just about keeping the (user-visible) API clear these sort of tooling changes might go a long way without requiring a separate v2 version.

@jhenstridge

Sure, but if you want code that uses the new stdlib to communicate with code using the old stdlib (i.e. calling functions or using types defined by a dependent module), then it's quite possible you'll have situations where Go 1.18 code needs to import packages from the old stdlib.

So even if there is some automatic import path remapping that is dependent on declared Go version of a module, the real package paths would likely need to be public and usable without remapping. And if they are public, is it really worth bothering with the remapping?

@Robert-M-Muench

I am not certain if I understand your point.

When you have an import path std-lib/xzy which gets remapped to std-lib/without-generics/xzy or std-lib/with-generics/xzy depending on the go version expected, there is only one public path, that works for all.

If my module requires generics, the go.mod has a go version 1.18+ and can't be used by people that want to use an earlier go compiler.

@jhenstridge

Let's say I've got a module written with Go < 1.18, which includes code like:

import "list"
func Foo() *list.List

Now imagine that I am writing a program using Go 1.18+, where list.List has been replaced with a generic type. If my program wants to call the Foo function from the above module, how would I refer to the function's return type? If there is any possibility for the two worlds of Go code to interoperate, at least one side will need to be import both versions of the standard library.

@HALtheWise

If my program wants to call the Foo function from the above module, how would I refer to the function's return type?

I see basically two options here (and you are correct that we would want to implement one of them):

  1. The "old" and "new" list packages are explicitly importable with some path, either "list/v1" or "pregeneric/list" or something, and the go version just controls which of those "list" is an alias for. That way, just like with third-party packages, you can simply import both on one side of the boundary.
  2. We require that every symbol in the old package has an alias or identical symbol in the new package. In many cases (like list.List) that new symbol can simply be a specialization of a now-generic type or function (list.List -> list.List[interface{}]), but it doesn't need to be if the new API doesn't map trivially to the old one. New code would see old functions return the equivalent new symbol. This has the added benefit that all users of the old library can be safely upgraded to the new one automatically with go fix or similar. This would probably be implemented by making an alias-only version of the old lists package API implemented in terms of the new lists package, just like if an open source package were trying to do the same thing. Unlike third-party code, we can hold the standard library to a higher standard of forward-compatability that the new symbols are actually indistinguishable from the old ones.

I personally prefer 2 because of the tooling advantages and because it makes it way easier for the go ecosystem to migrate quickly to the new APIs.

For generic math, type/function aliases would be better than default parameters.

Edit: As long as you can add parameters in an alias to a different name (type mgl32.Vector3 = mgl.Vector3[float32]), I think all of these cases would work with a combination that and default parameters - if default parameters are included, they don't require aliases to the same name.

For example the mathgl packages include 3D math types used by OpenGL bindings / games etc. There are 32 and 64 bit versions of each type, mgl32.Vec3/mgl64.Vec3, mgl32.Mat2x3/mgl64.Mat2x3, etc. Type aliases would allow each package to point to (and be compatible with) common generic versions of each type, but a default parameter wouldn't work since each package needs a different one (same thing for the functions, although wrapper functions would also be fine if they can be optimized away).

If Go ever permits array size type parameters (#44253), the type alias mechanism could also be used to aid migration in the same way (Vector3 could become an alias for Vector[3]).

Another example of math code (mathf.go) in the wild uses the f suffix (sinf, cosf) for 32 bit versions of math functions - type aliases would work there while default parameters wouldn't.

For those suggesting library authors break API compatibility, then all packages using those libraries re-writing to use the new API - I think we need to be realistic that this will not rapidly or completely happen. Many tasks currently work fine without generics (for example, 64-bit math code), so many applications and libraries would need to change a large number of files for zero or marginal benefits (changes that may include forking dependencies that are no longer actively maintained).

Even new code might continue to use "math" instead of "math/v2", if you are using 64-bit math, the former is shorter, and what's the difference?

I think the likely outcome is that users would need to be familiar with multiple ways to do the same thing (across many libraries) for the foreseeable future. Creating such a situation in the name of simplicity seems like a false economy.

12 replies
@wolverian

(Off-topic)

Because v1 would be implemented in terms of v2, so anything not reflected as an API change (like performance improvements and the like) you'd benefit from indefinitely, even if you use v1.

You're probably talking about the standard library here specifically, but I'm wondering how many projects in the Go ecosystem generally do this. Definitely my own vague understanding has been that people usually abandon old versions when they make breaking changes instead of re-implementing them in terms of the newer version. Some statistics about this would be super interesting!

@Merovius

You're probably talking about the standard library here specifically

FWIW, I'm not. The explicit intention of this discussion is to create general guidance for how to do this. And the mechanisms we've developed over the last years are generally applicable. And I believe they are what should be our general guidance.

Whether that's actually followed is a fair question. But as far as I'm concerned, that's a question for the maintainers of the relevant modules. As so often in open source, it's up to them to what degree they want to support their users and it's up to you to decide if you want to trust a particular maintainer.

@matt0xFF

@Merovius

But if it's worth introducing a new language feature, it definitely is worth releasing a new package. Language features are far more costly than releasing new packages.

In order to understand your position better, can you elaborate on this (with respect to this particular feature)? I have some idea of the costs of breaking compatibility, and in addition, I think users having to remember to add v2 to certain commonly-used packages like math, but not others, will forever add a (small) amount of cognitive overhead to learning and using Go.

However, I don't have as clear an idea of the costs of adding this particular language feature. For example, if there is some particular way users are likely to abuse it, or if there is reason to believe it will be an ongoing source of compiler bugs.

@lu4p

I have some idea of the costs of breaking compatibility, and in addition, I think users having to remember to add v2 to certain commonly-used packages like math, but not others, will forever add a (small) amount of cognitive overhead to learning and using Go.

It's perfectly valid to use the old version of the package, because v1 will be implemented using the v2 package. I expect most users to use gopls (via their editor), which should automatically add imports with the latest version.

@wolverian

The explicit intention of this discussion is to create general guidance for how to do this. And the mechanisms we've developed over the last years are generally applicable. And I believe they are what should be our general guidance.

Thank you, guidance on this topic is a good idea.

How about the following rule (inspired by Java) for types: if a type instantiation doesn't contain type parameters, then the type parameters default to the constraint types. For example:

type [T interface{}] List struct {...}

var p List[bytes.Buffer] // type-safe usage
var p List               // == List[interface{}], same as today

Pros:

  • Works well for container types
  • Most generic things in Go today seem to be containers, since writing generic functions is tricky
  • No additional syntax
  • Default rule is simple

Cons:

  • Doesn't work for interfaces that can be used as constraints but not regular values (e.g. comparable)
  • Doesn't work for making concrete functions generic
  • Prevents future type inference on generic type literals (if we have S[T], S{...} is defined to mean something with this rule)
2 replies
@Merovius

Doesn't work for interfaces that can be used as constraints but not regular values (e.g. comparable)

That excludes 4 of the 7 examples in the top-post.

@AndrewHarrisSPU

Your cons list, and a good amount of discussion here and elsewhere, highlight that a constraint does not express everything a programmer might wish about the shape of the eventually unified type. They are sufficient for a lot of useful, common cases but I think it'd be worth examining the kinds of affordances that may be possible for less common cases.

Metapoint: I don't think this issue worked very well with the GitHub discussions format. Discussions seem to work better when there is one big proposal with many small details to work out (e.g. there is going to be a slices package, what should go in it?), and then the discussions each burrow in on a detail (e.g., let's figure out a name for slices.Compact). At this point, this issue is still too up in the air (change generic type aliases? add v2 packages? something else?) to drill down on specifics, so each subdiscussion tends to repeat one or two of the main themes with minor variations in the specifics, which makes it very hard to follow.

1 reply
@deanveloper

This kind of discussion also doesn’t really fit Github issues either, that’d also be extremely messy.

I think threaded discussion with recursing sub-threads (a la Reddit) would be best for this kind of discussion.

I may be “under-thinking” the problem, it feels like too much work on the language to come up with what feels like a workaround. The use cases I can think of feel like they are handled with existing constructs. This is my use case summary, what am I missing?:
• User code upgrade to 1.18 from previous – their code is updated using go fix so that it is seamless (in the example of a current different return type the source code would be re-written at that point to cast to the pre 1.18 type so the rest of the program would function as originally intended).
• User wants to import a new package that has a go.mod of => 1.18 – the existing modules approach would error and tell them they need to upgrade to 1.18 to use the library. When they do, bullet 1 would take effect
What am I missing?

2 replies
@carlmjohnson

A lot of people need to support multiple versions of Go simultaneously. See the Go2 transition doc for specific criticisms of "the Python 3 problem" we want to avoid,

because there is no backward compatibility, it is impossible to mix Python 2 and Python 3 code in the same program. This means that for a typical program that uses a range of libraries, each of those libraries must be converted to Python 3 before the program can be converted. Since programs are in various states of conversion, libraries must support Python 2 and 3 simultaneously.

Because simultaneous support for 2 & 3 wasn't possible at the start, the Python 3 transition dragged on for a decade…

@suared

Thanks for the reference, it was a great read! re: Python 3 issue and potential solution to this thread seems to be already in the article referenced. It may just be a matter of agreeing how to characterize some of the changes to follow the articles advice. In my mind, the 3rd use case that I now see is:

  • User chooses to update their own code but relies on packages with earlier version dependencies - In the article the suggestion is to use the version to enable the compiler to compile it correctly. This is possible because of modules if it can be assumed to be reasonable to expect all "legacy" packages to now minimally support modules. The <1.18 in go.mod of pulled in packages would enable the compiler to use the "old" behavior.

Are there any other missing use cases? As a language user, my bias would be to keep things simple so the language has clear expectations. Having that hidden or forced into action by requiring something like a go fix to be run potentially even on pulled in packages would be preferable in my view. What I think makes Go unique vs a java or other in its ability to do this is that other than plugin there is no real "binary" interface that I am aware as in java's jar, etc. The source is available to be recompiled and must be even for plugin so I would favor whatever is simplest as a user so the don't need to navigate multiple versions of core library seemingly equivalent documentation. Just my .02. Very interesting problem, thanks for the insights!

how about suffix 'X'?

0 replies

At the risk of adding one more proposal to this already lengthy discussion:

  1. Create "math/v2" with the new APIs under obvious names (func Min[T numeric](...)) and (where necessary) the old APIs under ungainly names but marked as deprecated.
  2. Create "math/v1" that consists of aliases and thin wrappers around "math/v2" types and functions, and exposes a consistent API with the current "math" package.
  3. In Go tooling, resolve import "math" to either "math/v1" or "math/v2" depending on the go language version in go.mod. Explicitly specifying is always allowed.
  4. go fix (or a similar command) should bump the go version in your go.mod and modify the import paths and/or usages in your source code so there is no behavior change. At it's simplest, that means replacing all "math" imports with "math/v1", but a smarter migration tool could instead replace references with their new names Min -> MinFloat64, with generics-aware specializations Min -> Min[float64], or even use type information to notice when inference rules would do the right thing anyway Min -> Min

This approach is intended to keep the standard library as similar as possible to what public packages need to do, and step 3 is the only new piece required for this to work.

The warts of this approach are also warts any open-source library author will face, and the (optional) additional changes below will help with community migrations as well. Each of these would need to go through its own proposal process.

  • Language change: Allow function aliases similar to type aliases. Let "math/v1" declare func Min = math.Min[float64] instead of needing to make a wrapper function. This also avoids performance concerns with the wrapper, and improves consistency between func, type, and const, simplifying the language.
  • Make godoc propagate documentation from aliased-to types and funcs, either automatically or with a special directive in the comment. This prevents needing to maintain both the old and the new doc comments in many cases.
  • Make go fix automatically upgrade any dependency to a higher version whenever all referenced symbols in the old version are simply aliases to symbols in a higher version.
  • Release a script that helps package authors make v1 "alias packages"
8 replies
@hherman1

Feels a bit magical for “math” to be special. Do we really want devs to have to learn this or think about it?

@ncruces

While I agree that functions aliases have been unnecessary so far (function identity is not particularly relevant, one line wrappers work), @ianlancetaylor's proposal is different.

This proposal is not about redirecting users to new packages. For this, the current type aliases plus function wrappers would be sufficient.

This proposal, AFAICS is about overloading names within the same package, which has as been noted, is novel in Go. And if we are to overload names, I think aliasing is definitely clearer than wrapping.

@HALtheWise

I was using math as an example, the intent would be that all standard library packages would use this process, where new major versions can become the default in a new go release.

@ChrisHines

As far as I know, forwarding functions shouldn't create a performance penalty. Since Go 1.12 functions that do nothing but call another function are inlined.

@Merovius

@ncruces This thread is not about @ianlancetaylor's suggestion. It's about @HALtheWise's suggestion. Which would be a case of redirecting to new packages.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
🐢
Discussions
Labels
None yet