kbolino 8 days ago

I think both C and Go (the former is relevant due to Thompson's involvement in both and the massive influence it had on Go) are very "practical" languages, with strict goals in mind, and which delivered on those goals very well. They also couldn't have existed without battle-tested prior experience, including B for C and Limbo for Go.

I also think it's only from the perspective of a select few, plus some purists, that the authors of Go can be considered anything other than experts. That they made some mistakes, including some borne of hubris, doesn't really diminish their expertise to me.

1
thomashabets2 8 days ago

But my point is that articles like this show how that if you don't keep up with the state of the art, you run the risk of making this predictable mistakes.

If Go had been designed in the 1980s then it would have been genius. But now we know better. Expertise is more than knowing state of the art as of 30 years prior.

kbolino 7 days ago

I don't think the state of the art ca. 2007 was the same as it seems today.

For one thing, Go took a number of forward-thinking stances (or ones which were, at least, somewhat unusual for its time and target audience), like UTF-8 strings (granted, Thompson and Pike created the encoding in the first place), fat pointers for strings and slices, green threads with CSP (though channels proved to be less useful than envisioned) and no function coloring, first-class functions, a batteries-included standard library, etc.

The only things I can think of which Go did that seemed blatantly wrong in that era would be the lack of generics and proper enums. Null safety, which IMO proved to be the killer development of that era, was not clearly formed in industry yet. Tony Hoare hadn't even given his famous "billion-dollar mistake" talk yet, though I'm sure some idea of the problem already existed (and he did give it in 2009, a couple of years into Go's development but also before its first public release). I know others find the type system lacking, but I don't think diving hard into types is obviously the best way to go for every language.

If one were to seriously investigate whether expertise was valued by Pike et al., I think it would start by looking at Erlang/OTP. In my opinion, that ecosystem offers the strongest competition against Go on Go's own strengths, and it predates Go by many years. Were the Go designers aware of it at all? Did they evaluate its approach, and if so, did they decide against it for considered reasons? Arguing against C++ was easy coming from their desired goals, but what about a stronger opponent?

9rx 7 days ago

> The only things I can think of which Go did that seemed blatantly wrong in that era would be the lack of generics and proper enums.

Typescript added "improper" enums several years later. Which is especially interesting as it is the one feature in Typescript that doesn't map directly to Javascript. I'm not sure even that one was the settled science back then.

Generics were well established at that time, but that one didn't escape the Go authors. From day one it was explicitly called out as a feature Go should have, but that they hadn't figured out how to integrate it. https://youtu.be/rKnDgT73v8s?t=3257 Despite Taylor's admirably insistent efforts to find a solution (with 8+ proposals to his name, dating back to before Go was released to the public!), it ultimately required convincing an outside expert to lend a hand.

> I know others find the type system lacking

It may be lacking by today's standards where types are all the rage, but at the time we should also remember that it was the case that types simply weren't cool. People had grown tired of "doing XML sit-ups" and had fully embraced dynamic languages in reaction to that. Go was built in that time, for that time, explicitly intending to be a language that felt like a dynamic language but with the performance advantages of a static language. https://youtu.be/rKnDgT73v8s?t=471