We're not voting yet -- we haven't even explained the issues yet :)
The issue of non-denotable types is where all the complexity (and opportunity to get it wrong) in this feature lives. Dan will soon post some examples that hopefully will illustrate why both "just don't infer them, make the user say what they mean" and "just infer them, they're types" -- as "simple" and consistent as both of these seem -- are both extreme (and/or naive) positions.
(FWIW, initially I was in the "just don't infer" camp too; the attraction of that is that every program with `var` corresponds to an equivalent program with `var`. But the number of times where inference produces a capture or intersection is surprisingly high, and it will absolutely be perceived as "that stupid Java compiler, can't they just tell that..." Additionally, users will perceive the "penalty" of inference failure as messing up how their code prettily lines up -- and likely will seek to distort their code to avoid this aesthetic fail.)
On 3/30/2017 2:24 PM, [email protected] wrote:
Having a var that uses a non denotable type seems wrong to me, showing/hiding the type of a var should be a valid refactoring in any cases, IMO. So i vote for (2).
