-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to extend with type applications #5
Comments
Do you mean type applications on the type level as in |
It would be interesting cook up some scheme to get more programs to typecheck, e.g. to treat arguments to instances of the |
Yeah that's what I thought. I wonder if you could look at how type variables appear in the ADT and tag the type variables with their variance (invariant, co- contravariant or bivariant). Then you would know how the subtyping rule works for that type. Higher-kinded type variables make this harder or impossible though, since you don't know the variance until the HKT variable is instantiated... |
Yeah, that might work. :) If you haven't already it might be useful to look into how GHC does the |
For higher-kinded types, e.g. |
Thanks, I will look in to |
I mean unification. Tangent: I quite like the approach of elaborating to a core language, e.g. System F_omega (like GHC does), during type-checking, because that'll guide the implementation towards something where the types work out. With elaboration, subtyping With elaboration, unification doesn't return a coercion function, but has the side-effect of instantiating unification variables to make For me this makes it clearer when subtyping applies and when you have to fall back to unification, for instance. |
Thanks! I hadn't thought of it like that, that does make it clearer for me as well :) |
Just dumping some thoughts here since this is the only real discussion of this question that I can find. What about sidestepping the variance issue entirely by only allowing type constructors that are covariant in all arguments. This would mean that the function arrow would not be considered a type constructor like it is in Haskell.
With this restriction in place, is there a natural and easy extension of the system described in the paper? It would be nice even to have this important subset of type constructors without having to jump through all the extra hoops that the followup paper Sound and Complete Bidirectional Typechecking for Higher-Rank Polymorphism with Existentials and Indexed Types jumps through. What would be needed? I think that Figure 2 of the original paper would need kinds. Maybe polymorphic kinds can be left out. It would be necessary to add kinds and type applications.
I'm not sure if that should be added to types or to monotypes. A new subtyping rule (in Figure 3) is needed:
Maybe figure 4 (the declarative typing system) doesn't change at all? I'm stuck now. I need to read more, and I might come back to this thought later. |
I think Agda only allows covariant type constructors. In Haskell though non-covariant datatypes are used, so it might be very restrictive (for example Fix). I'm pretty sure just adding that new subtyping rule you showed is enough. Personally, I don't like the use of subsumption of "Sound and Complete..." anymore. Haskell is also moving away from it (with Quick Look Polymorphism) and making function arrows invariant (instead of using "deep skolemization"). This makes type inference simpler and I think it also makes inference of impredicative polymorphism easier. I prefer to have some support for impredicative polymorphism instead of the subtyping from "Sound and Complete...". In the Quick Look paper they also analyzed the impact of moving to invariant arrows and it turns out that only like 3% of the 2332 packages had some issues, most of them solved by some eta-abstraction. Anyway, sorry for going off-topic. I just think that subsumption may not be worth the trouble. Sticking to just unification probably works just fine. |
I don't think that just the subtyping rule is enough because I didn't add anything that actually performs kind checking. Somewhere, there has to be something that checks, when the application Concerning the variance of function arrows, even Dunfield and Krishnaswami kind of gave up on the clever treatment of arrows in the next paper:
Are there other papers you would recommend about bidirectional checking for type-scheme flavored System F using unification? I've done type checking of vanilla System F before but not type inference (where is seems like everyone uses type schemes instead of type assignment). |
Oh sorry, yes you are very right. You need to perfom kind checking. There already is a well-formedness check for types in the system, so that check could also do kind checking. So for example in the annotation type rule:
You can assume any types in the context are well-kinded and checking against a type is also fine. Only in synthesis rules where the user gave a type do you have to check for well-formedness and well-kindedness. I think if you use "Sound and Complete..." and swap subsumption for unification, you'll have a nice algorithm. Unification is a bit simpler, you just never instantiate foralls, only skolemize. Added bonus is that function arrow can just be another type constructor (although for efficiency reasons you might not want to do that). I'm not really aware of papers on this, sorry. One way to simplify the algorithm of "Sound and Complete..." (implementation-wise) is to have meta-variables carry around a list of type variables that are in scope. Then you are only allowed to unify a meta-variable with a type (skolem) variable if that variable appears in the list of the meta-variable. Now you don't need to use the context anymore to correctly scope the meta and type variables. Though I'm not sure if this causes other issues. This is similar to how it is done for higher-order unification in dependently type languages. Then the context can become immutable and you just store the meta-variables in a mutable map. For example:
That's just an idea I had to make the implementation of type inference for System F simpler. I kind of moved on to dependently typed languages now. Anyway I'm rambling a bit now, feel free to ask me whatever, although I'm not an expert on the matter. |
This is very much a non sequitur, but I started thinking about this again yesterday and started wondering about how the covariant-type-constructors-only world implied by a rule like
would interact with mutable references. Normally, in systems with subtyping, mutable references have to treat their parameters invariant. For example, with some kind of refinement typing, if you had:
You cannot use
Which seems like it would imply unsoundness since the same reference could be used as both an integral and character reference (begetting Anyway, the question here is "is it sound to treat the type argument of |
I'm writing this down mostly for my own sake so that I don't forget it. From an earlier comment:
This is clever, although it doesn't totally work, and only after trying something similar do I now fully appreciate it. The situations where it works are things like:
The paper doesn't really list any restrictions on the relationship between the input context Γ and tho output context ∆. But what is the relationship? The big this is that unsolved existentials may have been solved. But what about it's general shape? Should that be the same or not. The only shape changes could come from articulation ( One way to remedy this is to make functions not first class. If you cannot instantiate a metavariable with a function, then there is no |
Do you have any idea how to extend the system with type applications. I don't know how to extend the subtyping relation, because in general I don't know whether to compare the type applications covariantly or contravariantly. Is the only solution to fall back to unification (unifying foralls using alpha equality)?
The text was updated successfully, but these errors were encountered: