The \Delta-framework

The -framework

Abstract

We introduce the -framework, LF, a dependent type theory based on the Edinburgh Logical Framework LF, extended with the strong proof-functional connectives, i.e. strong intersection, minimal relevant implication and strong union. Strong proof-functional connectives take into account the shape of logical proofs, thus reflecting polymorphic features of proofs in formulæ. This is in contrast to classical or intuitionistic connectives where the meaning of a compound formula depends only on the truth value or the provability of its subformulæ. Our framework encompasses a wide range of type disciplines. Moreover, since relevant implication permits to express subtyping, LF subsumes also Pfenning’s refinement types. We discuss the design decisions which have led us to the formulation of LF, study its metatheory, and provide various examples of applications. Our strong proof-functional type theory can be plugged in existing common proof assistants.

() University of Udine   Université Côte d’Azur, INRIA

1 Introduction

This paper provides a unifying framework for two hitherto unreconciled understandings of types: i.e. types-as-predicates à la Curry and types-as-propositions (sets) à la Church. The key to our unification consists in introducing strong proof-functional connectives [40, 3, 4] in a dependent type theory such as the Edinburgh Logical Framework (LF) [22]. Both Logical Frameworks and Proof-Functional Logics consider proofs as first class citizens, albeit differently. Strong proof-functional connectives take seriously into account the shape of logical proofs, thus allowing for polymorphic features of proofs to be made explicit in formulæ. They provide a finer semantics than classical/intuitionistic connectives, where the meaning of a compound formula depends only on the truth value or the provability of its subformulæ. However, existing approaches to strong proof-functional connectives are all quite idiosyncratic in mentioning proofs. Logical Frameworks, on the other hand, provide a uniform approach to proof terms in object logics, but they have not fully capitalized on subtyping.

This situation calls for a natural combination of the two understandings of types, which should benefit both worlds. On the side of Logical Frameworks, the expressive power of the metalanguage would be enhanced thus allowing for shallower encodings of logics, a more principled use of subtypes [37], and new possibilities for formal reasoning in existing interactive theorem provers. On the side of type disciplines for programming languages, a principled framework for proofs would be provided, thus supporting a uniform approach to “proof reuse” practices based on type theory [38, 12, 20, 9, 6].

Therefore, in this paper, we extend LF with the connectives of strong intersection, strong union, and minimal relevant implication of Proof-Functional Logics [40, 3, 4]. We call this extension the -framework (LF), since it builds on the -calculus introduced in [31, 17]. Moreover, we illustrate by way of examples, that LF subsumes many expressive type disciplines in the literature [37, 3, 4, 38, 12].

It is not immediate to extend the judgments-as-type, Curry-Howard paradigm to logics supporting strong proof-functional connectives, since these connectives need to compare the shapes of derivations and do not just take into account the provability of propositions, i.e. the inhabitation of the corresponding type. In order to capture successfully strong logical connectives such as or , we need to be able to express the rules:

where is a suitable equivalence between logical proofs. Notice that the above rules suggest immediately intriguing applications in polymorphic constructions, i.e. the same evidence can be used as a proof for different statements. Pottinger [40] was the first to study the strong connective . He contrasted it to the intuitionistic connective as follows: “The intuitive meaning of can be explained by saying that to assert is to assert that one has a reason for asserting which is also a reason for asserting … (while) … to assert is to assert that one has a pair of reasons, the first of which is a reason for asserting and the second of which is a reason for asserting . A logical theorem involving intuitionistic conjunction which does not hold for strong conjunction is , otherwise there should exist a closed -term having simultaneously both one and two abstractions. Lopez-Escobar [32] and Mints [35] investigated extensively logics featuring both strong and intuitionistic connectives especially in the context of realizability interpretations.

Dually, it is in the -elimination rule that proof equality needs to be checked. Following Pottinger, we could say that asserting is to assert that one has a reason for , which is also a reason to assert and . The two connectives differ since the intuitionistic theorem is not derivable for , otherwise there would exist a term which behaves both as I and as K.

Following [4], Strong (or Minimal Relevant) Implication, , can be viewed as a special case of implication whose related function space is the simplest possible one, namely the one containing only the identity function. The operators and differ, since is not derivable. Relevant implication allows for a natural introduction of subtyping, in that morally means . Relevant implication amounts to a notion of “proof-reuse”. Combining the remarks in [4, 3], minimal relevant implication, strong intersection and strong union correspond respectively to the implication, conjunction and disjunction operators of Meyer and Routley’s Minimal Relevant Logic [34].

Figure 1: The type assignment system of [3] and the subtype theory

Strong connectives arise naturally in investigating the propositions-as-types analogy for intersection and union type assignment systems. Intersection types were introduced by Coppo, Dezani et al. in the late 70’s [13, 15, 16, 5] to support a form of ad hoc polymorphism, for untyped -calculi, à la Curry. Intersection types were used originally as an (undecidable) type assignment system for pure -calculi, i.e. for finitary descriptions of denotational semantics [14]. This line of research was later explored by Abramsky [1] in a full-fledged Stone duality. Union types were introduced semantically, by MacQueen, Plotkin, and Sethi [33, 3]. In [3] strong intersection, union and subtyping were thoroughly studied in the context of type-assignment systems, see Figure 1. A classical example of the expressiveness of union types is due to Pierce [38]: without union types, the best information we can get for

is a boolean type. Designing a -calculus à la Church with intersection and union types is problematic. The usual approach of simply adding types to binders does not work, as shown in Figure 2.

Figure 2: Polymorphic identity

Same difficulties can be found with union types. Intersection and union type disciplines started to be investigated in a explicitly typed programming language settings à la Church, much later by Reynolds and Pierce [41, 38], Wells et al. [48, 49], Liquori et al. [29, 18], Frisch et al. [21] and Dunfield [19]. From a logical point of view, there are many proposals to find a suitable logics to fit intersection: among them we cite [37, 47, 42, 36, 11, 10, 39], and previous papers by the authors [17, 30] and a type checker implementation [45]. In [17], two of the present authors proposed the -calculus as a typed -calculus à la Church corresponding to the type assignment system à la Curry with intersection and union but without . The relation between Church-style and Curry-style -calculi was expressed using an essence function, denoted by , that intuitively erases all the type information in terms (the full definition is shown in Figure 4).

The LF, introduced in this paper extends [31] with union types, dependent types and minimal relevant implication. The novelty of LF in the context of Logical Frameworks, lies in the full-fledged use of strong proof-functional connectives, which to our knowledge has never been explored before. Clearly, all -terms have a computational counterpart.

Pfenning’s work on Refinement Types [37] pioneered an extension of the Edinburgh Logical Framework with subtyping and intersection types. His approach capitalises on a tame and essentially ad hoc notion of subtyping, but the logical strength of that system does not go beyond the LF (i.e. simple types). The logical power of LF allows to type all strongly normalizing terms. Furthermore, subtyping in LF arises naturally as a derived notion from the more fundamental concept of minimal relevant implication, as illustrated in Section 2.

Miquel [36] discusses an extension of the Calculus of Constructions with implicit typing, which subsumes a kind of proof-functional intersection. His approach has opposite motivations to ours. While LF provides a Church-style version of Curry-style type assignment systems, Miquel’s Implicit Calculus of Constructions encompasses some features of Curry-style systems in an otherwise Church-style Calculus of Constructions. In LF we can discuss also ad hoc polymorphism, while in the Implicit Calculus only structural polymorphism is encoded. Indeed, he cannot assign the type to the identity [28]. Kopylov [27] adds a dependent intersection type constructor to NuPRL, allowing the resulting system to support dependent records (which are a very useful data structure to encode mathematics). The implicit product-type of Miquel, together with the dependent intersection type of Kopylov, and a suitable equality-type is used by Stump [46] to enrich the impredicative second-order system , in order to derive induction.

In order to achieve our goals, we could have carried out simply the encoding of LF in LF. But, due to the side-conditions characterizing proof-functional connectives, this would have be achieved only through a deep encoding. As an example of this, in Figure 8, we give an encoding of a subsystem of [3], where subtyping has been simulated using relevant arrows. This encoding illustrates the expressive power of LF in treating proofs as first-class citizens, and it was also a source of inspiration for LF.

All the examples discussed in this paper have been checked by an experimental proof development environment for LF [45] (see Bull and Bull-Subtyping in [44]).

Synopsis. In Section 2, we introduce LF and outline its metatheory, together with a discussion of the main design decisions. In Section 3, we provide the motivating examples. In Section 4, we outline the details of the implementation and future work.

Figure 3: The syntax of the -framework

2 The -framework: LF with proof-functional operators

The syntax of LF pseudo-terms is given in Figure 3. For the sake of simplicity, we suppose that -convertible terms are equal. Signatures and contexts are defined as finite sequence of declarations, like in LF. Observe that we could formulate LF in the style of [23], using only canonical forms and without reductions, but we prefer to use the standard LF format to support better intuition. There are three proof-functional objects, namely strong conjunction (typed with ) with two corresponding projections, strong disjunction (typed with ) with two corresponding injections, and strong (or relevant) -abstraction (typed with ). Indeed, a relevant implication is not a dependent one because the essence of the inhabitants of type is essentially the identity function as enforced in the typing rules. Note that injections need to be decorated with the injected type in order to ensure the unicity of typing.

Figure 4: The essence function

We need to generalize the notion of essence, introduced in [17, 30] to syntactically connect pure -terms (denoted by ) and type annotated LF terms (denoted by ). The essence function compositionally erases all type annotations, see Figure 4.

One could argue that the choice of in the definition of strong pairs/co-pairs is arbitrary and could have been replaced with : however, the typing rules will ensure that, if (resp. ) is typable, then we have that . Thus, strong pairs/co-pairs are constrained. The rule for the essence of a relevant application is justified by the fact that the operator amounts to just a type decoration.

Figure 5: The reduction semantics

The six basic reductions for LF objects appear on the left in Figure 5. Congruence rules are as usual, except for the two cases dealing with pairs and co-pairs which appear on the right of Figure 5. Here redexes need to be reduced “in parallel” in order to preserve identity of essences in the components. We denote by the symmetric, reflexive, and transitive closure of , i.e. the compatible closure of the reduction induced by the first six rules on the left in Figure 5, with the addition of the last two congruence rules in the same figure. In order to make this definition truly functional as well as to be able to prove a simple subject reduction result, we need to constrain pairs and co-pairs, i.e. objects of the form and to have congruent components up-to erasure of type annotations. This is achieved by imposing in both constructs. We will therefore assume that such pairs and co-pairs are simply not well defined terms, if the components have a different “infrastructure”. The effects of this choice are reflected in the congruence rules in the reduction relation, in order to ensure that reductions can only be carried out in parallel along the two components.

The restriction on reductions in pairs/co-pairs and the new constructs do not cause any problems in showing that is locally confluent:

{theorem}

[Local confluence]
The reduction relation on well-formed LF-terms is locally confluent. The extended type theory LF is a formal system for deriving judgements of the forms:

Figure 6: The type rules for valid objects

The set of rules for object formation is defined in Figure 6, while the sets of rules for signatures, contexts, kinds and families are defined as usual in the Appendix: all typing rules are syntax-directed. Note that proof-functionality is enforced by the essence side-conditions in rules , , and . In the rule we rely on the external notion of equality . An option could have be to add an internal notion of equality directly in the type system (), and prove that the external and the internal definitions of equality are equivalent, as was proved for semi-full Pure Type Systems [43]. Yet another possibility could be to compare type essences , for a suitable extension of essence to types and kinds. Unfortunately, this would lead to undecidability of type checking, in connection with relevant implication, as the following example shows. Let the two constants of type and of type : the following -term is typable with and its essence is .

Since the intended meaning of relevant implication is “essentially” the identity, introducing variables or constants whose type is a relevant implication, amounts to assuming axioms corresponding to type inclusions such as those that equate and . As a consequence, -equality of essences becomes undecidable. Thus, we rule out such options in relating relevant implications in LF to subtypes in the type assignment system of [3].

2.1 Relating LF to

We compare and contrast certain design decisions of LF to the type assignment system of [3]. The proof of strong normalization for LF will rely, in fact, on a forgetful mapping from LF to . As pointed out in [3], the elimination rule for union types in breaks subject reduction for one-step -reduction, but this can be recovered using a suitable parallel -reduction. The well-known counter-example for one-step reduction, due to Pierce is

where is the identity. In the typing context , the first and the last terms can be typed with , while the terms in the fork cannot. The reason is that the subject in the conclusion of the rule uses a context which can have more than one hole, as in the present case1. In LF, the formulation of the rule takes a different route which does not trigger the counterexample. Indeed, we have introduction and elimination constructs and which allow to reduce the term only if we know that the argument, stripped of the introduction construct, has one of the types of the disjunction. Pierce’s critical term can be expressed and typed in LF with the following judgment (the full derivation is in the Appendix):

where
, and . Notice that there is only one redex, namely , and the reduction of this redex leads to , and no other intermediate (untypable) -terms are possible.

The following result will be useful in the following section. {theorem} The system without gives types only to strongly normalizing terms. A proof is embedded in Theorem 4.8 of [3]. It can also be obtained using the general computability method presented in [25] Section 4, by interpreting intersection and union types precisely as intersections and unions in the lattice of computability sets.

2.2 Lf metatheory

LF can play the role of a Logical Framework only if decidable. Due to the lack of space, we list here only the main results: the complete list appears in the Appendix. The first important step states that if a -term is typable, then its type is unique up to . {theorem}[Unicity of types and kinds]

  1. If and , then .

  2. If and , then .

Strong normalization is proved as in LF. First we encode LF-terms into terms of the type assignment system such that redexes in the source language correspond to redexes in the target language and we use Theorem 2.1. Then, we introduce two forgetful mappings, namely and , defined in Figure 11 of the Appendix, to erase dependencies in types and to drop proof-functional constructors in -terms and we conclude. Special care is needed in dealing with redexes occurring in type-dependencies, because these need to be flattened at the level of terms.

{theorem}

[Strong normalization]

  1. LF is strongly normalizing, i.e.,

    1. If , then is strongly normalizing.

    2. If , then is strongly normalizing.

    3. If , then is strongly normalizing.

  2. Every strongly normalizing pure -term can be annotated so as to be the essence of a -term.

Local confluence and strong normalization entail confluence, so we have {theorem}[Confluence] LF is confluent, i.e.:

  1. If and , then such that and .

  2. If and , then such that and .

  3. If and , then such that and .

Then, we have subject reduction, whose proof relies on technical lemmas about inversion and subderivation properties (see Appendix).

{theorem}

[Subject reduction of LF]

  1. If and , then .

  2. If and , then .

  3. If and , then .

Finally, we define a possible algorithm for checking judgements in LF by computing a type or a kind for a term, and then testing for definitional equality, i.e, against the given type or kind. This is achieved by reducing both to their unique normal forms and checking that they are identical up to -conversion. Therefore we finally have:

{theorem}

[Decidability] All the type judgments of LF are recursively decidable.

Minimal Relevant Implications and Type Inclusion. Type inclusion and the rules of subtyping are related to the notion of minimal relevant implication, see [4, 17]. The insight is quite subtle, but ultimately very simple. This is what makes it appealing. The apparently intricate rules of subtyping and type inclusion, which occur in many systems, and might even appear ad hoc at times, can all be explained away in our principled approach, by proving that the relevant implication type is inhabited by a term whose essence is essentially a variable.

In the following theorem we show how relevant implication subsumes the type-inclusion rules of the theory of [3], without rules (5) and (13) (dealing with ) and rule (10) (distributing over ) in Figure 1: we call such restricted subtype theory. Note that the reason to drop subtype rule (10) is due to the fact that we cannot inhabit the type 2.

{theorem}

[Type Inclusion] The judgement (where both and do not contain dependencies or relevant families) holds iff holds in the subtype theory of enriched with new axioms of the form for each constant . As far as the system of Refinement Types introduced by Pfenning in [37], we have the following theorem: {corollary}[Pfenning’s Refinement Types] The judgment in can be encoded in LF by adding a constant of type to , where the latter is the signature obtained from by replacing each clause of the form or in by a constant of type . Moreover, while Pfenning needs to add explicitly the rules of subtyping (i.e. the theory of ) in , we inherit them naturally in LF from the rules for minimal relevant implication.

3 Examples

Figure 7: The LF encoding of Hereditary Harrop Formulæ

As we have argued in the previous sections, the point of this paper is a uniform and principled approach to the encoding of a plethora of type disciplines and systems which ultimately stem or can capitalize from strong proof-functional connectives and subtyping. The framework LF, presented in this paper, is the first to accommodate all the examples and counterexamples that have appeared in the literature. The complete developments of both the implementation of the -framework and example encodings can be found in [44].

We start the section showing the expressive power of LF in encoding classical features of typing disciplines with strong intersection and union.

Auto application. The judgement in , is rendered in LF by the LF-judgement .

Polymorphic identity. The judgement in , is rendered in LF by the judgement .

Commutativity of union. The judgement in is rendered in LF by the judgement

Pierce’s expression of page 2. The expressive power of union types highlighted by Pierce is rendered in LF by

The above example illustrates the advantages of taking LF as a framework. In LF we would render it only encoding deeply, ending up with the verbose code in pierce_program.v [44].

Hereditary Harrop Formulæ. The encoding of Hereditary Harrop’s Formulæ is one of the motivating examples given by Pfenning for introducing refinement types in [37]. In LF it can be expressed as in Figure 7 and type checked in [45] using our concrete syntax (file pfenning_harrop.bull [44]), without any reference to intersection types, by a subtle use of union types. We add also rules for solving and backchaining. Hereditary Harrop formulæ can be recursively defined using two mutually recursive syntactical objects called programs () and goals ():

Using Theorem A, we can provide an alternative encoding of atoms, goals and programs which is more faithful to the one by Pfenning. Namely, we can introduce in the signature the constants and in order to represent the axioms and in Pfenning’s encoding. Our approach based on union types, while retaining the same expressivity permits to shortcut certain inclusions and to rule out also certain exotic goals and exotic programs. Indeed, for the purpose of establishing the adequacy of the encoding, it is sufficient to avoid variables involving union types in the derivation contexts.

Natural Deductions in Normal Form. The second motivating example for intersection types given in [37] is natural deductions in normal form. We recall that a natural deduction is in normal form if there are no applications of elimination rules of a logical connective immediately following their corresponding introduction, in the main branch of a subderivation.

The encoding we give in LF is a slightly improved version of the one in [37]: as Pfenning, we restrict to the purely implicational fragment. As in the previous example, we use

union types to define normal forms () either as pure elimination-deductions from hypotheses () or normal form-deductions (). As above we could have used also intersection types. This example is interesting in itself, being the prototype of the encoding of type systems using canonical and atomic syntactic categories [23] and also of Fitch Set Theory [26].

{coq}

(* Define our types *) Axiom o : Set. (* Axiom omegatype : o. *) Axioms (arrow inter union : o -> o -> o).

(* Transform our types into LF types *) Axiom OK : o -> Set.

(* Define the essence equality as an equivalence relation *) Axiom Eq : forall (s t : o), OK s -> OK t -> Prop. Axiom Eqrefl : forall (s : o) (M : OK s), Eq s s M M. Axiom Eqsymm : forall (s t : o) (M : OK s) (N : OK t), Eq s t M N -> Eq t s N M. Axiom Eqtrans : forall (s t u : o) (M : OK s) (N : OK t) (O : OK u), Eq s t M N -> Eq t u N O -> Eq s u M O.

(* constructors for arrow (->I and ->E) *) Axiom Abst : forall (s t : o), ((OK s) -> (OK t)) -> OK (arrow s t). Axiom App : forall (s t : o), OK (arrow s t) -> OK s -> OK t.

(* constructors for intersection *) Axiom Proj_l : forall (s t : o), OK (inter s t) -> OK s. Axiom Proj_r : forall (s t : o), OK (inter s t) -> OK t. Axiom Pair : forall (s t : o) (M : OK s) (N : OK t), Eq s t M N -> OK (inter s t).

(* constructors for union *) Axiom Inj_l : forall (s t : o), OK s -> OK (union s t). Axiom Inj_r : forall (s t : o), OK t -> OK (union s t). Axiom Copair : forall (s t u : o) (X : OK (arrow s u)) (Y : OK (arrow t u)), OK (union s t) -> Eq (arrow s u) (arrow t u) X Y -> OK u.

(* define equality wrt arrow constructors *) Axiom Eqabst : forall (s t s’ t’ : o) (M : OK s -> OK t) (N : OK s’ -> OK t’), (forall (x : OK s) (y : OK s’), Eq s s’ x y -> Eq t t’ (M x) (N y)) -> Eq (arrow s t) (arrow s’ t’) (Abst s t M) (Abst s’ t’ N). Axiom Eqapp : forall (s t s’ t’ : o) (M : OK (arrow s t)) (N : OK s) (M’ : OK (arrow s’ t’)) (N’ : OK s’), Eq (arrow s t) (arrow s’ t’) M M’ -> Eq s s’ N N’ -> Eq t t’ (App s t M N) (App s’ t’ M’ N’).

(* define equality wrt intersection constructors *) Axiom Eqpair : forall (s t : o) (M : OK s) (N : OK t) (pf : Eq s t M N), Eq (inter s t) s (Pair s t M N pf) M. Axiom Eqproj_l : forall (s t : o) (M : OK (inter s t)), Eq (inter s t) s M (Proj_l s t M). Axiom Eqproj_r : forall (s t : o) (M : OK (inter s t)), Eq (inter s t) t M (Proj_r s t M).

(* define equality wrt union *) Axiom Eqinj_l : forall (s t : o) (M : OK s), Eq (union s t) s (Inj_l s t M) M. Axiom Eqinj_r : forall (s t : o) (M : OK t), Eq (union s t) t (Inj_r s t M) M. Axiom Eqcopair : forall (s t u : o) (M : OK (arrow s u)) (N : OK (arrow t u)) (O : OK (union s t)) (pf: Eq (arrow s u) (arrow t u) M N) (x : OK s), Eq s (union s t) x O -> Eq u u (App s u M x) (Copair s t u M N O pf).

Figure 8: The LF encoding of (Coq syntax)

Adequacy, Canonical Forms, Exotic terms. In the presence of union types, we have to pay special attention to the exact formulation of Adequacy Theorems, as in the Harrop’s formulæ example above. Otherwise exotic terms arise, such as , where and are distinct contexts (i.e. terms with holes), which cannot be naturally simplified even if . More work needs to be done to streamline how to exclude, or even capitalize on exotic terms.

Metacircular Encodings. The following diagram summarizes the network of adequate encodings/inclusions between LF, LF, and that can be defined. We denote by

the encoding of system in system , where the label sh (resp. dp), denotes a shallow (resp. deep) embedding. The notation denotes that is an extension of . Due to lack of space, but with the intention of providing a better formal understanding of the semantics of strong intersection and union types in a logical framework, we provide in Figure 8 a deep LF encoding of a presentation of à la Church [17]. A shallow encoding of in LF (file intersection_union.bull [44]) can be mechanically type checked in [45]. A shallow encoding of LF in LF (file lf.bull) making essential use of intersection types can be also type checked.

LF encoding of . Figure 8 presents a pure LF encoding of a presentation of à la Church in Coq syntax using HOAS. We use HOAS in order to take advantage of the higher-order features of the frameworks: other abstract syntax representation techniques would not be much different, but more verbose. The Eq predicate plays the same role of the essence function in LF, namely, it encodes the judgement that two proofs (i.e. two terms of type (OK _)) have the same structure. This is crucial in the Pair axiom (i.e. the introduction rule of the intersection type constructor) where we can inhabit the type (inter s t) only when the proofs of its component types s and t share the same structure (i.e. we have a witness of type (Eq s t M N), where M has type (OK s) and N has type (OK t)). A similar role is played by the Eq premise in the Copair axiom (i.e. the elimination rule of the union type constructor). We have an Eq axiom for each proof rule. Examples of this encoding can be found in intersection_union.v [44].

4 Implementation and Future Work

In a previous paper [45], we have implemented in OCaml suitable algorithms for type reconstruction, as well as type checking. In [30] we have implemented the subtyping algorithm which extends the well-known Hindley algorithm for intersection types [24] with union types. The subtyping algorithm has been mechanically proved correct in Coq, extending the Bessai’s mechanized proof of a subtyping algorithm for intersection types [8].

A Read-Eval-Print-Loop allows to define axioms and definitions, and performs some basic terminal-style features like error pretty-printing, subexpressions highlighting, and file loading. Moreover, it can type-check a proof or normalize it, using a strong reduction evaluator. We use the syntax of Pure Type Systems [7] to improve the compactness and the modularity of the kernel. Binders are implemented using de Brujin indexes. We implemented the conversion rule in the simplest way possible: when we need to compare types, we syntactically compare their normal form. Abstract and concrete syntax are mostly aligned: the concrete syntax is similar to the concrete syntax of Coq (see Bull and Bull-Subtyping [44]).

We are currently designing a higher-order unification algorithm for -terms and a bidirectional refinement algorithm, similar to the one found in [2]. The refinement can be split into two parts: the essence refinement and the typing refinement. In the same way, there will be a unification algorithm for the essence terms, and a unification algorithm for -terms. The bidirectional refinement algorithm aims to have partial type inference, and to give as much information as possible to a hypothetical solver, or the unifier. For instance, if we want to find a such that , we can infer that and that .

Appendix A Appendix

Let Figure 9 denote Valid Signatures and Contexts and Figure 10 denote Valid Kinds and Families.

Figure 9: Valid Signatures and Contexts

Figure 10: Valid Kinds and Families

LF can play the role of a logical framework only if decidable. The road map which we follow to establish decidability is the standard one, see e.g[22]. In particular, we prove in order: uniqueness of types and kinds, structural properties, normalization for raw well-formed terms, and hence confluence. Then we prove the inversion property, the subderivation property, subject reduction, and finally decidability. {lemma} Let be either or . Then:

  1. Weakening: If and , then .

  2. Strengthening: If , then , provided that .

  3. Transitivity: If and , then .

  4. Permutation: If , then , provided that does not occur free in or in , and that is valid in .

{theorem}

[Unicity of Types and Kinds]

  1. If and , then .

  2. If and , then .

In order to prove strong normalization we follow the pattern used for pure LF. Namely, we map LF-terms into terms of the system in such a way that redexes in the source language are mapped into redexes in the target language, and then take advantage of Theorem 2.1. Special care is needed in dealing with redexes occurring in type-dependencies, because these need to be flattened at the level of terms.

{definition}

Let the forgetful mappings and be defined as in Figure 11.