diff --git a/Reference Manual/kernel.tex b/Reference Manual/kernel.tex index daee18ea..11da88b3 100644 --- a/Reference Manual/kernel.tex +++ b/Reference Manual/kernel.tex @@ -142,7 +142,7 @@ \subsection{Syntax} In this document, as well as in the code documentation, we often write terms and formulas in a more conventional way, generally hiding the arity of labels and representing the label with its identifier only, preceded by an apostrophe (\lstinline|`|) if we need to precise that a symbol is schematic. When the arity is relevant, we write it with a superscript, for example: % \begin{gather*} - f^3(x,,z) \equiv \operatorname{Fun}(f, 3)(\List(\Var(x), \Var(y), \Var(z)))~, + f^3(x,y,z) \equiv \operatorname{Fun}(f, 3)(\List(\Var(x), \Var(y), \Var(z)))~, \end{gather*} % and @@ -287,7 +287,7 @@ \subsection{The Equivalence Checker} For soundness, the relation decided by the algorithm should be contained in the $\iff$ ``if and only if'' relation of first order logic. However, it is well known that this relationship is in general undecidable, and even the $\iff$ relation for propositional logic is coNP-complete. For practicality, we need a relation that is efficiently computable. The decision procedure implemented in LISA takes time quadratic in the size of the formula, which means that it is not significantly slower than syntactic equality. -It is based on an algorithm that decides the word problem for Ortholattices \cite{guilloudFormulaNormalizationsVerification2023a}. +It is based on an algorithm that decides the word problem for Ortholattices \cite{guilloudFormulaNormalizationsVerification2023}. Ortholattices are a generalization of Boolean algebra where instead of the law of distributivity, the weaker absorption law (L9, \autoref{tab:ortholatticeLaws}) holds. In particular, every identity in the theory of ortholattices is also a theorem of propositional logic. \begin{table}[bth] \centering @@ -314,7 +314,7 @@ \subsection{The Equivalence Checker} We write $s\sim_\OL t$ if both $s\leq_\OL t$ and $s\geq_\OL t$ hold. \autoref{thm:OL} is the main result we rely on. -\begin{theorem}[\cite{guilloudFormulaNormalizationsVerification2023a}] +\begin{theorem}[\cite{guilloudFormulaNormalizationsVerification2023}] \label{thm:OL} There exists an algorithm running in worst case quadratic time producing, for any terms $s$ over the signature $(\land, \lor, \neg)$, a normal form $\text{NF}_{\OL}(s)$ such that for any $t$, $s \sim_\OL t$ if and only if $\text{NF}_{\OL}(s) = \text{NF}_{\OL}(t)$. The algorithm is also capable of deciding if $s \leq_{OL} t$ holds in quadratic time. @@ -345,7 +345,7 @@ \subsection{The Equivalence Checker} In particular, the implementation in LISA also takes into account symmetry and reflexivity of equality as well as alpha-equivalence, by which we mean renaming of bound variables. It also expresses $\rightarrow$ and $\leftrightarrow$ in terms of $\lor$ and $\land$. -A more detailed discussion of extension of ortholattices to first-order logic, proof of correctness and implementation details can be found in \cite{guilloudFormulaNormalizationsVerification2023a} and \cite{guilloudLISAModernProof2023a}. +A more detailed discussion of extension of ortholattices to first-order logic, proof of correctness and implementation details can be found in \cite{guilloudFormulaNormalizationsVerification2023} and \cite{guilloudLISAModernProof2023}. @@ -354,7 +354,7 @@ \section{Proofs in Sequent Calculus} \label{sec:proofs_lk} \subsection{Sequent Calculus} \label{subsec:lk} -The deductive system used by LISA is an extended version of Gentzen's Sequent Calculus \cite{Gentzen1935,Gentzen1935II}. +The deductive system used by LISA is an extended version of the classical Sequent Calculus. % \begin{definition} A \textbf{sequent} is a pair $(\Gamma, \Sigma)$ of (possibly empty) sets of formulas, noted: @@ -1086,5 +1086,5 @@ \subsection{How to write helpers} % Now, every time a \lstinline|TermLabel| is writen in a place where a \lstinline|Term| is expected, it will be converted implicitly. -To learn more about Scala 3 and its capabilities, see its documentation at \footnotesize{\url{https://docs.scala-lang.org/scala3/book/introduction.html}}. +To learn more about Scala 3 and its capabilities, see its documentation at {\footnotesize\url{https://docs.scala-lang.org/scala3/book/introduction.html}}. diff --git a/Reference Manual/lisa.pdf b/Reference Manual/lisa.pdf index 36937675..0bb3bc56 100644 Binary files a/Reference Manual/lisa.pdf and b/Reference Manual/lisa.pdf differ diff --git a/Reference Manual/lisa.tex b/Reference Manual/lisa.tex index b23ac099..7f0764c5 100644 --- a/Reference Manual/lisa.tex +++ b/Reference Manual/lisa.tex @@ -8,7 +8,7 @@ \author{Simon Guilloud\\Laboratory for Automated Reasoning and Analysis, EPFL} \date{} -\begin{document} +\begin{document} \newfontfamily\titlefont{Arial} @@ -23,11 +23,13 @@ \vspace{1.5cm} - \textbf{\large WIP Edition} + \textbf{\large Version 0.2.1} + + \textbf{\large \today} \vspace{1.5cm} - {\Large Laboratory for Automated Reasoning and Analysis, EPFL, Switzerland} + {\Large Laboratory for Automated Reasoning and Analysis\\ EPFL, Switzerland} \vspace{1.5cm} @@ -50,6 +52,8 @@ \chapter*{Introduction} \tableofcontents +\input{quickguide.tex} + \input{kernel.tex} \input{prooflib.tex} diff --git a/Reference Manual/macro.tex b/Reference Manual/macro.tex index 05f2f578..bc207729 100644 --- a/Reference Manual/macro.tex +++ b/Reference Manual/macro.tex @@ -1,6 +1,6 @@ % macro.tex -\usepackage[utf8]{inputenc} +%\usepackage[utf8]{inputenc} \usepackage{url} \usepackage{graphicx} \usepackage[english]{babel} @@ -12,7 +12,7 @@ \usepackage[dvipsnames]{xcolor} \usepackage{csquotes} \usepackage[strings]{underscore} -\usepackage{hyperref} +\usepackage{hyperref} \usepackage{bussproofs} \usepackage{makecell} \usepackage{subcaption} @@ -21,8 +21,8 @@ \usepackage{xspace} \usepackage[ruled,vlined]{algorithm2e} \usepackage{array} -\usepackage{arial} \usepackage[T1]{fontenc} +\usepackage{newunicodechar} %To map unicode in listings to a different font than Fira Code \renewcommand\sfdefault{ua1} \sloppy % better (?) margin handling @@ -44,13 +44,58 @@ \newcommand*{\definitionautorefname}{Definition} +% URLs -%Code -%\usepackage[nomap]{FiraMono} +\hypersetup{ + colorlinks=true, + linkcolor=blue, + filecolor=magenta, + urlcolor=blue, + pdftitle={LISA Reference Manual}, + pdfpagemode=FullScreen, +} +\urlstyle{same} + + + +% Code fonts + +%\usepackage{lstfiracode} \usepackage{fontspec} +\usepackage{ucharclasses} \setmonofont[ Contextuals={Alternate} ]{Fira Code} +\newfontfamily{\fallbackfont}{DejaVuSans} +\DeclareTextFontCommand{\textfallback}{\fallbackfont} +\newunicodechar{∀}{\textfallback{∀}} +\newunicodechar{∃}{\textfallback{∃}} +\newunicodechar{∅}{\textfallback{∅}} +\newunicodechar{⊆}{\textfallback{⊆}} +\newunicodechar{∈}{\textfallback{∈}} + + +% Correct lstlisting parsing of unicode character. Add unicode points at the end. +\makeatletter +\lst@InputCatcodes +\def\lst@DefEC{% + \lst@CCECUse \lst@ProcessLetter + ^^80^^81^^82^^83^^84^^85^^86^^87^^88^^89^^8a^^8b^^8c^^8d^^8e^^8f% + ^^90^^91^^92^^93^^94^^95^^96^^97^^98^^99^^9a^^9b^^9c^^9d^^9e^^9f% + ^^a0^^a1^^a2^^a3^^a4^^a5^^a6^^a7^^a8^^a9^^aa^^ab^^ac^^ad^^ae^^af% + ^^b0^^b1^^b2^^b3^^b4^^b5^^b6^^b7^^b8^^b9^^ba^^bb^^bc^^bd^^be^^bf% + ^^c0^^c1^^c2^^c3^^c4^^c5^^c6^^c7^^c8^^c9^^ca^^cb^^cc^^cd^^ce^^cf% + ^^d0^^d1^^d2^^d3^^d4^^d5^^d6^^d7^^d8^^d9^^da^^db^^dc^^dd^^de^^df% + ^^e0^^e1^^e2^^e3^^e4^^e5^^e6^^e7^^e8^^e9^^ea^^eb^^ec^^ed^^ee^^ef% + ^^f0^^f1^^f2^^f3^^f4^^f5^^f6^^f7^^f8^^f9^^fa^^fb^^fc^^fd^^fe^^ff% + ^^^^2200^^^^2203^^^^2205^^^^2286^^^^2208% <--- for ∀ + ^^00} +\lst@RestoreCatcodes +\makeatother + + +% Syntactic coloring + \makeatletter \def\verbatim@nolig@list{} @@ -61,10 +106,13 @@ \definecolor{comments}{RGB}{80,0,110} \lstdefinelanguage{scala}{ + extendedchars = true, + inputencoding = utf8, alsoletter={@,=,>}, keywordstyle = {\color{blue}}, keywordstyle = [2]{\color{blue}}, commentstyle = \color{comments}, + basicstyle=\footnotesize\linespread{1.15}\ttfamily, morekeywords = [2]{abstract, case, class, def, do, Input, Output, then, else, extends, false, free, if, implicit, match, object, true, val, var, while, sealed, or, @@ -81,12 +129,15 @@ } \lstdefinelanguage{lisa}{ + extendedchars = true, + inputencoding = utf8, alsoletter={@,=,>}, keywordstyle = {\color{blue}}, keywordstyle = [2]{\color{blue}}, keywordstyle = [3]{\color{green}}, keywordstyle = [4]{\color{teal}}, commentstyle = \color{comments}, + basicstyle=\footnotesize\linespread{1.15}\ttfamily, morekeywords = [2]{abstract, case, class, def, do, Input, Output, then, else, extends, false, free, if, implicit, match, object, true, val, var, while, sealed, or, @@ -104,15 +155,34 @@ } +\definecolor{rosishlightgray}{rgb}{0.96, 0.94, 0.92} +\definecolor{bluishlightgray}{rgb}{0.94, 0.96, 0.98} + +\lstdefinelanguage{console}{ + extendedchars = true, + inputencoding = utf8, + backgroundcolor = \color{bluishlightgray}, + basicstyle=\footnotesize\linespread{1.15}\ttfamily, + mathescape, + escapeinside={!*}{*!}, + columns=fullflexible, + morekeywords={if,then,else,return,match,with,end,let,in, data, type, :=, def}, + moredelim=**[is][\color{green}]{@*}{*@}, + moredelim=**[is][\color{red}]{@**}{**@}, +} -\definecolor{verylightgray}{rgb}{0.96, 0.94, 0.92} \lstset{ - backgroundcolor = \color{verylightgray}, + extendedchars = true, + inputencoding = utf8, + extendedchars = \true, + backgroundcolor = \color{rosishlightgray}, basicstyle=\footnotesize\linespread{1.15}\ttfamily, mathescape, escapeinside={!*}{*!}, columns=fullflexible, - morekeywords={if,then,else,return,match,with,end,let,in, data, type, :=, def} + morekeywords={if,then,else,return,match,with,end,let,in, data, type, :=, def}, + moredelim=**[is][\color{green}]{@*}{*@}, + moredelim=**[is][\color{red}]{@**}{**@}, } \DeclareMathOperator{\pick}{pick} diff --git a/Reference Manual/prooflib.tex b/Reference Manual/prooflib.tex index e3bfc0f7..6ac40d7e 100644 --- a/Reference Manual/prooflib.tex +++ b/Reference Manual/prooflib.tex @@ -1,99 +1,51 @@ \chapter{Developping Mathematics with Prooflib} - -\section{Writing theory files} -LISA provides a canonical way of writing and organizing Kernel proofs by mean of a set of utilities and a DSL made possible by some of Scala 3's features such as string interpolation, extension and implicits. -The way to write a new theory file to mathematical development is: -\begin{lstlisting}[language=Scala, frame=single] +\label{chapt:prooflib} +LISA's kernel offers all the necessary tools to develops proofs, but reading and writing proofs written directly in its language is cumbersome. +To develop and maintain a library of mathematical development, LISA offers a dedicate interface and DSL to write proofs: Prooflib +LISA provides a canonical way of writing and organizing Kernel proofs by mean of a set of utilities and a DSL made possible by some of Scala 3's features. +\autoref{fig:theoryFileExample} is a reminder from \autoref{chapt:quickguide} of the canonical way to write a theory file in LISA. + +\begin{figure} +\begin{lstlisting}[language=lisa, frame=single] object MyTheoryName extends lisa.Main { - -} -\end{lstlisting} -and that's it! To write a theorem, the recommended syntax is: - -\begin{lstlisting}[language=Scala, frame=single] -object MyTheoryName extends lisa.Main { - - THEOREM("theoremName") of "desired conclusion" PROOF { - - ???: Proof - - } using (listOfJustifications) - show -} -\end{lstlisting} -\lstinline|show|{} is optional and prints the last proven theorem. We can similarily make the definition: - -% avoid page breaking -\noindent -\begin{minipage}{\textwidth} - \begin{lstlisting}[language=Scala, frame=single] - object MyTheoryName extends lisa.Main { - - val myFunction = - DEFINE("symbol", x, y) as definition(x,y) - show + val x = variable + val f = function[1] + val P = predicate[1] + + val fixedPointDoubleApplication = Theorem( + ∀(x, P(x) ==> P(f(x))) |- P(x) ==> P(f(f(x))) + ) { + assume(∀(x, P(x) ==> P(f(x)))) + val step1 = have(P(x) ==> P(f(x))) by InstantiateForall + val step2 = have(P(f(x)) ==> P(f(f(x)))) by InstantiateForall + have(thesis) by Tautology.from(step1, step2) + } + + + val emptySetIsASubset = Theorem( + ∅ ⊆ x + ) { + have((y ∈ ∅) ==> (y ∈ x)) by Tautology.from( + emptySetAxiom of (x := y)) + val rhs = thenHave (∀(y, (y ∈ ∅) ==> (y ∈ x))) by RightForall + have(thesis) by Tautology.from( + subsetAxiom of (x := ∅, y := x), rhs) } - \end{lstlisting} -\end{minipage} -% -This works for definitions of function and predicate symbols with a direct definition. for indirect definitions (via $\exists !$), use the following: -% -\begin{lstlisting}[language=Scala, frame=single] -object MyTheoryName extends lisa.Main { - val testdef = - DEFINE("symbol", x, y) asThe z suchThat { - ???:Formula - } PROOF { - ???:Proof - } using (listOfJustifications) - show } \end{lstlisting} +\caption{An example of a theory file in LISA} +\label{fig:theoryFileExample} +\end{figure} +In this chapter, we will describe how each of these construct is made possible and how they translate to statements in the Kernel. -======= -It is important to note that when multiple such files are developed, they all use the same underlying \lstinline|RunningTheory|{}. This makes it possible to use results proved previously by means of a simple \lstinline|import|{} statement as one would import a regular object. Similarly, one should also import as usual automation and tactics developed alongside. It is expected in the medium term that \lstinline|lisa.Main|{} will come with basic automation. - -To check the result of a developed file, and verify that the proofs contain no error, it is possible to run such a library object. -% specify which object -All imported theory objects will be run through as well, but only the result of the selected one will be printed. +\section*{WIP} -It is possible to refer to a theorem or axiom that has been previously proven or added using its name. The syntax is \lstinline|thm``theoremName''|{} or \lstinline|ax``axiomName''|{}. This makes it possible to write, for example, \lstinline|thm``theoremName''.show|{} and \lstinline|... using (ax``comprehensionSchema'')| Figure \ref{fig:kernellibrary} shows a typical example of set theory development. +%It is important to note that when multiple such files are developed, they all use the same underlying \lstinline|RunningTheory|{}. This makes it possible to use results proved previously by means of a simple \lstinline|import|{} statement as one would import a regular object. Similarly, one should also import as usual automation and tactics developed alongside. It is expected in the medium term that \lstinline|lisa.Main|{} will come with basic automation. +%All imported theory objects will be run through as well, but only the result of the selected one will be printed. -\begin{figure}[hp] - \begin{lstlisting}[language=Scala, frame=single] -object MyTheoryName extends lisa.Main { - THEOREM("russelParadox") of - (*@ $\forall$ @*)x. (x(*@$\in$@*)?y)(*@$\leftrightarrow$@*) (*@$\neg$@*)(x(*@$\in$@*)x)(*@$\vdash$@*) PROOF { - val y = VariableLabel("y") - val x = VariableLabel("x") - val s0 = RewriteTrue(in(y, y) <=> !in(y, y) |-()) - val s1 = LeftForall( - forall(x, in(x, y) <=> !in(x, x)) |- (), - 0, in(x, y) <=> !in(x, x), x, y - ) - Proof(s0, s1) - } using () - thm"russelParadox".show - - - THEOREM("unorderedPair_symmetry") of - "(*@$\vdash$@*)(*@$\forall$@*)y, x. {x, y} = {y, x}" PROOF { - ... - } using (ax"extensionalityAxiom", ax"pairAxiom") - show - - - val oPair = - DEFINE("", x, y) as pair(pair(x, y), pair(x, x)) - -} -\end{lstlisting} - \caption{Example of library development in LISA Kernel} - \label{fig:kernellibrary} -\end{figure} diff --git a/Reference Manual/quickguide.tex b/Reference Manual/quickguide.tex new file mode 100644 index 00000000..0b39e7d5 --- /dev/null +++ b/Reference Manual/quickguide.tex @@ -0,0 +1,286 @@ +\chapter{Quick Guide for writing proofs in LISA} +\label{chapt:quickguide} +LISA is a proof assistant, i.e. a tool to help humans to write completely formal proofs of mathematical statements. + +The centerpiece of LISA (called the kernel) contains a definition of first order logic (FOL), which is a logical framework to make formal mathematical statements and proofs. This kernel is what provides correctness guarantees to the user. It only accepts a small set of formal deduction rule such as ``if $a$ is true and $b$ is true then $a\land b$ is true". +This is in contrast with human-written proofs, which can contain a wide variety of complex or implicit arguments. Hence, if a proof is accepted as being correct by the kernel, we are guaranteed that it indeed is\footnote{Of course, it is always possible that the kernel itself has a bug in its implementation, but because it is a very small and simple program, we can build strong confidence that it is correct.}. +LISA's kernel is described in details in \autoref{chapt:kernel}. + +However, building advanced math such as topology or probability theory only from those primitive constructions would be excessively tedious. Instead, we use them as primitive building blocs which can be combined and automatized. Beyond the correctness guarantees of the kernel, LISA's purpose is to provide tools to make writing formal proofs easier. This include automation, via decision procedure which automatically prove theorems and deductions, and layers of abstraction (helpers, domain specific language) which make the presentation of formal statements and proofs closer to the traditional, human way of writing proofs. +This is not unlike programming languages: assembly is in theory sufficient to write any program on a computer, but high level programming languages offer many convenient features which make writing complex programs easier and which are ultimately translated into assembly. +\autoref{chapt:prooflib} explain in details how all these layers of abstraction and automation work. The rest of the present chapter will give a quick guide on how to use LISA. If you are not familiar with first order logic, we suggest you first read an introduction to first order logic such as \url{lara.epfl.ch/w/sav08/predicate_logic_informally}. + +\section{Installation} +LISA requires the Scala programming language to run. You can download and install Scala following \url{www.scala-lang.org/download/}. Once this is done, clone the LISA repository: +\begin{lstlisting}[language=console] +> git clone https://github.com/epfl-lara/lisa +\end{lstlisting} +To test your installation, do +\begin{lstlisting}[language=console] +> cd lisa +> sbt +\end{lstlisting} +SBT is a tool to run scala project and manage versions and dependencies. When it finished loading, do +\begin{lstlisting}[language=console] +> project lisa-examples +> run +\end{lstlisting} +Wait for the LISA codebase to be compiled and then press the number corresponding to "Example". You should obtain the following result: +\noindent\begin{minipage}{\linewidth}\vspace{1em} +\begin{lstlisting}[language=console] + @*Theorem fixedPointDoubleApplication := + ∀'x. 'P('x) ==> 'P('f('x)) ⊢ 'P('x) ==> 'P('f('f('x))) + + Theorem emptySetIsASubset := ⊢ subsetOf(emptySet, 'x) + + Theorem setWithElementNonEmpty := + elem('y, 'x) ⊢ ¬('x = emptySet) + + Theorem powerSetNonEmpty := ⊢ ¬(powerSet('x) = emptySet) + *@ +\end{lstlisting} +\end{minipage} + + + +\section{Development Environment} +To write LISA proofs, you can use any text editor or IDE. We recommend using \emph{Visual Studio Code} with the \emph{Metals} plugin. + +\subsection*{A Note on Special Characters} +Math is full of special character. Lisa usually admits both an ascii name and a unicode name for such symbols. By enabling ligatures, common ascii characters such as \lstinline|=|\lstinline|=|\lstinline|>| are rendered as \lstinline|==>|. +The present document uses the font \href{https://github.com/tonsky/FiraCode}{Fira Code}. Once installed on your system, you can activate it and ligatures on VSCode the following way: +\begin{enumerate} + \item Press ctrl-shift-P + \item Search for ``Open User Settings (JSON)'' + \item in the \lstinline|settings.json| file, add: + \begin{lstlisting} +"editor.fontFamily": "'Fira Code', Consolas, monospace", +"editor.fontLigatures": true, + \end{lstlisting} +\end{enumerate} +Other symbols such as \lstinline|∀| are unicode symbols, which can be entered via their unicode code, depending on your OS\footnote{alt+numpad on windows, ctrl-shift-U+code on Linux}, or by using an extension for VS Code such as \emph{Fast Unicode Math Characters}, \emph{Insert Unicode} or \emph{Unicode Latex}. +A cheat sheet of the most common symbols and how to input them is in \autoref{tab:Unicode}. +\begin{table} + \center + \begin{tabular}{c|c|c} + Rendering & Input & Name \\ \hline + \lstinline| === | & === & equality \\ \hline + \lstinline| \/ | & \textbackslash / & and \\ \hline + \lstinline| /\ | & /\textbackslash & or \\ \hline + \lstinline| ==> | & ==> & implies \\ \hline + \lstinline+ |- + & |- & vdash \\ \hline + \lstinline| ∀ | & U+2200 & forall \\ \hline + \lstinline| ∃ | & U+2203 & exists \\ \hline + \lstinline| ∈ | & U+2208 & in \\ \hline + \lstinline| ⊆ | & U+2286 & subseteq \\ \hline + \lstinline| ∅ | & U+2205 & emptyset \\ + \end{tabular} + \caption{Most frequently used Unicode symbols and ligatures.} + \label{tab:Unicode} +\end{table} +Note that by default, unicode characters will not be printed correctly on a Windows console. You will need to activate the corresponding charset and pick a font with support for unicode in your console's options, such as Consolas. + +\section{Writing theory files} +LISA provides a canonical way of writing and organizing kernel proofs by mean of a set of utilities and a DSL made possible by some of Scala 3's features. +To prove some theorems by yourself, start by creating a file named \lstinline|MyTheoryName.scala| right next to the Example.scala file\footnote{The relative path is lisa/lisa-examples/src/main/scala}. +Then simply write: + +\noindent\begin{minipage}{\linewidth}\vspace{1em} +\begin{lstlisting}[language=lisa, frame=single] +object MyTheoryName extends lisa.Main { + +} +\end{lstlisting} +\end{minipage} +and that's it! This will give you access to all the necessary LISA features. Let see how one can use them to prove a theorem: +$$ + \forall x. P(x) \implies P(f(x)) \vdash P(x) \implies P(f(f(x))) +$$ +To state the theorem, we first need to tell LISA that $x$ is a variable, $f$ is a function symbol and $P$ a predicate symbol. + +\noindent\begin{minipage}{\linewidth}\vspace{1em} +\begin{lstlisting}[language=lisa, frame=single] +object MyTheoryName extends lisa.Main { + val x = variable + val f = function[1] + val P = predicate[1] + +} +\end{lstlisting} +\end{minipage} + +where \lstinline|[1]| indicates that the symbol is of arity 1 (it takes a single argument). The symbols \lstinline|x, f, P| are scala identifiers that can be freely used in theorem statements and proofs, but they are also formal symbols of FOL in LISA's kernel. +We now can state our theorem: + +\noindent\begin{minipage}{\linewidth}\vspace{1em} +\begin{lstlisting}[language=lisa, frame=single] +object MyTheoryName extends lisa.Main { + val x = variable + val f = function[1] + val P = predicate[1] + + val fixedPointDoubleApplication = Theorem( + ∀(x, P(x) ==> P(f(x))) |- P(x) ==> P(f(f(x))) + ) { + ??? // Proof + } +} +\end{lstlisting} +\end{minipage} +The theorem will automatically be named \lstinline|fixedPointDoubleApplication|, like the name of the identifier it is assigned to, and will be available to reuse in future proofs. The proof itself is built using a sequence of proof step, which will update the status of the ongoing proof. + +\noindent\begin{minipage}{\linewidth}\vspace{1em} +\begin{lstlisting}[language=lisa, frame=single] +object MyTheoryName extends lisa.Main { + val x = variable + val f = function[1] + val P = predicate[1] + + val fixedPointDoubleApplication = Theorem( + ∀(x, P(x) ==> P(f(x))) |- P(x) ==> P(f(f(x))) + ) { + assume(∀(x, P(x) ==> P(f(x)))) + val step1 = have(P(x) ==> P(f(x))) by InstantiateForall + val step2 = have(P(f(x)) ==> P(f(f(x)))) by InstantiateForall + have(thesis) by Tautology.from(step1, step2) + } +} +\end{lstlisting} +\end{minipage} +First, we use the \lstinline|assume| construct in line 6. +This tells to LISA that the assumed formula is understood as being implicitly on the left hand side of every statement in the rest of the proof. + +Then, we need to instantiate the quantified formula twice using a specialized tactic. In lines 7 and 8, we use \lstinline|have| to state that a formula or sequent is true (given the assumption inside \lstinline|assume|), and that the proof of this is produced by the tactic \lstinline|InstantiateForall|. +We'll see more about the interface of a tactic later. To be able to reuse intermediate steps at any point later, we also assign the intermediates step to a variable. + +Finally, the last line says that the conclusion of the theorem itself, \lstinline|thesis|, can be proven using the tactic \lstinline|Tautology| and the two intermediate steps we reached. \lstinline|Tautology| is a tactic that is able to do reasoning with propositional connectives. It implements a complete decision procedure for propositional logic that is described in \autoref{tact:Tautology}. + +LISA is based on set theory, so you can also use set-theoretic primitives such as in the following theorem. + +\noindent\begin{minipage}{\linewidth}\vspace{1em} + \begin{lstlisting}[language=lisa, frame=single] +val emptySetIsASubset = Theorem( + ∅ ⊆ x +) { + have((y ∈ ∅) ==> (y ∈ x)) by Tautology.from( + emptySetAxiom of (x := y)) + val rhs = thenHave (∀(y, (y ∈ ∅) ==> (y ∈ x))) by RightForall + have(thesis) by Tautology.from( + subsetAxiom of (x := ∅, y := x), rhs) +} + \end{lstlisting} +\end{minipage} +We see a number of new constructs in this example. \lstinline|RightForall| is another tactic (in fact it corresponds to a core deduction rules of the kernel) that introduces a quantifier around a formula, if the bound variable is not free somewhere else in the sequent. +We also see in line 6 another construct: \lstinline|thenHave|. It is similar to \lstinline|have|, but it will automatically pass the previous statement to the tactic. Formally, +\noindent\begin{minipage}{\linewidth}\vspace{1em} + \begin{lstlisting}[language=lisa, frame=single] + have(X) by Tactic1 + thenHave (Y) by Tactic2 + \end{lstlisting} +\end{minipage} +is equivalent to + +\noindent\begin{minipage}{\linewidth}\vspace{1em} + \begin{lstlisting}[language=lisa, frame=single] + val s1 = have(X) by Tactic1 + have (Y) by Tactic2(s1) + \end{lstlisting} +\end{minipage} +\lstinline|thenHave| allows us to not give a name to every step when we're doing linear reasoning. Finally, in lines 5 and 8, we see that tactic can refer not only to steps of the current proof, but also to previously proven theorems and axioms, such as \lstinline|emptySetAxiom|. The \lstinline|of| keyword indicates the the axiom (or step) is instantiated in a particular way. For example: +\noindent\begin{minipage}{\linewidth}\vspace{1em} + \begin{lstlisting}[language=lisa, frame=single] + emptySetAxiom // == !(x ∈ ∅) + emptySetAxiom of (x := y) // == !(y ∈ ∅) + \end{lstlisting} +\end{minipage} + +LISA also allows to introduce definitions. There are essentially two kind of definitions, \emph{aliases} and definition via \emph{unique existence}. +An alias defines a constant, a function or predicate as being equal (or equivalent) to a given formula or term. For example, + +\noindent\begin{minipage}{\linewidth}\vspace{1em} + \begin{lstlisting}[language=lisa, frame=single] + val succ = DEF(x) --> union(unorderedPair(x, singleton(x))) + \end{lstlisting} +\end{minipage} +defines the function symbol \lstinline|succ| as the function taking a single argument $x$ and mapping it to the element $\bigcup \lbrace x, \lbrace x \rbrace \rbrace$\footnote{This correspond to the traditional encoding of the successor function for natural numbers in set theory.}. + +The second way of defining an object is more complicated and involve proving the existence and uniqueness of an object. This is detailed in \autoref{chapt:kernel}. + +You can now try to run the theory file you just wrote and verify if you made a mistake. To do so again do \lstinline|> run| in the sbt console and select the number corresponding to your file. +If all the output is green, perfect! If there is an error, it can be either a syntax error reported at compilation or an error in the proof. In both case, the error message can sometimes be cryptic, but it should at least consistently indicates which line of your file is incorrect. + +Alternatively, if you are using IntelliJ or VS Code and Metals, you can run your theory file directly in your IDE by clicking either on the green arrow (IntelliJ) or on ``run" (VS Code) next to your main object. + + +\section{Common Tactics} +\subsubsection*{Restate} +Restate is a tactic that reasons modulo ortholattices, a subtheory of boolean algebra (see \cite{guilloudFormulaNormalizationsVerification2023} and \autoref{subsec:equivalencechecker}). Formally, it is very efficient and can prove a lot of simple propositional transformations, but not everything that is true in classical logic. In particular, it can't prove that $(a\land b) \lor (a \land c) \iff a \land (b \lor c)$ is true. It can however prove very limited facts involving equality and quantifiers. Usage: + +\begin{lstlisting}[language=lisa] + have(statement) by Restate +\end{lstlisting} +tries to justify \lstinline|statement| by showing it is equivalent to \lstinline|True|. + +\begin{lstlisting}[language=lisa] + have(statement) by Restate(premise) +\end{lstlisting} +tries to justify \lstinline|statement| by showing it is equivalent to the previously proven \lstinline|premise|. + +\subsubsection*{Tautology} +\lstinline|Tautology| is a propositional solver based upon restate, but complete. It is able to prove every formula inference that holds in classical propositional logic. However, in the worst case its complexity can be exponential in the size of the formula. Usage: + +\begin{lstlisting}[language=lisa] + have(statement) by Tautology +\end{lstlisting} +Constructs a proof of \lstinline|statement|, if the statement is true and a proof of it using only classical propositional reasoning exists. + +\begin{lstlisting}[language=lisa] + have(statement) by Tautology.from(premise1, premise2,...) +\end{lstlisting} +Construct a proof of \lstinline|statement| from the previously proven \lstinline|premise1|, \lstinline|premise2|,... using propositional reasoning. + + +\subsubsection*{RightForall, InstantiateForall} +\lstinline|RightForall| will generalize a statement by quantifying it over free variables. For example, +\begin{lstlisting}[language=lisa] + have(P(x)) by ??? + thenHave(∀(x, P(x))) by RightForall +\end{lstlisting} +Note that if the statement inside \lstinline|have| has more than one formula, $x$ cannot appear (it cannot be \emph{free}) in any formula other than $P(x)$. It can also not appear in any assumption. + +\lstinline|InstantiateForall| does the opposite: given a universally quantified statement, it will specialize it. For example: +\begin{lstlisting}[language=lisa] + have(∀(x, P(x))) by ??? + thenHave(P(t)) by InstantiateForall +\end{lstlisting} +for any arbitrary term \lstinline|t|. + +\subsubsection*{Substitution} +Substitutions allows reasoning by substituting equal terms and equivalent formulas. Usage: +\begin{lstlisting}[language=lisa] + have(statement) by Substitution.ApplyRules(subst*)(premise) +\end{lstlisting} + +\lstinline|subst*| is an arbitrary number of substitution. Each of those can be a previously proven fact (or theorem or axiom), or a formula. They must all be of the form \lstinline|s === t| or \lstinline|A <=> B|, otherwise the tactic will fail. The \lstinline|premise| is a previously proven fact. The tactic will try to show that \lstinline|statement| can be obtained from \lstinline|premise| by applying the substitutions from \lstinline|subst|. In its simplest form, +\begin{lstlisting}[language=lisa] + val subst = have(s === t) by ??? + have(P(s)) by ??? + thenHave(P(t)) by Substitution.ApplyRules(subst) +\end{lstlisting} + +Moreover, \lstinline|Substitution| is also able to automatically unify and instantiate subst rules. For example + +\begin{lstlisting}[language=lisa] + val subst = have(g(x, y) === g(y, x)) by ??? + have(P(g(3, 8))) by ??? + thenHave(P(g(8, 3))) by Substitution.ApplyRules(subst) +\end{lstlisting} + +If a \lstinline|subst| is a formula rather than a proven fact, then it should be an assumption in the resulting statement. Similarly, if one of the substitution has an assumption, it should be in the resulting statement. For example, + +\begin{lstlisting}[language=lisa] + val subst = have(A |- Q(s) <=> P(s)) by ??? + have(Q(s) /\ s===f(t)) by ??? + thenHave(A, f(t) === t |- P(s) /\ s===t) + .by Substitution.ApplyRules(subst, f(t) === t) +\end{lstlisting} \ No newline at end of file diff --git a/Reference Manual/sguilloud.bib b/Reference Manual/sguilloud.bib index f8964557..c151a1dc 100644 --- a/Reference Manual/sguilloud.bib +++ b/Reference Manual/sguilloud.bib @@ -18,7 +18,7 @@ @incollection{aczelTypeTheoreticInterpretation1978 urldate = {2021-06-14}, isbn = {978-0-444-85178-9}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\JSZV8ILW\\Aczel - 1978 - The Type Theoretic Interpretation of Constructive .pdf} + file = {/home/sguillou/Zotero/storage/JSZV8ILW/Aczel - 1978 - The Type Theoretic Interpretation of Constructive .pdf} } @book{ahoDesignAnalysisComputer1974, @@ -28,7 +28,6 @@ @book{ahoDesignAnalysisComputer1974 edition = {1st}, publisher = {{Addison-Wesley Longman Publishing Co., Inc.}}, address = {{USA}}, - abstract = {From the Publisher: With this text, you gain an understanding of the fundamental concepts of algorithms, the very heart of computer science. It introduces the basic data structures and programming techniques often used in efficient algorithms. Covers use of lists, push-down stacks, queues, trees, and graphs. Later chapters go into sorting, searching and graphing algorithms, the string-matching algorithms, and the Schonhage-Strassen integer-multiplication algorithm. Provides numerous graded exercises at the end of each chapter. 0201000296B04062001}, isbn = {978-0-201-00029-0} } @@ -41,13 +40,12 @@ @book{AIGER2011 address = {{Linz}}, doi = {10.35011/fmvtr.2011-2}, urldate = {2022-10-13}, - abstract = {This is a short note on the differences between AIGER format version 20071012 and the new versions starting with version 1.9.}, copyright = {cc-by\_4}, langid = {english}, lccn = {UL:TN:FM}, keywords = {AIGER,format description}, annotation = {Accession Number: AC16184715}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\M6YJUPDA\\2011 - AIGER 1.9 and beyond.pdf} + file = {/home/sguillou/Zotero/storage/M6YJUPDA/2011 - AIGER 1.9 and beyond.pdf} } @book{AIGERAndInverterGraph2007, @@ -59,21 +57,19 @@ @book{AIGERAndInverterGraph2007 address = {{Linz}}, doi = {10.35011/fmvtr.2007-1}, urldate = {2022-10-13}, - abstract = {This report describes the AIG file format as used by the AIGER library. The purpose of this report is not only to motivate and document the format, but also to allow independent implementations of writers and readers by giving precise and unambiguous definitions.}, copyright = {cc-by\_4}, langid = {english}, lccn = {UL:TN:FM}, keywords = {AIGER,format description}, annotation = {Accession Number: AC16184712}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\VQHG9TRD\\2007 - The AIGER And-Inverter Graph (AIG) Format Version .pdf} + file = {/home/sguillou/Zotero/storage/VQHG9TRD/2007 - The AIGER And-Inverter Graph (AIG) Format Version .pdf} } @article{amaruEPFLCombinationalBenchmark2015, title = {The {{EPFL Combinational Benchmark Suite}}}, editor = {Amar{\`u}, Luca and Gaillardon, Pierre-Emmanuel and De Micheli, Giovanni}, year = {2015}, - journal = {Proceedings of the 24th International Workshop on Logic \& Synthesis (IWLS)}, - abstract = {In this paper, we present the EPFL combinational benchmark suite. We aim at completing existing benchmark suites by focusing only on \emph{natively} combinational benchmarks. The EPFL combinational benchmark suite consists of 23 combinational circuits designed to challenge modern logic optimization tools. It is further divided into three parts. The first part includes 10 arithmetic benchmarks, e.g., square-root, hypotenuse, divisor, multiplier etc.. The second part consists of 10 random/control benchmarks, e.g., round-robin arbiter, lookahead XY router, alu control unit, memory controller etc.. The third part contains 3 very large circuits, featuring more than ten million gates each. All benchmarks have a moderate number of inputs/outputs ranging from few tens to about one thousand. The EPFL benchmark suite is available to the public and distributed in all Verilog, VHDL, BLIF and AIGER formats. In addition to providing the benchmarks, we keep track of the best optimization results, mapped into LUT-6, for size and depth metrics. Better logic implementations can be submitted online. After combinational equivalence checking tests, the best LUT-6 realizations will be included in the benchmark suite together with the author's name and affiliation} + journal = {Proceedings of the 24th International Workshop on Logic \& Synthesis (IWLS)} } @book{andrewsIntroductionMathematicalLogic2002, @@ -90,7 +86,7 @@ @book{andrewsIntroductionMathematicalLogic2002 urldate = {2020-10-15}, isbn = {978-90-481-6079-2 978-94-015-9934-4}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\XH6FBES6\\Andrews - 2002 - An Introduction to Mathematical Logic and Type The.pdf} + file = {/home/sguillou/Zotero/storage/XH6FBES6/Andrews - 2002 - An Introduction to Mathematical Logic and Type The.pdf} } @incollection{aptTheoryDeclarativeKnowledge1988, @@ -103,14 +99,14 @@ @incollection{aptTheoryDeclarativeKnowledge1988 publisher = {{Morgan Kaufmann}}, address = {{San Francisco, CA, United States}}, doi = {10.1016/b978-0-934613-40-8.50006-3}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\9Y5J4C2G\\Apt et al. - 1988 - Towards a Theory of Declarative Knowledge.pdf} + file = {/home/sguillou/Zotero/storage/9Y5J4C2G/Apt et al. - 1988 - Towards a Theory of Declarative Knowledge.pdf} } @misc{ArchiveFormalProofs, title = {Archive of {{Formal Proofs}}}, urldate = {2022-05-10}, howpublished = {https://www.isa-afp.org/}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\6I27PLQW\\www.isa-afp.org.html} + file = {/home/sguillou/Zotero/storage/6I27PLQW/www.isa-afp.org.html} } @inproceedings{arthanHOLConstantDefinition2014, @@ -124,11 +120,10 @@ @inproceedings{arthanHOLConstantDefinition2014 publisher = {{Springer International Publishing}}, address = {{Cham}}, doi = {10.1007/978-3-319-08970-6_34}, - abstract = {This note gives a proposal for a simpler and more powerful replacement for the mechanisms currently provided in the various HOL implementations for defining new constants.}, isbn = {978-3-319-08970-6}, langid = {english}, keywords = {Free Variable,Logical Connective,Proof Tree,Type Instance,Type Variable}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\Z4Y9PQ8D\\Arthan - 2014 - HOL Constant Definition Done Right.pdf} + file = {/home/sguillou/Zotero/storage/Z4Y9PQ8D/Arthan - 2014 - HOL Constant Definition Done Right.pdf} } @article{arvindN2AlgorithmSatisfiability1987, @@ -145,7 +140,7 @@ @article{arvindN2AlgorithmSatisfiability1987 urldate = {2023-06-07}, langid = {english}, keywords = {Satisfiability problem}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\E6THQ9JQ\\Arvind and Biswas - 1987 - An O(n2) algorithm for the satisfiability problem .pdf;C\:\\Users\\Simon\\Zotero\\storage\\8GIGLMK5\\0020019087902006.html} + file = {/home/sguillou/Zotero/storage/E6THQ9JQ/Arvind and Biswas - 1987 - An O(n2) algorithm for the satisfiability problem .pdf;/home/sguillou/Zotero/storage/8GIGLMK5/0020019087902006.html} } @misc{austinChallengesImplementingLCFStyle2015, @@ -153,17 +148,16 @@ @misc{austinChallengesImplementingLCFStyle2015 author = {Austin, Evan and Alexander, P.}, year = {2015}, urldate = {2022-06-20}, - abstract = {The work in this paper details the challenges the team has encountered in the development of one such implementation, a monadic approach to the LCF style tailored to the Haskell programming language, HaskHOL. The predominant, root design among current proof assistants, the LCF style, is traditionally realized through impure, functional languages. Thus, languages that eschew side-effects in the name of purity collectively represent a largely untapped platform for exploring alternate implementations of LCF-style provers. The work in this paper details the challenges we have encountered in the development of one such implementation, a monadic approach to the LCF style tailored to the Haskell programming language. The resultant proof system, HaskHOL, is introduced and our current work with it is briefly discussed.}, howpublished = {https://www.semanticscholar.org/paper/Challenges-Implementing-an-LCF-Style-Proof-System-Austin-Alexander/e2f0b6e96878bc83ecce2c610a6acffcc42a1dff}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\7LE293NF\\e2f0b6e96878bc83ecce2c610a6acffcc42a1dff.html} + file = {/home/sguillou/Zotero/storage/7LE293NF/e2f0b6e96878bc83ecce2c610a6acffcc42a1dff.html} } @article{avigadTheoremProvingLean, title = {Theorem {{Proving}} in {{Lean}}}, author = {Avigad, Jeremy}, pages = {173}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\6J4XVC4A\\Avigad - Theorem Proving in Lean.pdf} + file = {/home/sguillou/Zotero/storage/6J4XVC4A/Avigad - Theorem Proving in Lean.pdf} } @article{aygunProvingTheoremsUsing2021, @@ -175,10 +169,9 @@ @article{aygunProvingTheoremsUsing2021 eprint = {2112.10664}, primaryclass = {cs}, urldate = {2022-05-05}, - abstract = {Traditional automated theorem provers for first-order logic depend on speed-optimized search and many handcrafted heuristics that are designed to work best over a wide range of domains. Machine learning approaches in literature either depend on these traditional provers to bootstrap themselves or fall short on reaching comparable performance. In this paper, we propose a general incremental learning algorithm for training domain specific provers for first-order logic without equality, based only on a basic given-clause algorithm, but using a learned clause-scoring function. Clauses are represented as graphs and presented to transformer networks with spectral features. To address the sparsity and the initial lack of training data as well as the lack of a natural curriculum, we adapt hindsight experience replay to theorem proving, so as to be able to learn even when no proof can be found. We show that provers trained this way can match and sometimes surpass state-of-the-art traditional provers on the TPTP dataset in terms of both quantity and quality of the proofs.}, archiveprefix = {arxiv}, keywords = {Computer Science - Artificial Intelligence,Computer Science - Logic in Computer Science,I.2.3}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\3GSZDR9Q\\Aygün et al. - 2021 - Proving Theorems using Incremental Learning and Hi.pdf;C\:\\Users\\Simon\\Zotero\\storage\\78BMFYQD\\2112.html} + file = {/home/sguillou/Zotero/storage/3GSZDR9Q/Aygün et al. - 2021 - Proving Theorems using Incremental Learning and Hi.pdf;/home/sguillou/Zotero/storage/78BMFYQD/2112.html} } @book{baaderTermRewritingAll1998, @@ -189,9 +182,8 @@ @book{baaderTermRewritingAll1998 address = {{Cambridge}}, doi = {10.1017/CBO9781139172752}, urldate = {2021-09-13}, - abstract = {This textbook offers a unified and self-contained introduction to the field of term rewriting. It covers all the basic material (abstract reduction systems, termination, confluence, completion, and combination problems), but also some important and closely connected subjects: universal algebra, unification theory, Gr\"obner bases and Buchberger's algorithm. The main algorithms are presented both informally and as programs in the functional language Standard ML (an appendix contains a quick and easy introduction to ML). Certain crucial algorithms like unification and congruence closure are covered in more depth and Pascal programs are developed. The book contains many examples and over 170 exercises. This text is also an ideal reference book for professional researchers: results that have been spread over many conference and journal articles are collected together in a unified notation, proofs of almost all theorems are provided, and each chapter closes with a guide to the literature.}, isbn = {978-0-521-77920-3}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\BP642VYS\\Baader and Nipkow - 1998 - Term Rewriting and All That.pdf;C\:\\Users\\Simon\\Zotero\\storage\\SGRIHIBJ\\71768055278D0DEF4FFC74722DE0D707.html} + file = {/home/sguillou/Zotero/storage/BP642VYS/Baader and Nipkow - 1998 - Term Rewriting and All That.pdf;/home/sguillou/Zotero/storage/SGRIHIBJ/71768055278D0DEF4FFC74722DE0D707.html} } @article{bakerCONSShouldNot1992, @@ -205,8 +197,7 @@ @article{bakerCONSShouldNot1992 pages = {24--34}, issn = {0362-1340}, doi = {10.1145/130854.130858}, - urldate = {2021-05-31}, - abstract = {Lazy allocation is a model for allocating objects on the execution stack of a high-level language which does not create dangling references. Our model provides safe transportation into the heap for objects that may survive the deallocation of the surrounding stack frame. Space for objects that do not survive the deallocation of the surrounding stack frame is reclaimed without additional effort when the stack is popped. Lazy allocation thus performs a first-level garbage collection, and if the language supports garbage collection of the heap, then our model can reduce the amortized cost of allocation in such a heap by filtering out the short-lived objects that can be more efficiently managed in LIFO order. A run-time mechanism called result expectation further filters out unneeded results from functions called only for their effects. In a shared-memory multi-processor environment, this filtering reduces contention for the allocation and management of global memory.Our model performs simple local operations, and is therefore suitable for an interpreter or a hardware implementation. Its overheads for functional data are associated only with assignments, making lazy allocation attractive for mostly functional programming styles. Many existing stack allocation optimizations can be seen as instances of this generic model, in which some portion of these local operations have been optimized away through static analysis techniques.Important applications of our model include the efficient allocation of temporary data structures that are passed as arguments to anonymous procedures which may or may not use these data structures in a stack-like fashion. The most important of these objects are functional arguments (funargs), which require some run-time allocation to preserve the local environment. Since a funarg is sometimes returned as a first-class value, its lifetime can survive the stack frame in which it was created. Arguments which are evaluated in a lazy fashion (Scheme delays or "suspensions") are similarly handled. Variable-length argument "lists" themselves can be allocated in this fashion, allowing these objects to become "first-class". Finally, lazy allocation correctly handles the allocation of a Scheme control stack, allowing Scheme continuations to become first-class values.} + urldate = {2021-05-31} } @incollection{barbosaCvc5VersatileIndustrialStrength2022, @@ -222,10 +213,9 @@ @incollection{barbosaCvc5VersatileIndustrialStrength2022 address = {{Cham}}, doi = {10.1007/978-3-030-99524-9_24}, urldate = {2022-05-09}, - abstract = {Abstract cvc5 is the latest SMT solver in the cooperating validity checker series and builds on the successful code base of CVC4. This paper serves as a comprehensive system description of cvc5 's architectural design and highlights the major features and components introduced since CVC4 ~1.8. We evaluate cvc5 's performance on all benchmarks in SMT-LIB and provide a comparison against CVC4 and Z3.}, isbn = {978-3-030-99523-2 978-3-030-99524-9}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\3FB7GWIW\\Barbosa et al. - 2022 - cvc5 A Versatile and Industrial-Strength SMT Solv.pdf} + file = {/home/sguillou/Zotero/storage/3FB7GWIW/Barbosa et al. - 2022 - cvc5 A Versatile and Industrial-Strength SMT Solv.pdf} } @article{barendregtLambdaCalculiTypes2000, @@ -233,9 +223,8 @@ @article{barendregtLambdaCalculiTypes2000 author = {Barendregt, Henk and Abramsky, S. and Gabbay, D. and Maibaum, T. and Barendregt, Henk (Hendrik)}, year = {2000}, month = oct, - abstract = {Contents 1 Introduction : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 4 2 Type-free lambda calculus : : : : : : : : : : : : : : : : : : : : : : 7 2.1 The system : : : : : : : : : : : : : : : : : : : : : : : : : : : 7 2.2 Lambda definability : : : : : : : : : : : : : : : : : : : : : : 14 2.3 Reduction : : : : : : : : : : : : : : : : : : : : : : : : : : : : 20 3 Curry versus Church typing : : : : : : : : : : : : : : : : : : : : : 34 3.1 The system !-Curry : : : : : : : : : : : : : : : : : : : : : 34 3.2 The system !-Church : : : : : : : : : : : : : : : : : : : : 42 4 Typing `a la Curry : : : : : : : : : : : : : : : : : : : : : : : : : : 46 4.1 The systems : : : : : : : : : : : : : : : : : : : : : : : : : : : 47 4.2 Subject reduction and conversion}, keywords = {Read}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\SXVI9ZTK\\Barendregt et al. - 2000 - Lambda Calculi with Types.pdf} + file = {/home/sguillou/Zotero/storage/SXVI9ZTK/Barendregt et al. - 2000 - Lambda Calculi with Types.pdf} } @inproceedings{barrettCVC42011, @@ -249,11 +238,10 @@ @inproceedings{barrettCVC42011 publisher = {{Springer}}, address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-642-22110-1_14}, - abstract = {CVC4 is the latest version of the Cooperating Validity Checker. A joint project of NYU and U Iowa, CVC4 aims to support the useful feature set of CVC3 and SMT-LIBv2 while optimizing the design of the core system architecture and decision procedures to take advantage of recent engineering and algorithmic advances. CVC4 represents a completely new code base; it is a from-scratch rewrite of CVC3, and many subsystems have been completely redesigned. Additional decision procedures for CVC4 are currently under development, but for what it currently achieves, it is a lighter-weight and higher-performing tool than CVC3. We describe the system architecture, subsystems of note, and discuss some applications and continuing work.}, isbn = {978-3-642-22110-1}, langid = {english}, keywords = {Decision Procedure,Node Data,Node Manager,Node Object,Proof Rule}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\LD79NNTU\\Barrett et al. - 2011 - CVC4.pdf} + file = {/home/sguillou/Zotero/storage/LD79NNTU/Barrett et al. - 2011 - CVC4.pdf} } @inproceedings{bartheIntroductionDependentType2002, @@ -267,11 +255,10 @@ @inproceedings{bartheIntroductionDependentType2002 publisher = {{Springer}}, address = {{Berlin, Heidelberg}}, doi = {10.1007/3-540-45699-6_1}, - abstract = {Functional programming languages often feature mechanisms that involve complex computations at the level of types. These mechanisms can be analyzed uniformly in the framework of dependent types, in which types may depend on values. The purpose of this chapter is to give some background for such an analysis.We present here precise theorems, that should hopefully help the reader to understand to which extent statements like ``introducing dependent types in a programming language implies that type checking is undecidable'', are justified.}, isbn = {978-3-540-45699-5}, langid = {english}, keywords = {Dependent Type,Functional Programming,Print,Type System,Type Theory,Typing Rule}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\JYVXR8YZ\\Barthe and Coquand - 2002 - An Introduction to Dependent Type Theory.pdf} + file = {/home/sguillou/Zotero/storage/JYVXR8YZ/Barthe and Coquand - 2002 - An Introduction to Dependent Type Theory.pdf} } @article{basinCombiningWS1SHOL2001, @@ -279,7 +266,6 @@ @article{basinCombiningWS1SHOL2001 author = {Basin, David and Friedrich, Stefan}, year = {2001}, month = jul, - abstract = {We investigate the combination of the weak second-order monadic logic of one successor (WS1S) with higher-order logic (HOL). We show how these two logics can be combined, how theorem provers based on them can be safely integrated, and how the result can be used. In particular, we present an embedding of the semantics of WS1S in HOL that provides a basis for coupling the MONA system, a decision procedure for WS1S, with an implementation of HOL in the Isabelle system. Afterwards, we describe methods that reduce problems formalized in HOL to problems in the language of WS1S. We present applications to arithmetic reasoning and proving properties of parameterized sequential systems. 1}, keywords = {Print,Read} } @@ -295,9 +281,8 @@ @article{basinTermEqualityProblem1994a issn = {00200190}, doi = {10.1016/0020-0190(94)00084-0}, urldate = {2021-09-13}, - abstract = {We demonstrate that deciding if two terms containing otherwise uninterpreted associative, commutative, and associative-commutative function symbols and commutative variable-binding operators are equal is polynomially equivalent to determining if two graphs are isomorphic. The reductions we use provide insight into this result and suggest polynomial time special cases.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\94MICTHB\\Basin - 1994 - A term equality problem equivalent to graph isomor.pdf} + file = {/home/sguillou/Zotero/storage/94MICTHB/Basin - 1994 - A term equality problem equivalent to graph isomor.pdf} } @article{bauerFiveStagesAccepting2016, @@ -312,10 +297,9 @@ @article{bauerFiveStagesAccepting2016 issn = {0273-0979, 1088-9485}, doi = {10.1090/bull/1556}, urldate = {2020-09-17}, - abstract = {On the odd day, a mathematician might wonder what constructive mathematics is all about. They may have heard arguments in favor of constructivism but are not at all convinced by them, and in any case they may care little about philosophy. A typical introductory text about constructivism spends a great deal of time explaining the principles and contains only trivial mathematics, while advanced constructive texts are impenetrable, like all unfamiliar mathematics. How then can a mathematician find out what constructive mathematics feels like? What new and relevant ideas does constructive mathematics have to offer, if any? I shall attempt to answer these questions.}, langid = {english}, keywords = {Read}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\5982DKNU\\Bauer - 2016 - Five stages of accepting constructive mathematics.pdf} + file = {/home/sguillou/Zotero/storage/5982DKNU/Bauer - 2016 - Five stages of accepting constructive mathematics.pdf} } @inproceedings{bellOrthologicForcingManifestation1983, @@ -332,9 +316,8 @@ @inproceedings{bellOrthologicForcingManifestation1983 address = {{Singapore}}, doi = {10.1016/S0049-237X(08)70953-4}, urldate = {2023-06-23}, - abstract = {This chapter discusses the orthologic, forcing, and manifestation of attributes. The principal feature that is presumed to distinguish the so-called quantum logic from its classical counterpart is the failure, for the former, of the distributive law. However, this is not, in itself, enlightening because the distributive law is not really a fundamental logical law but is rather a derivative of more rudimentary laws. The chapter discusses corresponding task for the logical system closely related to, and a generalization of, quantum logic called (first-order) ``orthologic.'' The crucial differences between orthologic and classic logic are identified by defining the concept of forcing for sentences over the appropriate structures, called ``frames''. It is shown in the chapter that orthologic differs from classic or intuitionistic logic in that, while the forcing relation for frames appropriate for either of the latter logical system enjoys a certain property of persistence, the forcing relation for frames appropriate for orthologic does not. Using the notions of assemblage and proximity space, a more concrete interpretation of the failure of persistence is given that yields new illustrations of the quantum-mechanical concepts of superposition and incompatible attributes.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\2XMCKBT5\\S0049237X08709534.html} + file = {/home/sguillou/Zotero/storage/2XMCKBT5/S0049237X08709534.html} } @inproceedings{benkeEfficientTypeReconstruction1993, @@ -348,7 +331,6 @@ @inproceedings{benkeEfficientTypeReconstruction1993 publisher = {{Springer}}, address = {{Berlin, Heidelberg}}, doi = {10.1007/3-540-57182-5_19}, - abstract = {The complexity of type reconstruction for simply-typed lambda calculus with subtype relation resulting from single inheritance (i.e. being a disjoint union of tree-like posets) is analyzed. As a result a class of posets including (but not restricted to) trees is defined, for which the said problem is solvable in polynomial time.}, isbn = {978-3-540-47927-7}, langid = {english}, keywords = {Absolute Retract,Deduction System,Needed,Polynomial Time,Type Inference,Type Reconstruction} @@ -358,10 +340,9 @@ @article{benkeLogicalApproachComplexity title = {A {{Logical Approach}} to {{Complexity Bounds}} for {{Subtype Inequalities}}}, author = {Benke, Marcin}, pages = {10}, - abstract = {We study complexity of type reconstruction with subtypes. As proved recently, this problem is polynomially equivalent to checking satisfiability of systems of inequalities. Therefore we concentrate on the latter problem and show how a variant of the transitive closure logic can be used to find an interesting class of posets for which this problem can be solved in polynomial time. Further we propose alternation as a framework suitable for presenting and explaining the aforementioned complexity for various classes of underlying subtype relation.}, langid = {english}, keywords = {Needed}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\U3DUFVFG\\Benke - A Logical Approach to Complexity Bounds for Subtyp.pdf} + file = {/home/sguillou/Zotero/storage/U3DUFVFG/Benke - A Logical Approach to Complexity Bounds for Subtyp.pdf} } @inproceedings{bentkampSuperpositionFullHigherorder2021, @@ -376,7 +357,7 @@ @inproceedings{bentkampSuperpositionFullHigherorder2021 publisher = {{Springer}}, doi = {10.1007/978-3-030-79876-5_23}, urldate = {2021-09-29}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\Q7EMM7I9\\Bentkamp et al. - 2021 - Superposition for Full Higher-order Logic.pdf} + file = {/home/sguillou/Zotero/storage/Q7EMM7I9/Bentkamp et al. - 2021 - Superposition for Full Higher-order Logic.pdf} } @book{bertotInteractiveTheoremProving2004, @@ -389,17 +370,16 @@ @book{bertotInteractiveTheoremProving2004 address = {{Berlin Heidelberg}}, doi = {10.1007/978-3-662-07964-5}, urldate = {2021-06-22}, - abstract = {Coq is an interactive proof assistant for the development of mathematical theories and formally certified software. It is based on a theory called the calculus of inductive constructions, a variant of type theory. This book provides a pragmatic introduction to the development of proofs and certified programs using Coq. With its large collection of examples and exercises it is an invaluable tool for researchers, students, and engineers interested in formal methods and the development of zero-fault software.}, isbn = {978-3-540-20854-9}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\QWHUA3UH\\9783540208549.html} + file = {/home/sguillou/Zotero/storage/QWHUA3UH/9783540208549.html} } @misc{BetterTogetherUnifying, title = {Better {{Together}}: {{Unifying Datalog}} and {{Equality Saturation}} | {{Proceedings}} of the {{ACM}} on {{Programming Languages}}}, urldate = {2023-06-29}, howpublished = {https://dl.acm.org/doi/10.1145/3591239}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\59FL9BXX\\3591239.html} + file = {/home/sguillou/Zotero/storage/59FL9BXX/3591239.html} } @book{bezhanishviliLecturesLogicComputation2012, @@ -416,16 +396,15 @@ @book{bezhanishviliLecturesLogicComputation2012 isbn = {978-3-642-31484-1 978-3-642-31485-8}, langid = {english}, keywords = {Print,Read}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\83JPBVXF\\Bezhanishvili and Goranko - 2012 - Lectures on Logic and Computation ESSLLI 2010 Cop.pdf} + file = {/home/sguillou/Zotero/storage/83JPBVXF/Bezhanishvili and Goranko - 2012 - Lectures on Logic and Computation ESSLLI 2010 Cop.pdf} } @misc{BibliographyManagementLaTeX, title = {{Bibliography management in LaTeX}}, urldate = {2021-10-29}, - abstract = {Un \'editeur LaTeX en ligne facile \`a utiliser. Pas d'installation, collaboration en temps r\'eel, gestion des versions, des centaines de mod\`eles de documents LaTeX, et plus encore.}, howpublished = {https://fr.overleaf.com/learn/latex/Bibliography\_management\_in\_LaTeX}, langid = {french}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\UY2P8DLA\\Bibliography_management_in_LaTeX.html} + file = {/home/sguillou/Zotero/storage/UY2P8DLA/Bibliography_management_in_LaTeX.html} } @article{biereAIGER2011, @@ -435,9 +414,8 @@ @article{biereAIGER2011 publisher = {{Institut for Formal Models and Verification, Johannes Kepler University}}, doi = {10.35011/FMVTR.2011-2}, urldate = {2022-10-12}, - abstract = {This is a short note on the differences between AIGER format version 20071012 and the new versions starting with version 1.9.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\DU3LP5DG\\Biere et al. - 2011 - AIGER 1.9 and beyond.pdf} + file = {/home/sguillou/Zotero/storage/DU3LP5DG/Biere et al. - 2011 - AIGER 1.9 and beyond.pdf} } @book{biereAIGER2011a, @@ -450,13 +428,12 @@ @book{biereAIGER2011a address = {{Linz}}, doi = {10.35011/fmvtr.2011-2}, urldate = {2022-10-13}, - abstract = {This is a short note on the differences between AIGER format version 20071012 and the new versions starting with version 1.9.}, copyright = {cc-by\_4}, langid = {english}, lccn = {UL:TN:FM}, keywords = {AIGER,format description}, annotation = {Accession Number: AC16184715}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\AH7EBKZ4\\2011 - AIGER 1.9 and beyond.pdf} + file = {/home/sguillou/Zotero/storage/AH7EBKZ4/2011 - AIGER 1.9 and beyond.pdf} } @article{biereAIGERAndInverterGraph2007, @@ -466,9 +443,8 @@ @article{biereAIGERAndInverterGraph2007 publisher = {{Institut for Formal Models and Verification, Johannes Kepler University}}, doi = {10.35011/FMVTR.2007-1}, urldate = {2022-10-12}, - abstract = {This report describes the AIG file format as used by the AIGER library. The purpose of this report is not only to motivate and document the format, but also to allow independent implementations of writers and readers by giving precise and unambiguous definitions.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\ETF67RSD\\Biere - 2007 - The AIGER And-Inverter Graph (AIG) Format Version .pdf} + file = {/home/sguillou/Zotero/storage/ETF67RSD/Biere - 2007 - The AIGER And-Inverter Graph (AIG) Format Version .pdf} } @inproceedings{bierePreprocessingInprocessingTechniques2012, @@ -482,10 +458,9 @@ @inproceedings{bierePreprocessingInprocessingTechniques2012 publisher = {{Springer}}, address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-642-34188-5_1}, - abstract = {SAT solvers are used in many applications in and outside of Computer Science. The success of SAT is based on the use of good decision heuristics, learning, restarts, and compact data structures with fast algorithms. But also efficient and effective encoding, preprocessing and inprocessing techniques are important in practice. In this talk we give an overview of old and more recent inprocessing and preprocessing techniques starting with ancient pure literal reasoning and failed literal probing. Hyper-binary resolution and variable elimination are more recent techniques of this century. We discuss blocked-clause elimination, which gives a nice connection to optimizing encodings and conclude with our recent results on unhiding redundancy fast.}, isbn = {978-3-642-34188-5}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\ALM2LNXJ\\Biere - 2012 - Preprocessing and Inprocessing Techniques in SAT.pdf} + file = {/home/sguillou/Zotero/storage/ALM2LNXJ/Biere - 2012 - Preprocessing and Inprocessing Techniques in SAT.pdf} } @article{birkhoffLogicQuantumMechanics1936, @@ -501,8 +476,8 @@ @article{birkhoffLogicQuantumMechanics1936 publisher = {{Annals of Mathematics}}, issn = {0003-486X}, doi = {10.2307/1968621}, - urldate = {2023-06-23}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\U3CI5KEG\\Birkhoff and Von Neumann - 1936 - The Logic of Quantum Mechanics.pdf} + urldate = {2023-06-13}, + file = {/home/sguillou/Zotero/storage/CSB9262L/Birkhoff et Von Neumann - 1936 - The Logic of Quantum Mechanics.pdf} } @article{birkhoffLogicQuantumMechanics1936a, @@ -518,8 +493,8 @@ @article{birkhoffLogicQuantumMechanics1936a publisher = {{Annals of Mathematics}}, issn = {0003-486X}, doi = {10.2307/1968621}, - urldate = {2023-06-13}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\CSB9262L\\Birkhoff et Von Neumann - 1936 - The Logic of Quantum Mechanics.pdf} + urldate = {2023-06-23}, + file = {/home/sguillou/Zotero/storage/U3CI5KEG/Birkhoff and Von Neumann - 1936 - The Logic of Quantum Mechanics.pdf} } @incollection{blanchetteTrulyModularCo2014, @@ -534,20 +509,18 @@ @incollection{blanchetteTrulyModularCo2014 address = {{Cham}}, doi = {10.1007/978-3-319-08970-6_7}, urldate = {2022-03-20}, - abstract = {We extended Isabelle/HOL with a pair of definitional commands for datatypes and codatatypes. They support mutual and nested (co)recursion through well-behaved type constructors, including mixed recursion\textendash corecursion, and are complemented by syntaxes for introducing primitively (co)recursive functions and by a general proof method for reasoning coinductively. As a case study, we ported Isabelle's Coinductive library to use the new commands, eliminating the need for tedious ad hoc constructions.}, isbn = {978-3-319-08969-0 978-3-319-08970-6}, langid = {english}, keywords = {Read}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\SQ9ARKIP\\Blanchette et al. - 2014 - Truly Modular (Co)datatypes for IsabelleHOL.pdf} + file = {/home/sguillou/Zotero/storage/SQ9ARKIP/Blanchette et al. - 2014 - Truly Modular (Co)datatypes for IsabelleHOL.pdf} } @article{blanquiEncodingTypeUniverses, title = {Encoding Type Universes without Using Matching modulo {{AC}}}, author = {Blanqui, Frederic}, pages = {15}, - abstract = {The encoding of proof systems and type theories in logical frameworks is key to allow the translation of proofs from one system to the other. The {$\lambda\Pi$}-calculus modulo rewriting is a powerful logical framework in which various systems have already been encoded, including type systems with an infinite hierarchy of universes with, on universes, a unary successor operator and a binary max operator: Matita, Coq, Agda and Lean. However, to decide the word problem in this max-successor algebra, all the proposed encodings use rewriting with matching modulo associativity and commutativity (AC), which is of high complexity and difficult to add in standard algorithms for {$\beta$}-reduction and type-checking. In this paper, we show that we can get rid of matching modulo AC by enforcing terms to be in some special canonical form wrt AC, and by using rewriting rules taking advantage of this canonical form.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\6HD6G5ID\\Blanqui - Encoding type universes without using matching mod.pdf} + file = {/home/sguillou/Zotero/storage/6HD6G5ID/Blanqui - Encoding type universes without using matching mod.pdf} } @article{bonzioNoteOrthomodularLattices2017, @@ -558,8 +531,7 @@ @article{bonzioNoteOrthomodularLattices2017 journal = {International Journal of Theoretical Physics}, volume = {56}, doi = {10.1007/s10773-016-3258-6}, - abstract = {We introduce a new identity equivalent to the orthomodular law in every ortholattice.}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\CVI9XTPX\\Bonzio and Chajda - 2017 - A Note on Orthomodular Lattices.pdf} + file = {/home/sguillou/Zotero/storage/CVI9XTPX/Bonzio and Chajda - 2017 - A Note on Orthomodular Lattices.pdf} } @article{borosRecognitionQHornFormulae1994, @@ -574,9 +546,8 @@ @article{borosRecognitionQHornFormulae1994 issn = {0166-218X}, doi = {10.1016/0166-218X(94)90033-7}, urldate = {2023-06-12}, - abstract = {The class of q-Horn Boolean expressions, generalizing the important classes of quadratic, Horn, and disguised Horn formulae, has been introduced in Boros et al.(1990). It has been shown there that the satisfiability problem corresponding to a disjunctive normal form {$\varphi$} is solvable in time, linear in the size of {$\varphi$}, if {$\varphi$} is known to be q-Horn. However, the recognition of such formulae was based on the solution of a linear programming problem, and had therefore a much higher (although still polynomial) complexity. In this paper a linear-time combinatorial algorithm is presented for recognizing q-Horn formulae, and reducing in this way the overall complexity of the corresponding satisfiability problem to a linear one.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\2QV7PSBN\\Boros et al. - 1994 - Recognition of q-Horn formulae in linear time.pdf;C\:\\Users\\Simon\\Zotero\\storage\\IDBRFKPG\\0166218X94900337.html} + file = {/home/sguillou/Zotero/storage/2QV7PSBN/Boros et al. - 1994 - Recognition of q-Horn formulae in linear time.pdf;/home/sguillou/Zotero/storage/IDBRFKPG/0166218X94900337.html} } @incollection{boudetACunificationHigherorderPatterns1997, @@ -591,10 +562,26 @@ @incollection{boudetACunificationHigherorderPatterns1997 address = {{Berlin, Heidelberg}}, doi = {10.1007/BFb0017445}, urldate = {2021-05-06}, - abstract = {We present a complete algorithm for the unification of higher-order patterns modulo the associative-commutative theory of some constants + l , . . . , + n . Given an AC-unification problem over higher-order patterns, the output of the algorithm is a finite set DAG solved forms [9], constrained by some flexible-flexible equations with the same head on both sides. Indeed, in the presence of AC constants, such equations are always solvable, but they have no minimal complete set of unifiers [13]. We prove that the algorithm terminates, is sound, and that any solution of the original unification problem is an instance of one of the computed solutions which satisfies the constraints.}, isbn = {978-3-540-63753-0 978-3-540-69642-1}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\KJBHREIU\\Boudet et Contejean - 1997 - AC-unification of higher-order patterns.pdf} + file = {/home/sguillou/Zotero/storage/KJBHREIU/Boudet et Contejean - 1997 - AC-unification of higher-order patterns.pdf} +} + +@inproceedings{bradleySATBasedModelChecking2011, + title = {{{SAT-Based Model Checking}} without {{Unrolling}}}, + booktitle = {Verification, {{Model Checking}}, and {{Abstract Interpretation}}}, + author = {Bradley, Aaron R.}, + editor = {Jhala, Ranjit and Schmidt, David}, + year = {2011}, + series = {Lecture {{Notes}} in {{Computer Science}}}, + pages = {70--87}, + publisher = {{Springer}}, + address = {{Berlin, Heidelberg}}, + doi = {10.1007/978-3-642-18275-4_7}, + isbn = {978-3-642-18275-4}, + langid = {english}, + keywords = {Inductive Generalization,Model Check,Safety Property,Symbolic Model Check,Transition Relation}, + file = {/home/sguillou/Zotero/storage/QH2G2FW7/Bradley - 2011 - SAT-Based Model Checking without Unrolling.pdf} } @incollection{braibantTacticsReasoningModulo2011, @@ -609,10 +596,9 @@ @incollection{braibantTacticsReasoningModulo2011 address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-642-25379-9_14}, urldate = {2021-05-06}, - abstract = {We present a set of tools for rewriting modulo associativity and commutativity (AC) in Coq, solving a long-standing practical problem. We use two building blocks: first, an extensible reflexive decision procedure for equality modulo AC; second, an OCaml plug-in for pattern matching modulo AC. We handle associative only operations, neutral elements, uninterpreted function symbols, and user-defined equivalence relations. By relying on type-classes for the reification phase, we can infer these properties automatically, so that end-users do not need to specify which operation is A or AC, or which constant is a neutral element.}, isbn = {978-3-642-25378-2 978-3-642-25379-9}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\WPN6E49K\\Braibant et Pous - 2011 - Tactics for Reasoning Modulo AC in Coq.pdf} + file = {/home/sguillou/Zotero/storage/WPN6E49K/Braibant et Pous - 2011 - Tactics for Reasoning Modulo AC in Coq.pdf} } @inproceedings{brownHigherOrderTarskiGrothendieck2019, @@ -620,11 +606,10 @@ @inproceedings{brownHigherOrderTarskiGrothendieck2019 booktitle = {{{ITP}}}, author = {Brown, Chad E. and Kaliszyk, C. and Pak, Karol}, year = {2019}, - doi = {10.4230/LIPIcs.ITP.2019.9}, - abstract = {A foundation for computer verified proofs based on higher-order Tarski-Grothendieck set theory is introduced, and it is shown that this theory has a model if a 2-inaccessible cardinal exists. We formally introduce a foundation for computer verified proofs based on higher-order Tarski-Grothendieck set theory. We show that this theory has a model if a 2-inaccessible cardinal exists. This assumption is the same as the one needed for a model of plain Tarski-Grothendieck set theory. The foundation allows the co-existence of proofs based on two major competing foundations for formal proofs: higher-order logic and TG set theory. We align two co-existing Isabelle libraries, Isabelle/HOL and Isabelle/Mizar, in a single foundation in the Isabelle logical framework. We do this by defining isomorphisms between the basic concepts, including integers, functions, lists, and algebraic structures that preserve the important operations. With this we can transfer theorems proved in higher-order logic to TG set theory and vice versa. We practically show this by formally transferring Lagrange's four-square theorem, Fermat 3-4, and other theorems between the foundations in the Isabelle framework.} + doi = {10.4230/LIPIcs.ITP.2019.9} } -@inproceedings{brownSatallaxAutomaticHigherOrder2012a, +@inproceedings{brownSatallaxAutomaticHigherOrder2012, title = {Satallax: {{An Automatic Higher-Order Prover}}}, shorttitle = {Satallax}, booktitle = {Automated {{Reasoning}}}, @@ -636,11 +621,10 @@ @inproceedings{brownSatallaxAutomaticHigherOrder2012a publisher = {{Springer}}, address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-642-31365-3_11}, - abstract = {Satallax is an automatic higher-order theorem prover that generates propositional clauses encoding (ground) tableau rules and uses MiniSat to test for unsatisfiability. We describe the implementation, focusing on flags that control search and examples that illustrate how the search proceeds.}, isbn = {978-3-642-31365-3}, langid = {english}, keywords = {higher-order logic,higher-order theorem proving,simple type theory}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\N6FKDN44\\Brown - 2012 - Satallax An Automatic Higher-Order Prover.pdf} + file = {/home/sguillou/Zotero/storage/N6FKDN44/Brown - 2012 - Satallax An Automatic Higher-Order Prover.pdf} } @article{brunsFreeOrtholattices1976, @@ -656,9 +640,25 @@ @article{brunsFreeOrtholattices1976 issn = {0008-414X, 1496-4279}, doi = {10.4153/CJM-1976-095-6}, urldate = {2021-09-23}, - abstract = {It has been known for some time but does not seem to be anywhere in the literature that the variety of all ortholattices is generated by its finite members (see (4.2) of this paper). This is well known to imply that the word problem for free ortholattices is solvable. On the other hand, it is also known that the solution obtained this way is of no practical use. The main purpose of this paper is to present a workable solution.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\DTIXXI63\\Bruns - 1976 - Free Ortholattices.pdf;C\:\\Users\\Simon\\Zotero\\storage\\JZWYSGIC\\6BD405B268BCDD7B8CE75706F04A6084.html} + file = {/home/sguillou/Zotero/storage/DTIXXI63/Bruns - 1976 - Free Ortholattices.pdf;/home/sguillou/Zotero/storage/JZWYSGIC/6BD405B268BCDD7B8CE75706F04A6084.html} +} + +@inproceedings{brunsModelCheckingMultivalued2004, + title = {Model {{Checking}} with {{Multi-valued Logics}}}, + booktitle = {Automata, {{Languages}} and {{Programming}}}, + author = {Bruns, Glenn and Godefroid, Patrice}, + editor = {D{\'i}az, Josep and Karhum{\"a}ki, Juhani and Lepist{\"o}, Arto and Sannella, Donald}, + year = {2004}, + series = {Lecture {{Notes}} in {{Computer Science}}}, + pages = {281--293}, + publisher = {{Springer}}, + address = {{Berlin, Heidelberg}}, + doi = {10.1007/978-3-540-27836-8_26}, + isbn = {978-3-540-27836-8}, + langid = {english}, + keywords = {Atomic Proposition,Distributive Lattice,Lattice Element,Model Check,Temporal Logic}, + file = {/home/sguillou/Zotero/storage/YFPWF7JY/Bruns and Godefroid - 2004 - Model Checking with Multi-valued Logics.pdf} } @inproceedings{bruttomessoOpenSMTSolver2010, @@ -673,9 +673,8 @@ @inproceedings{bruttomessoOpenSMTSolver2010 address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-642-12002-2_12}, urldate = {2021-09-29}, - abstract = {This paper describes OpenSMT, an incremental, efficient, and open-source SMT-solver. OpenSMT has been specifically designed to be easily extended with new theory-solvers, in order to be accessible for non-experts for the development of customized algorithms. We sketch the solver's architecture and interface. We discuss its distinguishing features w.r.t. other state-of-the-art solvers.}, isbn = {978-3-642-12001-5}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\PCMCEZS3\\Bruttomesso et al. - 2010 - The OpenSMT solver.pdf} + file = {/home/sguillou/Zotero/storage/PCMCEZS3/Bruttomesso et al. - 2010 - The OpenSMT solver.pdf} } @incollection{bruttomessoOpenSMTSolver2010a, @@ -690,10 +689,9 @@ @incollection{bruttomessoOpenSMTSolver2010a address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-642-12002-2_12}, urldate = {2021-10-01}, - abstract = {This paper describes OpenSMT, an incremental, efficient, and open-source SMT-Solver. OpenSMT has been specifically designed to be easily extended with new theory-solvers, in order to be accessible for non-experts for the development of customized algorithms. We sketch the solver's architecture and interface. We discuss its distinguishing features w.r.t. other state-of-the-art solvers.}, isbn = {978-3-642-12001-5 978-3-642-12002-2}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\4T69IYGD\\Bruttomesso et al. - 2010 - The OpenSMT Solver.pdf} + file = {/home/sguillou/Zotero/storage/4T69IYGD/Bruttomesso et al. - 2010 - The OpenSMT Solver.pdf} } @inproceedings{brzozowskiMorganBisemilattices2000, @@ -705,9 +703,8 @@ @inproceedings{brzozowskiMorganBisemilattices2000 pages = {173--178}, issn = {0195-623X}, doi = {10.1109/ISMVL.2000.848616}, - abstract = {We study de Morgan bisemilattices, which are algebras of the form (S, /spl cup/, /spl and/, /sup -/, 1, 0), where (S, /spl cup/, /spl and/) is a bisemilattice, 1 and 0 are the unit and zero elements of S, and /sup -/ is a unary operation, called quasi-complementation, that satisfies the involution law and de Morgan's laws. de Morgan bisemilattices are generalizations of de Morgan algebras, and have applications in multi-valued simulations of digital circuits. We present some basic observations about bisemilattices, and provide a set-theoretic characterization for a subfamily of de Morgan bisemilattices, which we call locally distributive de Morgan bilattices.}, keywords = {Absorption,Algebra,Circuit simulation,Computer science,Digital circuits,Inverters,Logic circuits}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\JXPNKLW3\\Brzozowski - 2000 - de Morgan bisemilattices.pdf;C\:\\Users\\Simon\\Zotero\\storage\\ET9M8XWZ\\848616.html} + file = {/home/sguillou/Zotero/storage/JXPNKLW3/Brzozowski - 2000 - de Morgan bisemilattices.pdf;/home/sguillou/Zotero/storage/ET9M8XWZ/848616.html} } @incollection{bussAlogtimeAlgorithmsTree1997, @@ -722,10 +719,9 @@ @incollection{bussAlogtimeAlgorithmsTree1997 address = {{Berlin, Heidelberg}}, doi = {10.1007/3-540-63385-5_30}, urldate = {2021-05-06}, - abstract = {The tree isomorphism problem is the problem of determining whether two trees are isomorphic. The tree canonization problem is the problem of producing a canonical tree isomorphic to a given tree. The tree comparison problem is the problem of determining whether one tree is less than a second tree in a natural ordering on trees. We present alternating logarithmic time algorithms for the tree isomorphism problem, the tree canonization problem and the tree comparison problem. As a consequence, there is a recursive enumeration of the alternating log time tree problems.}, isbn = {978-3-540-63385-3 978-3-540-69806-7}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\28UMFXDK\\Buss - 1997 - Alogtime algorithms for tree isomorphism, comparis.pdf} + file = {/home/sguillou/Zotero/storage/28UMFXDK/Buss - 1997 - Alogtime algorithms for tree isomorphism, comparis.pdf} } @incollection{bussChapterIntroductionProof1998, @@ -742,7 +738,7 @@ @incollection{bussChapterIntroductionProof1998 doi = {10.1016/S0049-237X(98)80016-5}, urldate = {2023-03-25}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\ESA6QW2A\\Buss - 1998 - Chapter I - An Introduction to Proof Theory.pdf;C\:\\Users\\Simon\\Zotero\\storage\\ZB7IS2K3\\S0049237X98800165.html} + file = {/home/sguillou/Zotero/storage/ESA6QW2A/Buss - 1998 - Chapter I - An Introduction to Proof Theory.pdf;/home/sguillou/Zotero/storage/ZB7IS2K3/S0049237X98800165.html} } @book{bussHandbookProofTheory1998, @@ -750,7 +746,7 @@ @book{bussHandbookProofTheory1998 author = {Buss, Samuel R.}, year = {1998}, publisher = {{Elsevier}}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\GH89ULP6\\BUSHOP.html} + file = {/home/sguillou/Zotero/storage/3EM7XZPT/Buss - 1998 - Handbook of Proof Theory.pdf;/home/sguillou/Zotero/storage/GH89ULP6/BUSHOP.html} } @misc{buzzardLeanPerfectoidSpaces, @@ -758,7 +754,7 @@ @misc{buzzardLeanPerfectoidSpaces author = {Buzzard, Kevin and Commelin, Johan and Massot, Patrick}, urldate = {2021-06-23}, howpublished = {https://leanprover-community.github.io/lean-perfectoid-spaces/}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\74V7LUCU\\lean-perfectoid-spaces.html} + file = {/home/sguillou/Zotero/storage/74V7LUCU/lean-perfectoid-spaces.html} } @article{caminatimarcobright;sapienza-universitadiromaBasicFirstorderModel2010, @@ -769,8 +765,7 @@ @article{caminatimarcobright;sapienza-universitadiromaBasicFirstorderModel2010 pages = {No 1 (2010); 4977}, publisher = {{Journal of Formalized ReasoningJournal of Formalized Reasoning}}, doi = {10.6092/ISSN.1972-5787/1974}, - urldate = {2023-04-22}, - abstract = {The author has submitted to Mizar Mathematical Library a series of five articles introducing a framework for the formalization of classical first-order model theory. In them, Goedel's completeness and Lowenheim-Skolem theorems have also been formalized for the countable case, to offer a first application of it and to showcase its utility. This is an overview and commentary on some key aspects of this setup. It features exposition and discussion of a new encoding of basic definitions and theoretical gears needed for the task, remarks about the design strategies and approaches adopted in their implementation, and more general reflections about proof checking induced by the work done.} + urldate = {2023-04-22} } @misc{caminatiSimplifiedFrameworkFirstorder2012, @@ -783,11 +778,10 @@ @misc{caminatiSimplifiedFrameworkFirstorder2012 primaryclass = {cs, math}, publisher = {{arXiv}}, urldate = {2023-04-22}, - abstract = {A strictly formal, set-theoretical treatment of classical first-order logic is given. Since this is done with the goal of a concrete Mizar formalization of basic results (Lindenbaum lemma; Henkin, satisfiability, completeness and Lowenheim-Skolem theorems) in mind, it turns into a systematic pursue of simplification: we give up the notions of free occurrence, of derivation tree, and study what inference rules are strictly needed to prove the mentioned results. Afterwards, we discuss details of the actual Mizar implementation, and give general techniques developed therein.}, archiveprefix = {arxiv}, langid = {english}, keywords = {{03C07, 97N80, 03B70},Computer Science - Logic in Computer Science,Mathematics - Logic}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\QI476R9T\\Caminati - 2012 - A simplified framework for first-order languages a.pdf} + file = {/home/sguillou/Zotero/storage/QI476R9T/Caminati - 2012 - A simplified framework for first-order languages a.pdf} } @article{cepekKnownNewClasses2005, @@ -803,10 +797,9 @@ @article{cepekKnownNewClasses2005 issn = {0166-218X}, doi = {10.1016/j.dam.2003.12.011}, urldate = {2023-06-10}, - abstract = {In this paper we study several classes of Boolean formulae which generalize Horn formulae while preserving one of their main properties, namely the fact that satisfiability is decidable in polynomial time. We compare the known classes with respect to inclusion and define a hierarchy of new classes, which properly contains some of the known classes.}, langid = {english}, keywords = {Conjunctive normal form,Horn Boolean formula,Polynomial time,Satisfiability}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\2PBVNZSS\\Čepek et Kučera - 2005 - Known and new classes of generalized Horn formulae.pdf;C\:\\Users\\Simon\\Zotero\\storage\\CH3WZNL2\\S0166218X05000697.html} + file = {/home/sguillou/Zotero/storage/2PBVNZSS/Čepek et Kučera - 2005 - Known and new classes of generalized Horn formulae.pdf;/home/sguillou/Zotero/storage/CH3WZNL2/S0166218X05000697.html} } @article{chajdaGeneralizationEffectAlgebras2012, @@ -822,10 +815,9 @@ @article{chajdaGeneralizationEffectAlgebras2012 issn = {1337-2211}, doi = {10.2478/s12175-012-0063-4}, urldate = {2023-06-28}, - abstract = {A common generalization of effect algebras and ortholattices that allows to represent ortholattices in a similar way in which orthomodular lattices are represented in the setting of effect algebras is introduced.}, langid = {english}, keywords = {effect algebra,generalized pre-effect algebra,ortholattice,pre-effect algebra}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\AHFDIIZU\\Chajda and Kühr - 2012 - A generalization of effect algebras and ortholatti.pdf} + file = {/home/sguillou/Zotero/storage/AHFDIIZU/Chajda and Kühr - 2012 - A generalization of effect algebras and ortholatti.pdf} } @misc{chajdaImplicationOrthologic2003, @@ -838,10 +830,9 @@ @misc{chajdaImplicationOrthologic2003 publisher = {{arXiv}}, doi = {10.48550/arXiv.quant-ph/0210083}, urldate = {2023-06-04}, - abstract = {We involve a certain propositional logic based on ortholattices. We characterize the implicational reduct of such a logic and we show that its algebraic counterpart is the so-called orthosemilattice. Properties of congruences and congruence kernels of these algebras are described.}, archiveprefix = {arxiv}, keywords = {Quantum Physics}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\C6DF6UPV\\Chajda et Halas - 2003 - An implication in orthologic.pdf;C\:\\Users\\Simon\\Zotero\\storage\\G3DUIL4J\\0210083.html} + file = {/home/sguillou/Zotero/storage/C6DF6UPV/Chajda et Halas - 2003 - An implication in orthologic.pdf;/home/sguillou/Zotero/storage/G3DUIL4J/0210083.html} } @article{chajdaImplicationOrthologic2005, @@ -856,7 +847,6 @@ @article{chajdaImplicationOrthologic2005 issn = {1572-9575}, doi = {10.1007/s10773-005-7051-1}, urldate = {2023-06-04}, - abstract = {We involve a certain propositional logic based on an ortholattice. We characterize the implication reduct of such a logic and show that its algebraic counterpart is the so-called orthosemilattice. Properties of congruences and congruence kernels of these algebras are described.}, langid = {english}, keywords = {congruence,congruence kernel,implication orthoalagebra,ortholattice,orthosemilattice} } @@ -873,7 +863,7 @@ @article{chajdaShefferOperationOrtholattices2005 issn = {0231-9721}, urldate = {2022-05-19}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\NR9IX5EB\\32449.html} + file = {/home/sguillou/Zotero/storage/NR9IX5EB/32449.html} } @article{chandruExtendedHornSets1991, @@ -888,9 +878,8 @@ @article{chandruExtendedHornSets1991 issn = {0004-5411}, doi = {10.1145/102782.102789}, urldate = {2023-06-12}, - abstract = {The class of Horn clause sets in propositional logic is extended to a larger class for which the satisfiability problem can still be solved by unit resolution in linear time. It is shown that to every arborescence there corresponds a family of extended Horn sets, where ordinary Horn sets correspond to stars with a root at the center. These results derive from a theorem of Chandresekaran that characterizes when an integer solution of a system of inequalities can be found by rounding a real solution in a certain way. A linear-time procedure is provided for identifying ``hidden'' extended Horn sets (extended Horn but for complementation of variables) that correspond to a specified arborescence. Finally, a way to interpret extended Horn sets in applications is suggested.}, keywords = {Horn clauses,propositional logic}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\6ZVTSZVM\\Chandru and Hooker - 1991 - Extended Horn sets in propositional logic.pdf} + file = {/home/sguillou/Zotero/storage/6ZVTSZVM/Chandru and Hooker - 1991 - Extended Horn sets in propositional logic.pdf} } @article{chandruRenamableHornGeneralized1990, @@ -905,10 +894,9 @@ @article{chandruRenamableHornGeneralized1990 issn = {1573-7470}, doi = {10.1007/BF01531069}, urldate = {2023-06-07}, - abstract = {A Boolean function in disjunctive normal form (DNF) is aHorn function if each of its elementary conjunctions involves at most one complemented variable. Ageneralized Horn function is constructed from a Horn function by disjuncting a nested set of complemented variables to it. The satisfiability problem is solvable in polynomial time for both Horn and generalized Horn functions. A Boolean function in DNF is said to berenamable Horn if it is Horn after complementation of some variables. Succinct mathematical characterizations and linear-time algorithms for recognizing renamable Horn and generalized Horn functions are given in this paper. The algorithm for recognizing renamable Horn functions gives a new method to test 2-SAT. Some computational results are also given.}, langid = {english}, keywords = {Computational logic,generalized Horn formulae,Horn formulae}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\7SSUI7GQ\\Chandru et al. - 1990 - On renamable Horn and generalized Horn functions.pdf} + file = {/home/sguillou/Zotero/storage/7SSUI7GQ/Chandru et al. - 1990 - On renamable Horn and generalized Horn functions.pdf} } @inproceedings{charatonikComplexityBernaysSchonfinkelClass2010, @@ -922,7 +910,6 @@ @inproceedings{charatonikComplexityBernaysSchonfinkelClass2010 publisher = {{Springer}}, address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-642-16242-8_14}, - abstract = {The Bernays-Sch\"onfinkel class with Datalog is a 2-variable fragment of the Bernays-Sch\"onfinkel class extended with least fixed points expressible by certain monadic Datalog programs. It was used in a bounded model checking procedure for programs manipulating dynamically allocated pointer structures, where the bounded model checking problem was reduced to the satisfiability of formulas in this logic. The best known upper bound on the complexity of the satisfiability problem for this logic was 2NEXPTIME.}, isbn = {978-3-642-16242-8}, langid = {english}, keywords = {Bounded Model Check,Model Check,Pointer Program,Reachability Problem,Transitive Closure} @@ -938,10 +925,9 @@ @inproceedings{chaudhuriProoftheoreticApproachCertifying2019 address = {{Cascais, Portugal}}, doi = {10.1145/3293880.3294094}, urldate = {2021-06-06}, - abstract = {When presented with a formula to prove, most theorem provers for classical first-order logic process that formula following several steps, one of which is commonly called skolemization. That process eliminates quantifier alternation within formulas by extending the language of the underlying logic with new Skolem functions and by instantiating certain quantifiers with terms built using Skolem functions. In this paper, we address the problem of checking (i.e., certifying) proof evidence that involves Skolem terms. Our goal is to do such certification without using the mathematical concepts of model-theoretic semantics (i.e., preservation of satisfiability) and choice principles (i.e., epsilon terms). Instead, our proof checking kernel is an implementation of Gentzen's sequent calculus, which directly supports quantifier alternation by using eigenvariables. We shall describe deskolemization as a mapping from client-side terms, used in proofs generated by theorem provers, into kernel-side terms, used within our proof checking kernel. This mapping which associates skolemized terms to eigenvariables relies on using outer skolemization. We also point out that the removal of Skolem terms from a proof is influenced by the polarities given to propositional connectives.}, isbn = {978-1-4503-6222-1}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\SIW89JPP\\Chaudhuri et al. - 2019 - A proof-theoretic approach to certifying skolemiza.pdf} + file = {/home/sguillou/Zotero/storage/SIW89JPP/Chaudhuri et al. - 2019 - A proof-theoretic approach to certifying skolemiza.pdf} } @inproceedings{chaudhuriTLAProofSystem2008, @@ -951,8 +937,7 @@ @inproceedings{chaudhuriTLAProofSystem2008 year = {2008}, month = jan, volume = {418}, - abstract = {We describe an extension to the TLA+ specification language with constructs for writing proofs and a proof environment, called the Proof Manager (PM), to checks those proofs. The language and the PM support the incremental development and checking of hierarchically structured proofs. The PM translates a proof into a set of independent proof obligations and calls upon a collection of back-end provers to verify them. Different provers can be used to verify different obligations. The currently supported back-ends are the tableau prover Zenon and Isabelle/TLA+, an axiomatisation of TLA+ in Isabelle/Pure. The proof obligations for a complete TLA+ proof can also be used to certify the theorem in Isabelle/TLA+.}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\LZPYNWUM\\Chaudhuri et al. - 2008 - A TLA+ proof system.pdf} + file = {/home/sguillou/Zotero/storage/LZPYNWUM/Chaudhuri et al. - 2008 - A TLA+ proof system.pdf} } @article{cholakCOMPLEXITYLOCALSTRATIFICATION1994, @@ -966,8 +951,7 @@ @article{cholakCOMPLEXITYLOCALSTRATIFICATION1994 issn = {01692968}, doi = {10.3233/FI-1994-2144}, urldate = {2023-06-29}, - abstract = {The class of locally stratified logic programs is shown to be Pi-1-1 complete by the construction of a reducibility of the class of infinitely branching nondeterministic finite register machines.}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\QVFG5HTB\\Cholak and Blair - 1994 - THE COMPLEXITY OF LOCAL STRATIFICATION.pdf} + file = {/home/sguillou/Zotero/storage/QVFG5HTB/Cholak and Blair - 1994 - THE COMPLEXITY OF LOCAL STRATIFICATION.pdf} } @inproceedings{clarkNegationFailure1977, @@ -977,7 +961,6 @@ @inproceedings{clarkNegationFailure1977 month = jan, pages = {293--322}, doi = {10.1007/978-1-4684-3384-5_11}, - abstract = {A query evaluation process for a logic data base comprising a set of clauses is described. It is essentially a Horn clause theorem prover augmented with a special inference rule for dealing with negation. This is the negation as failure inference rule whereby \textasciitilde{} P can be inferred if every possible proof of P fails. The chief advantage of the query evaluator described is the effeciency with which it can be implemented. Moreover, we show that the negation as failure rule only allows us to conclude negated facts that could be inferred from the axioms of the completed data base, a data base of relation definitions and equality schemas that we consider is implicitly given by the data base of clauses. We also show that when the clause data base and the queries satisfy certain constraints, which still leaves us with a data base more general than a conventional relational data base, the query evaluation process will find every answer that is a logical consequence of the completed data base.}, isbn = {978-1-4684-3386-9} } @@ -992,11 +975,10 @@ @incollection{clarkNegationFailure1978 address = {{Boston, MA}}, doi = {10.1007/978-1-4684-3384-5_11}, urldate = {2023-06-27}, - abstract = {A query evaluation process for a logic data base comprising a set of clauses is described. It is essentially a Horn clause theorem prover augmented with a special inference rule for dealing with negation. This is the negation as failure inference rule whereby \textasciitilde{} P can be inferred if every possible proof of P fails. The chief advantage of the query evaluator described is the effeciency with which it can be implemented. Moreover, we show that the negation as failure rule only allows us to conclude negated facts that could be inferred from the axioms of the completed data base, a data base of relation definitions and equality schemas that we consider is implicitly given by the data base of clauses. We also show that when the clause data base and the queries satisfy certain constraints, which still leaves us with a data base more general than a conventional relational data base, the query evaluation process will find every answer that is a logical consequence of the completed data base.}, isbn = {978-1-4684-3384-5}, langid = {english}, keywords = {Data Base,Evaluation Tree,Inference Rule,Query Evaluation,Selection Rule}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\FM34GGVJ\\Clark - 1977 - Negation as Failure.pdf} + file = {/home/sguillou/Zotero/storage/FM34GGVJ/Clark - 1977 - Negation as Failure.pdf} } @inproceedings{comonOrderingACtheoriesSymbolic1995, @@ -1009,17 +991,31 @@ @inproceedings{comonOrderingACtheoriesSymbolic1995 address = {{San Deigo, CA, USA}}, doi = {10.1109/LICS.1995.523272}, urldate = {2021-05-06}, - abstract = {We design combination techniques for symbolic constraint solving in the presence of associative and commutative ( A C ) function symbols. This yields an algorithm for solving AC-RPO constraa'nts (where ACRPO is the AC-compatible total reduction ordering of [16]), which was a missing ingredient for automated deduction strategies with AC-constraint inheritance [15, l!?].A s in th:e AC-unification case (actually the AC-unification algorithm of [9] is an instance of ours), for this purpose we first study the pure case, i.e. we show how to solve AC-ordering constraints built over a single A C function symbol and variables. Since AC-R,?O is an interpretation- based ordering, our algorithm also requires the combination of algorithms for solving interpreted constraints and non-interpreted constmints.}, isbn = {978-0-8186-7050-3}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\2IRY7Q2I\\Comon et al. - 1995 - Ordering, AC-theories and symbolic constraint solv.pdf} + file = {/home/sguillou/Zotero/storage/2IRY7Q2I/Comon et al. - 1995 - Ordering, AC-theories and symbolic constraint solv.pdf} +} + +@article{cookQuantifiedPropositionalCalculus2005, + title = {Quantified Propositional Calculus and a Second-Order Theory for {{NC1}}}, + author = {Cook, Stephen and Morioka, Tsuyoshi}, + year = {2005}, + month = aug, + journal = {Archive for Mathematical Logic}, + volume = {44}, + number = {6}, + pages = {711--749}, + issn = {0933-5846, 1432-0665}, + doi = {10.1007/s00153-005-0282-2}, + urldate = {2023-08-22}, + langid = {english} } @misc{CoqcommunityCoqtailmathCoqtail, title = {Coq-Community/Coqtail-Math: {{Coqtail}} Is a Library of Mathematical Theorems and Tools Proved inside the {{Coq}} Proof Assistant. {{Results}} Range Mostly from Arithmetic to Real and Complex Analysis. [Maintainer=@jmadiot]}, urldate = {2023-05-04}, howpublished = {https://github.com/coq-community/coqtail-math}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\4X6FRRPC\\coqtail-math.html} + file = {/home/sguillou/Zotero/storage/4X6FRRPC/coqtail-math.html} } @misc{CoqcontribsZfc2020, @@ -1027,7 +1023,6 @@ @misc{CoqcontribsZfc2020 year = {2020}, month = dec, urldate = {2021-06-14}, - abstract = {An encoding of Zermelo-Fraenkel Set Theory in Coq}, copyright = {LGPL-2.1}, howpublished = {coq-contribs} } @@ -1036,10 +1031,10 @@ @article{coquandAnalysisGirardParadox title = {An Analysis of {{Girard}}'s Paradox}, author = {Coquand, T}, pages = {25}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\8GDX3WKJ\\Coquand - An analysis of Girard's paradox.pdf} + file = {/home/sguillou/Zotero/storage/8GDX3WKJ/Coquand - An analysis of Girard's paradox.pdf} } -@article{coquandCalculusConstructions1988a, +@article{coquandCalculusConstructions1988, title = {The Calculus of Constructions}, author = {Coquand, Thierry and Huet, G{\'e}rard}, year = {1988}, @@ -1052,7 +1047,7 @@ @article{coquandCalculusConstructions1988a doi = {10.1016/0890-5401(88)90005-3}, urldate = {2021-06-11}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\DZQC9VC7\\Coquand et Huet - 1988 - The calculus of constructions.pdf;C\:\\Users\\Simon\\Zotero\\storage\\983ISZNR\\0890540188900053.html} + file = {/home/sguillou/Zotero/storage/DZQC9VC7/Coquand et Huet - 1988 - The calculus of constructions.pdf;/home/sguillou/Zotero/storage/983ISZNR/0890540188900053.html} } @article{cosmadakisWordGeneratorProblems1988, @@ -1068,7 +1063,39 @@ @article{cosmadakisWordGeneratorProblems1988 doi = {10.1016/0890-5401(88)90048-X}, urldate = {2021-09-24}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\J9U2KUGS\\Cosmadakis - 1988 - The word and generator problems for lattices.pdf} + file = {/home/sguillou/Zotero/storage/J9U2KUGS/Cosmadakis - 1988 - The word and generator problems for lattices.pdf} +} + +@inproceedings{cousotAbstractInterpretationUnified1977, + title = {Abstract Interpretation: A Unified Lattice Model for Static Analysis of Programs by Construction or Approximation of Fixpoints}, + shorttitle = {Abstract Interpretation}, + booktitle = {Proceedings of the 4th {{ACM SIGACT-SIGPLAN}} Symposium on {{Principles}} of Programming Languages}, + author = {Cousot, Patrick and Cousot, Radhia}, + year = {1977}, + month = jan, + series = {{{POPL}} '77}, + pages = {238--252}, + publisher = {{Association for Computing Machinery}}, + address = {{New York, NY, USA}}, + doi = {10.1145/512950.512973}, + urldate = {2023-09-07}, + isbn = {978-1-4503-7350-0}, + file = {/home/sguillou/Zotero/storage/4SJSTAAJ/Cousot and Cousot - 1977 - Abstract interpretation a unified lattice model f.pdf} +} + +@article{craigThreeUsesHerbrandGentzen1957, + title = {Three Uses of the {{Herbrand-Gentzen}} Theorem in Relating Model Theory and Proof Theory}, + author = {Craig, William}, + year = {1957}, + month = sep, + journal = {Journal of Symbolic Logic}, + volume = {22}, + number = {3}, + pages = {269--285}, + issn = {0022-4812, 1943-5886}, + doi = {10.2307/2963594}, + urldate = {2023-08-27}, + langid = {english} } @article{dantsinComplexityExpressivePower2001, @@ -1083,19 +1110,17 @@ @article{dantsinComplexityExpressivePower2001 issn = {0360-0300}, doi = {10.1145/502807.502810}, urldate = {2023-07-05}, - abstract = {This article surveys various complexity and expressiveness results on different forms of logic programming. The main focus is on decidable forms of logic programming, in particular, propositional logic programming and datalog, but we also mention general logic programming with function symbols. Next to classical results on plain logic programming (pure Horn clause programs), more recent results on various important extensions of logic programming are surveyed. These include logic programming with different forms of negation, disjunctive logic programming, logic programming with equality, and constraint logic programming.}, keywords = {Complexity,datalog,expressive power,logic programming,nonmonotonic logic,query languages}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\HLWWL7J4\\Dantsin et al. - 2001 - Complexity and expressive power of logic programmi.pdf} + file = {/home/sguillou/Zotero/storage/HLWWL7J4/Dantsin et al. - 2001 - Complexity and expressive power of logic programmi.pdf} } @article{daskalakisSortingSelectionPosets, title = {Sorting and {{Selection}} in {{Posets}}}, author = {Daskalakis, Constantinos and Karp, Richard M and Mossel, Elchanan and Verbin, Elad and Riesenfeld, Samantha}, pages = {10}, - abstract = {Classical problems of sorting and searching assume an underlying linear ordering of the objects being compared. In this paper, we study these problems in the context of partially ordered sets, in which some pairs of objects are incomparable. This generalization is interesting from a combinatorial perspective, and it has immediate applications in ranking scenarios where there is no underlying linear ordering, e.g., conference submissions. It also has applications in reconstructing certain types of networks, including biological networks.}, langid = {english}, keywords = {Needed}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\B2ZYVK5U\\Daskalakis et al. - Sorting and Selection in Posets.pdf} + file = {/home/sguillou/Zotero/storage/B2ZYVK5U/Daskalakis et al. - Sorting and Selection in Posets.pdf} } @book{daveyIntroductionLatticesOrder2002, @@ -1107,9 +1132,8 @@ @book{daveyIntroductionLatticesOrder2002 address = {{Cambridge}}, doi = {10.1017/CBO9780511809088}, urldate = {2023-07-07}, - abstract = {This new edition of Introduction to Lattices and Order presents a radical reorganization and updating, though its primary aim is unchanged. The explosive development of theoretical computer science in recent years has, in particular, influenced the book's evolution: a fresh treatment of fixpoints testifies to this and Galois connections now feature prominently. An early presentation of concept analysis gives both a concrete foundation for the subsequent theory of complete lattices and a glimpse of a methodology for data analysis that is of commercial value in social science. Classroom experience has led to numerous pedagogical improvements and many new exercises have been added. As before, exposure to elementary abstract algebra and the notation of set theory are the only prerequisites, making the book suitable for advanced undergraduates and beginning graduate students. It will also be a valuable resource for anyone who meets ordered structures.}, isbn = {978-0-521-78451-1}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\RFXQT9R7\\946458CB6638AF86D85BA00F5787F4F4.html} + file = {/home/sguillou/Zotero/storage/RFXQT9R7/946458CB6638AF86D85BA00F5787F4F4.html} } @article{davisMachineProgramTheoremproving1962, @@ -1124,8 +1148,7 @@ @article{davisMachineProgramTheoremproving1962 issn = {0001-0782}, doi = {10.1145/368273.368557}, urldate = {2021-10-01}, - abstract = {The programming of a proof procedure is discussed in connection with trial runs and possible improvements.}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\THXL5IR8\\Davis et al. - 1962 - A machine program for theorem-proving.pdf} + file = {/home/sguillou/Zotero/storage/THXL5IR8/Davis et al. - 1962 - A machine program for theorem-proving.pdf} } @article{dayDoublingConstructionsLattice1992, @@ -1141,10 +1164,9 @@ @article{dayDoublingConstructionsLattice1992 issn = {0008-414X, 1496-4279}, doi = {10.4153/CJM-1992-017-7}, urldate = {2022-05-23}, - abstract = {This paper examines the simultaneous doubling of multiple intervals of a lattice in great detail. In the case of a finite set of W-failure intervals, it is shown that there in a unique smallest lattice mapping homomorphically onto the original lattice, in which the set of W-failures is removed. A nice description of this new lattice is given. This technique is used to show that every lattice that is a bounded homomorphic image of a free lattice has a projective cover. It is also used to give a sufficient condition for a fintely presented lattice to be weakly atomic and shows that the problem of which finitely presented lattices are finite is closely related to the problem of characterizing those finite lattices with a finite W-cover.}, langid = {english}, keywords = {06B05,06B25}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\5B4MMHFR\\Day - 1992 - Doubling Constructions in Lattice Theory.pdf;C\:\\Users\\Simon\\Zotero\\storage\\F3YY9U3I\\FEDEB48AA3ACB4449CDDA7CDF7B1ED5F.html} + file = {/home/sguillou/Zotero/storage/5B4MMHFR/Day - 1992 - Doubling Constructions in Lattice Theory.pdf;/home/sguillou/Zotero/storage/F3YY9U3I/FEDEB48AA3ACB4449CDDA7CDF7B1ED5F.html} } @article{daySimpleSolutionWord1970, @@ -1160,9 +1182,8 @@ @article{daySimpleSolutionWord1970 issn = {0008-4395, 1496-4287}, doi = {10.4153/CMB-1970-051-0}, urldate = {2022-05-23}, - abstract = {Whitman [2] solved the word problem for lattices by giving an explicit construction of the free lattice, FL(X), on a given set of generators X.The solution is the following:For x, y {$\smallin$} X, and a, b, c, d {$\smallin$} FL(X),(W1) (W2) (W3) (W4) where [p, q] = \{x; p {$\leq$} x {$\leq$} q\}.The purpose of this note is to give a simple nonconstructive proof that the condition (W4) must hold in every projective (hence every free) lattice. Jonsson [1] has shown that in every equational class of lattices (Wl), (W2), and (W3) hold. Therefore the combination of these results gives a complete nonconstructive solution to the word problem for lattices.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\9Q46PF7T\\Day - 1970 - A Simple Solution to the Word Problem for Lattices.pdf;C\:\\Users\\Simon\\Zotero\\storage\\7MGC562B\\B4B9B2CBCCC01ED44BF83EB4B8FBAB97.html} + file = {/home/sguillou/Zotero/storage/9Q46PF7T/Day - 1970 - A Simple Solution to the Word Problem for Lattices.pdf;/home/sguillou/Zotero/storage/7MGC562B/B4B9B2CBCCC01ED44BF83EB4B8FBAB97.html} } @article{daySplittingLatticesGenerate1977, @@ -1179,7 +1200,14 @@ @article{daySplittingLatticesGenerate1977 urldate = {2022-05-23}, langid = {english}, keywords = {Algebra UNIV,Finite Lattice,Homomorphic Image,Inverse Limit,Subdirect Product}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\TGDZ89UI\\Day - 1977 - Splitting lattices generate all lattices.pdf} + file = {/home/sguillou/Zotero/storage/TGDZ89UI/Day - 1977 - Splitting lattices generate all lattices.pdf} +} + +@misc{DblpSearchAbstraction, + title = {Dblp: {{Search}} for "Abstraction from Proof"}, + urldate = {2023-08-28}, + howpublished = {https://dblp.org/search?q=abstraction\%20from\%20proof}, + file = {/home/sguillou/Zotero/storage/MC7ZNSQY/search.html} } @inproceedings{degtyarevDecidabilityProblemsPrenex1996, @@ -1191,9 +1219,8 @@ @inproceedings{degtyarevDecidabilityProblemsPrenex1996 pages = {503--512}, issn = {1043-6871}, doi = {10.1109/LICS.1996.561467}, - abstract = {We develop a constraint-based technique which allows one to prove decidability and complexity results for sequent calculi. Specifically, we study decidability problems for the prenex fragment of intuitionistic logic. We introduce an analogue of Skolemization for intuitionistic logic with equality, prove PSPACE-completeness of two fragments of intuitionistic logic with and without equality and some other results. In the proofs, we use a combination of techniques of constraint satisfaction, loop-free sequent systems of intuitionistic logic and properties of simultaneous rigid E-unification.}, keywords = {Automatic logic units,Calculus,complexity,constraint satisfaction,constraint-based technique,decidability,formal logic,intuitionistic logic,loop-free sequent systems,Polynomials,prenex fragment,PSPACE-completeness,sequent calculi,Skolemization}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\CN9E8NQP\\Degtyarev et Voronkov - 1996 - Decidability problems for the prenex fragment of i.pdf;C\:\\Users\\Simon\\Zotero\\storage\\XXMVGPYQ\\561467.html} + file = {/home/sguillou/Zotero/storage/CN9E8NQP/Degtyarev et Voronkov - 1996 - Decidability problems for the prenex fragment of i.pdf;/home/sguillou/Zotero/storage/XXMVGPYQ/561467.html} } @incollection{dejonghPositiveFormulasIntuitionistic2015, @@ -1208,10 +1235,9 @@ @incollection{dejonghPositiveFormulasIntuitionistic2015 address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-662-46906-4_11}, urldate = {2023-03-20}, - abstract = {In this article we investigate the positive, i.e. {$\lnot$}, {$\perp$}-free formulas of intuitionistic propositional and predicate logic, IPC and IQC, and minimal logic, MPC and MQC. For each formula {$\phi$} of IQC we define the positive formula {$\phi$}+ that represents the positive content of {$\phi$}. The formulas {$\phi$} and {$\phi$}+ exhibit the same behavior on top models, models with a largest world that makes all atomic sentences true. We characterize the positive formulas of IPC and IQC as the formulas that are immune to the operation of turning a model into a top model. With the +-operation we show, using the uniform interpolation theorem for IPC, that both the positive fragment of IPC and MPC respect a revised version of uniform interpolation. In propositional logic the well-known theorem that KC is conservative over the positive fragment of IPC is shown to generalize to many logics with positive axioms. In first-order logic, we show that IQC + DNS (double negation shift) + KC is conservative over the positive fragment of IQC and similar results as for IPC.}, isbn = {978-3-662-46905-7 978-3-662-46906-4}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\FD2HK2CP\\de Jongh and Zhao - 2015 - Positive Formulas in Intuitionistic and Minimal Lo.pdf} + file = {/home/sguillou/Zotero/storage/FD2HK2CP/de Jongh and Zhao - 2015 - Positive Formulas in Intuitionistic and Minimal Lo.pdf} } @inproceedings{demouraGeneralizedEfficientArray2009, @@ -1222,9 +1248,8 @@ @inproceedings{demouraGeneralizedEfficientArray2009 month = nov, pages = {45--52}, doi = {10.1109/FMCAD.2009.5351142}, - abstract = {The theory of arrays is ubiquitous in the context of software and hardware verification and symbolic analysis. The basic array theory was introduced by McCarthy and allows to symbolically representing array updates. In this paper we present combinatory array logic, CAL, using a small, but powerful core of combinators, and reduce it to the theory of uninterpreted functions. CAL allows expressing properties that go well beyond the basic array theory. We provide a new efficient decision procedure for the base theory as well as CAL. The efficient procedure serves a critical role in the performance of the state-of-the-art SMT solver Z3 on array formulas from applications.}, keywords = {Arithmetic,Automata,basic array theory,combinatory array logic,Constraint theory,decision theory,Delay,efficient array decision procedure,Equations,Filters,formal logic,Formal verification,Hardware,hardware verification,Logic arrays,satisfiability modulo theory,SMT solver Z3,software verification,Surface-mount technology,symbolic analysis,theorem proving}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\YTHRIRKA\\5351142.html} + file = {/home/sguillou/Zotero/storage/YTHRIRKA/5351142.html} } @incollection{demouraLeanTheoremProver2015, @@ -1239,10 +1264,9 @@ @incollection{demouraLeanTheoremProver2015 address = {{Cham}}, doi = {10.1007/978-3-319-21401-6_26}, urldate = {2022-05-09}, - abstract = {Lean is a new open source theorem prover being developed at Microsoft Research and Carnegie Mellon University, with a small trusted kernel based on dependent type theory. It aims to bridge the gap between interactive and automated theorem proving, by situating automated tools and methods in a framework that supports user interaction and the construction of fully specified axiomatic proofs. Lean is an ongoing and long-term effort, but it already provides many useful components, integrated development environments, and a rich API which can be used to embed it into other systems. It is currently being used to formalize category theory, homotopy type theory, and abstract algebra. We describe the project goals, system architecture, and main features, and we discuss applications and continuing work.}, isbn = {978-3-319-21400-9 978-3-319-21401-6}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\3EIK6M9T\\de Moura et al. - 2015 - The Lean Theorem Prover (System Description).pdf} + file = {/home/sguillou/Zotero/storage/3EIK6M9T/de Moura et al. - 2015 - The Lean Theorem Prover (System Description).pdf} } @inproceedings{demouraZ3EfficientSMT2008, @@ -1257,11 +1281,10 @@ @inproceedings{demouraZ3EfficientSMT2008 publisher = {{Springer}}, address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-540-78800-3_24}, - abstract = {Satisfiability Modulo Theories (SMT) problem is a decision problem for logical first order formulas with respect to combinations of background theories such as: arithmetic, bit-vectors, arrays, and uninterpreted functions. Z3 is a new and efficient SMT Solver freely available from Microsoft Research. It is used in various software verification and analysis applications.}, isbn = {978-3-540-78800-3}, langid = {english}, keywords = {Bound Model Check,Linear Arithmetic,Predicate Abstraction,Symbolic Execution,Theory Solver}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\ZPJPHH5J\\de Moura and Bjørner - 2008 - Z3 An Efficient SMT Solver.pdf} + file = {/home/sguillou/Zotero/storage/ZPJPHH5J/de Moura and Bjørner - 2008 - Z3 An Efficient SMT Solver.pdf} } @inproceedings{duarteImplementingSuperpositionIProver2020, @@ -1277,7 +1300,22 @@ @inproceedings{duarteImplementingSuperpositionIProver2020 address = {{Paris}}, doi = {10.1007/978-3-030-51054-1_24}, urldate = {2022-10-11}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\ZHSFZYF6\\Duarte and Korovin - 2020 - Implementing Superposition in iProver (System Desc.pdf} + file = {/home/sguillou/Zotero/storage/ZHSFZYF6/Duarte and Korovin - 2020 - Implementing Superposition in iProver (System Desc.pdf} +} + +@inproceedings{dudenhefnerSimplerUndecidabilityProof2019, + title = {A {{Simpler Undecidability Proof}} for {{System F Inhabitation}}}, + booktitle = {{{TYPES}}}, + author = {Dudenhefner, Andrej and Rehof, Jakob}, + year = {2019}, + pages = {11 pages}, + publisher = {{Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik GmbH, Wadern/Saarbruecken, Germany}}, + doi = {10.4230/LIPICS.TYPES.2018.2}, + urldate = {2023-09-01}, + collaborator = {Wagner, Michael}, + copyright = {Creative Commons Attribution 3.0 Unported license (CC-BY 3.0)}, + langid = {english}, + keywords = {{000 Computer science, knowledge, general works},Computer Science} } @incollection{eenEffectivePreprocessingSAT2005, @@ -1292,10 +1330,9 @@ @incollection{eenEffectivePreprocessingSAT2005 address = {{Berlin, Heidelberg}}, doi = {10.1007/11499107_5}, urldate = {2022-06-22}, - abstract = {Preprocessing SAT instances can reduce their size considerably. We combine variable elimination with subsumption and selfsubsuming resolution, and show that these techniques not only shrink the formula further than previous preprocessing efforts based on variable elimination, but also decrease runtime of SAT solvers substantially for typical industrial SAT problems. We discuss critical implementation details that make the reduction procedure fast enough to be practical.}, isbn = {978-3-540-26276-3 978-3-540-31679-4}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\CL4RSF65\\Eén and Biere - 2005 - Effective Preprocessing in SAT Through Variable an.pdf} + file = {/home/sguillou/Zotero/storage/CL4RSF65/Eén and Biere - 2005 - Effective Preprocessing in SAT Through Variable an.pdf} } @article{eglyDifferentProofSearchStrategies2003, @@ -1310,10 +1347,9 @@ @article{eglyDifferentProofSearchStrategies2003 issn = {1572-8730}, doi = {10.1023/A:1022993408070}, urldate = {2023-06-09}, - abstract = {In this paper, we consider three different search strategies for a cut-free sequent system formalizing orthologic, and estimate the respective search spaces. Applying backward search, there are classes of formulae for which both the minimal proof length and the search space are exponential. In a combined forward and backward approach, all proofs are polynomial, but the potential search space remains exponential. Using a forward strategy, the potential search space becomes polynomial yielding a polynomial decision procedure for orthologic and the word problem for free ortholattices.}, langid = {english}, keywords = {Gentzen system,minimal quantum logic,orthologic,polynomial decision procedure,proof complexity,proof theory}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\ZMBUFBGR\\Egly et Tompits - 2003 - On Different Proof-Search Strategies for Orthologi.pdf} + file = {/home/sguillou/Zotero/storage/ZMBUFBGR/Egly et Tompits - 2003 - On Different Proof-Search Strategies for Orthologi.pdf} } @article{eglyNoTitleFound2003, @@ -1326,8 +1362,7 @@ @article{eglyNoTitleFound2003 pages = {131--152}, issn = {00393215}, doi = {10.1023/A:1022993408070}, - urldate = {2023-06-09}, - abstract = {In this paper, we consider three different search strategies for a cut-free sequent system formalizing orthologic, and estimate the respective search spaces. Applying backward search, there are classes of formulae for which both the minimal proof length and the search space are exponential. In a combined forward and backward approach, all proofs are polynomial, but the potential search space remains exponential. Using a forward strategy, the potential search space becomes polynomial yielding a polynomial decision procedure for orthologic and the word problem for free ortholattices.} + urldate = {2023-06-09} } @article{eiterDisjunctiveDatalog1997, @@ -1342,8 +1377,7 @@ @article{eiterDisjunctiveDatalog1997 issn = {0362-5915}, doi = {10.1145/261124.261126}, urldate = {2023-06-26}, - abstract = {We consider disjunctive Datalog, a powerful database query language based on disjunctive logic programming. Briefly, disjunctive Datalog is a variant of Datalog where disjunctions may appear in the rule heads; advanced versions also allow for negation in the bodies which can be handled according to a semantics for negation in disjunctive logic programming. In particular, we investigate three different semantics for disjunctive Datalog: the minimal model semantics the perfect model semantics, and the stable model semantics. For each of these semantics, the expressive power and complexity are studied. We show that the possibility variants of these semantics express the same set of queries. In fact, they precisely capture the complexity class {$\Sigma$}P2. Thus, unless the Polynomial Hierarchy collapses, disjunctive Datalog is more expressive that normal logic programming with negation. These results are not only of theoretical interest; we demonstrate that problems relevant in practice such as computing the optimal tour value in the Traveling Salesman Problem and eigenvector computations can be handled in disjunctive Datalog, but not Datalog with negation (unless the Polynomial Hierarchy collapses). In addition, we study modularity properties of disjunctive Datalog and investigate syntactic restrictions of the formalisms.}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\RALY733G\\Eiter et al. - 1997 - Disjunctive datalog.pdf} + file = {/home/sguillou/Zotero/storage/RALY733G/Eiter et al. - 1997 - Disjunctive datalog.pdf} } @article{eiterRecognizingRenamableGeneralized1995, @@ -1358,18 +1392,16 @@ @article{eiterRecognizingRenamableGeneralized1995 issn = {0166-218X}, doi = {10.1016/0166-218X(93)E0152-O}, urldate = {2023-06-10}, - abstract = {Yamasaki and Doshita (1983) have defined an extension of the class of propositional Horn formulas; later, Gallo and Scutell\`a (1988) generalized this class to a hierarchy {$\Gamma$}o {$\subset$}- {$\Gamma$}1 {$\subset$}- \ldots{} {$\subset$}- {$\Gamma$}k {$\subset$}- \ldots, where {$\Gamma$}o is the set of Horn formulas and {$\Gamma$}1 is the class of Yamasaki and Doshita. For any fixed k, the propositional formulas in {$\Gamma$}k can be recognized in polynomial time, and the satisfiability problem for {$\Gamma$}k formulas can be solved in polynomial time. A possible way of extending these tractable subclasses of the satisfiability problem is to consider renamings: a renaming of a formula is obtained by replacing for some variables all their positive occurrences by negative occurrences and vice versa. The class of renamings of Horn formulas can be recognized in linear time. Chandru et al. (1990) have posed the problem of deciding whether the renamings of {$\Gamma$}1 formulas can be recognized efficiently. We show that this is probably not the case by proving the NP-completeness of recognizing the renamings of {$\Gamma$}k formulas for any k {$\geqslant$} 1.}, langid = {english}, keywords = {Generalized Horn clauses,NP-completeness,Renamable Horn clauses,Satisfiability problem}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\MLIYLYTS\\Eiter et al. - 1995 - Recognizing renamable generalized propositional Ho.pdf;C\:\\Users\\Simon\\Zotero\\storage\\BDFU6TBZ\\0166218X93E0152O.html} + file = {/home/sguillou/Zotero/storage/MLIYLYTS/Eiter et al. - 1995 - Recognizing renamable generalized propositional Ho.pdf;/home/sguillou/Zotero/storage/BDFU6TBZ/0166218X93E0152O.html} } @article{EnforcingLanguageSemanticsUsing, title = {Enforcing {{Language SemanticsUsing Proof-Carrying Data}}}, pages = {39}, - abstract = {Sound reasoning about the behavior of programs relies on program execution adhering to the language semantics. However, in a distributed computation, when a value is sent from one party to another, the receiver faces the question of whether the value is well-traced: could it have been produced by a computation that respects the language semantics? If not, then accepting the non-well-traced value may invalidate the receiver's reasoning, leading to bugs or vulnerabilities.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\2WUHSE39\\Enforcing Language SemanticsUsing Proof-Carrying D.pdf} + file = {/home/sguillou/Zotero/storage/2WUHSE39/Enforcing Language SemanticsUsing Proof-Carrying D.pdf} } @misc{EPFLCombinationalBenchmark2022, @@ -1377,7 +1409,6 @@ @misc{EPFLCombinationalBenchmark2022 year = {2022}, month = oct, urldate = {2022-10-13}, - abstract = {EPFL logic synthesis benchmarks}, copyright = {MIT}, howpublished = {lsils} } @@ -1394,9 +1425,8 @@ @inproceedings{even-mendozaLatticebasedSMTProgram2019 address = {{New York, NY, USA}}, doi = {10.1145/3359986.3361214}, urldate = {2021-09-29}, - abstract = {We present a lattice-based satisfiability modulo theory for verification of programs with library functions, for which the mathematical libraries supporting these functions contain a high number of equations and inequalities. Common strategies for dealing with library functions include treating them as uninterpreted functions or using the theories under which the functions are fully defined. The full definition could in most cases lead to instances that are too large to solve efficiently. Our lightweight theory uses lattices for efficient representation of library functions by a subset of guarded literals. These lattices are constructed from equations and inequalities of properties of the library functions. These subsets are found during the lattice traversal. We generalise the method to a number of lattices for functions whose values depend on each other in the program, and we describe a simultaneous traversal algorithm of several lattices, so that a combination of guarded literals from all lattices does not lead to contradictory values of their variables. We evaluate our approach on benchmarks taken from the robotics community, and our experimental results demonstrate that we are able to solve a number of instances that were previously unsolvable by existing SMT solvers.}, isbn = {978-1-4503-6997-8}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\AU6W8CT9\\Even-Mendoza et al. - 2019 - Lattice-based SMT for program verification.pdf} + file = {/home/sguillou/Zotero/storage/AU6W8CT9/Even-Mendoza et al. - 2019 - Lattice-based SMT for program verification.pdf} } @article{falaschiSynchronizationLogicAxiomatics1984, @@ -1412,9 +1442,8 @@ @article{falaschiSynchronizationLogicAxiomatics1984 issn = {0019-9958}, doi = {10.1016/S0019-9958(84)80021-2}, urldate = {2023-06-08}, - abstract = {An extension of Horn clause logic is defined based on the introduction of a synchronization operator. Generalized Horn clauses (GHC) are introduced through an informal description of their operational semantics, which allows discussion of some typical synchronization problems. GHC are first considered formally as a programming language by defining the syntax, the operational semantics, the model-theoretic semantics, and the fixed-point semantics. The above mentioned semantics are given in the Van Emden-Kowalski style (1976, J. Assoc. Comput. Mach. 23, 733\textendash 742) and are proved equivalent. GHC are then characterized as axiomatic theories. A set of axiom schemata concerned with the newly introduced synchronization operator is defined and it is proved that the operational semantics inference rule is both sound and complete. Finally, the relation between GHC and Horn clauses is analyzed, and it is proved that Horn clause logic is strictly included in the generalized Horn clause logic.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\6H7FXD2X\\Falaschi et al. - 1984 - A synchronization logic Axiomatics and formal sem.pdf} + file = {/home/sguillou/Zotero/storage/6H7FXD2X/Falaschi et al. - 1984 - A synchronization logic Axiomatics and formal sem.pdf} } @article{farmerSevenVirtuesSimple2008, @@ -1429,10 +1458,9 @@ @article{farmerSevenVirtuesSimple2008 issn = {1570-8683}, doi = {10.1016/j.jal.2007.11.001}, urldate = {2020-10-20}, - abstract = {Simple type theory, also known as higher-order logic, is a natural extension of first-order logic which is simple, elegant, highly expressive, and practical. This paper surveys the virtues of simple type theory and attempts to show that simple type theory is an attractive alternative to first-order logic for practical-minded scientists, engineers, and mathematicians. It recommends that simple type theory be incorporated into introductory logic courses offered by mathematics departments and into the undergraduate curricula for computer science and software engineering students.}, langid = {english}, keywords = {Complete ordered field,Higher-order logic,Nonstandard models,Peano arithmetic,Practical logics,Print,Read,Type theory}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\M8EFWEUK\\Farmer - 2008 - The seven virtues of simple type theory.pdf;C\:\\Users\\Simon\\Zotero\\storage\\QEETNGJV\\S157086830700081X.html} + file = {/home/sguillou/Zotero/storage/M8EFWEUK/Farmer - 2008 - The seven virtues of simple type theory.pdf;/home/sguillou/Zotero/storage/QEETNGJV/S157086830700081X.html} } @article{FixpointSemanticsLogic2002, @@ -1446,8 +1474,7 @@ @article{FixpointSemanticsLogic2002 issn = {0304-3975}, doi = {10.1016/S0304-3975(00)00330-3}, urldate = {2023-06-26}, - abstract = {The variety of semantical approaches that have been invented for logic programs is quite broad, drawing on classical and many-valued logic, lattice theory, game theory, and topology. One source of this richness is the inherent non-monotonicity of its negation, something that does not have close parallels with the machinery of other programming paradigms. Nonetheless, much of the work on logic programming semantics seems to exist side by side with similar work done for imperative and functional programming, with relatively minimal contact between communities. In this paper we summarize one variety of approaches to the semantics of logic programs: that based on fixpoint theory. We do not attempt to cover much beyond this single area, which is already remarkably fruitful. We hope readers will see parallels with, and the divergences from the better known fixpoint treatments developed for other programming methodologies.}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\JPCL65E8\\2002 - Fixpoint semantics for logic programming a survey.pdf} + file = {/home/sguillou/Zotero/storage/JPCL65E8/2002 - Fixpoint semantics for logic programming a survey.pdf} } @article{freeseFinitelyPresentedLattices1979, @@ -1464,8 +1491,7 @@ @article{freeseFinitelyPresentedLattices1979 issn = {0002-9939}, doi = {10.2307/2042634}, urldate = {2022-08-29}, - abstract = {It is shown that the generalized word problem for lattices is solvable. Moreover, one can recursively decide if two finitely presented lattices are isomorphic. It is also shown that the automorphism group of a finitely presented lattice is finite.}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\EWWR7ULW\\Freese and Nation - 1979 - Finitely Presented Lattices.pdf} + file = {/home/sguillou/Zotero/storage/EWWR7ULW/Freese and Nation - 1979 - Finitely Presented Lattices.pdf} } @article{freeseFINITELYPRESENTEDLATTICES1989, @@ -1473,9 +1499,8 @@ @article{freeseFINITELYPRESENTEDLATTICES1989 author = {Freese, Ralph}, year = {1989}, month = apr, - abstract = {A canonical form for elements of a lattice freely generated by a partial lattice is given. This form agrees with Whitman's canonical form for free lattices when the partial lattice is an antichain. The connection between this canonical form and the arithmetic of the lattice is given. For example, it is shown that every element of a finitely presented lattice has only finitely many minimal join representations and that every join representation can be refined to one of these. An algorithm is given which decides if a given element of a finitely presented lattice has a cover and finds them if it does. An example is given of a nontrivial, finitely presented lattice with no cover at all.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\S2WULM66\\Freese - FINITELY PRESENTED LATTICES CANONICALFORMS AND TH.pdf} + file = {/home/sguillou/Zotero/storage/S2WULM66/Freese - FINITELY PRESENTED LATTICES CANONICALFORMS AND TH.pdf} } @book{freeseFreeLattices1995, @@ -1491,7 +1516,7 @@ @book{freeseFreeLattices1995 urldate = {2022-04-29}, isbn = {978-0-8218-0389-9 978-1-4704-1273-9}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\VACVUUBX\\Freese et al. - 1995 - Free Lattices.pdf} + file = {/home/sguillou/Zotero/storage/VACVUUBX/Freese et al. - 1995 - Free Lattices.pdf} } @article{furerQuotientsBoundedNatural2022, @@ -1506,10 +1531,9 @@ @article{furerQuotientsBoundedNatural2022 issn = {1860-5974}, doi = {10.46298/lmcs-18(1:23)2022}, urldate = {2022-03-20}, - abstract = {The functorial structure of type constructors is the foundation for many definition and proof principles in higher-order logic (HOL). For example, inductive and coinductive datatypes can be built modularly from bounded natural functors (BNFs), a class of well-behaved type constructors. Composition, fixpoints, and, under certain conditions, subtypes are known to preserve the BNF structure. In this article, we tackle the preservation question for quotients, the last important principle for introducing new types in HOL. We identify sufficient conditions under which a quotient inherits the BNF structure from its underlying type. Surprisingly, lifting the structure in the obvious manner fails for some quotients, a problem that also affects the quotients of polynomial functors used in the Lean proof assistant. We provide a strictly more general lifting scheme that supports such problematic quotients. We extend the Isabelle/HOL proof assistant with a command that automates the registration of a quotient type as a BNF, reducing the proof burden on the user from the full set of BNF axioms to our inheritance conditions. We demonstrate the command's usefulness through several case studies.}, archiveprefix = {arxiv}, keywords = {Computer Science - Logic in Computer Science,Computer Science - Programming Languages}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\LFSZIGZF\\Fürer et al. - 2022 - Quotients of Bounded Natural Functors.pdf;C\:\\Users\\Simon\\Zotero\\storage\\NKZGJAE9\\2104.html} + file = {/home/sguillou/Zotero/storage/LFSZIGZF/Fürer et al. - 2022 - Quotients of Bounded Natural Functors.pdf;/home/sguillou/Zotero/storage/NKZGJAE9/2104.html} } @book{galmicheAutomatedReasoningAnalytic2013, @@ -1523,12 +1547,11 @@ @book{galmicheAutomatedReasoningAnalytic2013 publisher = {{Springer Berlin Heidelberg : Imprint: Springer}}, address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-642-40537-2}, - abstract = {This book constitutes the refereed proceedings of the 22th International Conference on Automated Reasoning with Analytic Tableaux and Related Methods, TABLEAUX 2013, held in Nancy, France, in September 2013. The 20 revised research papers presented together with 4 system descriptions were carefully reviewed and selected from 38 submissions. The papers cover many topics as proof-theory in classical and non-classical logics, analytic tableaux for various logics, related techniques and concepts, e.g., model checking and BDDs, related methods (model elimination, sequent calculi, resolution, and connection method), new calculi and methods for theorem proving and verification in classical and non-classical logics, systems, tools, implementations and applications as well as automated deduction and formal methods applied to logic, mathematics, software development, protocol verification, and security}, isbn = {978-3-642-40537-2}, langid = {english}, lccn = {006.3}, keywords = {Artificial intelligence,Artificial Intelligence,Computer logic,Computer programming,Computer science,Discrete Mathematics in Computer Science,Logics and Meanings of Programs,Mathematical logic,Mathematical Logic and Formal Languages,Mathematics,Programming Techniques,Software engineering,Software Engineering}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\RPJAERJF\\Galmiche et Larchey-Wendling - 2013 - Automated Reasoning with Analytic Tableaux and Rel.pdf} + file = {/home/sguillou/Zotero/storage/RPJAERJF/Galmiche et Larchey-Wendling - 2013 - Automated Reasoning with Analytic Tableaux and Rel.pdf} } @incollection{ganzingerDPLLFastDecision2004, @@ -1544,10 +1567,9 @@ @incollection{ganzingerDPLLFastDecision2004 address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-540-27813-9_14}, urldate = {2021-10-01}, - abstract = {The logic of equality with uninterpreted functions (EUF) and its extensions have been widely applied to processor verification, by means of a large variety of progressively more sophisticated (lazy or eager ) translations into propositional SAT. Here we propose a new approach, namely a general DPLL(X) engine, whose parameter X can be instantiated with a specialized solver SolverT for a given theory T , thus producing a system DPLL(T ). We describe this DPLL(T ) scheme, the interface between DPLL(X) and SolverT , the architecture of DPLL(X), and our solver for EUF, which includes incremental and backtrackable congruence closure algorithms for dealing with the built-in equality and the integer successor and predecessor symbols. Experiments with a first implementation indicate that our technique already outperforms the previous methods on most benchmarks, and scales up very well.}, isbn = {978-3-540-22342-9 978-3-540-27813-9}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\WFMXAJ32\\Ganzinger et al. - 2004 - DPLL(T) Fast Decision Procedures.pdf} + file = {/home/sguillou/Zotero/storage/WFMXAJ32/Ganzinger et al. - 2004 - DPLL(T) Fast Decision Procedures.pdf} } @inproceedings{ganzingerRelatingSemanticProoftheoretic2001, @@ -1559,9 +1581,8 @@ @inproceedings{ganzingerRelatingSemanticProoftheoretic2001 pages = {81--90}, issn = {1043-6871}, doi = {10.1109/LICS.2001.932485}, - abstract = {Compares three approaches to polynomial-time decidability for uniform word problems for quasi-varieties. Two of the approaches, by T. Evans (1951) and S. Burris (1995), respectively, are semantic, referring to certain embeddability and axiomatizability properties. The third approach is more proof-theoretic in nature, inspired by D. McAllester's (1993) concept of local inference. We define two closely related notions of locality for equational Horn theories and show that both of the criteria of Evans and Burris lie in between these two concepts. In particular, the variant we call "stable locality" is shown to subsume both Evans' and Burris's methods.}, keywords = {Algebra,Counting circuits,Dynamic programming,Encoding,Equations,Geometry,Lattices,Polynomials,Print}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\GSS58EJB\\Ganzinger - 2001 - Relating semantic and proof-theoretic concepts for.pdf;C\:\\Users\\Simon\\Zotero\\storage\\QPFMI745\\932485.html} + file = {/home/sguillou/Zotero/storage/GSS58EJB/Ganzinger - 2001 - Relating semantic and proof-theoretic concepts for.pdf;/home/sguillou/Zotero/storage/QPFMI745/932485.html} } @article{gelfondClassicalNegationLogic1991, @@ -1576,10 +1597,9 @@ @article{gelfondClassicalNegationLogic1991 issn = {1882-7055}, doi = {10.1007/BF03037169}, urldate = {2023-06-26}, - abstract = {An important limitation of traditional logic programming as a knowledge representation tool, in comparison with classical logic, is that logic programming does not allow us to deal directly with incomplete information. In order to overcome this limitation, we extend the class of general logic programs by including classical negation, in addition to negation-as-failure. The semantics of such extended programs is based on the method of stable models. The concept of a disjunctive database can be extended in a similar way. We show that some facts of commonsense knowledge can be represented by logic programs and disjunctive databases more easily when classical negation is available. Computationally, classical negation can be eliminated from extended programs by a simple preprocessor. Extended programs are identical to a special case of default theories in the sense of Reiter.}, langid = {english}, keywords = {Disjunctive Databases,Incomplete Information,Logic Programming,Negation as Failure,Nonmonotonic Reasoning,Stable Models}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\FX9KQ65C\\Gelfond and Lifschitz - 1991 - Classical negation in logic programs and disjuncti.pdf} + file = {/home/sguillou/Zotero/storage/FX9KQ65C/Gelfond and Lifschitz - 1991 - Classical negation in logic programs and disjuncti.pdf} } @inproceedings{gelfondStableModelSemantics1988, @@ -1588,8 +1608,7 @@ @inproceedings{gelfondStableModelSemantics1988 author = {Gelfond, M. and Lifschitz, V.}, year = {1988}, urldate = {2023-06-26}, - abstract = {This paper studies the stable model semantics of logic programs with (abstract) constraint atoms and their properties. We introduce a succinct abstract representation of these constraint atoms in which a constraint atom is represented compactly. We show two applications. First, under this representation of constraint atoms, we generalize the Gelfond\textendash Lifschitz transformation and apply it to define stable models (also called answer sets) for logic programs with arbitrary constraint atoms. The resulting semantics turns out to coincide with the one defined by Son et al. (2007), which is based on a fixpoint approach. One advantage of our approach is that it can be applied, in a natural way, to define stable models for disjunctive logic programs with constraint atoms, which may appear in the disjunctive head as well as in the body of a rule. As a result, our approach to the stable model semantics for logic programs with constraint atoms generalizes a number of previous approaches. Second, we show that our abstract representation of constraint atoms provides a means to characterize dependencies of atoms in a program with constraint atoms, so that some standard characterizations and properties relying on these dependencies in the past for logic programs with ordinary atoms can be extended to logic programs with constraint atoms.}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\67AXEBQW\\Gelfond and Lifschitz - 1988 - The Stable Model Semantics for Logic Programming.pdf} + file = {/home/sguillou/Zotero/storage/67AXEBQW/Gelfond and Lifschitz - 1988 - The Stable Model Semantics for Logic Programming.pdf} } @article{gentzenUntersuchungenUberLogische1935, @@ -1602,7 +1621,7 @@ @article{gentzenUntersuchungenUberLogische1935 issn = {0025-5874; 1432-1823}, urldate = {2023-02-16}, langid = {und}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\ABN8GQFX\\168546.html} + file = {/home/sguillou/Zotero/storage/ABN8GQFX/168546.html} } @incollection{geuversConsistencyExtensionsHigher2007, @@ -1618,11 +1637,10 @@ @incollection{geuversConsistencyExtensionsHigher2007 issn = {0302-9743, 1611-3349}, doi = {10.1007/978-3-540-74464-1_10}, urldate = {2020-11-02}, - abstract = {It is well-known, due to the work of Girard and Coquand, that adding polymorphic domains to higher order logic, HOL, or its type theoretic variant {$\lambda$}HOL, renders the logic inconsistent. This is known as Girard's paradox, see [7]. But there is also a another presentation of higher order logic, in its type theoretic variant called {$\lambda$}PRED{$\omega$}, to which polymorphic domains can be added safely, Both {$\lambda$}HOL and {$\lambda$}PRED{$\omega$} are well-known type systems and in this paper we study why {$\lambda$}HOL with polymorphic domains is inconsistent and why nd {$\lambda$}PRED{$\omega$} with polymorphic domains remains consistent. We do this by describing a simple model for the latter and we show why this can not be a model of the first.}, isbn = {978-3-540-74463-4 978-3-540-74464-1}, langid = {english}, keywords = {Print,Read}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\4M9AGALP\\Geuvers - 2007 - (In)consistency of Extensions of Higher Order Logi.pdf} + file = {/home/sguillou/Zotero/storage/4M9AGALP/Geuvers - 2007 - (In)consistency of Extensions of Higher Order Logi.pdf} } @book{geuversLogicsTypeSystems1993, @@ -1633,7 +1651,7 @@ @book{geuversLogicsTypeSystems1993 isbn = {978-90-90-06352-2}, langid = {english}, keywords = {Read}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\NQFWVT3X\\Geuvers - 1993 - Logics and type systems.pdf} + file = {/home/sguillou/Zotero/storage/NQFWVT3X/Geuvers - 1993 - Logics and type systems.pdf} } @article{geuversProofAssistantsHistory2009, @@ -1649,9 +1667,8 @@ @article{geuversProofAssistantsHistory2009 issn = {0256-2499, 0973-7677}, doi = {10.1007/s12046-009-0001-5}, urldate = {2022-06-20}, - abstract = {In this paper I will discuss the fundamental ideas behind proof assistants: What are they and what is a proof anyway? I give a short history of the main ideas, emphasizing the way they ensure the correctness of the mathematics formalized. I will also briefly discuss the places where proof assistants are used and how we envision their extended use in the future. While being an introduction into the world of proof assistants and the main issues behind them, this paper is also a position paper that pushes the further use of proof assistants. We believe that these systems will become the future of mathematics, where definitions, statements, computations and proofs are all available in a computerized form. An important application is and will be in computer supported modelling and verification of systems. But there is still a long road ahead and I will indicate what we believe is needed for the further proliferation of proof assistants.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\H34YPKND\\Geuvers - 2009 - Proof assistants History, ideas and future.pdf} + file = {/home/sguillou/Zotero/storage/H34YPKND/Geuvers - 2009 - Proof assistants History, ideas and future.pdf} } @article{geuversPureTypeSystems2010, @@ -1667,7 +1684,7 @@ @article{geuversPureTypeSystems2010 urldate = {2020-10-20}, langid = {english}, keywords = {Print,Read}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\WPGY7WTC\\Geuvers et al. - 2010 - Pure Type Systems without Explicit Contexts.pdf} + file = {/home/sguillou/Zotero/storage/WPGY7WTC/Geuvers et al. - 2010 - Pure Type Systems without Explicit Contexts.pdf} } @article{ghilardiHigherOrderQuantifierElimination2020, @@ -1679,24 +1696,19 @@ @article{ghilardiHigherOrderQuantifierElimination2020 issn = {0168-7433, 1573-0670}, doi = {10.1007/s10817-020-09578-5}, urldate = {2020-09-17}, - abstract = {We develop quantifier elimination procedures for fragments of higher order logic arising from the formalization of distributed systems (especially of fault-tolerant ones). Such procedures can be used in symbolic manipulations like the computation of pre/post images and of projections. We show in particular that our procedures are quite effective in producing counter abstractions that can be model-checked using standard SMT technology. In fact, very often in the current literature verification tasks for distributed systems are accomplished via counter abstractions. Such abstractions can sometimes be justified via simulations and bisimulations. In this work, we supply logical foundations to this practice, by our technique for second order quantifier elimination. We implemented our procedure for a simplified (but still expressive) subfragment and we showed that our method is able to successfully handle verification benchmarks from various sources with interesting performances.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\BD5HDFW2\\Ghilardi and Pagani - 2020 - Higher-Order Quantifier Elimination, Counter Simul.pdf} + file = {/home/sguillou/Zotero/storage/BD5HDFW2/Ghilardi and Pagani - 2020 - Higher-Order Quantifier Elimination, Counter Simul.pdf} } @book{girardProofsTypes1989, title = {Proofs and Types}, - author = {Girard, Jean-Yves}, + author = {Girard, Jean-Yves and Taylor, Paul and Lafont, Yves}, year = {1989}, - series = {Cambridge Tracts in Theoretical Computer Science}, - number = {7}, + month = mar, publisher = {{Cambridge University Press}}, - address = {{Cambridge [England] ; New York}}, + address = {{USA}}, isbn = {978-0-521-37181-0}, - langid = {english}, - lccn = {QA9.5 .G57 1989}, - keywords = {Lambda calculus}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\PU5UTCQT\\Girard - 1989 - Proofs and types.pdf} + file = {/home/sguillou/Zotero/storage/PU5UTCQT/Girard - 1989 - Proofs and types.pdf} } @article{givanPolynomialtimeComputationLocal2002, @@ -1711,9 +1723,8 @@ @article{givanPolynomialtimeComputationLocal2002 issn = {1529-3785}, doi = {10.1145/566385.566387}, urldate = {2023-03-30}, - abstract = {We consider the concept of a local set of inference rules. A local rule set can be automatically transformed into a rule set for which bottom-up evaluation terminates in polynomial time. The local-rule-set transformation gives polynomial-time evaluation strategies for a large variety of rule sets that cannot be given terminating evaluation strategies by any other known automatic technique. This article discusses three new results. First, it is shown that every polynomial-time predicate can be defined by an (unstratified) local rule set. Second, a new machine-recognizable subclass of the local rule sets is identified. Finally, we show that locality, as a property of rule sets, is undecidable in general.}, keywords = {Automated reasoning,decision procedures,descriptive complexity theory}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\HZLETMVF\\Givan and Mcallester - 2002 - Polynomial-time computation via local inference re.pdf} + file = {/home/sguillou/Zotero/storage/HZLETMVF/Givan and Mcallester - 2002 - Polynomial-time computation via local inference re.pdf} } @incollection{goelBDDBasedProcedures1998, @@ -1728,10 +1739,9 @@ @incollection{goelBDDBasedProcedures1998 address = {{Berlin, Heidelberg}}, doi = {10.1007/BFb0028749}, urldate = {2023-04-04}, - abstract = {The logic of equality with uninterpreted functions has been proposed for verifying abstract hardware designs. The ability to perform fast satis ability checking over this logic is imperative for this veri cation paradigm to be successful. We present symbolic methods for satis ability checking for this logic. The rst procedure is based on restricting analysis to nite instantiations of the design. The second procedure directly reasons about equality by introducing Boolean-valued indicator variables for equality. Theoretical and experimental evidence shows the superiority of the second approach.}, isbn = {978-3-540-64608-2 978-3-540-69339-0}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\YJX5FT2Q\\Goel et al. - 1998 - BDD based procedures for a theory of equality with.pdf} + file = {/home/sguillou/Zotero/storage/YJX5FT2Q/Goel et al. - 1998 - BDD based procedures for a theory of equality with.pdf} } @article{goldblattSemanticAnalysisOrthologic1974, @@ -1747,7 +1757,7 @@ @article{goldblattSemanticAnalysisOrthologic1974 doi = {10.1007/BF00652069}, urldate = {2023-03-30}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\6HWCBMQS\\Goldblatt - 1974 - Semantic analysis of orthologic.pdf} + file = {/home/sguillou/Zotero/storage/6HWCBMQS/Goldblatt - 1974 - Semantic analysis of orthologic.pdf} } @article{gonthierFormalProofFour2008, @@ -1758,7 +1768,7 @@ @article{gonthierFormalProofFour2008 number = {11}, pages = {12}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\NR3MR9DW\\Gonthier - 2008 - Formal Proof—The Four- Color Theorem.pdf} + file = {/home/sguillou/Zotero/storage/NR3MR9DW/Gonthier - 2008 - Formal Proof—The Four- Color Theorem.pdf} } @article{gonthierFormalProofFour2008a, @@ -1769,7 +1779,7 @@ @article{gonthierFormalProofFour2008a number = {11}, pages = {12}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\VUB8CT7T\\Gonthier - 2008 - Formal Proof—The Four- Color Theorem.pdf} + file = {/home/sguillou/Zotero/storage/VUB8CT7T/Gonthier - 2008 - Formal Proof—The Four- Color Theorem.pdf} } @book{gordonEdinburghLCF1979, @@ -1785,7 +1795,7 @@ @book{gordonEdinburghLCF1979 urldate = {2023-02-16}, isbn = {978-3-540-09724-2 978-3-540-38526-4}, keywords = {computation,logic}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\GSDE8UFJ\\Gordon et al. - 1979 - Edinburgh LCF.pdf} + file = {/home/sguillou/Zotero/storage/GSDE8UFJ/Gordon et al. - 1979 - Edinburgh LCF.pdf} } @incollection{gordonSetTheoryHigher1996, @@ -1800,10 +1810,9 @@ @incollection{gordonSetTheoryHigher1996 address = {{Berlin, Heidelberg}}, doi = {10.1007/BFb0105405}, urldate = {2020-12-04}, - abstract = {The majority of general purpose mechanised proof assistants support versions of typed higher order logic, even though set theory is the standard foundation for mathematics. For many applications higher order logic works well and provides, for speci cation, the bene ts of type-checking that are well-known in programming. However, there are areas where types get in the way or seem unmotivated. Furthermore, most people with a scienti c or engineering background already know set theory, but not higher order logic. This paper discusses some approaches to getting the best of both worlds: the expressiveness and standardness of set theory with the e cient treatment of functions provided by typed higher order logic.}, isbn = {978-3-540-61587-3 978-3-540-70641-0}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\CU4NYIV8\\Gordon - 1996 - Set theory, higher order logic or both.pdf} + file = {/home/sguillou/Zotero/storage/CU4NYIV8/Gordon - 1996 - Set theory, higher order logic or both.pdf} } @article{gradelDominoesComplexitySubclasses1989, @@ -1818,9 +1827,8 @@ @article{gradelDominoesComplexitySubclasses1989 issn = {0168-0072}, doi = {10.1016/0168-0072(89)90023-7}, urldate = {2021-07-09}, - abstract = {The complexity of subclasses of logical theories (mainly Presburger and Skolem arithmetic) is studied. The subclasses are defined by the structure of the quantifier prefix. For this purpose finite versions of dominoes (tiling problems) are used. Dominoes were introduced in the sixties as a tool to prove the undecidability of the {$\forall\exists\forall$}-case of the predicate calculus and have found in the meantime many other applications. Here it is shown that problems in complexity classes NTIME(T(n)) are reducible to domino problems where the space to be tiled is a square of size T(n). Because of their simple combinatorial structure these dominoes provide a convinient method for providing lower complexity bounds for simple formula classes in logical theories. Using this method it is shown that the class of {$\exists\forall{_\ast}$}-formulas in Presburger arithmetic has exponential complexity. This seems to be the simplest class with this property because the set of {$\exists{_\ast}$}-sentences in Presburger arithmetic is NP-complete and the classes which is shown to be fixed prefixes (i.e. where also the number of variables is limited) are all contained in appropriate levels of the polynomial time-hierarchy. Skolem arithmetic is the theory of positive natural numbers with multiplication and 's thus (isomorphic to) the weak direct power of Presburger arithmetic. For the theory in general as well as for most subclasses the complexity is one exponential step higher than in the case of Presburger arithmetic. An exception is the class of {$\exists{_\ast}$}-formulas which is shown to be NP-complete. On the other hand there is a formula class with fixed dimension which already has an exponential lower complexity bound. The last section mentions some results on other logical theories and indicates some possible lines of future research.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\C2PCLUYF\\Grädel - 1989 - Dominoes and the complexity of subclasses of logic.pdf;C\:\\Users\\Simon\\Zotero\\storage\\MK6ST48Q\\0168007289900237.html} + file = {/home/sguillou/Zotero/storage/C2PCLUYF/Grädel - 1989 - Dominoes and the complexity of subclasses of logic.pdf;/home/sguillou/Zotero/storage/MK6ST48Q/0168007289900237.html} } @article{guillenMATHEMATICSMAY2018, @@ -1828,7 +1836,7 @@ @article{guillenMATHEMATICSMAY2018 author = {Guillen, Alejandro Manuel and Freese, Ralph and Nation, J B and Ross, David and {Kjos-Hanssen}, Bj{\o}rn and Crosby, Martha}, pages = {41}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\KIRWAMEK\\Guillen et al. - MATHEMATICS MAY 2018.pdf} + file = {/home/sguillou/Zotero/storage/KIRWAMEK/Guillen et al. - MATHEMATICS MAY 2018.pdf} } @inproceedings{guilloudEquivalenceCheckingOrthocomplemented2022, @@ -1842,13 +1850,13 @@ @inproceedings{guilloudEquivalenceCheckingOrthocomplemented2022 publisher = {{Springer International Publishing}}, address = {{Cham}}, doi = {10.1007/978-3-030-99527-0_11}, - abstract = {Motivated by proof checking, we consider the problem of efficiently establishing equivalence of propositional formulas by relaxing the completeness requirements while still providing certain guarantees. We present a quasilinear time algorithm to decide the word problem on a natural algebraic structures we call orthocomplemented bisemilattices, a subtheory of Boolean algebra. The starting point for our procedure is a variation of Aho, Hopcroft, Ullman algorithm for isomorphism of trees, which we generalize to directed acyclic graphs. We combine this algorithm with a term rewriting system we introduce to decide equivalence of terms. We prove that our rewriting system is terminating and confluent, implying the existence of a normal form. We then show that our algorithm computes this normal form in log linear (and thus sub-quadratic) time. We provide pseudocode and a minimal working implementation in Scala.}, + copyright = {All rights reserved}, isbn = {978-3-030-99527-0}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\7GLKMGFQ\\Guilloud and Kunčak - 2022 - Equivalence Checking for Orthocomplemented Bisemil.pdf} + file = {/home/sguillou/Zotero/storage/7GLKMGFQ/Guilloud and Kunčak - 2022 - Equivalence Checking for Orthocomplemented Bisemil.pdf} } -@inproceedings{guilloudFormulaNormalizationsVerification2023a, +@inproceedings{guilloudFormulaNormalizationsVerification2023, title = {Formula {{Normalizations}} in {{Verification}}}, booktitle = {35th {{International Conference}} on {{Computer Aided Verification}}}, author = {Guilloud, Simon and Bucev, Mario and Milovancevic, Dragana and Kuncak, Viktor}, @@ -1857,17 +1865,18 @@ @inproceedings{guilloudFormulaNormalizationsVerification2023a pages = {-}, publisher = {{Springer}}, address = {{Paris}}, - abstract = {We propose a new approach for normalization and simplification of logical formulas. Our approach is based on algorithms for lattice-like structures. Specifically, we present two efficient algorithms for computing a normal form and deciding the word problem for two subtheories of Boolean algebra, giving a sound procedure for propositional logical equivalence that is incomplete in general but complete with respect to a subset of Boolean algebra axioms. We first show a new algorithm to produce a normal form for expressions in the theory of ortholattices (OL) in time O(n\^2). We also consider an algorithm, recently presented but never evaluated in practice, producing a normal form for a slightly weaker theory, orthocomplemented bisemilattices (OCBSL), in time O(n log(n)\^2). For both algorithms, we present an implementation and show efficiency in two domains. First, we evaluate the algorithms on large propositional expressions, specifically combinatorial circuits from a benchmark suite, as well as on large random formulas. Second, we implement and evaluate the algorithms in the Stainless verifier, a tool for verifying the correctness of Scala programs. We used these algorithms as a basis for a new formula simplifier, which is applied before valid verification conditions are saved into a persistent cache. The results show that normalization substantially increases cache hit ratio in large benchmarks} + copyright = {All rights reserved} } @article{guilloudLISAFoundationalTheorem, title = {{{LISA}}: {{Towards}} a {{Foundational Theorem Prover}}}, author = {Guilloud, Simon and Cassayre, Florian and Kun{\v c}ak, Viktor}, + copyright = {All rights reserved}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\E8NT4Q6C\\Guilloud et al. - LISA Towards a Foundational Theorem Prover.pdf} + file = {/home/sguillou/Zotero/storage/E8NT4Q6C/Guilloud et al. - LISA Towards a Foundational Theorem Prover.pdf} } -@inproceedings{guilloudLISAModernProof2023a, +@inproceedings{guilloudLISAModernProof2023, title = {{{LISA}} \textendash{} {{A Modern Proof System}}}, booktitle = {14th {{Conference}} on {{Interactive Theorem Proving}}}, author = {Guilloud, Simon and Gambhir, Sankalp and Kuncak, Viktor}, @@ -1876,7 +1885,7 @@ @inproceedings{guilloudLISAModernProof2023a pages = {0}, publisher = {{Daghstuhl}}, address = {{Bialystok}}, - abstract = {We present LISA, a proof system and proof assistant for constructing proofs in schematic first-order logic and axiomatic set theory. The logical kernel of the system is a proof checker for first-order logic with equality and schematic predicate and function symbols. It implements polynomial-time proof checking and uses the axioms of ortholattices (which implies the irrelevance of the order of conjuncts and disjuncts and additional propositional laws). The kernel supports the notion of theorems (whose proofs are not expanded), as well as definitions of predicate symbols and objects whose unique existence is proven. A domain-specific language enables construction of proofs and development of proof tactics with user-friendly tools and presentation, while remaining within the general-purpose language, Scala. We describe the LISA proof system and illustrate the flavour and the level of abstraction of proofs written in LISA. This includes a proof-generating tactic for propositional tautologies, leveraging the ortholattice properties to reduce the size of proofs. We also present early formalization of set theory in LISA, including Cantor's theorem}, + copyright = {All rights reserved}, keywords = {First Order Logic,Interactive Theorem Proving,Proof System} } @@ -1886,19 +1895,34 @@ @misc{guilloudLISAReferenceManual2023 year = {2023}, month = feb, urldate = {2023-02-16}, - abstract = {Proof assistant based on first-order logic and set theory}, copyright = {Apache-2.0}, howpublished = {EPFL-LARA} } +@misc{guilloudOrthologicAxioms2023, + title = {Orthologic with {{Axioms}}}, + author = {Guilloud, Simon and Kuncak, Viktor}, + year = {2023}, + month = jul, + number = {arXiv:2307.07569}, + eprint = {2307.07569}, + primaryclass = {cs, math}, + publisher = {{arXiv}}, + doi = {10.48550/arXiv.2307.07569}, + urldate = {2023-08-24}, + archiveprefix = {arxiv}, + copyright = {All rights reserved}, + keywords = {Computer Science - Logic in Computer Science,Mathematics - Logic}, + file = {/home/sguillou/Zotero/storage/SMNMGSB4/Guilloud and Kuncak - 2023 - Orthologic with Axioms.pdf;/home/sguillou/Zotero/storage/49N6BSQG/2307.html} +} + @article{gveroCompleteCompletionUsing, title = {Complete {{Completion}} Using {{Types}} and {{Weights}}}, author = {Gvero, Tihomir and Kuncak, Viktor and Kuraj, Ivan and Piskac, Ruzica}, pages = {12}, - abstract = {Developing modern software typically involves composing functionality from existing libraries. This task is difficult because libraries may expose many methods to the developer. To help developers in such scenarios, we present a technique that synthesizes and suggests valid expressions of a given type at a given program point. As the basis of our technique we use type inhabitation for lambda calculus terms in long normal form. We introduce a succinct representation for type judgements that merges types into equivalence classes to reduce the search space, then reconstructs any desired number of solutions on demand. Furthermore, we introduce a method to rank solutions based on weights derived from a corpus of code. We implemented the algorithm and deployed it as a plugin for the Eclipse IDE for Scala. We show that the techniques we incorporated greatly increase the effectiveness of the approach. Our evaluation benchmarks are code examples from programming practice; we make them available for future comparisons.}, langid = {english}, keywords = {Print,Read}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\KH6S4JBF\\Gvero et al. - Complete Completion using Types and Weights.pdf} + file = {/home/sguillou/Zotero/storage/KH6S4JBF/Gvero et al. - Complete Completion using Types and Weights.pdf} } @inproceedings{haasePresburgerArithmeticStars2019, @@ -1909,9 +1933,8 @@ @inproceedings{haasePresburgerArithmeticStars2019 month = jun, pages = {1--14}, doi = {10.1109/LICS.2019.8785850}, - abstract = {We study the computational complexity of existential Presburger arithmetic with (possibly nested occurrences of) a Kleene-star operator. In addition to being a natural extension of Presburger arithmetic, our investigation is motivated by two other decision problems. The first problem is the rational subset membership problem in graph groups. A graph group is an infinite group specified by a finite undirected graph. While a characterisation of graph groups with a decidable rational subset membership problem was given by Lohrey and Steinberg [J. Algebra, 320(2) (2008)], it has been an open problem (i) whether the decidable fragment has elementary complexity and (ii) what is the complexity for each fixed graph group. The second problem is the reachability problem for integer vector addition systems with states and nested zero tests. We prove that the satisfiability problem for existential Pres-burger arithmetic with stars is NEXP-complete and that all three problems are polynomially inter-reducible. Moreover, we consider for each problem a variant with a fixed parameter: We fix the star-height in the logic, a graph parameter for the membership problem, and the number of distinct zero-tests in the integer vector addition systems. We establish NP-completeness of all problems with fixed parameters. In particular, this enables us to obtain a complete description of the complexity landscape of the rational subset membership problem for fixed graph groups: If the graph is a clique, the problem is N L-complete. If the graph is a disjoint union of cliques, it is P-complete. If it is a transitive forest (and not a union of cliques), the problem is NP-complete. Otherwise, the problem is undecidable.}, keywords = {Automata,automata theory,computational complexity,Computational complexity,Computational modeling,Computer science,decidability,decidable rational subset membership problem,decision problems,finite undirected graph,fixed graph group,Grammar,graph parameter,group theory,integer vector addition systems,Kleene-star operator,NP-completeness,polynomials,Pres-burger arithmetic,rational subsets,reachability analysis,reachability problem,satisfiability problem,set theory,Tools,zero tests,zero-tests}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\2ET9TMU2\\Haase and Zetzsche - 2019 - Presburger arithmetic with stars, rational subsets.pdf} + file = {/home/sguillou/Zotero/storage/2ET9TMU2/Haase and Zetzsche - 2019 - Presburger arithmetic with stars, rational subsets.pdf} } @phdthesis{haddadIntegratingAutomatedTheorem2021, @@ -1920,10 +1943,9 @@ @phdthesis{haddadIntegratingAutomatedTheorem2021 year = {2021}, month = sep, urldate = {2023-03-16}, - abstract = {Lambdapi is a proof assistant that allows users to construct a proof of a given theorem in a universal language based on the lambda-pi-calculus. The goal of this thesis is to add more automation to Lambdapi to gain more time and effort for the users. This thesis presents three contributions associated with the integration of automated provers in proof assistants. The first contribution consists of the implementation of a tactic that calls automated provers from Lambdapi by using an external platform called Why3. Usually, automated provers do not generate a complete certificate of a given formula, thus, the second contribution presented in this thesis is the reconstruction in Lambdapi of proofs generated by first-order automated provers implemented in a tool called Ekstrakto. Finally, automated provers often perform some transformations on the formula that they are trying to solve. Among these transformations, we can find Skolemization steps. The last contribution is devoted to the certification of Skolemization steps performed by the automated provers in order to have a complete reconstruction. This has been implemented in a tool called Skonverto.}, langid = {english}, school = {Universit\'e Paris-Saclay}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\CLPL2CBQ\\Haddad - 2021 - Integrating Automated Theorem Provers in Proof Ass.pdf} + file = {/home/sguillou/Zotero/storage/CLPL2CBQ/Haddad - 2021 - Integrating Automated Theorem Provers in Proof Ass.pdf} } @article{halesFORMALPROOFKEPLER2017, @@ -1936,16 +1958,15 @@ @article{halesFORMALPROOFKEPLER2017 issn = {2050-5086}, doi = {10.1017/fmp.2017.1}, urldate = {2022-06-20}, - abstract = {This article describes a formal proof of the Kepler conjecture on dense sphere packings in a combination of the HOL Light and Isabelle proof assistants. This paper constitutes the official published account of the now completed Flyspeck project.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\F5AMQT5Q\\Hales et al. - 2017 - A FORMAL PROOF OF THE KEPLER CONJECTURE.pdf} + file = {/home/sguillou/Zotero/storage/F5AMQT5Q/Hales et al. - 2017 - A FORMAL PROOF OF THE KEPLER CONJECTURE.pdf} } @misc{HandbookProofTheory, title = {Handbook of {{Proof Theory}}, {{Volume}} 137 - 1st {{Edition}}}, urldate = {2023-03-25}, howpublished = {https://www.elsevier.com/books/handbook-of-proof-theory/buss/978-0-444-89840-1}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\9ED3YNBG\\978-0-444-89840-1.html} + file = {/home/sguillou/Zotero/storage/9ED3YNBG/978-0-444-89840-1.html} } @article{hardegreeMaterialImplicationOrthomodular1981, @@ -1961,7 +1982,7 @@ @article{hardegreeMaterialImplicationOrthomodular1981 doi = {10.1305/ndjfl/1093883401}, urldate = {2021-09-21}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\M66K7KR5\\Hardegree - 1981 - Material implication in orthomodular (and Boolean).pdf} + file = {/home/sguillou/Zotero/storage/M66K7KR5/Hardegree - 1981 - Material implication in orthomodular (and Boolean).pdf} } @article{harrisonHOLLightManual, @@ -1970,7 +1991,7 @@ @article{harrisonHOLLightManual pages = {116}, langid = {english}, keywords = {Read}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\LQG76PNZ\\Harrison - The HOL Light manual (1.1).pdf} + file = {/home/sguillou/Zotero/storage/LQG76PNZ/Harrison - The HOL Light manual (1.1).pdf} } @incollection{harrisonHOLLightOverview2009, @@ -1986,10 +2007,9 @@ @incollection{harrisonHOLLightOverview2009 address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-642-03359-9_4}, urldate = {2020-10-28}, - abstract = {HOL Light is an interactive proof assistant for classical higherorder logic, intended as a clean and simplified version of Mike Gordon's original HOL system. Theorem provers in this family use a version of ML as both the implementation and interaction language; in HOL Light's case this is Objective CAML (OCaml). Thanks to its adherence to the so-called `LCF approach', the system can be extended with new inference rules without compromising soundness. While retaining this reliability and programmability from earlier HOL systems, HOL Light is distinguished by its clean and simple design and extremely small logical kernel. Despite this, it provides powerful proof tools and has been applied to some non-trivial tasks in the formalization of mathematics and industrial formal verification.}, isbn = {978-3-642-03358-2 978-3-642-03359-9}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\57CTTIRK\\Harrison - 2009 - HOL Light An Overview.pdf} + file = {/home/sguillou/Zotero/storage/57CTTIRK/Harrison - 2009 - HOL Light An Overview.pdf} } @article{harrisonHOLLightPrimitive, @@ -1998,17 +2018,16 @@ @article{harrisonHOLLightPrimitive pages = {17}, langid = {english}, keywords = {Read}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\3NI7F2II\\Harrison and Richter - THE HOL LIGHT PRIMITIVE INFERENCE RULES IMPLY THE .pdf} + file = {/home/sguillou/Zotero/storage/3NI7F2II/Harrison and Richter - THE HOL LIGHT PRIMITIVE INFERENCE RULES IMPLY THE .pdf} } @article{harrisonHOLLightTutorial, title = {{{HOL Light Tutorial}} (for Version 2.20)}, author = {Harrison, John and Jf, Intel}, pages = {230}, - abstract = {The HOL Light theorem prover can be difficult to get started with. While the manual is fairly detailed and comprehensive, the large amount of background information that has to be absorbed before the user can do anything interesting is intimidating. Here we give an alternative `quick start' guide, aimed at teaching basic use of the system quickly by means of a graded set of examples. Some readers may find it easier to absorb; those who do not are referred after all to the standard manual.}, langid = {english}, keywords = {Read}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\L76S6WQ2\\Harrison and Jf - HOL Light Tutorial (for version 2.20).pdf} + file = {/home/sguillou/Zotero/storage/L76S6WQ2/Harrison and Jf - HOL Light Tutorial (for version 2.20).pdf} } @inproceedings{harrisonLetMakeSet, @@ -2019,7 +2038,7 @@ @inproceedings{harrisonLetMakeSet address = {{Aussois}}, langid = {english}, keywords = {Read}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\JQW46D6M\\Services - Let's make set theory great again!.pdf} + file = {/home/sguillou/Zotero/storage/JQW46D6M/Services - Let's make set theory great again!.pdf} } @inproceedings{harrisonSelfverificationHOLLight2006, @@ -2033,27 +2052,36 @@ @inproceedings{harrisonSelfverificationHOLLight2006 publisher = {{Springer}}, address = {{Berlin, Heidelberg}}, doi = {10.1007/11814771_17}, - abstract = {The HOL Light prover is based on a logical kernel consisting of about 400 lines of mostly functional OCaml, whose complete formal verification seems to be quite feasible. We would like to formally verify (i) that the abstract HOL logic is indeed correct, and (ii) that the OCaml code does correctly implement this logic. We have performed a full verification of an imperfect but quite detailed model of the basic HOL Light core, without definitional mechanisms, and this verification is entirely conducted with respect to a set-theoretic semantics within HOL Light itself. We will duly explain why the obvious logical and pragmatic difficulties do not vitiate this approach, even though it looks impossible or useless at first sight. Extension to include definitional mechanisms seems straightforward enough, and the results so far allay most of our practical worries.}, isbn = {978-3-540-37188-5}, langid = {english}, keywords = {Boolean Type,Concrete Syntax,Mathematical Axiom,Polymorphic Type,Proof Checker}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\RYMTFJTF\\Harrison - 2006 - Towards Self-verification of HOL Light.pdf} + file = {/home/sguillou/Zotero/storage/RYMTFJTF/Harrison - 2006 - Towards Self-verification of HOL Light.pdf} +} + +@inproceedings{henzingerAbstractionsProofs2004, + title = {Abstractions from Proofs}, + booktitle = {Proceedings of the 31st {{ACM SIGPLAN-SIGACT Symposium}} on {{Principles}} of {{Programming Languages}}, {{POPL}} 2004, {{Venice}}, {{Italy}}, {{January}} 14-16, 2004}, + author = {Henzinger, Thomas A. and Jhala, Ranjit and Majumdar, Rupak and McMillan, Kenneth L.}, + editor = {Jones, Neil D. and Leroy, Xavier}, + year = {2004}, + pages = {232--244}, + publisher = {{ACM}}, + doi = {10.1145/964001.964021} } @article{herrmannVarietiesModularOrtholattices, title = {On Varieties of Modular Ortholattices Which Are Generated by Their Finite-Dimensional Members}, author = {Herrmann, Christian and Roddy, Micheale S}, pages = {9}, - abstract = {We prove that the following three conditions on a modular ortholattice, L, with respect to a given variety of modular ortholattices, V, are equivalent: L is in the variety of modular ortholattices generated by the finite-dimensional members of V; L can be embedded in an atomistic member of V; L has an orthogeometric representation in an anisotropic orthogeometry (Q, {$\perp$}), where [0, u] {$\in$} V, for all u {$\in$} Lfin(Q).}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\LR4JDFRY\\Herrmann and Roddy - On varieties of modular ortholattices which are ge.pdf} + file = {/home/sguillou/Zotero/storage/LR4JDFRY/Herrmann and Roddy - On varieties of modular ortholattices which are ge.pdf} } @misc{HierarchiesPolynomiallySolvable, title = {Hierarchies of Polynomially Solvable Satisfiability Problems | {{SpringerLink}}}, urldate = {2023-06-10}, howpublished = {https://link.springer.com/article/10.1007/BF02127974}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\C5UQ9T3U\\BF02127974.html} + file = {/home/sguillou/Zotero/storage/C5UQ9T3U/BF02127974.html} } @article{hietalaVerifiedOptimizerQuantum2021, @@ -2067,9 +2095,37 @@ @article{hietalaVerifiedOptimizerQuantum2021 pages = {37:1--37:29}, doi = {10.1145/3434318}, urldate = {2023-06-23}, - abstract = {We present VOQC, the first fully verified optimizer for quantum circuits, written using the Coq proof assistant. Quantum circuits are expressed as programs in a simple, low-level language called SQIR, a simple quantum intermediate representation, which is deeply embedded in Coq. Optimizations and other transformations are expressed as Coq functions, which are proved correct with respect to a semantics of SQIR programs. SQIR uses a semantics of matrices of complex numbers, which is the standard for quantum computation, but treats matrices symbolically in order to reason about programs that use an arbitrary number of quantum bits. SQIR's careful design and our provided automation make it possible to write and verify a broad range of optimizations in VOQC, including full-circuit transformations from cutting-edge optimizers.}, keywords = {Certified Compilation,Circuit Optimization,Formal Verification,Programming Languages,Quantum Computing}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\FY8BLQTF\\Hietala et al. - 2021 - A verified optimizer for Quantum circuits.pdf} + file = {/home/sguillou/Zotero/storage/FY8BLQTF/Hietala et al. - 2021 - A verified optimizer for Quantum circuits.pdf} +} + +@inproceedings{hoderInterpolationSymbolElimination2010, + title = {Interpolation and {{Symbol Elimination}} in {{Vampire}}}, + booktitle = {Automated {{Reasoning}}, 5th {{International Joint Conference}}, {{IJCAR}} 2010, {{Edinburgh}}, {{UK}}, {{July}} 16-19, 2010. {{Proceedings}}}, + author = {Hoder, Krystof and Kov{\'a}cs, Laura and Voronkov, Andrei}, + editor = {Giesl, J{\"u}rgen and H{\"a}hnle, Reiner}, + year = {2010}, + series = {Lecture {{Notes}} in {{Computer Science}}}, + volume = {6173}, + pages = {188--195}, + publisher = {{Springer}}, + doi = {10.1007/978-3-642-14203-1_16}, + urldate = {2023-09-05} +} + +@article{hojjatELDARICAHornSolver2018, + title = {The {{ELDARICA Horn Solver}}}, + author = {Hojjat, Hossein and Rummer, Philipp}, + year = {2018}, + month = oct, + journal = {2018 Formal Methods in Computer Aided Design (FMCAD)}, + pages = {1--7}, + publisher = {{IEEE}}, + address = {{Austin, TX}}, + doi = {10.23919/FMCAD.2018.8603013}, + urldate = {2023-09-05}, + isbn = {9780983567882}, + file = {/home/sguillou/Zotero/storage/FCDP8MUE/Hojjat and Rummer - 2018 - The ELDARICA Horn Solver.pdf} } @article{hollidayFundamentalNonClassicalLogic2023, @@ -2084,9 +2140,8 @@ @article{hollidayFundamentalNonClassicalLogic2023 issn = {2813-0405}, doi = {10.3390/logics1010004}, urldate = {2023-06-04}, - abstract = {We give a proof-theoretic as well as a semantic characterization of a logic in the signature with conjunction, disjunction, negation, and the universal and existential quantifiers that we suggest has a certain fundamental status. We present a Fitch-style natural deduction system for the logic that contains only the introduction and elimination rules for the logical constants. From this starting point, if one adds the rule that Fitch called Reiteration, one obtains a proof system for intuitionistic logic in the given signature; if instead of adding Reiteration, one adds the rule of Reductio ad Absurdum, one obtains a proof system for orthologic; by adding both Reiteration and Reductio, one obtains a proof system for classical logic. Arguably neither Reiteration nor Reductio is as intimately related to the meaning of the connectives as the introduction and elimination rules are, so the base logic we identify serves as a more fundamental starting point and common ground between proponents of intuitionistic logic, orthologic, and classical logic. The algebraic semantics for the logic we motivate proof-theoretically is based on bounded lattices equipped with what has been called a weak pseudocomplementation. We show that such lattice expansions are representable using a set together with a reflexive binary relation satisfying a simple first-order condition, which yields an elegant relational semantics for the logic. This builds on our previous study of representations of lattices with negations, which we extend and specialize for several types of negation in addition to weak pseudocomplementation. Finally, we discuss ways of extending these representations to lattices with a conditional or implication operation.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\CARUH8WL\\Holliday - 2023 - A Fundamental Non-Classical Logic.pdf} + file = {/home/sguillou/Zotero/storage/CARUH8WL/Holliday - 2023 - A Fundamental Non-Classical Logic.pdf} } @misc{hollidayOrthologicEpistemicModals2022, @@ -2096,10 +2151,9 @@ @misc{hollidayOrthologicEpistemicModals2022 publisher = {{arXiv}}, doi = {10.48550/ARXIV.2203.02872}, urldate = {2023-06-04}, - abstract = {Epistemic modals have peculiar logical features that are challenging to account for in a broadly classical framework. For instance, while a sentence of the form \$p\textbackslash wedge\textbackslash Diamond\textbackslash neg p\$ ('\$p\$, but it might be that not \$p\$') appears to be a contradiction, \$\textbackslash Diamond\textbackslash neg p\$ does not entail \$\textbackslash neg p\$, which would follow in classical logic. Likewise, the classical laws of distributivity and disjunctive syllogism fail for epistemic modals. Existing attempts to account for these facts generally either under- or over-correct. Some predict that \$p\textbackslash wedge\textbackslash Diamond\textbackslash neg p\$, a so-called epistemic contradiction, is a contradiction only in an etiolated sense, under a notion of entailment that does not always allow us to replace \$p\textbackslash wedge\textbackslash Diamond\textbackslash neg p\$ with a contradiction; these theories underpredict the infelicity of embedded epistemic contradictions. Other theories savage classical logic, eliminating not just rules that intuitively fail but also rules like non-contradiction, excluded middle, De Morgan's laws, and disjunction introduction, which intuitively remain valid for epistemic modals. In this paper, we aim for a middle ground, developing a semantics and logic for epistemic modals that makes epistemic contradictions genuine contradictions and that invalidates distributivity and disjunctive syllogism but that otherwise preserves classical laws that intuitively remain valid. We start with an algebraic semantics, based on ortholattices instead of Boolean algebras, and then propose a more concrete possibility semantics, based on partial possibilities related by compatibility. Both semantics yield the same consequence relation, which we axiomatize. We then show how to lift an arbitrary possible worlds model for a non-modal language to a possibility model for a language with epistemic modals.}, copyright = {arXiv.org perpetual, non-exclusive license}, keywords = {{03B65, 03B45, 03B60, 03G10},F.4.m,FOS: Computer and information sciences,Logic in Computer Science (cs.LO)}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\RXLKNHLV\\Holliday et Mandelkern - 2022 - The Orthologic of Epistemic Modals.pdf} + file = {/home/sguillou/Zotero/storage/RXLKNHLV/Holliday et Mandelkern - 2022 - The Orthologic of Epistemic Modals.pdf} } @misc{HOLOmegaLogicTheorem, @@ -2122,7 +2176,7 @@ @incollection{homeierHOLOmegaLogic2009 urldate = {2020-10-02}, isbn = {978-3-642-03358-2 978-3-642-03359-9}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\NCLEV22Q\\Homeier - 2009 - The HOL-Omega Logic.pdf} + file = {/home/sguillou/Zotero/storage/NCLEV22Q/Homeier - 2009 - The HOL-Omega Logic.pdf} } @article{hopcroftDesignAnalysisComputer, @@ -2130,16 +2184,15 @@ @article{hopcroftDesignAnalysisComputer author = {Hopcroft, John and UIIman, Jeffrey and Aho, Alfred}, pages = {479}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\K2G4BBLM\\Hopcroft et UIIman - Alfred V. Aho Bell Laboratories.pdf} + file = {/home/sguillou/Zotero/storage/K2G4BBLM/Hopcroft et UIIman - Alfred V. Aho Bell Laboratories.pdf} } @article{huCongruenceClosureACI, title = {Congruence Closure with {{ACI}} Function Symbols}, author = {Hu, Tanji and Givan, Robert}, pages = {15}, - abstract = {Congruence closure is the following well known reasoning problem: given a premise set of equations between ground terms over uninterpreted function symbols, does a given query equation follow using the axioms of equality? Several methods have been provided for polynomial-time answers to this question. Here we consider this same setting, but where some of the function symbols are known to be associative, commutative, and idempotent (ACI). Given these additional axioms, does the query equation follow from the premise equations? We provide a sound and complete cubic-time procedure correctly answering such questions. The problem requires exponential space when adding only AC function symbols [18], but requiring idempotence restores tractability . Our procedure is defined by providing a sound and complete ``local'' rule set for the problem [11]. A ``local formula'' is a formula mentioning only terms appearing in the premises or query. A local rule set is one for which any derivable local formula has a derivation using only local intermediate formulas. Closures under local rule sets can immediately be constructed in polynomial time by refusing to infer non-local formulas. Finally, we present results on the integration of ACI function symbols and equality inference rules into more general local rule sets.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\KJ9TC2K9\\Hu et Givan - Congruence closure with ACI function symbols.pdf} + file = {/home/sguillou/Zotero/storage/KJ9TC2K9/Hu et Givan - Congruence closure with ACI function symbols.pdf} } @article{huntComputationalComplexityAlgebra1987, @@ -2155,7 +2208,6 @@ @article{huntComputationalComplexityAlgebra1987 issn = {0097-5397}, doi = {10.1137/0216011}, urldate = {2021-09-27}, - abstract = {We study the computational complexity of equivalence and minimization problems for expressions on many different lattices including each finite lattice and each distributive lattice. A general efficient expressibility condition C on a lattice is presented such that 1. The equivalence problem is co\$NP\$ hard for constant-free expressions on any lattice with at least two elements that satisfies condition C. Each finite or distributive lattice is shown to satisfy condition C. Moreover, if a lattice \$\textbackslash mathcal\{L\}\$ satisfies condition C and \$ \textbackslash equiv \$ is a congruence relation on \$\textbackslash mathcal\{L\}\$, then \$\{\textbackslash mathcal\{L\} / \textbackslash equiv \}\$ also satisfies condition C. Several additional results are also presented. These results include the following: 2. In contrast to 1, the equivalence and operator minimization problems are solvable deterministically in polynomial time for disjunctive normal form and conjunctive normal form expressions on any lattice and for constant-free expressions on any free lattice with at least three generators: 3. Let \$\textbackslash mathcal\{L\}\$ be a lattice. Then, the operator minimization problem and various approximate operator minimization problems for expressions on \$\textbackslash mathcal\{L\}\$ are as hard as the problem of determining, for expressions F and G on \$\textbackslash mathcal\{L\}\$, if \$F \textbackslash leqq G\$.}, keywords = {03G10,06B,06B25,06D,68Q25,68Q40,computational complexity,deterministic polynomial time,distributive lattice,finite lattice,formula equivalence,formula minimization,free lattice,lattice,NP-hardness} } @@ -2168,8 +2220,7 @@ @article{hyckoImplicationsEquivalencesOrthomodular2005 volume = {38}, pages = {III--792}, doi = {10.1515/dema-2005-0402}, - abstract = {The present article describes a method for checking the validity of implications or equivalences in the free orthomodular lattice on two generators and in F(a,b,c 1 ,{$\cdots$},c n ), which is the free orthomodular lattice generated by the elements a,b,c 1 ,{$\cdots$},c n , where the elements c i , i=1,{$\cdots$},n, are central in it. The method presented is based on comparing the elements that are assigned to each expression on both sides of an implication or an equivalence. It gives a necessary condition for the implication or equivalence of arbitrary positive statements (a combination of identities and logical connectives AND and OR) to hold. When the conclusion part is an identity or a conjunction of identities, these conditions become also sufficient.}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\847HSZ9C\\Hyčko - 2005 - Implications and equivalences in orthomodular latt.pdf} + file = {/home/sguillou/Zotero/storage/847HSZ9C/Hyčko - 2005 - Implications and equivalences in orthomodular latt.pdf} } @article{ibarakiFunctionalDependenciesQHorn2001, @@ -2184,10 +2235,9 @@ @article{ibarakiFunctionalDependenciesQHorn2001 issn = {0004-3702}, doi = {10.1016/S0004-3702(01)00118-7}, urldate = {2023-06-10}, - abstract = {This paper studies functional dependencies in q-Horn theories, and discusses their use in knowledge condensation. We introduce compact model-based representations of q-Horn theories, analyze the structure of functional dependencies in q-Horn theories, and show that every minimal functional dependency in a q-Horn theory {$\Sigma$} can be expressed either by a single term or by a single clause. We also prove that the set of all minimal functional dependencies in {$\Sigma$} is quasi-acyclic. We then develop polynomial time algorithms for recognizing whether a given functional dependency holds in a q-Horn theory, which is represented either by a q-Horn CNF or as the q-Horn envelope of a set of models. Finally, we show that every q-Horn theory has a unique condensation, and can be condensed in polynomial time.}, langid = {english}, keywords = {Computational complexity,Condensation,Conjunctive normal form,Functional dependency,Knowledge representation,q-Horn theory}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\UTBYYLJB\\Ibaraki et al. - 2001 - On functional dependencies in q-Horn theories.pdf;C\:\\Users\\Simon\\Zotero\\storage\\SAMY75BD\\S0004370201001187.html} + file = {/home/sguillou/Zotero/storage/UTBYYLJB/Ibaraki et al. - 2001 - On functional dependencies in q-Horn theories.pdf;/home/sguillou/Zotero/storage/SAMY75BD/S0004370201001187.html} } @inproceedings{ihlemannLocalReasoningVerification2008, @@ -2201,7 +2251,6 @@ @inproceedings{ihlemannLocalReasoningVerification2008 publisher = {{Springer-Verlag}}, address = {{Berlin, Heidelberg}}, urldate = {2023-03-30}, - abstract = {We present a general framework which allows to identify complex theories important in verification for which efficient reasoning methods exist. The framework we present is based on a general notion of locality. We show that locality considerations allow us to obtain parameterized decidability and complexity results for many (combinations of) theories important in verification in general and in the verification of parametric systems in particular. We give numerous examples; in particular we show that several theories of data structures studied in the verification literature are local extensions of a base theory. The general framework we use allows us to identify situations in which some of the syntactical restrictions imposed in previous papers can be relaxed.}, isbn = {978-3-540-78799-0} } @@ -2216,7 +2265,6 @@ @inproceedings{ihlemannLocalReasoningVerification2008a publisher = {{Springer-Verlag}}, address = {{Berlin, Heidelberg}}, urldate = {2023-03-30}, - abstract = {We present a general framework which allows to identify complex theories important in verification for which efficient reasoning methods exist. The framework we present is based on a general notion of locality. We show that locality considerations allow us to obtain parameterized decidability and complexity results for many (combinations of) theories important in verification in general and in the verification of parametric systems in particular. We give numerous examples; in particular we show that several theories of data structures studied in the verification literature are local extensions of a base theory. The general framework we use allows us to identify situations in which some of the syntactical restrictions imposed in previous papers can be relaxed.}, isbn = {978-3-540-78799-0} } @@ -2232,7 +2280,7 @@ @book{jechSetTheory1978 langid = {english}, lccn = {QA3 QA248 .P8 vol. 79}, keywords = {Set theory}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\RAVKULB5\\Jech - 1978 - Set theory.pdf} + file = {/home/sguillou/Zotero/storage/RAVKULB5/Jech - 1978 - Set theory.pdf} } @article{kahnTopologicalSortingLarge1962, @@ -2247,8 +2295,7 @@ @article{kahnTopologicalSortingLarge1962 issn = {0001-0782}, doi = {10.1145/368996.369025}, urldate = {2021-10-11}, - abstract = {Topological Sorting is a procedure required for many problems involving analysis of networks. An example of one such problem is PERT. The present paper presents a very general method for obtaining topological order. It permits treatment of larger networks than can be handled on present procedures and achieves this with greater efficiency. Although the procedure can be adapted to any machine, it is discussed in terms of the 7090. A PERT network of 30,000 activities can be ordered in less than one hour of machine time. The method was developed as a byproduct of procedure needed by Westinghouse, Baltimore. It has not been programmed and at present there are no plans to implement it. In regard to the techniques described, Westinghouse's present and anticipated needs are completely served by the Lockheed program, which is in current use.}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\YEWYTUD4\\Kahn - 1962 - Topological sorting of large networks.pdf} + file = {/home/sguillou/Zotero/storage/YEWYTUD4/Kahn - 1962 - Topological sorting of large networks.pdf} } @book{kalmbachOrthomodularLattices1983, @@ -2260,7 +2307,7 @@ @book{kalmbachOrthomodularLattices1983 address = {{London ; New York}}, isbn = {978-0-12-394580-8}, langid = {Anglais}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\U3M3ISQV\\(London Mathematical Society Monographs) G. Kalmbach - Orthomodular Lattices-Academic Press (1983).djvu} + file = {/home/sguillou/Zotero/storage/U3M3ISQV/(London Mathematical Society Monographs) G. Kalmbach - Orthomodular Lattices-Academic Press (1983).djvu} } @incollection{kellerImportingHOLLight2010, @@ -2275,10 +2322,9 @@ @incollection{kellerImportingHOLLight2010 address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-642-14052-5_22}, urldate = {2020-11-30}, - abstract = {We present a new scheme to translate mathematical developments from HOL Light to Coq, where they can be re-used and rechecked. By relying on a carefully chosen embedding of Higher-Order Logic into Type Theory, we try to avoid some pitfalls of inter-operation between proof systems. In particular, our translation keeps the mathematical statements intelligible. This translation has been implemented and allows the importation of the HOL Light basic library into Coq.}, isbn = {978-3-642-14051-8 978-3-642-14052-5}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\GLB8ZWCV\\Keller and Werner - 2010 - Importing HOL Light into Coq.pdf} + file = {/home/sguillou/Zotero/storage/GLB8ZWCV/Keller and Werner - 2010 - Importing HOL Light into Coq.pdf} } @article{kirstCategoricityResultsLarge2019, @@ -2293,9 +2339,8 @@ @article{kirstCategoricityResultsLarge2019 issn = {1573-0670}, doi = {10.1007/s10817-018-9480-6}, urldate = {2020-10-11}, - abstract = {We formalise second-order ZF set theory in the dependent type theory of Coq. Assuming excluded middle, we prove Zermelo's embedding theorem for models, categoricity in all cardinalities, and the categoricity of extended axiomatisations fixing the number of Grothendieck universes. These results are based on an inductive definition of the cumulative hierarchy eliminating the need for ordinals and set-theoretic transfinite recursion. Following Aczel's sets-as-trees interpretation, we give a concise construction of an intensional model of second-order ZF with a weakened replacement axiom. Whereas this construction depends on no additional logical axioms, we obtain intensional and extensional models with full replacement assuming a description operator for trees and a weak form of proof irrelevance. In fact, these assumptions yield large models with n Grothendieck universes for every number n.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\A5GM4HUL\\Kirst and Smolka - 2019 - Categoricity Results and Large Model Constructions.pdf} + file = {/home/sguillou/Zotero/storage/A5GM4HUL/Kirst and Smolka - 2019 - Categoricity Results and Large Model Constructions.pdf} } @incollection{kojevnikovFindingEfficientCircuits2009, @@ -2310,10 +2355,26 @@ @incollection{kojevnikovFindingEfficientCircuits2009 address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-642-02777-2_5}, urldate = {2022-09-23}, - abstract = {In this paper we report preliminary results of experiments with finding efficient circuits (over binary bases) using SAT-solvers. We present upper bounds for functions with constant number of inputs as well as general upper bounds that were found automatically. We focus mainly on MOD-functions. Besides theoretical interest, these functions are also interesting from a practical point of view as they are the core of the residue number system. In particular, we present a circuit of size 3n + c over the full binary basis computing MODn3 .}, isbn = {978-3-642-02776-5 978-3-642-02777-2}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\WEJXR7DV\\Kojevnikov et al. - 2009 - Finding Efficient Circuits Using SAT-Solvers.pdf} + file = {/home/sguillou/Zotero/storage/WEJXR7DV/Kojevnikov et al. - 2009 - Finding Efficient Circuits Using SAT-Solvers.pdf} +} + +@inproceedings{kovacsFindingLoopInvariants2009, + title = {Finding {{Loop Invariants}} for {{Programs}} over {{Arrays Using}} a {{Theorem Prover}}}, + booktitle = {Fundamental {{Approaches}} to {{Software Engineering}}}, + author = {Kov{\'a}cs, Laura and Voronkov, Andrei}, + editor = {Chechik, Marsha and Wirsing, Martin}, + year = {2009}, + series = {Lecture {{Notes}} in {{Computer Science}}}, + pages = {470--485}, + publisher = {{Springer}}, + address = {{Berlin, Heidelberg}}, + doi = {10.1007/978-3-642-00593-0_33}, + isbn = {978-3-642-00593-0}, + langid = {english}, + keywords = {Array Variable,Invariant Generation,Loop Counter,Predicate Symbol,Scalar Variable}, + file = {/home/sguillou/Zotero/storage/WAVKP7L3/Kovács and Voronkov - 2009 - Finding Loop Invariants for Programs over Arrays U.pdf} } @inproceedings{kovacsFirstOrderTheoremProving2013, @@ -2327,11 +2388,10 @@ @inproceedings{kovacsFirstOrderTheoremProving2013 publisher = {{Springer}}, address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-642-39799-8_1}, - abstract = {In this paper we give a short introduction in first-order theorem proving and the use of the theorem prover Vampire. We discuss the superposition calculus and explain the key concepts of saturation and redundancy elimination, present saturation algorithms and preprocessing, and demonstrate how these concepts are implemented in Vampire. Further, we also cover more recent topics and features of Vampire designed for advanced applications, including satisfiability checking, theory reasoning, interpolation, consequence elimination, and program analysis.}, isbn = {978-3-642-39799-8}, langid = {english}, keywords = {Empty Clause,Inference Process,Predicate Symbol,Proof Search,Theorem Prove}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\Q7BL5WK6\\Kovács and Voronkov - 2013 - First-Order Theorem Proving and Vampire.pdf} + file = {/home/sguillou/Zotero/storage/Q7BL5WK6/Kovács and Voronkov - 2013 - First-Order Theorem Proving and Vampire.pdf} } @article{krivineProgramFullAxiom2021, @@ -2343,11 +2403,25 @@ @article{krivineProgramFullAxiom2021 eprint = {2006.05433}, primaryclass = {cs, math}, urldate = {2021-03-16}, - abstract = {The theory of classical realizability is a framework for the Curry-Howard correspondence which enables to associate a program with each proof in Zermelo-Fraenkel set theory. But, almost all the applications of mathematics in physics, probability, statistics, etc. use Analysis i.e. the axiom of dependent choice (DC) or even the (full) axiom of choice (AC). It is therefore important to find explicit programs for these axioms. Various solutions are already known for DC, for instance the lambda-term known as "bar recursion". We present here the first one, as far as we know, for AC.}, archiveprefix = {arxiv}, langid = {english}, keywords = {03E40,Computer Science - Logic in Computer Science,F.4.1,Mathematics - Logic}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\4RFM78FA\\Krivine - 2021 - A program for the full axiom of choice.pdf} + file = {/home/sguillou/Zotero/storage/4RFM78FA/Krivine - 2021 - A program for the full axiom of choice.pdf} +} + +@inproceedings{kroeningInterpolationBasedSoftwareVerification2011, + title = {Interpolation-{{Based Software Verification}} with {{Wolverine}}}, + booktitle = {Computer {{Aided Verification}} - 23rd {{International Conference}}, {{CAV}} 2011, {{Snowbird}}, {{UT}}, {{USA}}, {{July}} 14-20, 2011. {{Proceedings}}}, + author = {Kroening, Daniel and Weissenbacher, Georg}, + editor = {Gopalakrishnan, Ganesh and Qadeer, Shaz}, + year = {2011}, + series = {Lecture {{Notes}} in {{Computer Science}}}, + volume = {6806}, + pages = {573--578}, + publisher = {{Springer}}, + doi = {10.1007/978-3-642-22110-1_45}, + urldate = {2023-09-05}, + file = {/home/sguillou/Zotero/storage/4XG57EA5/Kroening and Weissenbacher - 2011 - Interpolation-Based Software Verification with Wol.pdf} } @article{kunenNegationLogicProgramming1987, @@ -2359,12 +2433,11 @@ @article{kunenNegationLogicProgramming1987 volume = {4}, number = {4}, pages = {289--308}, - issn = {0743-1066}, + issn = {07431066}, doi = {10.1016/0743-1066(87)90007-0}, urldate = {2023-06-26}, - abstract = {We define a semantics for negation as failure in logic programming. Our semantics may be viewed as a cross between the approaches of Clark [5] and Fitting [7]. As does [7], our semantics corresponds well with real PROLOG in the standard examples used in the literature to illustrate problems with [5]. Also, PROLOG and the common variants of it are sound but not complete for our semantics. Unlike [7], our semantics is constructive, in that the set of supported queries is recursively enumerable. Thus, a complete interpreter exists in theory, although we point out that there are serious difficulties in building one that works well in practice.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\SVH5TQ7I\\Kunen - 1987 - Negation in logic programming.pdf;C\:\\Users\\Simon\\Zotero\\storage\\7E8GFNRR\\0743106687900070.html} + file = {/home/sguillou/Zotero/storage/2MN8PU5U/Kunen - 1987 - Negation in logic programming.pdf} } @article{kunenNegationLogicProgramming1987a, @@ -2376,12 +2449,11 @@ @article{kunenNegationLogicProgramming1987a volume = {4}, number = {4}, pages = {289--308}, - issn = {07431066}, + issn = {0743-1066}, doi = {10.1016/0743-1066(87)90007-0}, urldate = {2023-06-26}, - abstract = {Semantic Scholar extracted view of "Negation in Logic Programming" by K. Kunen}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\2MN8PU5U\\Kunen - 1987 - Negation in logic programming.pdf} + file = {/home/sguillou/Zotero/storage/SVH5TQ7I/Kunen - 1987 - Negation in logic programming.pdf;/home/sguillou/Zotero/storage/7E8GFNRR/0743106687900070.html} } @book{kunenSetTheoryIntroduction1983, @@ -2392,7 +2464,6 @@ @book{kunenSetTheoryIntroduction1983 edition = {Reprint edition}, publisher = {{North Holland}}, address = {{Amsterdam Heidelberg}}, - abstract = {Studies in Logic and the Foundations of Mathematics, Volume 102: Set Theory: An Introduction to Independence Proofs offers an introduction to relative consistency proofs in axiomatic set theory, including combinatorics, sets, trees, and forcing. The book first tackles the foundations of set theory and infinitary combinatorics. Discussions focus on the Suslin problem, Martin's axiom, almost disjoint and quasi-disjoint sets, trees, extensionality and comprehension, relations, functions, and well-ordering, ordinals, cardinals, and real numbers. The manuscript then ponders on well-founded sets and easy consistency proofs, including relativization, absoluteness, reflection theorems, properties of well-founded sets, and induction and recursion on well-founded relations. The publication examines constructible sets, forcing, and iterated forcing. Topics include Easton forcing, general iterated forcing, Cohen model, forcing with partial functions of larger cardinality, forcing with finite partial functions, and general extensions. The manuscript is a dependable source of information for mathematicians and researchers interested in set theory.}, isbn = {978-0-444-86839-8}, langid = {english} } @@ -2409,9 +2480,8 @@ @article{lamportShouldYourSpecification1999 issn = {0164-0925, 1558-4593}, doi = {10.1145/319301.319317}, urldate = {2021-05-11}, - abstract = {Most specification languages have a type system. Type systems are hard to get right, and getting them wrong can lead to inconsistencies. Set theory can serve as the basis for a specification language without types. This possibility, which has been widely overlooked, offers many advantages. Untyped set theory is simple and is more flexible than any simple typed formalism. Polymorphism, overloading, and subtyping can make a type system more powerful, but at the cost of increased somplexity, and such refinements can never attain the flexibility of having no types at all. Typed formalisms have advantages, too, stemming from the power of mechanical type checking. While types serve little purpose in hand proofs, they do help with mechanized proofs. In the absence of verificaiton, type checking can catch errors in specifications. It may be possible to have the best of both worlds by adding typing annotations to an untyped specification language. We consider only specification languages, not programming languages.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\4W45UP2I\\Lamport et Paulson - 1999 - Should your specification language be typed.pdf} + file = {/home/sguillou/Zotero/storage/4W45UP2I/Lamport et Paulson - 1999 - Should your specification language be typed.pdf} } @inproceedings{laurentFocusingOrthologic2016, @@ -2426,7 +2496,24 @@ @inproceedings{laurentFocusingOrthologic2016 publisher = {{Schloss Dagstuhl - Leibniz-Zentrum f\"ur Informatik}}, address = {{Porto, Portugal}}, doi = {10.4230/LIPIcs.FSCD.2016.25}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\WAHXYNXH\\Laurent - 2016 - Focusing in Orthologic.pdf} + file = {/home/sguillou/Zotero/storage/WAHXYNXH/Laurent - 2016 - Focusing in Orthologic.pdf} +} + +@article{lavaletteInterpolationFragmentsIntuitionistic1989, + title = {Interpolation in {{Fragments}} of {{Intuitionistic Propositional Logic}}}, + author = {de Lavalette, Gerard R. Renardel}, + year = {1989}, + journal = {The Journal of Symbolic Logic}, + volume = {54}, + number = {4}, + eprint = {2274823}, + eprinttype = {jstor}, + pages = {1419--1430}, + publisher = {{Association for Symbolic Logic}}, + issn = {0022-4812}, + doi = {10.2307/2274823}, + urldate = {2023-09-04}, + file = {/home/sguillou/Zotero/storage/54WY72VI/Lavalette - 1989 - Interpolation in Fragments of Intuitionistic Propo.pdf} } @article{leeProofirrelevantModelCC2011, @@ -2442,10 +2529,9 @@ @article{leeProofirrelevantModelCC2011 issn = {18605974}, doi = {10.2168/LMCS-7(4:5)2011}, urldate = {2020-10-22}, - abstract = {We present a set-theoretic, proof-irrelevant model for Calculus of Constructions (CC) with predicative induction and judgmental equality in Zermelo-Fraenkel set theory with an axiom for countably many inaccessible cardinals. We use Aczel's trace encoding which is universally defined for any function type, regardless of being impredicative. Direct and concrete interpretations of simultaneous induction and mutually recursive functions are also provided by extending Dybjer's interpretations on the basis of Aczel's rule sets. Our model can be regarded as a higher-order generalization of the truth-table methods. We provide a relatively simple consistency proof of type theory, which can be used as the basis for a theorem prover.}, langid = {english}, keywords = {Print}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\MMQSFUN7\\Lee and Werner - 2011 - Proof-irrelevant model of CC with predicative indu.pdf} + file = {/home/sguillou/Zotero/storage/MMQSFUN7/Lee and Werner - 2011 - Proof-irrelevant model of CC with predicative indu.pdf} } @inproceedings{lewisHazardDetectionQuinary1972, @@ -2460,53 +2546,65 @@ @inproceedings{lewisHazardDetectionQuinary1972 address = {{New York, NY, USA}}, doi = {10.1145/800153.804941}, urldate = {2021-09-10}, - abstract = {Effective logic simulation programs must consider device propagation delays to be bounded values. This requires that the logic devices be simulated by models which use a multi-valued logical algebra. A quinary algebra is developed and employed in special algorithms which not only accurately predict the behavior of a logic circuit for all values of delay, but also detect the possibility of latent hazards and race conditions. A sample problem is simulated, and conclusions drawn.}, isbn = {978-1-4503-7458-3}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\HJSIU9YH\\Lewis - 1972 - Hazard detection by a quinary simulation of logic .pdf} + file = {/home/sguillou/Zotero/storage/HJSIU9YH/Lewis - 1972 - Hazard detection by a quinary simulation of logic .pdf} } @misc{Ltac2Coq16, title = {Ltac2 \textemdash{} {{Coq}} 8.16.1 Documentation}, urldate = {2023-02-18}, howpublished = {https://coq.inria.fr/refman/proof-engine/ltac2.html}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\JAFCQNID\\ltac2.html} + file = {/home/sguillou/Zotero/storage/JAFCQNID/ltac2.html} } -@inproceedings{madsenDatalogFlixDeclarative2016, +@article{macneillePartiallyOrderedSets1937, + title = {Partially Ordered Sets}, + author = {MacNeille, H. M.}, + year = {1937}, + journal = {Transactions of the American Mathematical Society}, + volume = {42}, + number = {3}, + pages = {416--460}, + issn = {0002-9947, 1088-6850}, + doi = {10.1090/S0002-9947-1937-1501929-X}, + urldate = {2023-08-24}, + langid = {english}, + file = {/home/sguillou/Zotero/storage/69GGAK76/MacNeille - 1937 - Partially ordered sets.pdf} +} + +@article{madsenDatalogFlixDeclarative2016, title = {From {{Datalog}} to Flix: A Declarative Language for Fixed Points on Lattices}, shorttitle = {From {{Datalog}} to Flix}, - booktitle = {Proceedings of the 37th {{ACM SIGPLAN Conference}} on {{Programming Language Design}} and {{Implementation}}}, author = {Madsen, Magnus and Yee, Ming-Ho and Lhot{\'a}k, Ond{\v r}ej}, year = {2016}, month = jun, - series = {{{PLDI}} '16}, + journal = {Proceedings of the 37th ACM SIGPLAN Conference on Programming Language Design and Implementation}, pages = {194--208}, - publisher = {{Association for Computing Machinery}}, - address = {{New York, NY, USA}}, + publisher = {{ACM}}, + address = {{Santa Barbara CA USA}}, doi = {10.1145/2908080.2908096}, - urldate = {2023-06-26}, - abstract = {We present Flix, a declarative programming language for specifying and solving least fixed point problems, particularly static program analyses. Flix is inspired by Datalog and extends it with lattices and monotone functions. Using Flix, implementors of static analyses can express a broader range of analyses than is currently possible in pure Datalog, while retaining its familiar rule-based syntax. We define a model-theoretic semantics of Flix as a natural extension of the Datalog semantics. This semantics captures the declarative meaning of Flix programs without imposing any specific evaluation strategy. An efficient strategy is semi-naive evaluation which we adapt for Flix. We have implemented a compiler and runtime for Flix, and used it to express several well-known static analyses, including the IFDS and IDE algorithms. The declarative nature of Flix clearly exposes the similarity between these two algorithms.}, - isbn = {978-1-4503-4261-2}, - keywords = {Datalog,logic programming,static analysis}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\PPFQTYMM\\Madsen et al. - 2016 - From Datalog to flix a declarative language for f.pdf} + urldate = {2023-06-23}, + isbn = {9781450342612}, + langid = {english}, + file = {/home/sguillou/Zotero/storage/4TQ6II37/Madsen et al. - 2016 - From Datalog to flix a declarative language for f.pdf} } -@article{madsenDatalogFlixDeclarative2016a, +@inproceedings{madsenDatalogFlixDeclarative2016a, title = {From {{Datalog}} to Flix: A Declarative Language for Fixed Points on Lattices}, shorttitle = {From {{Datalog}} to Flix}, + booktitle = {Proceedings of the 37th {{ACM SIGPLAN Conference}} on {{Programming Language Design}} and {{Implementation}}}, author = {Madsen, Magnus and Yee, Ming-Ho and Lhot{\'a}k, Ond{\v r}ej}, year = {2016}, month = jun, - journal = {Proceedings of the 37th ACM SIGPLAN Conference on Programming Language Design and Implementation}, + series = {{{PLDI}} '16}, pages = {194--208}, - publisher = {{ACM}}, - address = {{Santa Barbara CA USA}}, + publisher = {{Association for Computing Machinery}}, + address = {{New York, NY, USA}}, doi = {10.1145/2908080.2908096}, - urldate = {2023-06-23}, - abstract = {We present Flix, a declarative programming language for specifying and solving least fixed point problems, particularly static program analyses. Flix is inspired by Datalog and extends it with lattices and monotone functions. Using Flix, implementors of static analyses can express a broader range of analyses than is currently possible in pure Datalog, while retaining its familiar rule-based syntax. We define a model-theoretic semantics of Flix as a natural extension of the Datalog semantics. This semantics captures the declarative meaning of Flix programs without imposing any specific evaluation strategy. An efficient strategy is semi-naive evaluation which we adapt for Flix. We have implemented a compiler and runtime for Flix, and used it to express several well-known static analyses, including the IFDS and IDE algorithms. The declarative nature of Flix clearly exposes the similarity between these two algorithms.}, - isbn = {9781450342612}, - langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\4TQ6II37\\Madsen et al. - 2016 - From Datalog to flix a declarative language for f.pdf} + urldate = {2023-06-26}, + isbn = {978-1-4503-4261-2}, + keywords = {Datalog,logic programming,static analysis}, + file = {/home/sguillou/Zotero/storage/PPFQTYMM/Madsen et al. - 2016 - From Datalog to flix a declarative language for f.pdf} } @article{mathiasStrengthMacLane2001, @@ -2521,10 +2619,9 @@ @article{mathiasStrengthMacLane2001 issn = {0168-0072}, doi = {10.1016/S0168-0072(00)00031-2}, urldate = {2023-05-04}, - abstract = {Saunders Mac Lane has drawn attention many times, particularly in his book Mathematics: Form and Function, to the system ZBQC of set theory of which the axioms are Extensionality, Null Set, Pairing, Union, Infinity, Power Set, Restricted Separation, Foundation, and Choice, to which system, afforced by the principle, TCo, of Transitive Containment, we shall refer as MAC. His system is naturally related to systems derived from topos-theoretic notions concerning the category of sets, and is, as Mac Lane emphasises, one that is adequate for much of mathematics. In this paper we show that the consistency strength of Mac Lane's system is not increased by adding the axioms of Kripke\textendash Platek set theory and even the Axiom of Constructibility to Mac Lane's axioms; our method requires a close study of Axiom H, which was proposed by Mitchell; we digress to apply these methods to subsystems of Zermelo set theory Z, and obtain an apparently new proof that Z is not finitely axiomatisable; we study Friedman's strengthening KPP+AC of KP+MAC, and the Forster\textendash Kaye subsystem KF of MAC, and use forcing over ill-founded models and forcing to establish independence results concerning MAC and KPP; we show, again using ill-founded models, that KPP+V=L proves the consistency of KPP; turning to systems that are type-theoretic in spirit or in fact, we show by arguments of Coret and Boffa that KF proves a weak form of Stratified Collection, and that MAC+KP is a conservative extension of MAC for stratified sentences, from which we deduce that MAC proves a strong stratified version of KP; we analyse the known equiconsistency of MAC with the simple theory of types and give Lake's proof that an instance of Mathematical Induction is unprovable in Mac Lane's system; we study a simple set theoretic assertion\textemdash namely that there exists an infinite set of infinite sets, no two of which have the same cardinal\textemdash and use it to establish the failure of the full schema of Stratified Collection in Z; and we determine the point of failure of various other schemata in MAC. The paper closes with some philosophical remarks.}, langid = {english}, keywords = {Axiom H spectacles,Conservative extension,Constructibility,Failure of collection,Failure of induction,Forcing over non-standard models,Forster\textendash Kaye set theory,Kripke\textendash Platek set theory,Mac Lane set theory,Mostowski's principle,Power-admissible set,Simple theory of types,Stratifiable formula}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\JTUTR33I\\Mathias - 2001 - The strength of Mac Lane set theory.pdf;C\:\\Users\\Simon\\Zotero\\storage\\DI88ZDS6\\S0168007200000312.html} + file = {/home/sguillou/Zotero/storage/JTUTR33I/Mathias - 2001 - The strength of Mac Lane set theory.pdf;/home/sguillou/Zotero/storage/DI88ZDS6/S0168007200000312.html} } @article{maziarzHashingModuloAlphaEquivalence2021, @@ -2532,10 +2629,9 @@ @article{maziarzHashingModuloAlphaEquivalence2021 author = {Maziarz, Krzysztof and Ellis, Tom and Lawrence, Alan and Fitzgibbon, Andrew and Jones, Simon Peyton}, year = {2021}, pages = {17}, - abstract = {Syntax Tree (AST), which represents computational expressions using a tree structure. Subtrees of such an AST \textemdash referred to as subexpressions \textemdash{} are useful, because they often correspond to semantically meaningful parts of the program, such as functions. Many applications need to quickly identify all equivalent subexpressions in an AST. Examples include common subexpression elimination (CSE), as mentioned above; structure sharing to save memory, by representing all occurrences of the same subexpression by a pointer to a single shared tree; or pre-processing for machine learning, where subexpression equivalence can be used as an additional feature, for example by turning an AST into a graph with equality links.}, langid = {english}, keywords = {Print,Read}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\UVHXC8H2\\Maziarz et al. - 2021 - Hashing Modulo Alpha-Equivalence.pdf} + file = {/home/sguillou/Zotero/storage/UVHXC8H2/Maziarz et al. - 2021 - Hashing Modulo Alpha-Equivalence.pdf} } @article{mcallesterAutomaticRecognitionTractability1993, @@ -2550,17 +2646,59 @@ @article{mcallesterAutomaticRecognitionTractability1993 issn = {0004-5411}, doi = {10.1145/151261.151265}, urldate = {2022-01-17}, - abstract = {A procedure is given for recognizing sets of inference rules that generate polynomial time decidable inference relations. The procedure can automatically recognize the tractability of the inference rules underlying congruence closure. The recognition of tractability for that particular rule set constitutes mechanical verification of a theorem originally proved independently by Kozen and Shostak. The procedure is algorithmic, rather than heuristic, and the class of automatically recognizable tractable rule sets can be precisely characterized. A series of examples of rule sets whose tractability is nontrivial, yet machine recognizable, is also given. The technical framework developed here is viewed as a first step toward a general theory of tractable inference relations.}, keywords = {automated reasoning,inference rules,machine inference,mechanical verification,polynomial-time algorithm,proof systems,proof theory,theorem proving}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\VYUT3688\\McAllester - 1993 - Automatic recognition of tractability in inference.pdf} + file = {/home/sguillou/Zotero/storage/VYUT3688/McAllester - 1993 - Automatic recognition of tractability in inference.pdf} +} + +@inproceedings{mcmillanInterpolantsSymbolicModel2007, + title = {Interpolants and {{Symbolic Model Checking}}}, + booktitle = {Verification, {{Model Checking}}, and {{Abstract Interpretation}}, 8th {{International Conference}}, {{VMCAI}} 2007, {{Nice}}, {{France}}, {{January}} 14-16, 2007, {{Proceedings}}}, + author = {McMillan, Kenneth L.}, + editor = {Cook, Byron and Podelski, Andreas}, + year = {2007}, + series = {Lecture {{Notes}} in {{Computer Science}}}, + volume = {4349}, + pages = {89--90}, + publisher = {{Springer}}, + doi = {10.1007/978-3-540-69738-1_6}, + urldate = {2023-08-28} } -@inproceedings{mcmillanSolvingConstrainedHorn2013, - title = {Solving {{Constrained Horn Clauses}} Using {{Interpolation MSR-TR-2013-6}}}, - author = {McMillan, K. and Micrsoft and Rybalchenko, A.}, +@inproceedings{mcmillanInterpolationSATBasedModel2003, + title = {Interpolation and {{SAT-Based Model Checking}}}, + booktitle = {Computer {{Aided Verification}}, 15th {{International Conference}}, {{CAV}} 2003, {{Boulder}}, {{CO}}, {{USA}}, {{July}} 8-12, 2003, {{Proceedings}}}, + author = {McMillan, Kenneth L.}, + editor = {Jr, Warren A. Hunt and Somenzi, Fabio}, + year = {2003}, + series = {Lecture {{Notes}} in {{Computer Science}}}, + volume = {2725}, + pages = {1--13}, + publisher = {{Springer}}, + doi = {10.1007/978-3-540-45069-6_1}, + urldate = {2023-08-27}, + file = {/home/sguillou/Zotero/storage/X3LWE3QP/McMillan - 2003 - Interpolation and SAT-Based Model Checking.pdf} +} + +@inproceedings{mcmillanQuantifiedInvariantGeneration2008, + title = {Quantified {{Invariant Generation Using}} an {{Interpolating Saturation Prover}}}, + booktitle = {Tools and {{Algorithms}} for the {{Construction}} and {{Analysis}} of {{Systems}}, 14th {{International Conference}}, {{TACAS}} 2008, {{Held}} as {{Part}} of the {{Joint European Conferences}} on {{Theory}} and {{Practice}} of {{Software}}, {{ETAPS}} 2008, {{Budapest}}, {{Hungary}}, {{March}} 29-{{April}} 6, 2008. {{Proceedings}}}, + author = {McMillan, Kenneth L.}, + editor = {Ramakrishnan, C. R. and Rehof, Jakob}, + year = {2008}, + series = {Lecture {{Notes}} in {{Computer Science}}}, + volume = {4963}, + pages = {413--427}, + publisher = {{Springer}}, + doi = {10.1007/978-3-540-78800-3_31}, + urldate = {2023-09-05} +} + +@techreport{mcmillanSolvingConstrainedHorn2013, + title = {Solving {{Constrained Horn Clauses}} Using {{Interpolation}}}, + author = {McMillan, K. and Rybalchenko, A.}, year = {2013}, - urldate = {2023-06-10}, - abstract = {We present an interpolation-based method for symbolically solving systems of constrained Horn clauses. The method can be used to solve for unknown predicates in the verification conditions of programs. Thus, it has a variety of applications, including including model checking of recursive and threaded programs. The method is implemented in tool called Duality, which we evaluate using device driver verification benchmarks. Copyright 2012 Microsoft Research. All rights reserved.} + institution = {{Microsoft Research}}, + urldate = {2023-06-10} } @article{meinanderSolutionUniformWord2010, @@ -2576,9 +2714,8 @@ @article{meinanderSolutionUniformWord2010 issn = {1469-8072, 0960-1295}, doi = {10.1017/S0960129510000125}, urldate = {2021-09-23}, - abstract = {The uniform word problem for finitely presented ortholattices is shown to be solvable through a method of terminating proof search. The axioms of ortholattices are all Harrop formulas, and thus can be expressed in natural deduction style as single succedent rules. A system of natural deduction style rules for orthologic is given as an extension of the system for lattices presented by Negri and von Plato. By considering formal derivations of atomic formulas from a finite number of given atomic formulas, it is shown that proof search is bounded, and thus that the question of derivability of any atomic formula from any finite set of given atomic formulas is decidable.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\AZ2H533D\\Meinander - 2010 - A solution of the uniform word problem for orthola.pdf;C\:\\Users\\Simon\\Zotero\\storage\\Y6FC9VVE\\551A8CBABF42CCFD42886E9646061777.html} + file = {/home/sguillou/Zotero/storage/AZ2H533D/Meinander - 2010 - A solution of the uniform word problem for orthola.pdf;/home/sguillou/Zotero/storage/Y6FC9VVE/551A8CBABF42CCFD42886E9646061777.html} } @article{mendelsonIntroductionMathematicalLogic, @@ -2586,7 +2723,7 @@ @article{mendelsonIntroductionMathematicalLogic author = {Mendelson, Elliott}, pages = {499}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\NYN4AYZU\\Mendelson - Introduction to Mathematical Logic, Sixth Edition.pdf} + file = {/home/sguillou/Zotero/storage/NYN4AYZU/Mendelson - Introduction to Mathematical Logic, Sixth Edition.pdf} } @book{mendelsonIntroductionMathematicalLogic1987, @@ -2613,11 +2750,10 @@ @inproceedings{merzAutomaticVerificationTLA2012 publisher = {{Springer}}, address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-642-28717-6_23}, - abstract = {TLA + is a formal specification language that is based on ZF set theory and the Temporal Logic of Actions TLA. The TLA + proof system tlaps assists users in deductively verifying safety properties of TLA + specifications. tlaps is built around a proof manager, which interprets the TLA + proof language, generates corresponding proof obligations, and passes them to backend verifiers. In this paper we present a new backend for use with SMT solvers that supports elementary set theory, functions, arithmetic, tuples, and records. Type information required by the solvers is provided by a typing discipline for TLA + proof obligations, which helps us disambiguate the translation of expressions of (untyped) TLA + , while ensuring its soundness. Preliminary results show that the backend can help to significantly increase the degree of automation of certain interactive proofs.}, isbn = {978-3-642-28717-6}, langid = {english}, keywords = {Function Symbol,Input Language,Proof Obligation,Read,Type Assignment,Type Inference}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\SPJJA9FN\\Merz et Vanzetto - 2012 - Automatic Verification of TLA + Proof Obligations.pdf} + file = {/home/sguillou/Zotero/storage/SPJJA9FN/Merz et Vanzetto - 2012 - Automatic Verification of TLA + Proof Obligations.pdf} } @article{micaliComputationallySoundProofs2000, @@ -2633,9 +2769,8 @@ @article{micaliComputationallySoundProofs2000 issn = {0097-5397}, doi = {10.1137/S0097539795284959}, urldate = {2022-05-27}, - abstract = {This paper puts forward a new notion of a proof based on computational complexity and explores its implications for computation at large. Computationally sound proofs provide, in a novel and meaningful framework, answers to old and new questions in complexity theory. In particular, given a random oracle or a new complexity assumption, they enable us to prove that verifying is easier than deciding for all theorems; provide a quite effective way to prove membership in computationally hard languages (such as \$\{\textbackslash cal C\}o\$-\$\textbackslash cal N \textbackslash cal P\$-complete ones); and show that every computation possesses a short certificate vouching its correctness. Finally, if a special type of computationally sound proof exists, we show that Blum's notion of program checking can be meaningfully broadened so as to prove that \$\textbackslash cal N \textbackslash cal P\$-complete languages are checkable.}, keywords = {interactive proofs,Merkle trees,probabilistically checkable proofs,random oracles}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\P8MD7SN2\\Micali - 2000 - Computationally Sound Proofs.pdf} + file = {/home/sguillou/Zotero/storage/P8MD7SN2/Micali - 2000 - Computationally Sound Proofs.pdf} } @article{minouxLTURSimplifiedLineartime1988, @@ -2651,10 +2786,9 @@ @article{minouxLTURSimplifiedLineartime1988 issn = {0020-0190}, doi = {10.1016/0020-0190(88)90124-X}, urldate = {2023-06-12}, - abstract = {Testing for the satisfiability of a Horn expression in propositional calculus is a fundamental problem in the area of logic programming for many reasons. Among these is the fact that the basic solution techniques for propositional Horn formulae have been shown to be central to the design of efficient interpreters for some predicate logic-based languages such as Hornlog (Gallier and Raatz, 1987). The present paper proposes a simplified way of deriving a linear-time algorithm avoiding many of the intricacies of previously known descriptions. In addition, a full, ready-to-use computer code is provided at the end of the paper together with a detailed analysis of the necessary data structures.}, langid = {english}, keywords = {Horn formula,Propositional calculus,unit resolution}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\ZCUR36KM\\Minoux - 1988 - LTUR a simplified linear-time unit resolution alg.pdf;C\:\\Users\\Simon\\Zotero\\storage\\QA3XRSN5\\002001908890124X.html} + file = {/home/sguillou/Zotero/storage/ZCUR36KM/Minoux - 1988 - LTUR a simplified linear-time unit resolution alg.pdf;/home/sguillou/Zotero/storage/QA3XRSN5/002001908890124X.html} } @article{miyazakiPropertiesOrthologics2005, @@ -2670,8 +2804,7 @@ @article{miyazakiPropertiesOrthologics2005 publisher = {{Springer}}, issn = {0039-3215}, urldate = {2023-03-30}, - abstract = {In this paper, we present three main results on orthologics. Firstly, we give a sufficient condition for an orthologic to have variable separation property and show that the orthomodular logic has this property. Secondly, we show that the class of modular orthologics has an infinite descending chain. Finally we show that there exists a continuum of orthologics.}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\ZT5YTQ5B\\Miyazaki - 2005 - Some Properties of Orthologics.pdf} + file = {/home/sguillou/Zotero/storage/ZT5YTQ5B/Miyazaki - 2005 - Some Properties of Orthologics.pdf} } @inproceedings{mouraZ3EfficientSMT2008, @@ -2687,7 +2820,7 @@ @inproceedings{mouraZ3EfficientSMT2008 publisher = {{Springer}}, doi = {10.1007/978-3-540-78800-3_24}, urldate = {2021-09-29}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\CERFSTMI\\Moura et Bjørner - 2008 - Z3 An Efficient SMT Solver.pdf} + file = {/home/sguillou/Zotero/storage/CERFSTMI/Moura et Bjørner - 2008 - Z3 An Efficient SMT Solver.pdf} } @inproceedings{naumowiczBriefOverviewMizar2009, @@ -2699,9 +2832,8 @@ @inproceedings{naumowiczBriefOverviewMizar2009 volume = {5674}, pages = {67--72}, doi = {10.1007/978-3-642-03359-9_5}, - abstract = {Mizar is the name of a formal language derived from informal mathematics and computer software that enables proof-checking of texts written in that language. The system has been actively developed since 1970s, growing into a popular proof assistant accompanied with a huge repository of formalized mathematical knowledge. In this short overview, we give an outline of the key features of the Mizar language, the ideas and theory behind the system, its main applications, and current development.}, isbn = {978-3-642-03358-2}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\DA8Z7WM5\\Naumowicz et Kornilowicz - 2009 - A Brief Overview of Mizar.pdf} + file = {/home/sguillou/Zotero/storage/DA8Z7WM5/Naumowicz et Kornilowicz - 2009 - A Brief Overview of Mizar.pdf} } @article{negriCutEliminationPresence1998, @@ -2716,9 +2848,8 @@ @article{negriCutEliminationPresence1998 issn = {1079-8986, 1943-5894}, doi = {10.2307/420956}, urldate = {2023-04-12}, - abstract = {A way is found to add axioms to sequent calculi that maintains the eliminab of cut, through the representation of axioms as rules of inference of a suitable form this method, the structural analysis of proofs is extended from pure logic to free-va theories, covering all classical theories, and a wide class of constructive theories. All r are proved for systems in which also the rules of weakening and contraction can be elimin Applications include a system of predicate logic with equality in which also cuts on the equ axioms are eliminated.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\UFP2F7EG\\Negri et von Plato - 1998 - Cut Elimination in the Presence of Axioms.pdf} + file = {/home/sguillou/Zotero/storage/UFP2F7EG/Negri et von Plato - 1998 - Cut Elimination in the Presence of Axioms.pdf} } @article{nelsonFastDecisionProcedures1980, @@ -2732,8 +2863,20 @@ @article{nelsonFastDecisionProcedures1980 pages = {356--364}, issn = {0004-5411}, doi = {10.1145/322186.322198}, - urldate = {2021-07-08}, - abstract = {The notion of the congruence closure of a relation on a graph is defined and several algorithms for computing it are surveyed. A simple proof is given that the congruence closure algorithm provides a decision procedure for the quantifier-free theory of equality. A decision procedure is then given for the quantifier-free theory of LISP list structure based on the congruence closure algorithm. Both decision procedures determine the satisfiability of a conjunction of literals of length n in average time O(n log n) using the fastest known congruence closure algorithm. It is also shown that if the axiomatization of the theory of list structure is changed slightly, the problem of determining the satisfiability of a conjunction of literals becomes NP-complete. The decision procedures have been implemented in the authors' simplifier for the Stanford Pascal Verifier.} + urldate = {2021-07-08} +} + +@inproceedings{nielsonPrinciplesProgramAnalysis1999, + title = {Principles of {{Program Analysis}}}, + author = {Nielson, Flemming and Nielson, Hanne Riis and Hankin, Chris}, + year = {1999}, + publisher = {{Springer Berlin Heidelberg}}, + address = {{Berlin, Heidelberg}}, + doi = {10.1007/978-3-662-03811-6}, + urldate = {2023-09-07}, + isbn = {978-3-642-08474-4 978-3-662-03811-6}, + langid = {english}, + file = {/home/sguillou/Zotero/storage/WBULJFVT/Nielson et al. - 1999 - Principles of Program Analysis.pdf} } @article{nieuwenhuisFastCongruenceClosure2007, @@ -2749,10 +2892,9 @@ @article{nieuwenhuisFastCongruenceClosure2007 issn = {0890-5401}, doi = {10.1016/j.ic.2006.08.009}, urldate = {2021-07-06}, - abstract = {Congruence closure algorithms for deduction in ground equational theories are ubiquitous in many (semi-)decision procedures used for verification and automated deduction. In many of these applications one needs an incremental algorithm that is moreover capable of recovering, among the thousands of input equations, the small subset that explains the equivalence of a given pair of terms. In this paper we present an algorithm satisfying all these requirements. First, building on ideas from abstract congruence closure algorithms, we present a very simple and clean incremental congruence closure algorithm and show that it runs in the best known time O(n log n). After that, we introduce a proof-producing union-find data structure that is then used for extending our congruence closure algorithm, without increasing the overall O(n log n) time, in order to produce a k-step explanation for a given equation in almost optimal time (quasi-linear in k). Finally, we show that the previous algorithms can be smoothly extended, while still obtaining the same asymptotic time bounds, in order to support the interpreted functions symbols successor and predecessor, which have been shown to be very useful in applications such as microprocessor verification.}, langid = {english}, keywords = {Congruence closure,Decision procedures,Equational reasoning,Verification}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\TPZIP83Y\\Nieuwenhuis et Oliveras - 2007 - Fast congruence closure and extensions.pdf;C\:\\Users\\Simon\\Zotero\\storage\\3U432IXW\\S0890540106001581.html} + file = {/home/sguillou/Zotero/storage/TPZIP83Y/Nieuwenhuis et Oliveras - 2007 - Fast congruence closure and extensions.pdf;/home/sguillou/Zotero/storage/3U432IXW/S0890540106001581.html} } @article{noelExperimentingIsabelleZF1993, @@ -2766,16 +2908,14 @@ @article{noelExperimentingIsabelleZF1993 issn = {0168-7433, 1573-0670}, doi = {10.1007/BF00881863}, urldate = {2021-06-22}, - abstract = {The theorem prover Isabelle has been used to axiomatise ZF set theory with natural deduction and to prove a number of theorems concerningfunctions. In particular, the well-foundedrecursion theorem has been derived, allowing the definition of functions over recursive types (such as the length and the append functions for lists). The theory of functions has been developed sufficiently within ZF to include PP2, the theory of continuous functions forming the basis of LCF. Most of the theorems have been derived using backward proofs, with a small amount of automation.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\8LVVXKPA\\Noel - 1993 - Experimenting with Isabelle in ZF set theory.pdf} + file = {/home/sguillou/Zotero/storage/8LVVXKPA/Noel - 1993 - Experimenting with Isabelle in ZF set theory.pdf} } @book{oderskyFoundationsImplicitFunction2017, title = {Foundations of {{Implicit Function Types}}}, editor = {Odersky, Martin and Biboudis, Aggelos and Liu, Fengyun and Blanvillain, Olivier}, - year = {2017}, - abstract = {Implicit parameters are used pervasively in Scala and are also present in a number of other programming and theorem proving languages. This paper describes a generalization of implicit parameters as they are currently found in Scala to implicit function types. We motivate the construct by a series of examples and provide formal foundations that closely follow the semantics implemented by the Scala compiler} + year = {2017} } @article{onerShefferStrokeOperation2017, @@ -2791,17 +2931,16 @@ @article{onerShefferStrokeOperation2017 issn = {2391-5455}, doi = {10.1515/math-2017-0075}, urldate = {2022-05-19}, - abstract = {In this study, a term operation Sheffer stroke is presented in a given basic algebra {$\mathscr{A}$} and the properties of the Sheffer stroke reduct of {$\mathscr{A}$} are examined. In addition, we qualify such Sheffer stroke basic algebras. Finally, we construct a bridge between Sheffer stroke basic algebras and Boolean algebras.}, langid = {english}, keywords = {Basic algebras,Lattice with antitone involutions,Sheffer stroke reduct}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\K7BHLYT4\\Oner and Senturk - 2017 - The Sheffer stroke operation reducts of basic alge.pdf} + file = {/home/sguillou/Zotero/storage/K7BHLYT4/Oner and Senturk - 2017 - The Sheffer stroke operation reducts of basic alge.pdf} } @misc{OrthologicQuantumLogic, title = {Orthologic and Quantum Logic: Models and Computational Elements: {{Journal}} of the {{ACM}}: {{Vol}} 47, {{No}} 4}, urldate = {2023-06-23}, howpublished = {https://dl.acm.org/doi/10.1145/347476.347481}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\V5FW2N4I\\347476.html} + file = {/home/sguillou/Zotero/storage/V5FW2N4I/347476.html} } @article{paulin-mohringIntroductionCalculusInductive, @@ -2810,7 +2949,7 @@ @article{paulin-mohringIntroductionCalculusInductive pages = {14}, langid = {english}, keywords = {Print,Read}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\8HYTAD2T\\Paulin-Mohring - Introduction to the Calculus of Inductive Construc.pdf} + file = {/home/sguillou/Zotero/storage/8HYTAD2T/Paulin-Mohring - Introduction to the Calculus of Inductive Construc.pdf} } @article{paulsonFoundationGenericTheorem1989, @@ -2825,35 +2964,31 @@ @article{paulsonFoundationGenericTheorem1989 issn = {0168-7433, 1573-0670}, doi = {10.1007/BF00248324}, urldate = {2021-06-22}, - abstract = {Isabelle [28, 30] is an interactive theorem prover that supports a variety of logics. It represents rules as propositions (not as functions) and builds proofs by combining rules. These operations constitute a meta-logic (or `logical framework') in which the object-logics are formalized. Isabelle is now based on higher-order logic \textemdash{} a precise and well-understood foundation.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\7CATU6YQ\\Paulson - 1989 - The foundation of a generic theorem prover.pdf} + file = {/home/sguillou/Zotero/storage/7CATU6YQ/Paulson - 1989 - The foundation of a generic theorem prover.pdf} } @article{paulsonGenericTableauProver, title = {A {{Generic Tableau Prover}} and Its {{Integration}} with {{Isabelle}}}, author = {Paulson, Lawrence C}, pages = {16}, - abstract = {A generic tableau prover has been implemented and integrated with Isabelle (Paulson, 1994). Compared with classical first-order logic provers, it has numerous extensions that allow it to reason with any supplied set of tableau rules. It has a higherorder syntax in order to support user-defined binding operators, such as those of set theory. The unification algorithm is first-order instead of higher-order, but it includes modifications to handle bound variables.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\QHLYBIFS\\Paulson - A Generic Tableau Prover and its Integration with .pdf} + file = {/home/sguillou/Zotero/storage/2YW536JF/Paulson - A Generic Tableau Prover and its Integration with .pdf} } @article{paulsonGenericTableauProvera, title = {A {{Generic Tableau Prover}} and Its {{Integration}} with {{Isabelle}}}, author = {Paulson, Lawrence C}, pages = {16}, - abstract = {A generic tableau prover has been implemented and integrated with Isabelle (Paulson, 1994). Compared with classical first-order logic provers, it has numerous extensions that allow it to reason with any supplied set of tableau rules. It has a higherorder syntax in order to support user-defined binding operators, such as those of set theory. The unification algorithm is first-order instead of higher-order, but it includes modifications to handle bound variables.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\2YW536JF\\Paulson - A Generic Tableau Prover and its Integration with .pdf} + file = {/home/sguillou/Zotero/storage/QHLYBIFS/Paulson - A Generic Tableau Prover and its Integration with .pdf} } @article{paulsonGenericTableauProverb, title = {A {{Generic Tableau Prover}} and Its {{Integration}} with {{Isabelle}}}, author = {Paulson, Lawrence C}, - abstract = {A generic tableau prover has been implemented and integrated with Isabelle (Paulson, 1994). Compared with classical first-order logic provers, it has numerous extensions that allow it to reason with any supplied set of tableau rules. It has a higherorder syntax in order to support user-defined binding operators, such as those of set theory. The unification algorithm is first-order instead of higher-order, but it includes modifications to handle bound variables.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\BE3WYU83\\Paulson - A Generic Tableau Prover and its Integration with .pdf} + file = {/home/sguillou/Zotero/storage/BE3WYU83/Paulson - A Generic Tableau Prover and its Integration with .pdf} } @article{paulsonIsabelleNext7001993, @@ -2878,19 +3013,17 @@ @article{paulsonMechanizingSetTheory1996 issn = {0168-7433, 1573-0670}, doi = {10.1007/BF00283132}, urldate = {2021-05-11}, - abstract = {Fairly deep results of Zermelo-Fr\textasciitilde enkel (ZF) set theory have been mechanized using the proof assistant Isabelle. The results concern cardinal arithmetic and the Axiom of Choice (AC). A key result about cardinal multiplication is n | n = n, where n is any infinite cardinal. Proving this result required developing theories of orders, order-isomorphisms, order types, ordinal arithmetic, cardinals, etc.; this covers most of Kunen, Set Theory, Chapter I. Furthermore, we have proved the equivalence of 7 formulations of the Well-ordering Theorem and 20 formulations of AC; this covers the first two chapters of Rubin and Rubin, Equivalentsof the Axiom of Choice,and involves highly technical material. The definitions used in the proofs are largely faithful in style to the original mathematics.}, langid = {english}, keywords = {Print,Read}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\SPE72WIA\\Paulson et Grabczewski - 1996 - Mechanizing set theory Cardinal arithmetic and th.pdf} + file = {/home/sguillou/Zotero/storage/SPE72WIA/Paulson et Grabczewski - 1996 - Mechanizing set theory Cardinal arithmetic and th.pdf} } @article{pfenningInductivelyNedTypes, title = {Inductively {{De}} Ned {{Types}} in the {{Calculus}} of {{Constructions}}}, author = {Pfenning, Frank and {Paulin-Mohring}, Christine}, pages = {20}, - abstract = {We de ne the notion of an inductively de ned type in the Calculus of Constructions and show how inductively de ned types can be represented by closed types. We show that all primitive recursive functionals over these inductively de ned types are also representable. This generalizes work by B ohm \& Berarducci on synthesis of functions on term algebras in the second-order polymorphic -calculus (F2). We give several applications of this generalization, including a representation of F2-programs in F3, along with a de nition of functions reify, reflect, and eval for F2 in F3. We also show how to de ne induction over inductively de ned types and sketch some results that show that the extension of the Calculus of Construction by induction principles does not alter the set of functions in its computational fragment, F!. This is because a proof by induction can be realized by primitive recursion, which is already de nable in F!.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\ZREBQ9A8\\Pfenning et Paulin-Mohring - Inductively Dened Types in the Calculus of Constr.pdf} + file = {/home/sguillou/Zotero/storage/ZREBQ9A8/Pfenning et Paulin-Mohring - Inductively Dened Types in the Calculus of Constr.pdf} } @article{piskacDecidingEffectivelyPropositional2010, @@ -2905,7 +3038,6 @@ @article{piskacDecidingEffectivelyPropositional2010 issn = {1573-0670}, doi = {10.1007/s10817-009-9161-6}, urldate = {2023-06-27}, - abstract = {We introduce a DPLL calculus that is a decision procedure for the Bernays-Sch\"onfinkel class, also known as EPR. Our calculus allows combining techniques for efficient propositional search with data-structures, such as Binary Decision Diagrams, that can efficiently and succinctly encode finite sets of substitutions and operations on these. In the calculus, clauses comprise of a sequence of literals together with a finite set of substitutions; truth assignments are also represented using substitution sets. The calculus works directly at the level of sets, and admits performing simultaneous constraint propagation and decisions, resulting in potentially exponential speedups over existing approaches.}, langid = {english}, keywords = {BDDs,DPLL,Effectively propositional logic,SAT} } @@ -2921,11 +3053,10 @@ @inproceedings{piskacLinearArithmeticStars2008 publisher = {{Springer}}, address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-540-70545-1_25}, - abstract = {We consider an extension of integer linear arithmetic with a ``star'' operator takes closure under vector addition of the solution set of a linear arithmetic subformula. We show that the satisfiability problem for this extended language remains in NP (and therefore NP-complete). Our proof uses semilinear set characterization of solutions of integer linear arithmetic formulas, as well as a generalization of a recent result on sparse solutions of integer linear programming problems. As a consequence of our result, we present worst-case optimal decision procedures for two NP-hard problems that were previously not known to be in NP. The first is the satisfiability problem for a logic of sets, multisets (bags), and cardinality constraints, which has applications in verification, interactive theorem proving, and description logics. The second is the reachability problem for a class of transition systems whose transitions increment the state vector by solutions of integer linear arithmetic formulas.}, isbn = {978-3-540-70545-1}, langid = {english}, keywords = {Atomic Formula,Decision Procedure,Integer Vector,Regular Expression,Star Operator}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\M78U2AQX\\Piskac and Kuncak - 2008 - Linear Arithmetic with Stars.pdf} + file = {/home/sguillou/Zotero/storage/M78U2AQX/Piskac and Kuncak - 2008 - Linear Arithmetic with Stars.pdf} } @article{pretolaniHierarchiesPolynomiallySolvable1996, @@ -2940,19 +3071,17 @@ @article{pretolaniHierarchiesPolynomiallySolvable1996 issn = {1012-2443, 1573-7470}, doi = {10.1007/BF02127974}, urldate = {2023-06-10}, - abstract = {In this paper, we introduce general techniques for extending classes of polynomially solvable SAT instances. We generalize the approach of Gallo and Scutell\`a, who defined the hierarchy \{{$\Gamma$}i\}, where {$\Gamma$}l corresponds to the Generalized Horn class. We propose a family of polynomial hierarchies, where a polynomial hierarchy \{{$\Pi$}i\} is a sequence of polynomially solvable classes that cover the whole set of CNF formulas, and such that {$\Pi$}i {$\cap$} {$\Pi$}i+1 fori{$\geq$}0. Following a different approach, based on a new decomposition technique, we define the class of Split-Horn formulas, which is an extension of {$\Gamma$}l. We discuss and compare the basic properties of the proposed classes; polynomial time algorithms for recognition and solution are provided.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\QE8RGP9I\\Pretolani - 1996 - Hierarchies of polynomially solvable satisfiabilit.pdf} + file = {/home/sguillou/Zotero/storage/QE8RGP9I/Pretolani - 1996 - Hierarchies of polynomially solvable satisfiabilit.pdf} } @misc{Publications, title = {Publications}, journal = {CVC4}, urldate = {2021-09-29}, - abstract = {An efficient open-source automatic theorem prover for satisfiability modulo theories (SMT) problems.}, howpublished = {https://cvc4.github.io/publications.html}, langid = {american}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\YD5X6Q6C\\publications.html} + file = {/home/sguillou/Zotero/storage/YD5X6Q6C/publications.html} } @incollection{pudlakLengthsProofs1998, @@ -2967,7 +3096,7 @@ @incollection{pudlakLengthsProofs1998 urldate = {2021-02-22}, isbn = {978-0-444-89840-1}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\F835D42D\\ProofLengthPudlak-1.pdf} + file = {/home/sguillou/Zotero/storage/F835D42D/ProofLengthPudlak-1.pdf} } @article{ramseyProblemFormalLogic1930, @@ -2983,7 +3112,7 @@ @article{ramseyProblemFormalLogic1930 urldate = {2023-06-27}, copyright = {\textcopyright{} 1930 London Mathematical Society}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\HD35THNG\\s2-30.1.html} + file = {/home/sguillou/Zotero/storage/HD35THNG/s2-30.1.html} } @article{rawlingOrthologicQuantumLogic2000, @@ -2999,16 +3128,15 @@ @article{rawlingOrthologicQuantumLogic2000 issn = {0004-5411}, doi = {10.1145/347476.347481}, urldate = {2023-06-23}, - abstract = {Motivated by a growing need to understand the computational potential of quantum devices we suggest an approach to the relevant issues via quantum logic and its model theory. By isolating such notions as quantum parallelism and interference within a model-theoretic setting, quite divorced from their customary physical trappings, we seek to lay bare their logical underpinnings and possible computational ramifications. In the first part of the paper, a brief account of the relevant model theory is given, and some new results are derived. In the second part, we model the simplest classical gate, namely the N-gate, propose a quantization scheme (which translates between classical and quantum models, and from which emerges a logical interpretation of the notion of quantum parallelism), and apply it to the classical N-gate model. A class of physical instantiations of the resulting quantum N-gate model is also briefly discussed.}, keywords = {Hilbert spaces,quantum computing,quantum logic,quantum physics}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\FTQ5ZZEW\\Rawling and Selesnick - 2000 - Orthologic and quantum logic models and computati.pdf} + file = {/home/sguillou/Zotero/storage/FTQ5ZZEW/Rawling and Selesnick - 2000 - Orthologic and quantum logic models and computati.pdf} } @article{reynoldsPolymorphismNotSettheoretic, title = {Polymorphism Is Not Set-Theoretic}, author = {Reynolds, John}, pages = {16}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\VBRVBSUR\\Reynolds - Polymorphism is not set-theoretic.pdf} + file = {/home/sguillou/Zotero/storage/VBRVBSUR/Reynolds - Polymorphism is not set-theoretic.pdf} } @book{robinsonHandbookAutomatedReasoning2001, @@ -3018,7 +3146,8 @@ @book{robinsonHandbookAutomatedReasoning2001 publisher = {{Elsevier and MIT Press}}, address = {{MIT}}, urldate = {2023-04-01}, - isbn = {978-0-444-50813-3} + isbn = {978-0-444-50813-3}, + file = {/home/sguillou/Zotero/storage/C5JQWH5P/Robinson A., Voronkov A. (eds.) - Handbook of automated reasoning, vol.2-Elsevier (2001).pdf;/home/sguillou/Zotero/storage/CDZ8RB6Y/Robinson A., Voronkov A. (eds.) - Handbook of automated reasoning Vol. 1 [...]-Amsterdam [u.a] Elsevier [u.a.] (2001).pdf} } @book{rubinEquivalentsAxiomChoice1963, @@ -3040,10 +3169,9 @@ @incollection{rubioPrecedencebasedTotalACcompatible1993 address = {{Berlin, Heidelberg}}, doi = {10.1007/3-540-56868-9_28}, urldate = {2021-05-06}, - abstract = {Like Narendran and Ruainowitch (NR91), we define a simplification ordering which is AC-compatible and total on non-AC-equivalent ground terms, without any restrictions on the signature like the number of AC-symbola or free symbols. An important difference w.r.t. their work is that our ordering is not based on polynomial interpretations, but on a total \{arbitrary) precedence on the function symbols, like in LPO or RPO (this solves an open question posed e.g. by Bachmair (Bac91]).\vphantom\}}, isbn = {978-3-540-56868-1}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\6C7FC5DZ\\Rubio et Nieuwenhuis - 1993 - A precedence-based total AC-compatible ordering.pdf} + file = {/home/sguillou/Zotero/storage/6C7FC5DZ/Rubio et Nieuwenhuis - 1993 - A precedence-based total AC-compatible ordering.pdf} } @article{schinzTailCallElimination2001, @@ -3055,8 +3183,23 @@ @article{schinzTailCallElimination2001 volume = {59}, pages = {158--171}, doi = {10.1016/S1571-0661(05)80459-1}, - abstract = {A problem that often has to be solved by compilers for functional languages targeting the Java Virtual Machine is the elimination of tail calls. This paper explains how we solved it in our Funnel compiler and presents some experimental results about the impact our technique has on both performance and size of the compiled programs.}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\7URK6P2G\\Schinz et Odersky - 2001 - Tail Call Elimination on the Java Virtual Machine.pdf} + file = {/home/sguillou/Zotero/storage/7URK6P2G/Schinz et Odersky - 2001 - Tail Call Elimination on the Java Virtual Machine.pdf} +} + +@article{schuetteInterpolationssatzIntuitionistischenPraedikatenlogik1962, + title = {{Der Interpolationssatz der intuitionistischen Pr\"adikatenlogik}}, + author = {Sch{\"u}tte, Kurt}, + year = {1962}, + month = jun, + journal = {Mathematische Annalen}, + volume = {148}, + number = {3}, + pages = {192--200}, + issn = {1432-1807}, + doi = {10.1007/BF01470747}, + urldate = {2023-09-05}, + langid = {ngerman}, + file = {/home/sguillou/Zotero/storage/665TJ4DD/Schütte - 1962 - Der Interpolationssatz der intuitionistischen Präd.pdf} } @article{schultemontingCutEliminationWord1981, @@ -3071,9 +3214,8 @@ @article{schultemontingCutEliminationWord1981 issn = {0002-5240, 1420-8911}, doi = {10.1007/BF02483891}, urldate = {2023-06-12}, - abstract = {Semantic Scholar extracted view of "Cut elimination and word problems for varieties of lattices" by J. Schulte M\"onting}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\9NKZNRB4\\Schulte Mönting - 1981 - Cut elimination and word problems for varieties of.pdf} + file = {/home/sguillou/Zotero/storage/9NKZNRB4/Schulte Mönting - 1981 - Cut elimination and word problems for varieties of.pdf} } @inproceedings{schulzSystemDescription2013, @@ -3088,11 +3230,10 @@ @inproceedings{schulzSystemDescription2013 publisher = {{Springer}}, address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-642-45221-5_49}, - abstract = {E is a theorem prover for full first-order logic with equality. It reduces first-order problems to clause normal form and employs a saturation algorithm based on the equational superposition calculus. E is built on shared terms with cached rewriting, and employs several innovations for efficient clause indexing. Major strengths of the system are automatic problem analysis and highly flexible search heuristics. The prover can provide verifiable proof objects and answer substitutions with very little overhead. E performs well, solving more than 69\% of TPTP-5.4.0 FOF and CNF problems in automatic mode.}, isbn = {978-3-642-45221-5}, langid = {english}, keywords = {Answer Substitution,Empty Clause,Inference Rule,Proof Object,Proof Tree}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\CMKX2ZKT\\Schulz - 2013 - System Description E 1.8.pdf} + file = {/home/sguillou/Zotero/storage/CMKX2ZKT/Schulz - 2013 - System Description E 1.8.pdf} } @inproceedings{schurrReliableReconstructionFinegrained2021, @@ -3106,11 +3247,10 @@ @inproceedings{schurrReliableReconstructionFinegrained2021 publisher = {{Springer International Publishing}}, address = {{Cham}}, doi = {10.1007/978-3-030-79876-5_26}, - abstract = {We present a fast and reliable reconstruction of proofs generated by the SMT solver veriT in Isabelle. The fine-grained proof format makes the reconstruction simple and efficient. For typical proof steps, such as arithmetic reasoning and skolemization, our reconstruction can avoid expensive search. By skipping proof steps that are irrelevant for Isabelle, the performance of proof checking is improved. Our method increases the success rate of Sledgehammer by halving the failure rate and reduces the checking time by 13\%. We provide a detailed evaluation of the reconstruction time for each rule. The runtime is influenced by both simple rules that appear very often and common complex rules.}, isbn = {978-3-030-79876-5}, langid = {english}, keywords = {automatic theorem provers,proof assistants,proof verification}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\YI3QTC29\\Schurr et al. - 2021 - Reliable Reconstruction of Fine-grained Proofs in .pdf} + file = {/home/sguillou/Zotero/storage/YI3QTC29/Schurr et al. - 2021 - Reliable Reconstruction of Fine-grained Proofs in .pdf} } @book{SetTheory2003, @@ -3124,7 +3264,7 @@ @book{SetTheory2003 isbn = {978-3-540-44085-7}, langid = {english}, keywords = {algebra,arithmetic,axiom of choice,Boolean algebra,Cardinal number,cardinals,field,forcing,Mengenlehre,set theory,ultrapower,YellowSale2006}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\V78S4MTI\\2003 - Set Theory.pdf} + file = {/home/sguillou/Zotero/storage/V78S4MTI/2003 - Set Theory.pdf} } @book{SetTheoryIntroduction, @@ -3132,15 +3272,14 @@ @book{SetTheoryIntroduction shorttitle = {Set {{Theory}}}, urldate = {2022-05-10}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\PNP73XA2\\9780817642563.html} + file = {/home/sguillou/Zotero/storage/PNP73XA2/9780817642563.html} } @misc{sheeranTutorialStalmarckProof1998, title = {A Tutorial on {{St\aa lmarck}}'s Proof Procedure for Propositional Logic}, author = {Sheeran, Mary and Stalmarck, Gunnar}, year = {1998}, - abstract = {. We explain Stalmarck's proof procedure for classical propositional logic. The method is implemented in a commercial tool that has been used successfully in real industrial verification projects. Here, we present the proof system underlying the method, and motivate the various design decisions that have resulted in a system that copes well with the large formulas encountered in industrial-scale verification. We also discuss possible applications in Computer Aided Design of electronic circuits. c flSpringer To appear in the proceedings of FMCAD'98, published in Springer LNCS. A tutorial on Stalmarck's proof procedure for propositional logic Mary Sheeran and Gunnar Stalmarck Prover Technology AB and Chalmers University of Technology, Sweden Abstract. We explain Stalmarck's proof procedure for classical propositional logic. The method is implemented in a commercial tool that has been used successfully in real industrial verification projects. Here, we present the proof system underly...}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\KSSHCHYH\\Sheeran et al. - 1998 - A tutorial on Stålmarck's proof procedure for prop.pdf;C\:\\Users\\Simon\\Zotero\\storage\\6B8P5VDQ\\summary.html} + file = {/home/sguillou/Zotero/storage/KSSHCHYH/Sheeran et al. - 1998 - A tutorial on Stålmarck's proof procedure for prop.pdf;/home/sguillou/Zotero/storage/6B8P5VDQ/summary.html} } @article{sherifDecisionProblemOrthomodular1997, @@ -3155,9 +3294,8 @@ @article{sherifDecisionProblemOrthomodular1997 issn = {0002-5240}, doi = {10.1007/PL00000328}, urldate = {2022-08-02}, - abstract = {This paper answers a question of H. P. Sankappanavar who asked whether the theory of orthomodular lattices is recursively (finitely) inseparable (question 9 in [10]). A very similar question was raised by Stanley Burris at the Oberwolfach meeting on Universal Algebra, July 15 \textendash{} 21, 1979, and was later included in G. Kalmbach's monograph [6] as the problem 42. Actually Burris asked which varieties of orthomodular lattices are finitely decidable. Although we are not able to give a full answer to Burris' question we have a contribution to the problem.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\PI3PRMLU\\Sherif - 1997 - Decision problem for orthomodular lattices.pdf} + file = {/home/sguillou/Zotero/storage/PI3PRMLU/Sherif - 1997 - Decision problem for orthomodular lattices.pdf} } @article{shulmanSetTheoryCategory2008, @@ -3169,11 +3307,10 @@ @article{shulmanSetTheoryCategory2008 eprint = {0810.1279}, primaryclass = {math}, urldate = {2021-06-23}, - abstract = {Questions of set-theoretic size play an essential role in category theory, especially the distinction between sets and proper classes (or small sets and large sets). There are many different ways to formalize this, and which choice is made can have noticeable effects on what categorical constructions are permissible. In this expository paper we summarize and compare a number of such ``set-theoretic foundations for category theory,'' and describe their implications for the everyday use of category theory. We assume the reader has some basic knowledge of category theory, but little or no prior experience with formal logic or set theory.}, archiveprefix = {arxiv}, langid = {english}, keywords = {Mathematics - Category Theory,Mathematics - Logic}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\9UB7CGAP\\Shulman - 2008 - Set theory for category theory.pdf} + file = {/home/sguillou/Zotero/storage/9UB7CGAP/Shulman - 2008 - Set theory for category theory.pdf} } @article{silesInvestigationTypingEquality, @@ -3182,7 +3319,7 @@ @article{silesInvestigationTypingEquality pages = {131}, langid = {english}, keywords = {Print,Read}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\A7F6SMQJ\\Siles - Investigation on the typing of equality in type sy.pdf} + file = {/home/sguillou/Zotero/storage/A7F6SMQJ/Siles - Investigation on the typing of equality in type sy.pdf} } @article{silesPureTypeSystem2011, @@ -3190,9 +3327,8 @@ @article{silesPureTypeSystem2011 author = {Siles, Vincent and Herbelin, Hugo}, year = {2011}, pages = {28}, - abstract = {Pure Type Systems are usually described in two different ways, one that uses an external notion of computation like beta-reduction, and one that relies on a typed judgment of equality, directly in the typing system.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\W33BJ92X\\Siles and Herbelin - 2011 - Pure Type System conversion is always typable.pdf} + file = {/home/sguillou/Zotero/storage/W33BJ92X/Siles and Herbelin - 2011 - Pure Type System conversion is always typable.pdf} } @inproceedings{slindBriefOverviewHOL42008, @@ -3202,9 +3338,8 @@ @inproceedings{slindBriefOverviewHOL42008 month = aug, pages = {28--32}, doi = {10.1007/978-3-540-71067-7_6}, - abstract = {The HOLF proof assistant supports specification and proof in classical higher order logic. It is the latest in a long line of similar systems. In this short overview, we give an outline of the HOLF system and how it may be applied in formal verification.}, isbn = {978-3-540-71065-3}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\CTN5KQJG\\Slind et Norrish - 2008 - A Brief Overview of HOL4.pdf} + file = {/home/sguillou/Zotero/storage/CTN5KQJG/Slind et Norrish - 2008 - A Brief Overview of HOL4.pdf} } @article{songExpansionPostponementPure1997, @@ -3219,9 +3354,8 @@ @article{songExpansionPostponementPure1997 issn = {1860-4749}, doi = {10.1007/BF02947207}, urldate = {2020-10-19}, - abstract = {The expansion postponement problem in Pure Type Systems is an open problem raised by R. Pollack in 1992. In this paper, the author presents a set of necessary and sufficient conditions for this problem and a set of sufficient conditions for it. The author also gives some properties for pure type systems without the expansion rule.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\6HDPF3YG\\Song - 1997 - The expansion postponement in Pure Type Systems.pdf} + file = {/home/sguillou/Zotero/storage/6HDPF3YG/Song - 1997 - The expansion postponement in Pure Type Systems.pdf} } @article{sorensenLecturesCurryHowardIsomorphism2010, @@ -3232,9 +3366,8 @@ @article{sorensenLecturesCurryHowardIsomorphism2010 journal = {Studies in Logic and the Foundations of Mathematics}, volume = {149}, doi = {10.1016/S0049-237X(06)80005-4}, - abstract = {The Curry-Howard isomorphism states an amazing correspondence between systems of formal logic as encountered in proof theory and computational calculi as found in type theory. For instance, minimal propositional logic corresponds to simply typed-calculus, first-order logic corresponds to dependent types, second-order logic corresponds to polymorphic types, etc. The isomorphism has many aspects, even at the syntactic level: formulas correspond to types, proofs correspond to terms, provability corresponds to inhabitation, proof normalization corresponds to term reduction, etc. But there is much more to the isomorphism than this. For instance, it is an old idea---due to Brouwer, Kolmogorov, and Heyting, and later formalized by Kleene's realizability interpretation---that a constructive proof of an implication is a procedure that transforms proofs of the antecedent into proofs of the succedent; the Curry-Howard isomorphism gives syntactic representations of such procedures. These notes give an introduction to parts of proof theory and related aspects of type theory relevant for the Curry-Howard isomorphism.}, keywords = {Print,Read}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\QCZLVT2Y\\curry-howard.pdf} + file = {/home/sguillou/Zotero/storage/QCZLVT2Y/curry-howard.pdf} } @article{spiveyNotationReferenceManual, @@ -3242,7 +3375,7 @@ @article{spiveyNotationReferenceManual author = {Spivey, J M}, pages = {168}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\FUUZRIV7\\Spivey - Programming Research Group University of Oxford.pdf} + file = {/home/sguillou/Zotero/storage/FUUZRIV7/Spivey - Programming Research Group University of Oxford.pdf} } @misc{steenHigherOrderProverLeoIII2018, @@ -3256,10 +3389,9 @@ @misc{steenHigherOrderProverLeoIII2018 publisher = {{arXiv}}, doi = {10.48550/arXiv.1802.02732}, urldate = {2023-05-04}, - abstract = {The automated theorem prover Leo-III for classical higher-order logic with Henkin semantics and choice is presented. Leo-III is based on extensional higher-order paramodulation and accepts every common TPTP dialect (FOF, TFF, THF), including their recent extensions to rank-1 polymorphism (TF1, TH1). In addition, the prover natively supports almost every normal higher-order modal logic. Leo-III cooperates with first-order reasoning tools using translations to many-sorted first-order logic and produces verifiable proof certificates. The prover is evaluated on heterogeneous benchmark sets.}, archiveprefix = {arxiv}, keywords = {{03B35, 03B15, 03B45, 68T15, 68T27, 68T30:, 03Bxx},Computer Science - Artificial Intelligence,Computer Science - Logic in Computer Science,F.4.1,I.2.3,I.2.4,Mathematics - Logic}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\MK2KSZKE\\Steen and Benzmüller - 2018 - The Higher-Order Prover Leo-III (Extended Version).pdf;C\:\\Users\\Simon\\Zotero\\storage\\DLRXKXPE\\1802.html} + file = {/home/sguillou/Zotero/storage/MK2KSZKE/Steen and Benzmüller - 2018 - The Higher-Order Prover Leo-III (Extended Version).pdf;/home/sguillou/Zotero/storage/DLRXKXPE/1802.html} } @misc{steenLeoIII2022, @@ -3269,10 +3401,9 @@ @misc{steenLeoIII2022 month = jul, doi = {10.5281/zenodo.7650205}, urldate = {2023-02-19}, - abstract = {Leo-III~is an automated theorem prover for (polymorphic) higher-order logic which supports all common TPTP dialects, including THF, TFF and FOF as well as their rank-1 polymorphic derivatives. It is based on a paramodulation calculus with ordering constraints and, in tradition of its predecessor LEO-II, heavily relies on cooperation with external (mostly first-order) theorem provers for increased performance. Nevertheless, Leo-III can also be used as a stand-alone prover without employing any external cooperation. Leo-III version 1.7. Minor updates: Fixes around ground arithmetic support Updated versions of scala library and other dependencies Updated embedding runtime for NCL reasoning This version was used in CASC-J11 (http://tptp.org/CASC/J11/).}, howpublished = {Zenodo}, keywords = {artificial intelligence,automated deduction,automated reasoning,automated theorem prover,computational logic,higher-order logic,logic,reasoning,scala,theorem proving,TPTP}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\65MYUY2S\\7650205.html} + file = {/home/sguillou/Zotero/storage/65MYUY2S/7650205.html} } @book{stefanovaSimpleModelConstruction1995, @@ -3282,10 +3413,9 @@ @book{stefanovaSimpleModelConstruction1995 month = jan, pages = {264}, doi = {10.1007/3-540-61780-9_74}, - abstract = {We present a model construction for the Calculus of Constructions (CC) where all dependencies are carried out in a set-theoretical setting. The Soundness Theorem is proved and as a consequence of it Strong Normalization for CC is obtained. Some other applications of our model constructions are: showing that CC + Classical logic is consistent (by constructing a model for it) and showing that the Axiom of Choice is not derivable in CC (by constructing a model in which the type that represents the Axiom of Choice is empty).}, isbn = {978-3-540-61780-8}, keywords = {Read}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\D2RI2YND\\Stefanova and Geuvers - 1995 - A Simple Model Construction for the Calculus of Co.pdf} + file = {/home/sguillou/Zotero/storage/D2RI2YND/Stefanova and Geuvers - 1995 - A Simple Model Construction for the Calculus of Co.pdf} } @inproceedings{steindorferOptimizingHasharrayMapped2015, @@ -3299,19 +3429,17 @@ @inproceedings{steindorferOptimizingHasharrayMapped2015 address = {{Pittsburgh PA USA}}, doi = {10.1145/2814270.2814312}, urldate = {2021-06-08}, - abstract = {The data structures under-pinning collection API (e.g. lists, sets, maps) in the standard libraries of programming languages are used intensively in many applications.}, isbn = {978-1-4503-3689-5}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\5WEHWNLQ\\Steindorfer et Vinju - 2015 - Optimizing hash-array mapped tries for fast and le.pdf} + file = {/home/sguillou/Zotero/storage/5WEHWNLQ/Steindorfer et Vinju - 2015 - Optimizing hash-array mapped tries for fast and le.pdf} } @article{sutcliffePracticeClausificationAutomatic, title = {The {{Practice}} of {{Clausification}} in {{Automatic Theorem Proving}}}, author = {Sutcliffe, Geoff and Melville, Stuart}, pages = {21}, - abstract = {In the process of resolution based Automatic Theorem Proving, problems expressed in First Order Form (FOF) are transformed by a clausifier to Clause Normal Form (CNF). This research examines and compares clausifiers. The boundaries between clausification, simplification, and solution search are delineated, and common clausification and simplification operations are documented. Four known clausifiers are evaluated, thus providing insight into their relative performance, and also providing baseline data for future evaluation of clausifiers.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\MI4LSCBT\\Sutcliffe and Melville - The Practice of Clausification in Automatic Theore.pdf} + file = {/home/sguillou/Zotero/storage/MI4LSCBT/Sutcliffe and Melville - The Practice of Clausification in Automatic Theore.pdf} } @inproceedings{taitNormalDerivabilityClassical1968, @@ -3328,7 +3456,7 @@ @inproceedings{taitNormalDerivabilityClassical1968 isbn = {978-3-540-35900-5}, langid = {english}, keywords = {Axiom System,Normal Derivation,Peano Arithmetic,Predicate Variable,Propositional Logic}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\2IRXTZFC\\Tait - 1968 - Normal derivability in classical logic.pdf} + file = {/home/sguillou/Zotero/storage/2IRXTZFC/Tait - 1968 - Normal derivability in classical logic.pdf} } @incollection{thakurGeneralizationStalmarckMethod2012, @@ -3343,10 +3471,9 @@ @incollection{thakurGeneralizationStalmarckMethod2012 address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-642-33125-1_23}, urldate = {2022-09-28}, - abstract = {This paper gives an account of St\r{}almarck's method for validity checking of propositional-logic formulas, and explains each of the key components in terms of concepts from the field of abstract interpretation. We then use these insights to present a framework for propositional-logic validity-checking algorithms that is parametrized by an abstract domain and operations on that domain. St\r{}almarck's method is one instantiation of the framework; other instantiations lead to new decision procedures for propositional logic.}, isbn = {978-3-642-33124-4 978-3-642-33125-1}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\RBK2JQDX\\Thakur and Reps - 2012 - A Generalization of Stålmarck’s Method.pdf} + file = {/home/sguillou/Zotero/storage/RBK2JQDX/Thakur and Reps - 2012 - A Generalization of Stålmarck’s Method.pdf} } @inproceedings{tiurynSubtypeInequalities1992, @@ -3357,9 +3484,8 @@ @inproceedings{tiurynSubtypeInequalities1992 month = jun, pages = {308--315}, doi = {10.1109/LICS.1992.185543}, - abstract = {The satisfiability problem for subtype inequalities in simple types is studied. The naive algorithm that solves this problem runs in nondeterministic exponential time for every predefined poset of atomic subtypings the satisfiability problem for subtype inequalities is PSPACE-hard. On the other hand, it is proved that if the poset of atomic subtypings is a disjoint union of lattices, then the satisfiability problem for subtype inequalities is solvable in PTIME. This result covers the important special case of the unification problem that can be obtained when the atomic subtype relation is equality.{$<>$}}, keywords = {Calculus,Informatics,Lattices,Needed,Polynomials,Upper bound}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\CBF665SL\\Tiuryn - 1992 - Subtype inequalities.pdf;C\:\\Users\\Simon\\Zotero\\storage\\HRBCMGV2\\185543.html} + file = {/home/sguillou/Zotero/storage/CBF665SL/Tiuryn - 1992 - Subtype inequalities.pdf;/home/sguillou/Zotero/storage/HRBCMGV2/185543.html} } @inproceedings{traytelFoundationalCompositionalCo2012, @@ -3374,10 +3500,9 @@ @inproceedings{traytelFoundationalCompositionalCo2012 address = {{Dubrovnik, Croatia}}, doi = {10.1109/LICS.2012.75}, urldate = {2022-03-20}, - abstract = {Interactive theorem provers based on higher-order logic (HOL) traditionally follow the definitional approach, reducing high-level specifications to logical primitives. This also applies to the support for datatype definitions. However, the internal datatype construction used in HOL4, HOL Light, and Isabelle/HOL is fundamentally noncompositional, limiting its efficiency and flexibility, and it does not cater for codatatypes. We present a fully modular framework for constructing (co)datatypes in HOL, with support for mixed mutual and nested (co)recursion. Mixed (co)recursion enables type definitions involving both datatypes and codatatypes, such as the type of finitely branching trees of possibly infinite depth. Our framework draws heavily from category theory. The key notion is that of a bounded natural functor\textemdash an enriched type constructor satisfying specific properties preserved by interesting categorical operations. Our ideas are formalized in Isabelle and implemented as a new definitional package, answering a long-standing user request.}, isbn = {978-1-4673-2263-8 978-0-7695-4769-5}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\BZ3ASFF4\\Traytel et al. - 2012 - Foundational, Compositional (Co)datatypes for High.pdf} + file = {/home/sguillou/Zotero/storage/BZ3ASFF4/Traytel et al. - 2012 - Foundational, Compositional (Co)datatypes for High.pdf} } @incollection{tseitinComplexityDerivationPropositional1983, @@ -3392,7 +3517,6 @@ @incollection{tseitinComplexityDerivationPropositional1983 address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-642-81955-1_28}, urldate = {2023-06-12}, - abstract = {The question of the minimum complexity of derivation of a given formula in classical propositional calculus is considered in this article and it is proved that estimates of complexity may vary considerably among the various forms of propositional calculus. The forms of propositional calculus used in the present article are somewhat unusual, \textdagger{} but the results obtained for them can, in principle, be extended to the usual forms of propositional calculus.}, isbn = {978-3-642-81955-1}, langid = {english} } @@ -3424,8 +3548,22 @@ @unpublished{urbanIsabelleCookbook2013 year = {2013}, month = jul, urldate = {2023-02-18}, - abstract = {A Gentle Tutorial for Programming on the ML-Level of Isabelle}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\M5JZA4IM\\isabelle_programming.urban.pdf} + file = {/home/sguillou/Zotero/storage/M5JZA4IM/isabelle_programming.urban.pdf} +} + +@inproceedings{urzyczynInhabitationTypedLambdacalculi1997, + title = {Inhabitation in Typed Lambda-Calculi (a Syntactic Approach)}, + booktitle = {Typed {{Lambda Calculi}} and {{Applications}}}, + author = {Urzyczyn, Pawel}, + editor = {Goos, Gerhard and Hartmanis, Juris and Leeuwen, Jan and Groote, Philippe and Roger Hindley, J.}, + year = {1997}, + volume = {1210}, + pages = {373--389}, + publisher = {{Springer Berlin Heidelberg}}, + address = {{Berlin, Heidelberg}}, + doi = {10.1007/3-540-62688-3_47}, + urldate = {2023-08-23}, + isbn = {978-3-540-62688-6 978-3-540-68438-1} } @book{vaughtSetTheoryIntroduction1985, @@ -3437,7 +3575,7 @@ @book{vaughtSetTheoryIntroduction1985 googlebooks = {a4QZAQAAIAAJ}, isbn = {978-3-7643-3238-9}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\SWKPL9PC\\Robert L. Vaught - Set Theory_ An Introduction, Second Edition (2001).djvu} + file = {/home/sguillou/Zotero/storage/SWKPL9PC/Robert L. Vaught - Set Theory_ An Introduction, Second Edition (2001).djvu} } @incollection{vermaComplexityEquationalHorn2005, @@ -3452,10 +3590,9 @@ @incollection{vermaComplexityEquationalHorn2005 address = {{Berlin, Heidelberg}}, doi = {10.1007/11532231_25}, urldate = {2022-07-27}, - abstract = {Security protocols employing cryptographic primitives with algebraic properties are conveniently modeled using Horn clauses modulo equational theories. We consider clauses corresponding to the class H3 of Nielson, Nielson and Seidl. We show that modulo the theory ACU of an associative-commutative symbol with unit, as well as its variants like the theory XOR and the theory AG of Abelian groups, unsatisfiability is NP-complete. Also membership and intersection-non-emptiness problems for the closely related class of one-way as well as two-way tree automata modulo these equational theories are NP-complete. A key technical tool is a linear time construction of an existential Presburger formula corresponding to the Parikh image of a context-free language. Our algorithms require deterministic polynomial time using an oracle for existential Presburger formulas, suggesting efficient implementations are possible.}, isbn = {978-3-540-28005-7 978-3-540-31864-4}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\MKP6NGYE\\Verma et al. - 2005 - On the Complexity of Equational Horn Clauses.pdf} + file = {/home/sguillou/Zotero/storage/MKP6NGYE/Verma et al. - 2005 - On the Complexity of Equational Horn Clauses.pdf} } @inproceedings{volkerHOL2PSystemClassical2007, @@ -3469,11 +3606,10 @@ @inproceedings{volkerHOL2PSystemClassical2007 publisher = {{Springer}}, address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-540-74591-4_25}, - abstract = {This paper introduces the logical system HOL2P that extends classical higher order logic (HOL) with type operator variables and universal types. HOL2P has explicit term operations for type abstraction and type application. The formation of type application terms t [T] is restricted to small types T that do not contain any universal types. This constraint ensures the existence of a set-theoretic model and thus consistency.The expressiveness of HOL2P allows category-theoretic concepts such as natural transformations and initial algebras to be applied at the level of polymorphic HOL functions. The parameterisation of terms with type operators adds genericity to theorems. Type variable quantification can also be expressed.A prototype of HOL2P has been implemented on top of HOL-Light. Type inference is semi-automatic, and some type annotations are necessary. Reasoning is supported by appropriate tactics. The implementation has been used to check some sample derivations.}, isbn = {978-3-540-74591-4}, langid = {english}, keywords = {High Order Logic,Inference Rule,Natural Transformation,Type Inference,Type Variable}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\H55FS4BD\\Völker - 2007 - HOL2P - A System of Classical Higher Order Logic w.pdf} + file = {/home/sguillou/Zotero/storage/H55FS4BD/Völker - 2007 - HOL2P - A System of Classical Higher Order Logic w.pdf} } @inproceedings{vukmirovicMakingHigherOrderSuperposition2021, @@ -3487,22 +3623,20 @@ @inproceedings{vukmirovicMakingHigherOrderSuperposition2021 publisher = {{Springer International Publishing}}, address = {{Cham}}, doi = {10.1007/978-3-030-79876-5_24}, - abstract = {Superposition is among the most successful calculi for first-order logic. Its extension to higher-order logic introduces new challenges such as infinitely branching inference rules, new possibilities such as reasoning about formulas, and the need to curb the explosion of specific higher-order rules. We describe techniques that address these issues and extensively evaluate their implementation in the Zipperposition theorem prover. Largely thanks to their use, Zipperposition won the higher-order division of the CASC-J10 competition.}, isbn = {978-3-030-79876-5}, langid = {english}, keywords = {higher-order,superposition,theorem proving}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\4LIDRPPL\\Vukmirović et al. - 2021 - Making Higher-Order Superposition Work.pdf} + file = {/home/sguillou/Zotero/storage/4LIDRPPL/Vukmirović et al. - 2021 - Making Higher-Order Superposition Work.pdf} } -@article{wadlerHowMakeAdHoc1997a, +@article{wadlerHowMakeAdHoc1997, title = {How to {{Make Ad-Hoc Polymorphism Less Ad Hoc}}}, author = {Wadler, Philip and Blott, Stephen}, year = {1997}, month = aug, journal = {[No source information available]}, doi = {10.1145/75277.75283}, - abstract = {This paper presents type classes, a new approach to ad-hoc polymorphism. Type classes permit overloading of arithmetic operators such as multiplication, and generalise the "eqtype variables" of Standard ML. Type classes extend the Hindley/Milner polymorphic type system, and provide a new approach to issues that arise in object-oriented programming, bounded type quantification, and abstract data types. This paper provides an informal introduction to type classes, and defines them formally by means of type inference rules. 1 Introduction Strachey chose the adjectives ad-hoc and parametric to distinguish two varieties of polymorphism [Str67]. Ad-hoc polymorphism occurs when a function is defined over several different types, acting in a different way for each type. A typical example is overloaded multiplication: the same symbol may be used to denote multiplication of integers (as in 3*3) and multiplication of floating point values (as in 3.14*3.14). Parametric polymorphism occurs wh...}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\LSB36SN2\\Wadler et Blott - 1997 - How to Make Ad-Hoc Polymorphism Less Ad Hoc.pdf} + file = {/home/sguillou/Zotero/storage/LSB36SN2/Wadler et Blott - 1997 - How to Make Ad-Hoc Polymorphism Less Ad Hoc.pdf} } @inproceedings{weidenbachSPASSVersion2009, @@ -3516,11 +3650,10 @@ @inproceedings{weidenbachSPASSVersion2009 publisher = {{Springer}}, address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-642-02959-2_10}, - abstract = {SPASS is an automated theorem prover for full first-order logic with equality and a number of non-classical logics. This system description provides an overview of our recent developments in SPASS~3.5 including subterm contextual rewriting, improved split backtracking, a significantly faster FLOTTER implementation with additional control flags, completely symmetric implementation of forward and backward redundancy criteria, faster parsing with improved support for big files, faster and extended sort module, and support for include commands in input files. Finally, SPASS~3.5 can now parse files in TPTP syntax, comes with a new converter tptp2dfg and is distributed under a BSD style license.}, isbn = {978-3-642-02959-2}, langid = {english}, keywords = {Automate Theorem Prover,Empty Clause,Reduction Rule,Small Clause,Soft Typing}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\XN8RCM5N\\Weidenbach et al. - 2009 - SPASS Version 3.5.pdf} + file = {/home/sguillou/Zotero/storage/XN8RCM5N/Weidenbach et al. - 2009 - SPASS Version 3.5.pdf} } @inproceedings{wenzelIsabelleFramework2008, @@ -3534,11 +3667,10 @@ @inproceedings{wenzelIsabelleFramework2008 publisher = {{Springer}}, address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-540-71067-7_7}, - abstract = {Isabelle, which is available from http://isabelle.in.tum.de , is a generic framework for interactive theorem proving. The Isabelle/Pure meta-logic allows the formalization of the syntax and inference rules of a broad range of object-logics following the general idea of natural deduction [32,33]. The logical core is implemented according to the well-known ``LCF approach'' of secure inferences as abstract datatype constructors in ML [16]; explicit proof terms are also available [8]. Isabelle/Isar provides sophisticated extra-logical infrastructure supporting structured proofs and specifications, including concepts for modular theory development. Isabelle/HOL is a large application within the generic framework, with plenty of logic-specific add-on tools and a large theory library. Other notable object-logics are Isabelle/ZF (Zermelo-Fraenkel set-theory, see [34,36] and Isabelle/HOLCF [26] (Scott's domain theory within HOL). Users can build further formal-methods tools on top, e.g. see [53].}, isbn = {978-3-540-71067-7}, langid = {english}, keywords = {Automate Reasoning,High Order Logic,Natural Deduction,Prime Number Theorem,Structure Proof}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\5MESV7FU\\Wenzel et al. - 2008 - The Isabelle Framework.pdf} + file = {/home/sguillou/Zotero/storage/5MESV7FU/Wenzel et al. - 2008 - The Isabelle Framework.pdf} } @incollection{wernerSetsTypesTypes1997, @@ -3556,7 +3688,7 @@ @incollection{wernerSetsTypesTypes1997 isbn = {978-3-540-63388-4 978-3-540-69530-1}, langid = {english}, keywords = {Print,Read}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\YR23DNHK\\Werner - 1997 - Sets in types, types in sets.pdf} + file = {/home/sguillou/Zotero/storage/YR23DNHK/Werner - 1997 - Sets in types, types in sets.pdf} } @article{whitmanFreeLattices1941a, @@ -3573,10 +3705,10 @@ @article{whitmanFreeLattices1941a issn = {0003-486X}, doi = {10.2307/1969001}, urldate = {2022-05-05}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\ZM8YYR37\\Whitman - 1941 - Free Lattices.pdf} + file = {/home/sguillou/Zotero/storage/ZM8YYR37/Whitman - 1941 - Free Lattices.pdf} } -@article{whitmanFreeLatticesII1942a, +@article{whitmanFreeLatticesII1942, title = {Free {{Lattices II}}}, author = {Whitman, Philip M.}, year = {1942}, @@ -3590,7 +3722,7 @@ @article{whitmanFreeLatticesII1942a issn = {0003-486X}, doi = {10.2307/1968883}, urldate = {2023-03-14}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\4I39WNEF\\Whitman - 1942 - Free Lattices II.pdf} + file = {/home/sguillou/Zotero/storage/4I39WNEF/Whitman - 1942 - Free Lattices II.pdf} } @inproceedings{wintersteigerConcurrentPortfolioApproach2009, @@ -3605,7 +3737,6 @@ @inproceedings{wintersteigerConcurrentPortfolioApproach2009 address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-642-02658-4_60}, urldate = {2021-09-29}, - abstract = {With the availability of multi-core processors and large-scale computing clusters, the study of parallel algorithms has been revived throughout the industry. We present a portfolio approach to deciding the satisfiability of SMT formulas, based on the recent success of related algorithms for the SAT problem. Our parallel version of Z3 outperforms the sequential solver, with speedups of well over an order of magnitude on many benchmarks.}, isbn = {978-3-642-02657-7} } @@ -3621,9 +3752,8 @@ @article{wirthHilbertEpsilonOperator2008 issn = {15708683}, doi = {10.1016/j.jal.2007.07.009}, urldate = {2021-07-08}, - abstract = {Hilbert and Bernays avoided overspecification of Hilbert's {$\epsilon$}-operator. They axiomatized only what was relevant for their prooftheoretic investigations. Semantically, this left the {$\epsilon$}-operator underspecified. After briefly reviewing the literature on semantics of Hilbert's epsilon operator, we propose a new semantics with the following features: We avoid overspecification (such as rightuniqueness), but admit indefinite choice, committed choice, and classical logics. Moreover, our semantics for the {$\epsilon$} simplifies proof search and is natural in the sense that it mirrors some cases of referential interpretation of indefinite articles in natural language.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\8GWJVHVF\\Robert L. Vaught - Set Theory_ An Introduction, Second Edition (2001).djvu;C\:\\Users\\Simon\\Zotero\\storage\\E7NKBDAV\\Wirth - 2008 - Hilbert's epsilon as an operator of indefinite com.pdf} + file = {/home/sguillou/Zotero/storage/8GWJVHVF/Robert L. Vaught - Set Theory_ An Introduction, Second Edition (2001).djvu;/home/sguillou/Zotero/storage/E7NKBDAV/Wirth - 2008 - Hilbert's epsilon as an operator of indefinite com.pdf} } @article{yamasakiSatisfiabiltyProblemClass1983, @@ -3638,9 +3768,8 @@ @article{yamasakiSatisfiabiltyProblemClass1983 issn = {0019-9958}, doi = {10.1016/S0019-9958(83)80027-8}, urldate = {2023-06-13}, - abstract = {In this paper, the satisfiability problem for a class of proportional sentences is considered. Here a sentence is a set of clauses. A clause is a set of literals. First, it is proposed that a class S0 of propositional sentences which properly includes the class of propositional Horn sentences. A sentence \{C1,\ldots, Cn\} is in S0 if there are sets P1,\ldots,Pn of positive literals such that (1) P1 {$\supset$} P2 {$\supset$} \ldots{} {$\supset$} Pn, (2) PiCi for 1 {$\leqslant$} i {$\leqslant$} n, and (3) Ci - Pi is a Horn clause for 1 {$\leqslant$} i {$\leqslant$} n. Then it is proposed that a new inference rule, based on the resolution principle, by which (un)satisfiability for S0 in polynomial time can be decided.}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\48VKE9BF\\Yamasaki et Doshita - 1983 - The satisfiabilty problem for a class consisting o.pdf;C\:\\Users\\Simon\\Zotero\\storage\\QQR4HD4T\\S0019995883800278.html} + file = {/home/sguillou/Zotero/storage/48VKE9BF/Yamasaki et Doshita - 1983 - The satisfiabilty problem for a class consisting o.pdf;/home/sguillou/Zotero/storage/QQR4HD4T/S0019995883800278.html} } @incollection{yessenovCollectionsCardinalitiesRelations2010, @@ -3655,13 +3784,27 @@ @incollection{yessenovCollectionsCardinalitiesRelations2010 address = {{Berlin, Heidelberg}}, doi = {10.1007/978-3-642-11319-2_27}, urldate = {2020-09-17}, - abstract = {Logics that involve collections (sets, multisets), and cardinality constraints are useful for reasoning about unbounded data structures and concurrent processes. To make such logics more useful in verification this paper extends them with the ability to compute direct and inverse relation and function images. We establish decidability and complexity bounds for the extended logics.}, isbn = {978-3-642-11318-5 978-3-642-11319-2}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\QTX6AJYT\\Yessenov et al. - 2010 - Collections, Cardinalities, and Relations.pdf} + file = {/home/sguillou/Zotero/storage/QTX6AJYT/Yessenov et al. - 2010 - Collections, Cardinalities, and Relations.pdf} } -@inproceedings{zhanFormalizationFundamentalGroup2017, +@incollection{zhanFormalizationFundamentalGroup2017, + title = {Formalization of the Fundamental Group in Untyped Set Theory Using Auto2}, + author = {Zhan, Bohua}, + year = {2017}, + volume = {10499}, + eprint = {1707.04757}, + primaryclass = {cs}, + pages = {514--530}, + doi = {10.1007/978-3-319-66107-0_32}, + urldate = {2023-05-16}, + archiveprefix = {arxiv}, + keywords = {Computer Science - Logic in Computer Science}, + file = {/home/sguillou/Zotero/storage/54Z695WF/Zhan - 2017 - Formalization of the fundamental group in untyped .pdf;/home/sguillou/Zotero/storage/AR8YYM3Z/1707.html} +} + +@inproceedings{zhanFormalizationFundamentalGroup2017a, title = {Formalization of the {{Fundamental Group}} in {{Untyped Set Theory Using Auto2}}}, booktitle = {Interactive {{Theorem Proving}}}, author = {Zhan, Bohua}, @@ -3672,26 +3815,9 @@ @inproceedings{zhanFormalizationFundamentalGroup2017 publisher = {{Springer International Publishing}}, address = {{Cham}}, doi = {10.1007/978-3-319-66107-0_32}, - abstract = {We present a new framework for formalizing mathematics in untyped set theory using auto2. Using this framework, we formalize in Isabelle/FOL the entire chain of development from the axioms of set theory to the definition of the fundamental group for an arbitrary topological space. The auto2 prover is used as the sole automation tool, and enables succinct proof scripts throughout the project.}, isbn = {978-3-319-66107-0}, langid = {english}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\YJSGLYUK\\Zhan - 2017 - Formalization of the Fundamental Group in Untyped .pdf} -} - -@incollection{zhanFormalizationFundamentalGroup2017a, - title = {Formalization of the Fundamental Group in Untyped Set Theory Using Auto2}, - author = {Zhan, Bohua}, - year = {2017}, - volume = {10499}, - eprint = {1707.04757}, - primaryclass = {cs}, - pages = {514--530}, - doi = {10.1007/978-3-319-66107-0_32}, - urldate = {2023-05-16}, - abstract = {We present a new framework for formalizing mathematics in untyped set theory using auto2. Using this framework, we formalize in Isabelle/FOL the entire chain of development from the axioms of set theory to the definition of the fundamental group for an arbitrary topological space. The auto2 prover is used as the sole automation tool, and enables succinct proof scripts throughout the project.}, - archiveprefix = {arxiv}, - keywords = {Computer Science - Logic in Computer Science}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\54Z695WF\\Zhan - 2017 - Formalization of the fundamental group in untyped .pdf;C\:\\Users\\Simon\\Zotero\\storage\\AR8YYM3Z\\1707.html} + file = {/home/sguillou/Zotero/storage/YJSGLYUK/Zhan - 2017 - Formalization of the Fundamental Group in Untyped .pdf} } @article{zhangBetterTogetherUnifying2023, @@ -3706,7 +3832,6 @@ @article{zhangBetterTogetherUnifying2023 pages = {125:468--125:492}, doi = {10.1145/3591239}, urldate = {2023-06-29}, - abstract = {We present egglog, a fixpoint reasoning system that unifies Datalog and equality saturation (EqSat). Like Datalog, egglog supports efficient incremental execution, cooperating analyses, and lattice-based reasoning. Like EqSat, egglog supports term rewriting, efficient congruence closure, and extraction of optimized terms. We identify two recent applications -- a unification-based pointer analysis in Datalog and an EqSat-based floating-point term rewriter -- that have been hampered by features missing from Datalog but found in EqSat or vice-versa. We evaluate our system by reimplementing those projects in egglog. The resulting systems in egglog are faster, simpler, and fix bugs found in the original systems.}, keywords = {Datalog,Equality saturation,Program optimization,Rewrite systems}, - file = {C\:\\Users\\Simon\\Zotero\\storage\\RUDPMWFN\\Zhang et al. - 2023 - Better Together Unifying Datalog and Equality Sat.pdf} + file = {/home/sguillou/Zotero/storage/RUDPMWFN/Zhang et al. - 2023 - Better Together Unifying Datalog and Equality Sat.pdf} } diff --git a/Reference Manual/theorytopics.tex b/Reference Manual/theorytopics.tex index e5e206b5..e067a156 100644 --- a/Reference Manual/theorytopics.tex +++ b/Reference Manual/theorytopics.tex @@ -4,7 +4,7 @@ \chapter{Selected Theoretical Topics} \section{Set Theory and Mathematical Logic} \subsection{First Order Logic with Schematic Variables} \label{sec:theoryfol} -\subsection{Extension by Definition} +\subsection{Extensions by Definition} \label{sec:definitions} An extension by definition is the formal way of introducing new symbols in a mathematical theory. diff --git a/build.sbt b/build.sbt index 544c64c7..4040147a 100644 --- a/build.sbt +++ b/build.sbt @@ -19,7 +19,7 @@ inThisBuild( val commonSettings = Seq( - version := "0.9", + version := "0.6", crossScalaVersions := Seq("2.12.13", "2.13.4", "3.0.1", "3.2.0"), organization := "ch.epfl.lara", scalacOptions ++= Seq("-Ximport-suggestion-timeout", "0") diff --git a/lisa-examples/src/main/scala/Example.scala b/lisa-examples/src/main/scala/Example.scala index a5ac9614..63c3d2b3 100644 --- a/lisa-examples/src/main/scala/Example.scala +++ b/lisa-examples/src/main/scala/Example.scala @@ -1,4 +1,3 @@ -import lisa.automation.kernel.OLPropositionalSolver.* import lisa.prooflib.Substitution.{ApplyRules as Substitute} object Example extends lisa.Main { @@ -11,13 +10,13 @@ object Example extends lisa.Main { // Simple proof with LISA's DSL val fixedPointDoubleApplication = Theorem(∀(x, P(x) ==> P(f(x))) |- P(x) ==> P(f(f(x)))) { assume(∀(x, P(x) ==> P(f(x)))) - assume(P(x)) val step1 = have(P(x) ==> P(f(x))) by InstantiateForall val step2 = have(P(f(x)) ==> P(f(f(x)))) by InstantiateForall have(thesis) by Tautology.from(step1, step2) - } + } + - // Example of set theoretic development + //Example of set theoretic development /** * Theorem --- The empty set is a subset of every set. @@ -40,7 +39,7 @@ object Example extends lisa.Main { val setWithElementNonEmpty = Theorem( (y ∈ x) |- x =/= ∅ ) { - have((x === ∅) |- !(y ∈ x)) by Substitute(x === ∅)(emptySetAxiom of (x := y)) + have ((x === ∅) |- !(y ∈ x)) by Substitute(x === ∅)(emptySetAxiom of (x := y)) } /** diff --git a/project/build.properties b/project/build.properties index f344c148..f1a9ac34 100644 --- a/project/build.properties +++ b/project/build.properties @@ -1 +1,2 @@ + sbt.version = 1.8.2 diff --git a/src/main/scala/lisa/Main.scala b/src/main/scala/lisa/Main.scala index c13394d2..7385747a 100644 --- a/src/main/scala/lisa/Main.scala +++ b/src/main/scala/lisa/Main.scala @@ -9,6 +9,8 @@ import lisa.settheory.SetTheoryLibrary trait Main extends BasicMain { export SetTheoryLibrary.{powerAxiom as _, subsetAxiom as _, emptySetAxiom as _, given, _} export lisa.prooflib.Exports.* + export lisa.automation.kernel.OLPropositionalSolver.Tautology + export lisa.prooflib.Substitution.* /** * Power Set Axiom --- For a set `x`, there exists a power set of `x`, denoted