Publications on Rewriting
2020
- Decreasing Diagrams for Confluence and CommutationJörg Endrullis, Jan Willem Klop, and Roy OverbeekLogical Methods in Computer Science , Volume 16, Issue 1 (2020)paper
Summary
Like termination, confluence is a central property of rewrite systems. Unlike for termination, however, there exists no known complexity hierarchy for confluence. In this paper we investigate whether the decreasing diagrams technique can be used to obtain such a hierarchy.
The decreasing diagrams technique is one of the strongest and most versatile methods for proving confluence of abstract rewrite systems. It is complete for countable systems, and it has many well-known confluence criteria as corollaries. So what makes decreasing diagrams so powerful? In contrast to other confluence techniques, decreasing diagrams employ a labelling of the steps with labels from a well-founded order in order to conclude confluence of the underlying unlabelled relation. Hence it is natural to ask how the size of the label set influences the strength of the technique. In particular, what class of abstract rewrite systems can be proven confluent using decreasing diagrams restricted to 1 label, 2 labels, 3 labels, and so on?
Surprisingly, we find that two labels suffice for proving confluence for every abstract rewrite system having the cofinality property, thus in particular for every confluent, countable system. Secondly, we show that this result stands in sharp contrast to the situation for commutation of rewrite relations, where the hierarchy does not collapse.
Thirdly, investigating the possibility of a confluence hierarchy, we determine the first-order (non-)definability of the notion of confluence and related properties, using techniques from finite model theory. We find that in particular Hanf's theorem is fruitful for elegant proofs of undefinability of properties of abstract rewrite systems.
This paper is an extended version of Decreasing Diagrams with Two Labels Are Complete for Confluence of Countable Systems (FSCD 2018).
See research for an overview of my research on confluence.
Bibtex
@article{confluence:decreasing:diagrams:2020, title = {{Decreasing Diagrams for Confluence and Commutation}}, author = {Endrullis, J{\"{o}}rg and Klop, Jan Willem and Overbeek, Roy}, doi = {10.23638/LMCS-16(1:23)2020}, journal = {{Logical Methods in Computer Science}}, volume = {{Volume 16, Issue 1}}, year = {2020}, keywords = {rewriting, confluence}, type = {journal} }
Digital Object Identifier
10.23638/LMCS-16(1:23)2020
- Patch Graph RewritingRoy Overbeek, and Jörg EndrullisIn: Proc. Conf. on Graph Transformation (ICGT 2020), pp. 128–145, Springer (2020)paper
Summary
The basic principle of graph rewriting is the stepwise replacement of subgraphs inside a host graph. A challenge in such replacement steps is the treatment of the patch graph, consisting of those edges of the host graph that touch the subgraph, but are not part of it.
We introduce patch graph rewriting, a visual graph rewriting language with precise formal semantics. The language has rich expressive power in two ways. First, rewrite rules can flexibly constrain the permitted shapes of patches touching matching subgraphs. Second, rules can freely transform patches. We highlight the framework’s distinguishing features by comparing it against existing approaches.
Bibtex
@inproceedings{graph:rewriting:patch:2020, author = {Overbeek, Roy and Endrullis, J{\"{o}}rg}, title = {Patch Graph Rewriting}, booktitle = {Proc.\ Conf.\ on Graph Transformation (ICGT~2020)}, series = {LNCS}, volume = {12150}, pages = {128--145}, publisher = {Springer}, year = {2020}, doi = {10.1007/978-3-030-51372-6\_8}, keywords = {rewriting}, type = {conference} }
Digital Object Identifier
10.1007/978-3-030-51372-6_8
2019
- Confluence of the Chinese MonoidJörg Endrullis, and Jan Willem KlopIn: The Art of Modelling Computational Systems, pp. 206–220, Springer (2019)paper
Summary
The Chinese monoid, related to Knuth’s Plactic monoid, is of great interest in algebraic combinatorics. Both are ternary monoids, generated by relations between words of three symbols. The relations are, for a totally ordered alphabet, if cba = cab = bca if a ≤ b ≤ c. In this note we establish confluence by tiling for the Chinese monoid, with the consequence that every two words u, v have extensions to a common word: ∀ u,v. ∃ x,y. ux = vy.
Our proof is given using decreasing diagrams, a method for obtaining confluence that is central in abstract rewriting theory. Decreasing diagrams may also be applicable to various related monoid presentations.
We conclude with some open questions for the monoids considered.
See research for an overview of my research on confluence.
Bibtex
@inproceedings{confluence:chinese:monoid:2019, author = {Endrullis, J{\"{o}}rg and Klop, Jan Willem}, title = {{Confluence of the Chinese Monoid}}, booktitle = {The Art of Modelling Computational Systems}, series = {LNCS}, volume = {11760}, pages = {206--220}, publisher = {Springer}, year = {2019}, doi = {10.1007/978-3-030-31175-9\_12}, keywords = {rewriting,confluence}, type = {conference} }
Digital Object Identifier
10.1007/978-3-030-31175-9_12
- Braids via Term RewritingJörg Endrullis, and Jan Willem KlopTheoretical Computer Science , 777 , pp. 260–295 (2019)paper
Summary
We present a brief introduction to braids, in particular simple positive braids, with a double emphasis: first, we focus on term rewriting techniques, in particular, reduction diagrams and decreasing diagrams. The second focus is our employment of the colored braid notation next to the more familiar Artin notation. Whereas the latter is a relative, position dependent, notation, the former is an absolute notation that seems more suitable for term rewriting techniques such as symbol tracing. Artin's equations translate in this notation to simple word inversions. With these points of departure we treat several basic properties of positive braids, in particular related to the word problem, confluence property, projection equivalence, and the congruence property. In our introduction the beautiful diamond known as the permutohedron plays a decisive role.
Bibtex
@article{rewriting:braids:2019, author = {Endrullis, J{\"{o}}rg and Klop, Jan Willem}, title = {Braids via term rewriting}, journal = {Theoretical Computer Science}, volume = {777}, pages = {260--295}, year = {2019}, doi = {10.1016/j.tcs.2018.12.006}, keywords = {rewriting,confluence}, type = {journal} }
Digital Object Identifier
10.1016/j.tcs.2018.12.006
2018
- Decreasing Diagrams with Two Labels Are Complete for Confluence of Countable SystemsJörg Endrullis, Jan Willem Klop, and Roy OverbeekIn: Proc. Conf. on Formal Structures for Computation and Deduction (FSCD 2018), pp. 14:1–14:15, Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik (2018)paper
Summary
The decreasing diagrams technique is one of the strongest and most versatile methods for proving confluence of abstract reduction systems. So what makes decreasing diagrams so powerful? In contrast to other confluence techniques, decreasing diagrams employ a labelling of the steps.
In this paper we investigate how the size of the label set influences the strength of the technique. Surprisingly, we find that two labels suffice for proving confluence of every confluent, countable system. In contrast, for proving commutation of rewrite relations, it turns out that the strength of the technique increases with more labels.
While decreasing diagrams is complete for proving confluence of countable systems, the technique is not complete for commutation. In our paper De Bruijn's Weak Diamond Property Revisited we give a counterexample to the completeness of decreasing diagrams for commutation.
See research for an overview of my research on confluence.
Bibtex
@inproceedings{confluence:decreasing:diagrams:2018, author = {Endrullis, J{\"{o}}rg and Klop, Jan Willem and Overbeek, Roy}, title = {{Decreasing Diagrams with Two Labels Are Complete for Confluence of Countable Systems}}, booktitle = {Proc.\ Conf.\ on Formal Structures for Computation and Deduction (FSCD~2018)}, volume = {108}, pages = {14:1--14:15}, publisher = {Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik}, series = {LIPIcs}, year = {2018}, doi = {10.4230/LIPIcs.FSCD.2018.14}, keywords = {rewriting, confluence}, type = {conference} }
Digital Object Identifier
10.4230/LIPIcs.FSCD.2018.14
- Coinductive Foundations of Infinitary Rewriting and Infinitary Equational LogicJörg Endrullis, Helle Hvid Hansen, Dimitri Hendriks, Andrew Polonsky, and Alexandra SilvaLogical Methods in Computer Science , 14 (1) (2018)paper
Summary
We present a coinductive framework for defining and reasoning about the infinitary analogues of equational logic and term rewriting in a uniform, coinductive way.
The framework lends itself to an elegant and concise definition of the infinitary rewrite relation \( \to^\infty \) in terms of the single step relation \( \to \): \[ {\to^\infty} \,=\, \mu R. \nu S. ( \to \cup \mathrel{\overline{R}} )^* \mathrel{;} \overline{S} \] Here \( \mu \) and \( \nu \) are the least and greatest fixed-point operators, respectively, and \[ \overline{R} \,=\, \{\, (\, f(s_1,\ldots,s_n),\, \,f(t_1,\ldots,t_n) \,) \mid f \in \Sigma,\, s_1\! \mathrel{R} t_1,\ldots,s_n\! \mathrel{R} t_n \,\} \cup \text{Id} \] The setup captures rewrite sequences of arbitrary ordinal length, but it has neither the need for ordinals nor for metric convergence. This makes the framework suitable for formalizations in theorem provers. To wit, we provide the first formalization of the compression lemma in Coq.
This paper is an extended version of A Coinductive Framework for Infinitary Rewriting and Equational Reasoning (RTA 2015). We build on ideas in Infinitary Rewriting Coinductively (TYPES 2012) giving a coinductive perspective on infinitary lambda calculus. We extend these ideas to rewrite sequences beyond length omega by mixing induction and coinduction (least and greatest fixed-points).
Bibtex
@article{infintary:rewriting:coinductive:2018, author = {Endrullis, J\"{o}rg and Hansen, Helle Hvid and Hendriks, Dimitri and Polonsky, Andrew and Silva, Alexandra}, title = {{Coinductive Foundations of Infinitary Rewriting and Infinitary Equational Logic}}, journal = {Logical Methods in Computer Science}, volume = {14}, number = {1}, year = {2018}, doi = {10.23638/LMCS-14(1:3)2018}, keywords = {rewriting, infinitary rewriting, coinduction}, type = {journal} }
Digital Object Identifier
10.23638/LMCS-14(1:3)2018
2017
- Clocked Lambda CalculusJörg Endrullis, Dimitri Hendriks, Jan Willem Klop, and Andrew PolonskyMathematical Structures in Computer Science , 27 (5) , pp. 782–806 (2017)paper
Summary
We introduce the clocked lambda calculus, an extension of the classical lambda calculus with a unary symbol \( \tau \) that serves as a witness of \( \beta \)-steps. This calculus consists of the following two rules: \[ \begin{aligned} (\lambda x.M) N &\to \tau( M[x = N] ) \\ \tau(M)N &\to \tau(MN) \end{aligned} \] The clocked lambda-calculus is infinitary strongly normalizing, infinitary confluent, and the unique infinitary normal forms constitute enriched Böhm trees (more precisely, Lévy-Longo trees), which we call clocked Böhm trees. We show that clocked Böhm trees are suitable for discriminating a rich class of lambda terms having the same Böhm trees.
See research for an overview of my research on the clocked lambda claculus and fixed-point combinators.
Bibtex
@article{clocked:lambda:calculus:2017, author = {Endrullis, J\"{o}rg and Hendriks, Dimitri and Klop, Jan Willem and Polonsky, Andrew}, title = {{Clocked Lambda Calculus}}, journal = {Mathematical Structures in Computer Science}, volume = {27}, number = {5}, pages = {782--806}, year = {2017}, doi = {10.1017/S0960129515000389}, keywords = {rewriting, infinitary rewriting, lambda calculus}, type = {journal} }
Digital Object Identifier
10.1017/S0960129515000389
2015
- Proving Non-Termination by Finite AutomataJörg Endrullis, and Hans ZantemaIn: Proc. Conf. on Rewriting Techniques and Applications (RTA 2015), pp. 160–176, Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik (2015)paper
Summary
We present a new technique to prove non-termination of term rewriting. The basic idea is tofind a non-empty regular language of terms that is weakly closed under rewriting and does not containnormal forms. It is automated by representing the language by a tree automaton with a fixednumber of states, and expressing the mentioned requirements in a SAT formula. Satisfiabilityof this formula implies non-termination. Our approach succeeds for many examples where allearlier techniques fail, for instance for the S-rule from combinatory logic.
See research for an overview of my research on termination.
Bibtex
@inproceedings{termination:automata:2015, author = {Endrullis, J\"{o}rg and Zantema, Hans}, title = {{Proving Non-termination by Finite Automata}}, booktitle = {Proc.\ Conf.\ on Rewriting Techniques and Applications (RTA~2015)}, volume = {36}, pages = {160--176}, publisher = {Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik}, series = {LIPIcs}, year = {2015}, doi = {10.4230/LIPIcs.RTA.2015.160}, keywords = {rewriting, termination, automata}, type = {conference} }
Digital Object Identifier
10.4230/LIPIcs.RTA.2015.160
- A Coinductive Framework for Infinitary Rewriting and Equational ReasoningJörg Endrullis, Helle Hvid Hansen, Dimitri Hendriks, Andrew Polonsky, and Alexandra SilvaIn: Proc. Conf. on Rewriting Techniques and Applications (RTA 2015), pp. 143–159, Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik (2015)paper
Summary
We present a coinductive framework for defining and reasoning about the infinitary analogues of equational logic and term rewriting in a uniform, coinductive way.
The framework lends itself to an elegant and concise definition of the infinitary rewrite relation \( \to^\infty \) in terms of the single step relation \( \to \): \[ {\to^\infty} \,=\, \mu R. \nu S. ( \to \cup \mathrel{\overline{R}} )^* \mathrel{;} \overline{S} \] Here \( \mu \) and \( \nu \) are the least and greatest fixed-point operators, respectively, and \[ \overline{R} \,=\, \{\, (\, f(s_1,\ldots,s_n),\, \,f(t_1,\ldots,t_n) \,) \mid f \in \Sigma,\, s_1\! \mathrel{R} t_1,\ldots,s_n\! \mathrel{R} t_n \,\} \cup \text{Id} \] The setup captures rewrite sequences of arbitrary ordinal length, but it has neither the need for ordinals nor for metric convergence. This makes the framework suitable for formalizations in theorem provers.
We refer to Coinductive Foundations of Infinitary Rewriting and Infinitary Equational Logic (LMCS 2018) for an extended journal version of this paper.
We build on ideas in Infinitary Rewriting Coinductively (TYPES 2012) giving a coinductive perspective on infinitary lambda calculus. We extend these ideas to rewrite sequences beyond length omega by mixing induction and coinduction (least and greatest fixed-points).
Bibtex
@inproceedings{infintary:rewriting:coinductive:2015, author = {Endrullis, J\"{o}rg and Hansen, Helle Hvid and Hendriks, Dimitri and Polonsky, Andrew and Silva, Alexandra}, title = {{A Coinductive Framework for Infinitary Rewriting and Equational Reasoning}}, booktitle = {Proc.\ Conf.\ on Rewriting Techniques and Applications (RTA 2015)}, volume = {36}, pages = {143--159}, publisher = {Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik}, series = {LIPIcs}, year = {2015}, doi = {10.4230/LIPIcs.RTA.2015.143}, keywords = {rewriting, infinitary rewriting, coinduction}, type = {conference} }
Digital Object Identifier
10.4230/LIPIcs.RTA.2015.143
2014
- Discriminating Lambda-Terms Using Clocked Boehm TreesJörg Endrullis, Dimitri Hendriks, Jan Willem Klop, and Andrew PolonskyLogical Methods in Computer Science , 10 (2) (2014)paper
Summary
As observed by Intrigila, there are hardly techniques available in the lambda calculus to prove that two lambda terms are not \( \beta \)-convertible. Techniques employing the usual Böhm trees are inadequate when we deal with terms having the same Böhm tree. This is the case in particular for fixed-point combinators, as they all have the same Böhm tree.
Another interesting equation, whose consideration was suggested by Scott, is BY = BYS, an equation valid in the classical model \( P \omega \) of lambda calculus, and hence valid with respect to Böhm tree-equality, but nevertheless the terms are \( \beta \)-inconvertible.
To prove such beta-inconvertibilities, we refine the concept of Böhm trees: we introduce clocked Böhm trees's with annotations that convey information of the tempo in which the Böhm trees are produced. Böhm trees are thus enriched with an intrinsic clock behaviour, leading to a refined discrimination method for lambda terms. An analogous approach pertains to Levy-Longo trees and Berarducci trees.
We illustrate applicability of our refined Böhm trees at the following examples:
- We show how to \( \beta \)-discriminate a large number of fixed-point combinators.
- We answer a question of Gordon Plotkin: Is there a fixed point combinator \( Y \) such that \[ Y ( \lambda z. f zz ) =_\beta Y ( \lambda x. Y ( \lambda y. f xy )) \]
This paper is an extended version of Modular Construction of Fixed Point Combinators and Clocked Böhm Trees (LICS 2010).
See research for an overview of my research on the clocked lambda claculus and fixed-point combinators.
Bibtex
@article{lambda:clocks:2014, author = {Endrullis, J\"{o}rg and Hendriks, Dimitri and Klop, Jan Willem and Polonsky, Andrew}, title = {{Discriminating Lambda-Terms Using Clocked Boehm Trees}}, journal = {Logical Methods in Computer Science}, volume = {10}, number = {2}, year = {2014}, doi = {10.2168/LMCS-10(2:4)2014}, keywords = {rewriting, infinitary rewriting, lambda calculus}, type = {journal} }
Digital Object Identifier
10.2168/LMCS-10(2:4)2014
- Infinitary Term Rewriting for Weakly Orthogonal Systems: Properties and CounterexamplesJörg Endrullis, Clemens Grabmayer, Dimitri Hendriks, Jan Willem Klop, and Vincent van OostromLogical Methods in Computer Science , 10 (2:7) , pp. 1–33 (2014)paper
Bibtex
@article{infinitary:weakly:orthogonal:2014, author = {Endrullis, J\"{o}rg and Grabmayer, Clemens and Hendriks, Dimitri and Klop, Jan Willem and van~Oostrom, Vincent}, title = {{Infinitary Term Rewriting for Weakly Orthogonal Systems: Properties and Counterexamples}}, journal = {Logical Methods in Computer Science}, volume = {10}, number = {2:7}, pages = {1--33}, year = {2014}, doi = {10.2168/LMCS-10(2:7)2014}, keywords = {rewriting, infinitary rewriting, lambda calculus}, type = {journal} }
Digital Object Identifier
10.2168/LMCS-10(2:7)2014
- On the Complexity of Stream EqualityJörg Endrullis, Dimitri Hendriks, Rena Bakhshi, and Grigore RosuJournal of Functional Programming , 24 (2-3) , pp. 166–217 (2014)paper
Bibtex
@article{complexity:stream:equality:2014, author = {Endrullis, J\"{o}rg and Hendriks, Dimitri and Bakhshi, Rena and Rosu, Grigore}, title = {{On the Complexity of Stream Equality}}, journal = {Journal of Functional Programming}, volume = {24}, number = {2-3}, pages = {166--217}, year = {2014}, doi = {10.1017/S0956796813000324}, keywords = {rewriting, undecidability, streams}, type = {journal} }
Digital Object Identifier
10.1017/S0956796813000324
2013
- De Bruijn’s Weak Diamond Property RevisitedJörg Endrullis, and Jan Willem KlopIndagationes Mathematicae , 24 (4) , pp. 1050–1072 (2013)paper
Summary
In this paper we revisit an unpublished but influential technical report from 1978 by N.G. de Bruijn, written in the framework of the Automath project. This report describes a technique for proving confluence of abstract reduction systems, called the weak diamond property. It paved the way for the powerful technique developed by Van Oostrom to prove confluence of abstract reduction systems, called decreasing diagrams.
We first revisit in detail De Bruijn’s old proof, providing a few corrections and hints for understanding. We find that this original criterion and proof technique are still worthwhile. Next, we establish that De Bruijn’s confluence criterion can be used to derive the decreasing diagrams theorem (the reverse was already known). We also provide a short proof of decreasing diagrams in the spirit of De Bruijn. We finally address the issue of completeness of this method.
See research for an overview of my research on confluence.
Bibtex
@article{confluence:weak:diamond:2013, author = {Endrullis, J\"{o}rg and Klop, Jan Willem}, title = {{De Bruijn’s Weak Diamond Property Revisited}}, journal = {Indagationes Mathematicae}, volume = {24}, number = {4}, pages = {1050 -- 1072}, year = {2013}, doi = {10.1016/j.indag.2013.08.005}, note = {In memory of N.G. (Dick) de Bruijn (1918–2012)}, keywords = {rewriting, confluence}, type = {journal} }
Digital Object Identifier
10.1016/j.indag.2013.08.005
- SAT Compilation for Termination Proofs via Semantic LabellingAlexander Bau, Jörg Endrullis, and Johannes WaldmannIn: Proc. Workshop on Termination (WST 2007) (2013)paper
Bibtex
@inproceedings{termination:sat:compilation:2014, author = {Bau, Alexander and Endrullis, J\"{o}rg and Waldmann, Johannes}, title = {{SAT compilation for Termination Proofs via Semantic Labelling}}, booktitle = {Proc.\ Workshop on Termination (WST~2007)}, year = {2013}, keywords = {rewriting, termination}, type = {workshop} }
- Clocks for Functional ProgramsJörg Endrullis, Dimitri Hendriks, Jan Willem Klop, and Andrew PolonskyIn: The Beauty of Functional Code - Essays Dedicated to Rinus Plasmeijer on the Occasion of His 61st Birthday, pp. 97–126, Springer (2013)paper
Summary
The contribution of this paper is twofold.
First, we derive a complete characterization of all simply-typed fixed-point combinator (fpc) generators using Barendregt's Inhabitation Machines. A fpc generator is a lambda term \( G \) such that \( Y G \) is a fpc whenever \( Y \) is. The term \( \delta = \lambda xy. y(xy) \), also known as Smullyan's Owl, is a famous fpc generator. For instance, Turing's fpc \( Y_1 \) can be obtaind from Curry's fpc \( Y_0 \) by postfixing \( \delta \): \[ Y_1 = Y_0 \delta \]
Second, we present a conjecture that vastly generalises Richard Statman's question on the existance of double fixed-point combinators. Statman asked whether there is a fixed-point combinator \( Y \) such that \( Y =_\beta Y \delta \). This question remains open as the proof by Benedetto Intrigila contains a gap. We have the following conjecture about the relation of the \( \mu \)-opertator and fixed-point combinators:
ConjectureWe conjecture that for any fixed-point combinator \( Y \) and simply-typed \( \lambda\mu \)-terms \( s,t \) it holds that \[ s =_{ \beta \mu } t \iff s_Y =_{ \beta } t_Y \] where \(s_Y, t_Y\) are the untyped lambda terms obtained from \(s,t\), respectively, by replacing all occurrences of \( \mu \)-operators with the fixed-point combinator \( Y \).See research for an overview of my research on the clocked lambda claculus and fixed-point combinators.
Bibtex
@inproceedings{lambda:clocks:functional:programs:2013, author = {Endrullis, J\"{o}rg and Hendriks, Dimitri and Klop, Jan Willem and Polonsky, Andrew}, title = {{Clocks for Functional Programs}}, booktitle = {The Beauty of Functional Code - Essays Dedicated to Rinus Plasmeijer on the Occasion of His 61st Birthday}, pages = {97--126}, year = {2013}, doi = {10.1007/978-3-642-40355-2\_8}, series = {LNCS}, volume = {8106}, publisher = {Springer}, keywords = {rewriting, infinitary rewriting, lambda calculus}, type = {journal} }
Digital Object Identifier
10.1007/978-3-642-40355-2_8
- Circular Coinduction in Coq Using Bisimulation-Up-To TechniquesJörg Endrullis, Dimitri Hendriks, and Martin BodinIn: Proc. Conf. on Interactive Theorem Proving (ITP 2013), pp. 354–369, Springer (2013)paper
Bibtex
@inproceedings{circular:coinduction:2013, author = {Endrullis, J\"{o}rg and Hendriks, Dimitri and Bodin, Martin}, title = {{Circular Coinduction in Coq Using Bisimulation-Up-To Techniques}}, booktitle = {Proc.\ Conf.\ on Interactive Theorem Proving (ITP~2013)}, volume = {7998}, pages = {354--369}, publisher = {Springer}, series = {LNCS}, year = {2013}, doi = {10.1007/978-3-642-39634-2\_26}, keywords = {rewriting, coinduction, formal verification}, type = {conference} }
Digital Object Identifier
10.1007/978-3-642-39634-2_26
2012
- Automatic Sequences and Zip-SpecificationsClemens Grabmayer, Jörg Endrullis, Dimitri Hendriks, Jan Willem Klop, and Lawrence S. MossIn: Proc. Symp. on Logic in Computer Science (LICS 2012), pp. 335–344, IEEE Computer Society (2012)paper
Bibtex
@inproceedings{streams:zip:2012, author = {Grabmayer, Clemens and Endrullis, J\"{o}rg and Hendriks, Dimitri and Klop, Jan Willem and Moss, Lawrence S.}, title = {{Automatic Sequences and Zip-Specifications}}, booktitle = {Proc.\ Symp.\ on Logic in Computer Science (LICS~2012)}, pages = {335--344}, publisher = {{IEEE} Computer Society}, year = {2012}, doi = {10.1109/LICS.2012.44}, keywords = {rewriting, streams, automata}, type = {conference} }
Digital Object Identifier
10.1109/LICS.2012.44
- Highlights in Infinitary Rewriting and Lambda CalculusJörg Endrullis, Dimitri Hendriks, and Jan Willem KlopTheoretical Computer Science , 464 , pp. 48–71 (2012)paper
Bibtex
@article{infinitary:highlights:2012, author = {Endrullis, J\"{o}rg and Hendriks, Dimitri and Klop, Jan Willem}, title = {{Highlights in Infinitary Rewriting and Lambda Calculus}}, journal = {Theoretical Computer Science}, volume = {464}, pages = {48--71}, year = {2012}, doi = {10.1016/j.tcs.2012.08.018}, keywords = {rewriting, infinitary rewriting, lambda calculus}, type = {journal} }
Digital Object Identifier
10.1016/j.tcs.2012.08.018
- On the Complexity of Equivalence of Specifications of Infinite ObjectsJörg Endrullis, Dimitri Hendriks, and Rena BakhshiIn: Proc. Int. Conf. on Functional Programming (ICFP 2012), pp. 153–164, ACM (2012)paper
Bibtex
@inproceedings{complexity:stream:equality:2012, author = {Endrullis, J\"{o}rg and Hendriks, Dimitri and Bakhshi, Rena}, title = {{On the Complexity of Equivalence of Specifications of Infinite Objects}}, booktitle = {Proc.\ Int.\ Conf.\ on Functional Programming (ICFP~2012)}, pages = {153--164}, publisher = {{ACM}}, year = {2012}, doi = {10.1145/2364527.2364551}, keywords = {rewriting, undecidability, streams}, type = {conference} }
Digital Object Identifier
10.1145/2364527.2364551
2011
- On Equal μ-TermsJörg Endrullis, Clemens Grabmayer, Jan Willem Klop, and Vincent van OostromTheoretical Computer Science , 412 (28) , pp. 3175–3202 (2011)paper
Summary
We consider the rewrite system Rμ with μx.M → μM [x := μx.M ] as its single rewrite rule. This kernel system denoting recursively defined objects occurs in several contexts, e.g. it is the framework of recursive types. For general signatures this rewriting system is widely used to represent and manipulate infinite regular trees.
The main concern of this paper is the convertibility relation for μ-terms as given by the μ-rule, in particular its decidability. This relation is sometimes called weak μ-equality, in contrast with strong μ-equality, which is given by equality of the possibly infinite tree unwinding of μ-terms. While strong equality has received much attention, the opposite is the case for weak μ-equality.
We present three alternative proofs for decidability of weak μ-equality.
Bibtex
@article{equal:mu:terms:2011, author = {Endrullis, J\"{o}rg and Grabmayer, Clemens and Klop, Jan Willem and van~Oostrom, Vincent}, title = {{On Equal $\mu$-Terms}}, journal = {Theoretical Computer Science}, volume = {412}, number = {28}, pages = {3175--3202}, year = {2011}, doi = {10.1016/j.tcs.2011.04.011}, keywords = {rewriting,automata}, type = {journal} }
Digital Object Identifier
10.1016/j.tcs.2011.04.011
- Lazy Productivity via TerminationJörg Endrullis, and Dimitri HendriksTheoretical Computer Science , 412 (28) , pp. 3203–3225 (2011)paper
Bibtex
@article{productivity:termination:2011, author = {Endrullis, J\"{o}rg and Hendriks, Dimitri}, title = {{Lazy Productivity via Termination}}, journal = {Theoretical Computer Science}, volume = {412}, number = {28}, pages = {3203--3225}, year = {2011}, doi = {10.1016/j.tcs.2011.03.024}, keywords = {rewriting, infinitary rewriting, productivity, termination}, type = {journal} }
Digital Object Identifier
10.1016/j.tcs.2011.03.024
- Proving Equality of Streams AutomaticallyHans Zantema, and Jörg EndrullisIn: Proc. Conf. on Rewriting Techniques and Applications (RTA 2011), pp. 393–408, Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik (2011)paper
Bibtex
@inproceedings{streams:equality:2011, author = {Zantema, Hans and Endrullis, J\"{o}rg}, title = {{Proving Equality of Streams Automatically}}, booktitle = {Proc.\ Conf.\ on Rewriting Techniques and Applications (RTA~2011)}, volume = {10}, pages = {393--408}, publisher = {Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik}, series = {LIPIcs}, year = {2011}, doi = {10.4230/LIPIcs.RTA.2011.393}, keywords = {rewriting, infinitary rewriting, streams}, type = {conference} }
Digital Object Identifier
10.4230/LIPIcs.RTA.2011.393
- Infinitary Rewriting CoinductivelyJörg Endrullis, and Andrew PolonskyIn: Proc. Conf. on Types for Proofs and Programs (TYPES 2012), pp. 16–27, Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik (2011)paper
Summary
We provide a coinductive definition of strongly convergent reductions between infinite lambda terms. This approach avoids the notions of ordinals and metric convergence which have appeared in the earlier definitions of the concept. As an illustration, we prove the existence part of the infinitary standardization theorem. The proof is fully formalized in Coq using coinductive types.
The paper concludes with a characterization of infinite lambda terms which reduce to themselves in a single beta step.
The papers
extend the idea in this paper to reductions of length beyond omega.The extension to reduction lengths beyond omega requires mixing of induction and coinduction. Therefore the simpler setup presented in the current paper (Infinitary Rewriting Coinductively) is preferrable whenever there is no need for reductions longer than omega. This concerns for instance rewrite systems having the compression property, such as lambda calculus and left-linear term rewriting systems (with finite left-hand sides).
Bibtex
@inproceedings{infintary:lambda:coinductive:2011, author = {Endrullis, J\"{o}rg and Polonsky, Andrew}, title = {{Infinitary Rewriting Coinductively}}, booktitle = {Proc.\ Conf.\ on Types for Proofs and Programs (TYPES~2012)}, volume = {19}, pages = {16--27}, publisher = {Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik}, series = {LIPIcs}, year = {2011}, doi = {10.4230/LIPIcs.TYPES.2011.16}, keywords = {rewriting, infinitary rewriting, coinduction}, type = {conference} }
Digital Object Identifier
10.4230/LIPIcs.TYPES.2011.16
- Levels of Undecidability in RewritingJörg Endrullis, Herman Geuvers, Jakob Grue Simonsen, and Hans ZantemaInformation and Computation , 209 (2) , pp. 227–245 (2011)paper
Summary
Undecidability of various properties of first-order term rewriting systems is well-known. An undecidable property can be classified by the complexity of the formula defining it. This classification gives rise to a hierarchy of distinct levels of undecidability, starting from the arithmetical hierarchy classifying properties using first order arithmetical formulas, and continuing into the analytic hierarchy, where quantification over function variables is allowed.
In this paper we give an overview of how the main properties of first-order term rewriting systems are classified in these hierarchies. We consider properties related to normalization (strong normalization, weak normalization and dependency problems) and properties related to confluence (confluence, local confluence and the unique normal form property). For all of these we distinguish between the single term version and the uniform version. Where appropriate, we also distinguish between ground and open terms.
Most uniform properties are \( \Pi^0_2 \)-complete. The particular problem of local confluence turns out to be \( \Pi^0_2 \)-complete for ground terms, but only \( \Sigma^0_1 \)-complete (and thereby recursively enumerable) for open terms. The most surprising result concerns dependency pair problems without minimality flag: we prove this problem to be \( \Pi^1_1 \)-complete, hence not in the arithmetical hierarchy, but properly in the analytic hierarchy.
This paper is an extended version of Degrees of Undecidability in Term Rewriting (CSL 2009).
Bibtex
@article{rewriting:undecidability:levels:2011, author = {Endrullis, J\"{o}rg and Geuvers, Herman and Simonsen, Jakob Grue and Zantema, Hans}, title = {{Levels of Undecidability in Rewriting}}, journal = {Information and Computation}, volume = {209}, number = {2}, pages = {227--245}, year = {2011}, doi = {10.1016/j.ic.2010.09.003}, keywords = {rewriting, undecidability, termination, confluence}, type = {journal} }
Digital Object Identifier
10.1016/j.ic.2010.09.003
2010
- Productivity of Stream DefinitionsJörg Endrullis, Clemens Grabmayer, Dimitri Hendriks, Ariya Isihara, and Jan Willem KlopTheoretical Computer Science , 411 (4-5) , pp. 765–782 (2010)paper
Summary
We give an algorithm for deciding productivity of a large and natural class of recursive stream definitions. A stream definition is called `productive' if it can be evaluated continually in such a way that a uniquely determined stream in constructor normal form is obtained as the limit. Whereas productivity is undecidable for stream definitions in general, we show that it can be decided for `pure' stream definitions. For every pure stream definition the process of its evaluation can be modelled by the dataflow of abstract stream elements, called `pebbles', in a finite `pebbleflow net(work)'. And the production of a pebbleflow net associated with a pure stream definition, that is, the amount of pebbles the net is able to produce at its output port, can be calculated by reducing nets to trivial nets.
This paper is an extended version of Productivity of Stream Definitions (2007) (FCT 2007).
See research for an overview of my research on productivity.
Bibtex
@article{productivity:streams:2010, author = {Endrullis, J\"{o}rg and Grabmayer, Clemens and Hendriks, Dimitri and Isihara, Ariya and Klop, Jan Willem}, title = {{Productivity of Stream Definitions}}, journal = {Theoretical Computer Science}, volume = {411}, number = {4-5}, pages = {765--782}, year = {2010}, doi = {10.1016/j.tcs.2009.10.014}, keywords = {rewriting, infinitary rewriting, productivity}, type = {journal} }
Digital Object Identifier
10.1016/j.tcs.2009.10.014
- Local Termination: Theory and PracticeJörg Endrullis, Roel C. de Vrijer, and Johannes WaldmannLogical Methods in Computer Science , 6 (3) (2010)paper
Bibtex
@article{termination:local:2010, author = {Endrullis, J\"{o}rg and de~Vrijer, Roel C. and Waldmann, Johannes}, title = {{Local Termination: Theory and Practice}}, journal = {Logical Methods in Computer Science}, volume = {6}, number = {3}, year = {2010}, doi = {10.2168/LMCS-6(3:20)2010}, keywords = {rewriting, termination, automata}, type = {journal} }
Digital Object Identifier
10.2168/LMCS-6(3:20)2010
- Transforming Outermost into Context-Sensitive RewritingJörg Endrullis, and Dimitri HendriksLogical Methods in Computer Science , 6 (2) (2010)paper
Summary
We define two transformations from term rewriting systems (TRSs) to context-sensitive TRSs in such a way that termination of the target system implies outermost termination of the original system. In the transformation based on `context extension', each outermost rewrite step is modeled by exactly one step in the transformed system. This transformation turns out to be complete for the class of left-linear TRSs. The second transformation is called `dynamic labeling' and results in smaller sized context-sensitive TRSs. Here each modeled step is adjoined with a small number of auxiliary steps. As a result state-of-the-art termination methods for context-sensitive rewriting become available for proving termination of outermost rewriting. Both transformations have been implemented in Jambox, making it the most successful tool in the category of outermost rewriting of the annual termination competition.
This is an extended version of the paper From Outermost to Context-Sensitive Rewriting (RTA 2009).
See research for an overview of my research on termination.
Bibtex
@article{termination:outermost:2010, author = {Endrullis, J\"{o}rg and Hendriks, Dimitri}, title = {{Transforming Outermost into Context-Sensitive Rewriting}}, journal = {Logical Methods in Computer Science}, volume = {6}, number = {2}, year = {2010}, doi = {10.2168/LMCS-6(2:5)2010}, keywords = {rewriting, termination, automata}, type = {journal} }
Digital Object Identifier
10.2168/LMCS-6(2:5)2010
- Modular Construction of Fixed Point Combinators and Clocked Böhm TreesJörg Endrullis, Dimitri Hendriks, and Jan Willem KlopIn: Proc. Symp. on Logic in Computer Science (LICS 2010), pp. 111–119, IEEE Computer Society (2010)paper
Summary
One of the best-known methods for discriminating lambda terms with respect to \( \beta \)-convertibility is due to Corrado Böhm. Roughly speaking, the Böhm tree of a lambda term is its infinite normal form. If lambda terms have distinct Böhm trees, then they are not \( \beta \)-convertible. But what if their Böhm trees coincide? For example, all fixed-point combinators have the same Böhm tree, namely \[ \lambda x. x(x(x(\ldots))) \]
We refine Böhm trees with a clock that records how many head reduction steps have been used to rewrite each subterm to head normal form. We show that clocked Böhm trees are suitable for discriminating a rich class of lambda terms (in particular many fixed-point combinators) having the same Böhm trees.
We refer to Discriminating Lambda-Terms Using Clocked Boehm Trees (LMCS 2014) for an extended journal version of this paper. In the extended version we improve the precision of the clocks to atomic clocks and we answer a question of Gordon Plotkin, showing that the method can be used beyond simple terms.
See research for an overview of my research on the clocked lambda claculus and fixed-point combinators.
Bibtex
@inproceedings{lambda:clocks:2010, author = {Endrullis, J\"{o}rg and Hendriks, Dimitri and Klop, Jan Willem}, title = {{Modular Construction of Fixed Point Combinators and Clocked B\"{o}hm Trees}}, booktitle = {Proc.\ Symp.\ on Logic in Computer Science (LICS~2010)}, pages = {111--119}, publisher = {{IEEE} Computer Society}, year = {2010}, doi = {10.1109/LICS.2010.8}, keywords = {rewriting, infinitary rewriting, lambda calculus}, type = {conference} }
Digital Object Identifier
10.1109/LICS.2010.8
- Unique Normal Forms in Infinitary Weakly Orthogonal RewritingJörg Endrullis, Clemens Grabmayer, Dimitri Hendriks, Jan Willem Klop, and Vincent van OostromIn: Proc. Conf. on Rewriting Techniques and Applications (RTA 2010), pp. 85–102, Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik (2010)paper
Bibtex
@inproceedings{infinitary:weakly:orthogonal:unique:normal:forms:2010, author = {Endrullis, J\"{o}rg and Grabmayer, Clemens and Hendriks, Dimitri and Klop, Jan Willem and van~Oostrom, Vincent}, title = {{Unique Normal Forms in Infinitary Weakly Orthogonal Rewriting}}, booktitle = {Proc.\ Conf.\ on Rewriting Techniques and Applications (RTA~2010)}, volume = {6}, pages = {85--102}, publisher = {Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik}, series = {LIPIcs}, year = {2010}, doi = {10.4230/LIPIcs.RTA.2010.85}, keywords = {rewriting, infinitary rewriting, lambda calculus}, type = {conference} }
Digital Object Identifier
10.4230/LIPIcs.RTA.2010.85
2009
- From Outermost to Context-Sensitive RewritingJörg Endrullis, and Dimitri HendriksIn: Proc. Conf. on Rewriting Techniques and Applications (RTA 2009), pp. 305–319, Springer (2009)paper
Summary
We define a transformation from term rewriting systems (TRSs) to context-sensitive TRSs in such a way that termination of the target system implies outermost termination of the original system. For the class of left-linear TRSs the transformation is complete. Thereby state-of-the-art termination methods and automated termination provers for context-sensitive rewriting become available for proving termination of outermost rewriting. The translation has been implemented in Jambox, making it the most successful tool in the category of outermost rewriting of the last edition of the annual termination competition.
An extended version of this paper is available from Transforming Outermost into Context-Sensitive Rewriting (Logical Methods in Computer Science 2010).
See research for an overview of my research on termination.
Bibtex
@inproceedings{termination:outermost:2009, author = {Endrullis, J\"{o}rg and Hendriks, Dimitri}, title = {{From Outermost to Context-Sensitive Rewriting}}, booktitle = {Proc.\ Conf.\ on Rewriting Techniques and Applications (RTA~2009)}, volume = {5595}, pages = {305--319}, publisher = {Springer}, series = {LNCS}, year = {2009}, doi = {10.1007/978-3-642-02348-4\_22}, keywords = {rewriting, termination, automata}, type = {conference} }
Digital Object Identifier
10.1007/978-3-642-02348-4_22
- Local TerminationJörg Endrullis, Roel C. de Vrijer, and Johannes WaldmannIn: Proc. Conf. on Rewriting Techniques and Applications (RTA 2009), pp. 270–284, Springer (2009)paper
Bibtex
@inproceedings{termination:local:2009, author = {Endrullis, J\"{o}rg and de~Vrijer, Roel C. and Waldmann, Johannes}, title = {{Local Termination}}, booktitle = {Proc.\ Conf.\ on Rewriting Techniques and Applications (RTA~2009)}, volume = {5595}, pages = {270--284}, publisher = {Springer}, series = {LNCS}, year = {2009}, doi = {10.1007/978-3-642-02348-4\_19}, keywords = {rewriting, termination, automata}, type = {conference} }
Digital Object Identifier
10.1007/978-3-642-02348-4_19
- Complexity of Fractran and ProductivityJörg Endrullis, Clemens Grabmayer, and Dimitri HendriksIn: Proc. Conf. on Automated Deduction (CADE 2009), pp. 371–387, Springer (2009)paper
Bibtex
@inproceedings{complexity:productivity:2009, author = {Endrullis, J\"{o}rg and Grabmayer, Clemens and Hendriks, Dimitri}, title = {{Complexity of Fractran and Productivity}}, booktitle = {Proc.\ Conf.\ on Automated Deduction (CADE~2009)}, volume = {5663}, pages = {371--387}, publisher = {Springer}, series = {LNCS}, year = {2009}, doi = {10.1007/978-3-642-02959-2\_28}, keywords = {rewriting, undecidability, productivity}, type = {conference} }
Digital Object Identifier
10.1007/978-3-642-02959-2_28
- Degrees of Undecidability in Term RewritingJörg Endrullis, Herman Geuvers, and Hans ZantemaIn: Proc. Conf. on Computer Science Logic (CSL 2009), pp. 255–270, Springer (2009)paper
Summary
Undecidability of various properties of first-order term rewriting systems is well-known. An undecidable property can be classified by the complexity of the formula defining it. This classification gives rise to a hierarchy of distinct levels of undecidability, starting from the arithmetical hierarchy classifying properties using first order arithmetical formulas, and continuing into the analytic hierarchy, where quantification over function variables is allowed.
In this paper we give an overview of how the main properties of first-order term rewriting systems are classified in these hierarchies. We consider properties related to normalization (strong normalization, weak normalization and dependency problems) and properties related to confluence (confluence, local confluence and the unique normal form property). For all of these we distinguish between the single term version and the uniform version. Where appropriate, we also distinguish between ground and open terms.
Most uniform properties are \( \Pi^0_2 \)-complete. The particular problem of local confluence turns out to be \( \Pi^0_2 \)-complete for ground terms, but only \( \Sigma^0_1 \)-complete (and thereby recursively enumerable) for open terms. The most surprising result concerns dependency pair problems without minimality flag: we prove this problem to be \( \Pi^1_1 \)-complete, hence not in the arithmetical hierarchy, but properly in the analytic hierarchy.
We refer to Levels of Undecidability in Rewriting (Information and Computation 2011) for an extended journal version of this paper.
Bibtex
@inproceedings{rewriting:undecidability:degrees:2009, author = {Endrullis, J\"{o}rg and Geuvers, Herman and Zantema, Hans}, title = {{Degrees of Undecidability in Term Rewriting}}, booktitle = {Proc.\ Conf.\ on Computer Science Logic (CSL~2009)}, volume = {5771}, pages = {255--270}, publisher = {Springer}, series = {LNCS}, year = {2009}, doi = {10.1007/978-3-642-04027-6\_20}, keywords = {rewriting, undecidability, termination, confluence}, type = {conference} }
Digital Object Identifier
10.1007/978-3-642-04027-6_20
2008
- Data-Oblivious Stream ProductivityJörg Endrullis, Clemens Grabmayer, and Dimitri HendriksIn: Proc. Conf. on Logic for Programming Artificial Intelligence and Reasoning (LPAR 2008), pp. 79–96, Springer (2008)paper
Summary
We are concerned with demonstrating productivity of specifications of infinite streams of data, based on orthogonal rewrite rules. In general, this property is undecidable, but for restricted formats computable sufficient conditions can be obtained. The usual analysis, also adopted here, disregards the identity of data, thus leading to approaches that we call data-oblivious. We present a method that is provably optimal among all such data-oblivious approaches. This means that in order to improve on our algorithm one has to proceed in a data-aware fashion.
See research for an overview of my research on productivity.
Bibtex
@inproceedings{productivity:data:oblivious:2008, author = {Endrullis, J\"{o}rg and Grabmayer, Clemens and Hendriks, Dimitri}, title = {{Data-Oblivious Stream Productivity}}, booktitle = {Proc.\ Conf.\ on Logic for Programming Artificial Intelligence and Reasoning (LPAR~2008)}, number = {5330}, pages = {79--96}, publisher = {Springer}, series = {LNCS}, year = {2008}, doi = {10.1007/978-3-540-89439-1\_6}, keywords = {rewriting, infinitary rewriting, productivity}, type = {conference} }
Digital Object Identifier
10.1007/978-3-540-89439-1_6
- Matrix Interpretations for Proving Termination of Term RewritingJörg Endrullis, Johannes Waldmann, and Hans ZantemaJournal of Automated Reasoning , 40 (2-3) , pp. 195–220 (2008)paper
Summary
We present a new method for automatically proving termination of term rewriting using matrix interpretation (weighted automata with natural numbers as weights). It is based on the well-known idea of interpretation of terms where every rewrite step causes a decrease, but instead of the usual natural numbers we use vectors of natural numbers, ordered by a particular nontotal well-founded ordering. Function symbols are interpreted by linear mappings represented by matrices. This method allows us to prove termination and relative termination.
A modification of the latter, in which strict steps are only allowed at the top, turns out to be helpful in combination with the dependency pair transformation. By bounding the dimension and the matrix coefficients, the search problem becomes finite. Our implementation transforms it to a Boolean satisfiability problem (SAT), to be solved by a state-of-the-art SAT solver.
This is an extended version of the paper Matrix Interpretations for Proving Termination of Term Rewriting (IJCAR 2006).
See research for an overview of my research on termination.
Bibtex
@article{termination:matrix:2008, author = {Endrullis, J\"{o}rg and Waldmann, Johannes and Zantema, Hans}, title = {{Matrix Interpretations for Proving Termination of Term Rewriting}}, journal = {Journal of Automated Reasoning}, volume = {40}, number = {2-3}, pages = {195--220}, year = {2008}, doi = {10.1007/s10817-007-9087-9}, keywords = {rewriting, termination}, type = {journal} }
Digital Object Identifier
10.1007/s10817-007-9087-9
- Reduction under SubstitutionJörg Endrullis, and Roel C. de VrijerIn: Proc. Conf. on Rewriting Techniques and Applications (RTA 2008), pp. 425–440, Springer (2008)paper
Bibtex
@inproceedings{lambda:reduction:under:substitution:2008, author = {Endrullis, J\"{o}rg and de~Vrijer, Roel C.}, title = {{Reduction under Substitution}}, booktitle = {Proc.\ Conf.\ on Rewriting Techniques and Applications (RTA~2008)}, volume = {5117}, pages = {425--440}, publisher = {Springer}, series = {LNCS}, year = {2008}, doi = {10.1007/978-3-540-70590-1\_29}, keywords = {rewriting, lambda calculus}, type = {conference} }
Digital Object Identifier
10.1007/978-3-540-70590-1_29
- Proving Infinitary NormalizationJörg Endrullis, Clemens Grabmayer, Dimitri Hendriks, Jan Willem Klop, and Roel C. de VrijerIn: Proc. Conf. on Types for Proofs and Programs (TYPES 2008), pp. 64–82, Springer (2008)paper
Bibtex
@inproceedings{infinitary:normalization:2009, author = {Endrullis, J\"{o}rg and Grabmayer, Clemens and Hendriks, Dimitri and Klop, Jan Willem and de~Vrijer, Roel C.}, title = {{Proving Infinitary Normalization}}, booktitle = {Proc.\ Conf.\ on Types for Proofs and Programs (TYPES~2008)}, volume = {5497}, pages = {64--82}, publisher = {Springer}, series = {LNCS}, year = {2008}, doi = {10.1007/978-3-642-02444-3\_5}, keywords = {rewriting, infinitary rewriting, automata}, type = {conference} }
Digital Object Identifier
10.1007/978-3-642-02444-3_5
2007
- Productivity of Stream DefinitionsJörg Endrullis, Clemens Grabmayer, Dimitri Hendriks, Ariya Isihara, and Jan Willem KlopIn: Proc. Symp. on Fundamentals of Computation Theory (FCT 2007), pp. 274–287, Springer (2007)paper
Summary
We give an algorithm for deciding productivity of a large and natural class of recursive stream definitions. A stream definition is called `productive' if it can be evaluated continually in such a way that a uniquely determined stream in constructor normal form is obtained as the limit. Whereas productivity is undecidable for stream definitions in general, we show that it can be decided for `pure' stream definitions. For every pure stream definition the process of its evaluation can be modelled by the dataflow of abstract stream elements, called `pebbles', in a finite `pebbleflow net(work)'. And the production of a pebbleflow net associated with a pure stream definition, that is, the amount of pebbles the net is able to produce at its output port, can be calculated by reducing nets to trivial nets.
An extended journal version of this paper is available from Productivity of Stream Definitions (TCS 2010).
See research for an overview of my research on productivity.
Bibtex
@inproceedings{productivity:streams:2007, author = {Endrullis, J\"{o}rg and Grabmayer, Clemens and Hendriks, Dimitri and Isihara, Ariya and Klop, Jan Willem}, title = {{Productivity of Stream Definitions}}, booktitle = {Proc.\ Symp.\ on Fundamentals of Computation Theory (FCT~2007)}, number = {4639}, pages = {274--287}, publisher = {Springer}, series = {LNCS}, year = {2007}, doi = {10.1007/978-3-540-74240-1\_24}, keywords = {rewriting, infinitary rewriting, productivity}, type = {conference} }
Digital Object Identifier
10.1007/978-3-540-74240-1_24
- Root Stabilisation Using Dependency PairsJörg Endrullis, and Jeroen KetemaIn: Proc. Workshop on Termination (WST 2007), pp. 17–21 (2007)paper
Bibtex
@inproceedings{termination:root:2007, author = {Endrullis, J\"{o}rg and Ketema, Jeroen}, title = {{Root Stabilisation Using Dependency Pairs}}, booktitle = {Proc.\ Workshop on Termination (WST~2007)}, pages = {17--21}, year = {2007}, keywords = {rewriting, infinitary rewriting, termination}, type = {workshop} }
2006
- Matrix Interpretations for Proving Termination of Term RewritingJörg Endrullis, Johannes Waldmann, and Hans ZantemaIn: Proc. Int. Joint Conf. on Automated Reasoning (IJCAR 2006), pp. 574–588, Springer (2006)paper
Summary
We present a new method for automatically proving termination of term rewriting using matrix interpretation (weighted automata with natural numbers as weights). It is based on the well-known idea of interpretation of terms where every rewrite step causes a decrease, but instead of the usual natural numbers we use vectors of natural numbers, ordered by a particular non-total well-founded ordering. Function symbols are interpreted by linear mappings represented by matrices. This method allows to prove termination and relative termination. A modification of the latter in which strict steps are only allowed at the top, turns out to be helpful in combination with the dependency pair transformation.
By bounding the dimension and the matrix coefficients, the search problem becomes finite. Our implementation transforms it to a Boolean satisfiability problem (SAT), to be solved by a state-of-the-art SAT solver. Our implementation performs well on the Termination Problem Data Base: better than 5 out of 6 tools that participated in the 2005 termination competition in the category of term rewriting.
An extended journal version of this paper is available from Matrix Interpretations for Proving Termination of Term Rewriting (Journal of Automated Reasoning 2008).
See research for an overview of my research on termination.
Bibtex
@inproceedings{termination:matrix:2006, author = {Endrullis, J\"{o}rg and Waldmann, Johannes and Zantema, Hans}, title = {{Matrix Interpretations for Proving Termination of Term Rewriting}}, booktitle = {Proc.\ Int.\ Joint Conf.\ on Automated Reasoning (IJCAR~2006)}, volume = {4130}, pages = {574--588}, publisher = {Springer}, series = {LNCS}, year = {2006}, doi = {10.1007/11814771\_47}, keywords = {rewriting, termination, automata}, type = {conference} }
Digital Object Identifier
10.1007/11814771_47
- Decomposing Terminating Rewrite RelationsJörg Endrullis, Dieter Hofbauer, and Johannes WaldmannIn: Proc. Workshop on Termination (WST 2006), pp. 39–43 (2006)paper
Summary
In this paper we are concerned with automatically proving termination of string rewriting systems (or word rewriting systems). We introduce an efficient technique for constructing automata that serve as certificates of match-boundedness, and thereby termination. Previously constructions have either been inefficient or incomplete:
- Match-Bounded String Rewriting Systems by Alfons Geser, Dieter Hofbauer and Johannes Waldmann. The construction in this paper is complete but very inefficient.
- On tree automata that certify termination of left-linear term rewriting systems by Alfons Geser, Dieter Hofbauer, Johannes Waldmann and Hans Zantema. The construction in this paper is efficient but incomplete. We present the first automata construction for matchbounded string rewriting that is both efficient and complete. Here completenss means that the algorithm always terminates if the system is matchbounded.
See research for an overview of my research on termination.
Bibtex
@inproceedings{termination:matchbounds:2006, author = {Endrullis, J\"{o}rg and Hofbauer, Dieter and Waldmann, Johannes}, title = {{Decomposing Terminating Rewrite Relations}}, booktitle = {Proc.\ Workshop on Termination (WST~2006)}, pages = {39--43}, year = {2006}, keywords = {rewriting, termination, automata}, type = {workshop} }