Katalin Bimbó (ed).
J. Michael Dunn on information based logics.
Springer Publishing Company, Incorporated, 2016, 436 pp.

ACM CR categories:
I.2.4 Knowledge Representation Formalisms And Methods
F.4.1 Computational Logic
F.4.1 Mathematical Logic
I.2.3 Deduction And Theorem Proving

Apart from introductory material and an "autobio", these papers on Dunn's work divide themselves into three areas:

I enjoyed reading Alasdair Urquhart's story of Dunn's original discovery, with Robert K. Meyer, of the admissibility of disjunctive syllogism in relevance logic, and different proofs that came up later. Discussing a proof of Saul Kripke, Urquhart poses the open question whether the use of this proof rule gives a "speedup" theorem in relevance logic with much shorter proofs. This theme recurs in Larry Moss's article on syllogisms that express counting properties, regarding the admissibility of reductio ad absurdum. Johan van Benthem's article talks about the "grand picture" of "the interplay of storing a lot of information about the past versus compressing information" (as an example of the latter he mentions forgetting).

Computing science arose from decision problems in logic and mathematical proofs, which led to the definition of computational models. Later this was refined into a study of complexity of algorithms. More recently, computing science has dealt with informational problems, such as collecting pieces of information, which may together be inconsistent, and arriving at a belief which appears most plausible given the data. These questions first arose in the model theory of relevance logics. The proof theory of relevance logics appears to raise issues of succinctness of representations and efficiency of recovery of inferences from them.

Big data is a fashionable theme today. These articles will be valuable to graduate students and researchers who are more attracted to the underlying logical structure rather than the extraction of significance from number crunching.