Theory Restored

paper
Authorship
  1. 1. Allen Renear

    University of Illinois, Urbana-Champaign

Work text
This plain text was ingested for the purpose of full-text search, not to preserve original formatting or readability. For the most complete copy, refer to the original conference program.

Summary

Paul Caton has recently presented a sustained critique of the claim that certain activities within the text encoding community constitute the evolution of a body of theory about text. According to Caton reflection on markup practice has not lead to theory at all, let alone a Lakatosian progressive research programme, but only to principled practice. Caton's allows that text encoding should be theorized, but he believes that this has not yet happened and that those who believe otherwise are confused about both what sort of theory is possible and what sort is needed. The proposed paper will respond directly to these important criticisms. Specifically it will argues that Caton assumes a false dichotomy between empirical science on one hand, and so-called "critical theory" on the other and therefore fails to see that text encoding theory is not failed empirical science, but rather successful "formal science."

The Original Claims to Theory

Caton takes his texts from two articles. One by Allen Renear, "Out of Praxis: Three (Meta)Theories of Textuality," Kathryn Sutherland (Ed.). Electronic Text: Investigations in Method and Theory (pp. 107-126). Oxford 1997, and one by Renear and Elli Mylonas, "The Text Encoding Initiative at 10: Not Just an Interchange Format Anymore -- But a New Research Community" the guest editor's introduction to the TEI anniversary issue of Computers and the Humanities (1999).

In "Out of Praxis" Renear writes (and Caton quotes):

[the text encoding community] ... has evolved a rich body of illuminating theory about the nature of text -- theory that is useful not only to anyone who would create, manage, or use electronic texts, but also to anyone who would, more generally, understand electronic textuality from a theoretical perspective ... the significance of this body of theory and analysis extends well beyond the specific concerns of text processing and text encoding and contributes directly to our general understanding of the deepest issues of textuality and textual communication in general. (Renear, "Out of Praxis.")
And Caton notes that in "The TEI at 10" Renear and Mylonas, after reiterating similar claims, compare this theoretical activity to a Lakatosian "research programme" and intimate that it is a"progressive" research programme, in Lakatos's sense, exhibiting productive evolution of new theories of increasing explanatory power.

Caton's Criticisms

Caton begins by analyzing Lakatos's concept of a progressive research program in more detail and develops an argument that the notion of a research program is inapplicable to text encoding theorizing and in any case gives a negative verdict on "progressiveness." Caton notes Lakatos's claim that new theories evolving within a progressive research programme must predict or explain new facts, as well as retain the core assertions. Renear had suggested that ongoing evolution of the OHCO thesis corresponded to this sort of progressiveness, responding to counterexamples (such as the various kinds of non-hierarchical structures), with a natural satisfying explanatory evolution. But Caton points out that Lakatos claims that a truly progressive research programme must predict or establish "stunning novel facts," such as an unexpected astronomical event. Not only does there not seem to be any such predictions forthcoming from text encoding theory, but Caton sees a contradictory theme in the methodology of much of encoding theorizing, which emphasizes adjusting our theoretical claims to our actual intuitions ("common sense") about texts -- how could such adjustments ever lead to a "stunning new fact"?

Caton concludes:

In itself, the fact that text encoding's home-grown theory fails to be strictly scientific according to the very criteria it invokes is not greatly significant. Who would think of markup as a science anyway? ... these are all symptoms of a confusion about theory and practice.

... In Renear's version of text encoding history, cumulative practice in representing text leads to experience and then to reflection upon that experience and ultimately to, if not knowledge, then at least theory. I argue, however, that practice has instead led simply to principle. (Caton ACH 2003)

And he adds to this account a further deflationary argument to the conclusion that what passes as text encoding theory is in fact "semi-formal description," not explanation.

It is arguable that there is anything unexplained about text, but if for a moment we assume complete ignorance and ask of theoreticians "why are texts the way they are?" we immediately beg the rejoinder "you tell us first how they are, then we can hypothesize about why they are that way". In other words, we need the observation first: we have to define what we wish to explain. Looking at it this way we see that OHCOs 1, 2, and 3, for example, are not hypotheses at all, but attempts at semi-formal description of a phenomenon ... Renear's problems with the word "theory" stem from his conflation of multiple signifieds. One signified -- the "scientific" one, if you like -- points to theory as speculative propositions, offering answers to questions that arise when we realize what we don't know: why does an apple fall to earth? Why do the planets move in the paths they do? The other points to theory as a set of rules or principles that underpin a particular human practice: a theory of poesie, for example.
Caton sees Renear and others as hankering after an empirical science of text that would satisfy their positivist inclinations and their aversion to theory of the literary or critical variety; which is the theorizing that Caton believes text encoding needs.

Analysis

Caton's critique is brilliant and illuminating, and, ironically, a contribution to text encoding theory. But on almost every important point he is wrong.

In our paper we will argue that the fundamental error in Caton's approach is his failure to recognize that text encoding theory, as presented in Renear's examples, is for the most part a formal science, in the sense elaborated by the linguist Jerrold Katz (1981, 1997). In this respect text encoding theorizing is like theorizing in computer science, some parts of linguistics, philosophy, and mathematics, and unlike theorizing in astronomy, sociology, and chemistry. Although the nature of formal science is different from empirical science, there is no reason at all to deny the word, or the concept, theory, to the theories of the formal sciences.

Consider an example from linguistics, speech act "theory". It is a fact that results in speech act theory provide a sense of new understanding and illumination characteristic of science in the most general sense and that contributions to speech act theory often proceed by, at least in some cases, the familiar process of conjecture and refutation that seems constitutive of science. And although some of the content of speech act theory is empirical, and it is certainly intrinsically involved in much actual empirical theorizing in related purely empirical areas of linguistics, at least much of speech act theory is not empirical at all: consider for instance the specific analyses as carried out by Austin, Grice, Searle, or Bach and Harnish -- in every case the method is non-empirical. It is true that we would not normally say the theory "explains", e.g. promising, (and it doesn't, because that particular idiom of explanation is empirical) but speech act theory does "explain how promising works."

Does the original theory (say Austin's account), or any of its subsequent improvements predict "stunning new facts"? The problem here is both that the role of "stunning new facts" in empirical science is exaggerated by Lakatos, and the prospects of "stunning new facts" in formal science minimized by Caton. On the one hand much empirical science is incremental improvement, and on the other formal science can provide a surprise: Russell's set paradox for instance, or Godel's incompleteness results, or Cantor's transfinite cardinals. Caton considers Renear's claim to a "common sense view" as a decisive indication that no "stunning new facts" will be forthcoming. What is true is that formal science is concerned with systematizing and adjusting our formal understanding of how the world. Stunning new facts will not come in the form of a comet, but in the form of paradox, whether that means an actually antinomy or simply the unexpected formal result. Consider Gettier's famous counterexample to the JTB theory of knowledge. In retrospect we might be tempted to say that this can't possibly be either stunning or new as it is simply an unpacking of concepts every ordinary person has; but yet it was stunning, and it was new.

Other examples from linguistics, philosophy, mathematics, computer science, can be usefully developed and we will do so in the final paper. In every case there are non-empirical theories that illuminate and, in a sense, explain phenomena, and that are also closely, and perhaps confusingly, connected with empirical explanations as well, playing supportive roles.

We note that in the end our result is surprisingly irenic. Caton wants to secure an open field for the application of literary theory and critical theory to text encoding phenomena -- no part of our rejoinder is inconsistent with that agenda. In addition, we believe that this adaption of the notion of a "formal science" throws some new light on the old debate over the nature of humanities computing.

Bibliography

1. Paul Caton, "A Critique of 'Theory' in Text Encoding." -- ACH/ALLC Athens, Georgia 2003.
2. Jerrold J. Katz, Language and Other Abstract Objects, Rowman and Littlefield, 1981.
3. Jerrold J. Katz, Realistic Rationalism, MIT Press, 1997.
4. Allen Renear, "Out of Praxis: Three (Meta)Theories of Textuality," Kathryn Sutherland (Ed.). Electronic Text: Investigations in Method and Theory (pp. 107-126). Oxford 1997.
5. Allen Renear and Elli Mylonas, "The Text Encoding Initiative at 10: Not Just an Interchange Format Anymore -- But a New Research Community" the guest editor's introduction to the TEI anniversary issue of Computers and the Humanities 33:1-3. 1999.

If this content appears in violation of your intellectual property rights, or you see errors or omissions, please reach out to Scott B. Weingart to discuss removing or amending the materials.

Conference Info

Complete

ACH/ALLC / ACH/ICCH / ALLC/EADH - 2004

Hosted at Göteborg University (Gothenburg)

Gothenborg, Sweden

June 11, 2004 - June 16, 2004

105 works by 152 authors indexed

Series: ACH/ICCH (24), ALLC/EADH (31), ACH/ALLC (16)

Organizers: ACH, ALLC

Tags
  • Keywords: None
  • Language: English
  • Topics: None