9 Information and communication in living matter
Bernd-Olaf Küppers
Ever since the elucidation of the molecular basis of living systems, we have known that all elementary processes of life are
governed by information. Thus, information turns out to be a key concept in understanding living matter (Küppers,
1990). More than that: the flow of information at all levels of the living system reveals the properties of communication. This
means that the information stored in the genome of the organism is expressed in innumerable feedback loops – a process through
which the genetic information is continually re-evaluated by permanent interactions with the physical environment to which
it is exposed. In this way, the living organism is built up, step by step, into a hierarchically organized network of unmatched
complexity.
The fact that all phenomena of life are based upon information and communication is indeed the principal characteristic of
living matter. Without the perpetual exchange of information at all levels of organization, no functional order in the living
organism could be sustained. The processes of life would implode into a jumble of chaos if they were not perpetually stabilized
by information and communication. In this chapter, I should like to consider some of the consequences that follow from this
for our philosophical understanding of reality.
9.1 About “information” and “communication”
In daily usage, the terms “information” and “communication” are not always clearly distinguished from each other. Yet, even
the etymology of the two words indicates that the reference of the concepts cannot entirely overlap. The term “information”
– following closely its Latin root
informare – denotes primarily the formative, and thus instructive function of a message. By contrast, the word “communication” – derived
from the Latin word
communicare – denotes the process by which the sender and the receiver of information try to reach a common understanding. The subject
of this understanding is a common evaluation of the information exchanged between the sender and the receiver. Alongside this,
the bare instruction for which the word “information” stands seems like a command that results in the mechanical: that is,
the unilateral transfer of the information from sender to receiver, without any aim of achieving a common or mutual understanding
of the “meaning” of the information being expressed by its operative function.
Thus, if we wish to approach the concept of communication in living matter in its widest sense, we need to examine the relationship
between information on the one hand and mutual or common understanding on the other. At the same time, we shall need to demonstrate
that concepts such as “information” and “communication” can meaningfully be applied to natural processes. The latter task
would seem to raise fewer difficulties for the concept of information than it does for that of communication.
Information, as suggested above, means primarily “instruction,” in the sense of a command or step in a computer program. This
in turn has the function of imposing a selective condition on the possible processes that can go on in a system. In precisely
this sense, living processes are “instructed” by the information that is contained in encoded form in genes. Expressed in
the terms of physics: the genome represents a specific physical boundary condition, a constraint that restricts the set of
physically possible processes to those that do actually take place within the organism and are directed towards preservation
of the system (Küppers,
1992). Thus, the idea of “instruction by information” has a precise physical meaning, and in this context information can be indeed
regarded as an objective property of living matter.
It is a harder task to demonstrate the universality of communication. One tends to assume intuitively that this concept is
applicable only to the exchange of information between human beings. This assumption arises from the fact that the idea of
a “common” understanding seems to make no sense outside the realm of human consciousness. However, this could be a false premise
based upon a narrow use of the concept of understanding. Reaching a common understanding usually means reaching agreement.
This in turn implies that one must understand one another in the sense that each party can comprehend what the other party
intends to communicate. However, attaining a common understanding does not necessarily presuppose any reflections upon the
nature or the subject of the communication process, nor does it imply the question of whether the contents of the communication
are true or false. Rather, it requires only a mere exchange of information; that is, a number of messages to be sent in both
directions – without, however, either party necessarily being aware of the meaning of what is being communicated.
There is thus a subtle difference in scope between “a reflected understanding” and “reaching a coordinated reaction.” If we
are for a moment willing to put aside the highly sophisticated forms of human understanding, and to operate with a concept
of understanding that encompasses only the objectives of achieving a coordinated reaction, then it becomes easy to see how
this concept is applicable to all levels of living matter. We thus have to concede that molecules, cells, bacteria, plants,
and animals have the ability to communicate. In this case, “communication” means neither more nor less than the reciprocal
harmonization and coordination of processes by means of chemical, acoustic, and optical signals.
9.2 About “understanding”
The foregoing arguments have taken me along a path that some philosophers of science have branded “naïve” naturalism. Their
criticism is directed especially at the idea that information can exist as a natural object, independently of human beings:
that is to say, outside
the myriad ways in which humans communicate. This charge of naturalism is heard from quite diverse philosophical camps. However,
all such critics share the conviction that only human language can be a carrier of information, and that the use of linguistic
categories in relation to natural phenomena is nothing more than a naturalistic fallacy. For representatives of this philosophical
position, any talk of information and communication in the natural sciences – as practiced especially in modern biology –
is no more than a metaphor that reveals, ultimately, a sadly uncritical usage of terms such as “language” and “understanding.”
Let us examine this controversy more closely and ask once more the question of what we actually understand by the word “understanding.”
The tautological way in which I express this question indicates that one can easily get into a vicious circle when trying
to approach the notion of understanding. This is because it seems generally to be the case that one can only understand something
if one has understood some other things. This plausible statement is central to philosophical hermeneutics, the best-known
and most influential doctrine of human understanding (Gadamer,
1965).
The hermeneutic thesis, according to which any understanding is bound to some other understanding, obviously refers to the
total “network” of human understanding in which any kind of understanding is embedded. In other words: any form of communication
presupposes some prior understanding, which provides the necessary basis for a meaningful exchange of information. In fact,
there seems to be no information in an absolute sense – not even as a plain syntactic structure – as the mere identification
of a sequence of signs as being “information” presupposes a foregoing knowledge of signs and sequences of signs. In short:
information exists only in a relative sense – that is, in relation to some other information.
Thus, even if we adopt an information-theoretical point of view, there seems to be no obstacle to the hermeneutic circle,
according to which a person can only understand something if he has already understood something else. Nevertheless, this
perspective
contradicts the intentions of philosophical hermeneutics, which puts a completely different construction upon the hermeneutic
circle. Within the framework of this philosophy, the pre-understanding of any kind of human understanding is thought to be
rooted in the totality of human existence. And this ontological interpretation is intended to lead not to a relative but to
an absolute and true understanding of the world.
Moreover, because we use language to comprehend the world, the hermeneutic school regards language as the gate that opens
for us the access to our existence. The philosopher Hans-Georg Gadamer (
1965, p. 450) has expressed this in the often-quoted sentence: “Being that can be understood is language.” Even though some prominent
philosophers of the hermeneutic school assign a special role to dialogue, their concept of understanding still fails to possess
the objectiveness and relativity that characterize a critical comprehension of human understanding. On the contrary: a world
view that rests its claims to validity and truth exclusively upon the rootedness of understanding in human existence has moved
to the forefront and become the absolute norm for any understanding at all.
So, in contrast to the relativistic world picture offered to us by modern science, philosophical hermeneutics seeks to propagate
a fundamentalism of understanding that is centered firmly on the philosophical tradition of absolute understanding. Moreover,
if human language is considered to be a prerequisite for all understanding, human language becomes the ultimate reference
in our relation to the world.
The thesis of this chapter, which intends to give language a naturalistic interpretation that allows us to speak of the “language”
of genes, seems to be diametrically opposed to this position. According to the naturalistic interpretation, which is shared
by other biologists, language is a natural principle for the organization of complex systems, which – in the words of Manfred
Eigen (
1979, p. 181) – “can be analyzed in an abstract sense, that is, without reference to human existence.” From the standpoint of
philosophical hermeneutics,
such use of the word “language” is completely unacceptable. From this perspective, biologists who think and speak in this
way about the existence of a “molecular language” look like drivers thundering down the motorway in the wrong direction –
ignoring all the signposts naturally provided by human language for comprehending the world.
9.3 The “language” of genes
Impressive evidence for the naturalistic view of language seems to be found in the language-like arrangement of genetic information.
Thus, as is well known, the genetic alphabet is grouped in higher-order informational units, which in genetic handwriting
take over the functions of words, sentences, and so forth. And, like human language, genetic information has a hierarchical
structure, which is unfolded in a complex feedback mechanism – a process that shows all the properties of a communication
process between the genome and its physical context.
Of course, the parallels break down if we try to use the full riches of human language as a measure of the “language-like”
structure of the genome. But from an evolutionary standpoint, there are good grounds to assert that “language” is indeed a
natural phenomenon, which originates in the molecular language of the genome and has found, in the course of evolution, its
hitherto highest expression in human language (Küppers,
1995). For evolutionary biologists, there is no question as to whether languages below the level of human language exist; the
issue is rather about identifying the general conditions under which linguistic structures originate and evolve.
The significance of the natural phenomenon “language” for the explanation of living matter was recognized and first expressed
with admirable clarity at the end of the nineteenth century by Friedrich Miescher, the discoverer of nucleic acids. Asking
how a substance such as a nucleic acid can generate the vast diversity of genetic structures, he drew an analogy to the principles
of stereochemistry. In the same way – Miescher argued – that a narrow variety of small
molecular units is able to build up large molecules of almost unlimited complexity that are chemically very similar, but which
have very different structures in three dimensions, the nucleic acids are capable of instructing the vast diversity of genetic
structures. This line of thinking led Miescher to the conclusion that the nucleic acids must be able to “express all the riches
and all the diversity of inheritance, just as words and ideas in all languages can be expressed in the 24–30 letters of the
alphabet” (Miescher,
1897, p. 116). Obviously Miescher’s view of living matter was that of a “linguistic movement” rather than that of a “clockwork
machine.” However, the “linguistic movement” of living matter is not a dead system of rules, but a dynamic one.
So, is this all just metaphoric speech? An outside observer, watching the disputants from a distance, might wonder what the
controversy is all about, and might even suspect that it was a typical philosophers’ war over the meaning of words. Our observer
would be certain to draw attention to the fact that we repeatedly take words out of their original context and transpose them
into another, so that any discourse about the world of nature is bound to employ metaphors, at least to a certain extent.
Why, then, should we not simply regard terms such as “information,” “communication,” and “language” in biology as what they
really are: namely, adequate and highly resilient media for the description of the phenomena of life? Do the recent spectacular
successes at the interface between biotechnology and information technology not justify the use of these concepts in biology?
The construction of bio-computers, the development of genetic algorithms, the simulation of cognitive processes in neural
networks, the coupling of nerve cells to computer chips, the generation of genetic information in evolution machines – all
these would scarcely be conceivable without the information-theoretical foundations of living matter provided by biology.
However, the foregoing questions cannot be disposed of with simple arguments. This is above all because “information,”
“communication,” and “language” are charged with other notions such as “meaning,” “value,” “truth,” and the like. And this
is where we run into the real nub of the discussion. Phenomena associated with meaning, as expressed in the semantic dimension
of information, appear to evade completely all attempts to explain them on a naturalistic basis, and thus also to escape scientific
description.
The right to interpret phenomena of meaning has traditionally been claimed by the humanities: especially by its hermeneutic
disciplines. They have placed meaning, and thus also the understanding of meaning, at the center of their methodology; a clear
demarcation against the natural sciences may indeed have been one of the motives for this. Whatever the reasons, the humanities
have long gone their own way, have not considered it necessary to subject themselves to the scientific method of causal analysis
– and have thus retained their independence for a considerable length of time.
The question of how broadly the concept of information may be applied is thus by no means a dispute about the content and
the range of the applicability of a word. It would be truer to regard this question as the focal point at which philosophical
controversies about the unity of knowledge converge – debates that have determined the relationship of the humanities and
the natural sciences for more than a hundred years. The biological sciences, which stand at the junction between these two
currents of thought, are always the first to get caught in the crossfire. This is because an information-theoretical account
of living matter involving a law-like explanation necessarily introduces questions of meaning and, thus, the semantic aspect
of information (Küppers,
1996). Furthermore, the introduction of the semantic aspect of information in turn leads to the most fascinating plan-like and
purpose-like aspects of living matter, which have every appearance of overstretching the capacity of traditional scientific
explanation. Are, then, physical explanations – and with them the entire reductionistic research program in biology – doomed
to founder on the semantic aspect of information?
9.4 The semantic dimension of information
Our discussion up to now has suggested that semantic information is “valued” information. The value of information is, however,
not an absolute quantity; rather, it can only be judged by a receiver. Thus, the semantics of information depend fundamentally
upon the state of the receiver. This state is determined by their prior knowledge, prejudices, expectations, and so forth.
In short: the receiver’s evaluation scale is the result of a particular, historically unique, pathway of experiences. Can
– we may persist in asking – the particular and individual aspects of reality ever become the object of inquiry in a science
based upon general laws and universal concepts? Even Aristotle addressed this important question. His answer was a clear “No.”
For him – the logician – there were no general discoveries to be made about things that were necessarily of an individual
nature, because the logic of these two attributes – general and particular – made them mutually exclusive. This view has persisted
through to our age, and has left a deep mark upon our present-day understanding of what science is and does.
Under these circumstances, the achievement of the philosopher Ernst Cassirer appears all the more admirable. Opposing the
Aristotelian tradition, Cassirer attempted to bridge the presumed gap between the general and the particular (Cassirer,
1910). Particular phenomena, he argued, do not become particular because they evade the general rules, but because they stand
in a particular – that is, singular – relationship to them. Cassirer’s reflections may have been triggered by an
aperçu of von Goethe (1981, p. 433): “The general and the particular coincide – the particular is the general as it appears under
various conditions.”
According to Cassirer, it is the unique constellation of general aspects of a phenomenon that makes up its uniqueness. This
is an interesting idea. It makes clear that an all-embracing theory of semantic information is impossible, whereas general
aspects of semantics can very well be discerned. Taken for themselves, these aspects may never completely embrace the phenomenon
in question.
But through their unique interconnectedness, they allow the particular characteristics of the phenomenon to show through clearly.
In other words: the unique properties of semantic information originate by superposition of its general disposition. The aspects
that constitute semantic information in this sense include, among others, their novelty and their pragmatic relevance as well
as their complexity (Küppers,
1996).
At the beginning of the 1950s, the philosophers and logicians Yehoshua Bar-Hillel and Rudolf Carnap (
1953) tried to quantify the meaning of a linguistic expression in terms of its novelty value. This idea was a direct continuation
of the concept developed within the framework of Shannon’s information theory, where the information content of a message
is coupled to its expectation value: the lower the expectation value of a message, the higher its novelty and thus its information
content. This approach takes care of the fact that an important task of information is to eliminate or counteract uncertainty.
However, the examples adduced by Bar-Hillel and Carnap are restricted to an artificial language.
A more powerful approach to measuring the semantics of information is that based upon its pragmatic relevance. This approach
has been described in a paradigmatic way by Donald MacKay (1969) in his book Information, Mechanism and Meaning. The pragmatic aspect of information refers to the action(s) of the receiver to which the information leads, or in which
it results.
For some time now, my own efforts have been focused on a new approach, intended to investigate the complexity of semantic
information (Küppers,
1996). Unlike the approaches described above, this one does not seek to make the meaning of information directly amenable to measurement.
Rather, it aims to demarcate the most general conditions that make up the essence of semantic information. Investigations
of this kind are important because they afford a more general insight into the question of the origin of information, and
therefore have consequences for major fundamental problems of biology such as the origin and evolution of life (Küppers,
2000a).
9.5 How does information originate?
Let us consider the relationship between semantic information and complexity in more detail. Information, as we have said,
is always related to an entity that receives and evaluates the information. This in turn means that evaluation presupposes
some other information that underlies the process of registration and processing of the incoming information. But how much
information is needed in order to understand, in the foregoing sense, an item of incoming information? This question expresses
the quantitative version of the hermeneutic thesis, according to which a person can only understand some piece of information
when it has already understood some other information.
At first sight, it would seem impossible to provide any kind of answer to this question since it involves the concept of understanding,
which, as we have seen, is already difficult to understand by itself, let alone to quantify. Surprisingly, however, an answer
can be given, at least if we restrict ourselves to the minimal conditions for understanding. To this belongs first of all
the sheer registration by the receiver of the information to be understood. If the information concerned conveys meaning –
that is, information of maximum complexity – then the receiver must obviously record its entire symbol sequence before the
process of understanding can begin. Thus, even the act of recording involves information of the same degree of (algorithmic)
complexity as that of the symbol sequence that is to be understood.
This surprising result is related to the fact that information conveying meaning cannot be compressed without change in, or
even loss of, its meaning. It is true that the contents of a message can be shortened into a telegram style or a tabloid headline;
however, this always entails some loss of information. This is the case for any meaningful information: be it a great epic
poem or simply the day’s weather report. Viewed technically, this means that no algorithms – that is, computer programs –
exist that can extrapolate arbitrarily chosen parts of the message and thus generate the rest
of the message. But if there are no meaning-generating algorithms, then no information can arise de novo. Therefore, to understand
a piece of information of a certain complexity, one always requires background information that is at least of the same complexity.
This is the sought-after answer to the question of how much information is needed to understand some other information. Ultimately,
it implies that there are no “informational perpetual-motion machines” that can generate meaningful information out of nothing
(Küppers,
1996).
This result is the consequence of a rigorous relativization of the concept of information. It is a continuation of the development
that characterized the progress of physics in the last century: the path from the absolute to the relative. This began with
the abandoning of basic concepts that had been understood in an absolute sense – ideas such as “space,” “time,” and “object”
– and has since led to well-known and far-reaching consequences for the foundations of physics. Whether the thorough-going
relativization of the concept of information will one day lead to a comparable revolution in biological thinking cannot at
present be said. This is largely due to the fact that the results up to now have been derived with respect to the semantic
dimension of human language, and it is not yet clear to what extent they are applicable to the “language of genes.” For this
reason, questions such as whether evolution is a sort of perpetual-motion machine must for the present remain open.
At least it is certain that we must take leave of the idea of being able, one day, to construct intelligent machines that
spontaneously generate meaningful information de novo and continually raise its complexity. If information always refers to
other information, can then information in a genuine sense ever be generated? Or are the processes by which it arises in nature
or in society nothing more than processes of transformation: that is, translation and re-evaluation of information, admittedly
in an information space of gigantic dimensions, so that the result always seems to be new and unique? Questions such as these
take us to the frontline of fundamental research, where
question after question arises, and where we have a wealth of opportunities for speculation but no real answers.
9.6 The world of abstract structures
Finally, I should like to return briefly to the question with which we began: Are the ideas of “information,” “communication,”
and “language” applicable to the world of material structures? We saw how difficult it is to decide this on a philosophical
basis. But it may also be the case that the question is wrongly put. There does indeed seem a surprising solution on the way:
one prompted by current scientific developments. In the last few decades, at the border between the natural sciences and the
humanities, a new scientific domain is emerging that has been termed “structural sciences” (Küppers,
2000b). Alongside information theory, it encompasses important disciplines such as cybernetics, game theory, system theory, complexity
theory, network theory, synergetics, and semiotics, to mention but a few. The object of structural sciences is the way in
which the reality is structured – expressed, investigated, and described in an abstract form. This is done irrespectively
of whether these structures occur in a natural or an artificial, a living or a non-living, system. Among these, “information,”
“communication,” and “language” can be treated within structural sciences as abstract structures, without the question of
their actual nature being raised. By considering reality only in terms of its abstract structures, without making any distinction
between objects of “nature” and “culture,” the structural sciences build a bridge between the natural sciences and the humanities
and thus have major significance for the unity of science (Küppers,
2000b).
In philosophy, the structural view of the world is not new. Within the frame of French structuralism, Gilles Deleuze took
the linguistic metaphor to its limit when he said that “There are no structures that are not linguistic ones … and objects
themselves only have structure in that they conduct a silent discourse, which is the language of signs” (Deleuze,
2002, p. 239). Seen from this
perspective, Gadamer’s dictum “Being that can be understood is language” (Gadamer,
1965, p. 450) takes on a radically new meaning: “Being” can only be understood when it already has a linguistic structure. Pursuing
this corollary, the philosopher Hans Blumenberg (
2000), in a broad review of modern cultural history, has shown that – and how – the linguistic metaphor has made possible the
“readability” (that is, the understanding) of the world. However, the relativity of all understanding has of necessity meant
that the material “read” was reinterpreted over and over again, and that the course of time has led to an ever more accurate
appreciation of which “readings” are wrong. In this way, we have approached, step by step, an increasingly discriminating
understanding of the reality surrounding us.
References
Bar-Hillel, Y., and Carnap, R. (1953). Semantic information. British Journal for the Philosophy of Science, 4: 147.
Blumenberg, H. (2000). Die Lesbarkeit der Welt. Frankfurt/Main: Suhrkamp.
Cassirer, E. (1910). Substanzbegriff und Funktionsbegriff. Berlin: Bruno Cassirer.
Deleuze, G. (2002). À quoi reconnaît-on le structuralisme? In L’Äle Déserte et Autres Textes, ed. D. Lapoujade. Paris: Minuit, 238–269.
Eigen, M. (1979). Sprache und Lernen auf molekularer Ebene. In Der Mensch und seine Sprache, eds A. Peisl and A. Mohler. Frankfurt/Main: Propyläen Verlag, 181–218.
Gadamer, H.-G. (1965). Wahrheit und Methode, 2nd ed. Tübingen: J. B. C. Mohr.
Küppers, B.-O. (1990). Information and the Origin of Life. Cambridge, MA: The MIT Press.
Küppers, B.-O. (1992). Understanding complexity. In Emergence or Reduction? Eds A. Beckermann, H. Flohr and J. Kim. Berlin: de Gruyter, 241–256 [reprinted in Chaos and Complexity, eds R. J. Russell, N. Murphy and A. R. Peacocke (1995). Vatican City State: Vatican Observatory Publications, 93–105].
Küppers, B.-O. (1995). The context-dependence of biological information. In Information. New Questions to a Multidisciplinary Concept, eds K. Kornwachs and K. Jacoby. Berlin: Akademie Verlag, 135–145.
Küppers, B.-O. (1996). Der semantische Aspekt von Information und seine evolutionsbiologische Bedeutung. Nova Acta Leopoldina, 294: 195–219.
Küppers, B.-O. (2000a). The world of biological complexity: Origin and evolution of life. In Many Worlds, ed. S. J. Dick. Pennsylvania: Templeton Foundation Press, 31–43.
Küppers, B.-O. (2000b). Die Strukturwissenschaften als Bindeglied zwischen Natur- und Geisteswissenschaften. In Die Einheit der Wirklichkeit, ed. B.-O. Küppers. Munich: Fink Verlag, 89–105.
MacKay, D. M. (1969). Information, Mechanism and Meaning. Cambridge, MA: The MIT Press.
Miescher, F. (1897). Die histochemischen und physiologischen Arbeiten. Bd. I. Leipzig: F. C. W. Vogel.
von Goethe, J. W. (1981). Werke, Hamburger edition vol. 12. München: C. H. Beck.
Information and the Nature of Reality: From Physics to Metaphysics, eds. Paul Davies and Niels Henrik Gregersen. Published by Cambridge University Press © P. Davies and N. Gregersen 2010.