Skip to main content

what is information?

There is a spectrum of different ways of understanding what information is. These range from quantitative analyses–which treat information in terms of uninterpreted patterns of data–to qualitative analyses, which treat information semantically, as a matter of meaningful structures. What is the relation between these different approaches? Are there invariants underlying the different uses of the term ‘information’ in disciplines such as computer science, economics, genetics, neuroscience, physics, and statistics? Is there some sort of logic, which transcends these different uses? How, for example, are we able to transform quantitative information deriving from meteorological sensors into the sorts of qualitative information that is useful for human decision-making? According to Shannon and Weaver, information refers to the degree of uncertainty present in a message. Does a view along these lines provide a viable starting point for a unified analysis of information, or must we look elsewhere? This issue of The Monist will take questions such as these as a basis for fostering new research in the philosophy of information. - call for papers at The Monist, advisory editor Luicano Floridi (due April 30, 2016)
In case we want to put the Monist in our sights again, this issue looks related to the idea of a "continuous information theory" that I've been blathering about from time to time.  I haven't yet done enough background research to say whether that's new, or how it relates to other field theories.

About Floridi -- here are some of his works:

Floridi, Luciano. The Philosophy of Information. Oxford: Oxford University Press, 2011.
—. The Fourth Revolution: How the Infosphere is Reshaping Human Reality. Oxford: Oxford University Press, 2013.
—. Information: A Very Short Introduction. Oxford: Oxford University Press, 2012.

And another shorter one that I looked through: 

Open Problems in the Philosophy of Information

Comments

Popular posts from this blog

Metacommunicative cues

In the previous post on Extra channels I finished with a distinction between diachronic and synchronic metacommunication. In this post I'd like to respond to some comments by the co-author of this blog, Joe, in some of his previous posts, by invoking Jurgen Ruesch's concept of metacommunication . Gregory Bateson was interested in thinking about cybernetics, but didn't seem to feel constrained to think about it using a strictly computational or information-theoretic paradigm, while still being informed by the ideas. This gave him the freedom to talk about ideas like "context", "relationship", "learning", and "communication" without needing to define them in precise computational terms. Nevertheless, he handles the ideas fairly rigorously. (Joe, Phatic Workshop: towards a μ-calculus ) Gregory Bateson and Jurgen Ruesch, among many other notable thinkers, were part of the Palo Alto Group of researchers tasked to apply new methods (a

Extra channels

In the following, I would like to clarify the connection between channel and context and concomitantly the difference between metachannel and parachannel . Paul Kockelman urges us "to notice the fundamental similarity between codes and channels" (2011: 725) but instead of that purported fundamental similarity points out the contrast between them. I argue that context , or objects and states of affairs (Bühler 2011[1934]: 35), demonstrate a closer relationship to channel than to code. This is largely because the first three fundamental relations, sender or subject , context or object , and receiver or addressee , belong to Bühler's original organon model while code , contact and message , which were previously implicit in the organon model, are made explicit as additions to the model by Jakobson (1985[1976c]). Thus the most productive approach would be to pair a component from the original organon model with an additional component in the language functions model.