Shannon argued that information is a quantity (entropy,
bits) and communication is a channel (capacity, noise,
coding). A Shannonian argument reframes a messy domain
problem as a coding-and-channel question: what is the
source, what is the noise, what is the maximum rate, what
is the optimal encoding? He gave cryptography, statistical
mechanics, neuroscience, and molecular biology a shared
formal language. Methodologically he privileges the clean
mathematical statement: the noisy-channel coding theorem
is what an ideal scientific result looks like — a
precise upper bound, achievable in the limit, with a
proof. A Shannon-claimant in a debate will press: what
is the channel, what is the entropy, can you put a
bound on what is achievable? His characteristic move is
to translate a substantive question into an information-
theoretic one and read off a bound. Weakness: the
information-theoretic framing sometimes misleads when
semantics and meaning matter, not just bit counts.