In
information theory, systems are modeled by a transmitter, channel, and receiver. The transmitter produces messages that are sent through the channel. The channel modifies the message in some way. The receiver attempts to infer which message was sent. In this context,
entropy (more specifically,
Shannon entropy) is the
expected value (average) of the information contained in each message. 'Messages' can be modeled by any flow of information.