22:00 Entropy in classical information theory is better thought of as "missing information" instead of "information". In fact that was the term Shannon originally used. In those terms, it makes sense to say that when all the components are independent and entropy is maximal. When you observe events from this system they will have a higher "missing information" burden (the information needed to encode the event, which you did not possess prior to observing the event) since no system component carries information about the other components.
Пікірлер: 3