ENTROPI
- In classical physics, the entropy of a physical system is proportional to the quantity of energy no longer available to do physical work. Entropy is central to the second law of thermodynamics, which states that in an isolated system any activity increases the entropy.
- In quantum mechanics, von Neumann entropy extends the notion of entropy to quantum systems by means of the density matrix.
- In probability theory, the entropy of a random variable measures theuncertainty about the value that might be assumed by the variable.
- In information theory, the compression entropy of a message (e.g. a computer file) quantifies the information content carried by the message in terms of the best losslesscompression rate.
- In the theory of dynamical systems,entropy quantifies the exponential complexity of a dynamical system or the average flow of information per unit of time.
- In sociology, entropy is the natural decay of structure (such as law, organization, and convention) in a social system.
- In the common sense, entropy means disorder or chaos
Figure 1: In a naive analogy, energy in a physical system may be compared to water in lakes, rivers and the sea. Only the water that is above the sea level can be used to do work (e.g. propagate a turbine). Entropy represents the water contained in the sea.
Karangan dari :
Untuk lebih lengkapnya silahkan klik alamat dibawah ini :
http://www.scholarpedia.org/article/Entropy#Shannon_entropy
Tomasz Downarowicz (2007) <a href="http://www.scholarpedia.org/article/Entropy">Entropy</a>. <a href="http://www.scholarpedia.org/">Scholarpedia</a>, 2(11):3901.
Tidak ada komentar:
Posting Komentar