Saturday, July 14, 2012

Of Paradoxes and Paradigm Shifts: Entropy and Negentropy of Living Systems


As a former editor / publisher for the Truth Seekers (now Open Mind) most of the articles in the eNewsletter were written to elaborate on UFOs and other anomalous experiences in connection with this group which the belief of  "life on other worlds" -- do exist -- possibly as multiple universes -- macrocosms whose inhabitants are microcosms. 

     Specific to chaos theory is entropy which is a measure of disorder, or more precisely unpredictability. To illustrate a down-to-earth analogy; a series of coin tosses with a fair coin has maximum entropy, since there is no way to predict what will come next. Therefore a string of coin tosses with a two-headed coin has zero entropy, since the coin will always come up heads. Of particular interest are "collections of data in the real world" that fall within 0 and 1. Therefore , it is important to realize the difference between the entropy of a set of possible outcomes from the entropy of a particular outcome. A single toss of a fair coin has an entropy of one bit, but a particular result (e.g. "heads") has zero entropy, since it is entirely "predictable".


      Similarly, in information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the "Shannon entropy," which quantifies the expected value of the information contained in a message, usually in units such as bits. Hence, a 'message' means a specific realization of the random variable. A good example is Claude E. Shannon 1948 paper on "A Mathematical Theory of Communication". 
     Shannon's entropy represents an absolute limit on the best possible "Lossless data compression" as a class of data compression algorithms that allows the exact original data to be reconstructed from the compressed data. The term lossless is in contrast to lossy data compression, which only allows an approximation of the original data to be reconstructed, in exchange for better compression rates.
      Lossless data compression is used in many applications. For example, it is used in the ZIP file format and in the Unix tool gzip. It is also often used as a component within lossy data compression technologies (e.g. lossless mid/side joint stereo preprocessing by the LAME MP3 encoder and other lossy audio encoders). Lossless compression is used in cases where it is important that the original and the decompressed data be identical, or where deviations from the original data could be deleterious. Typical examples are executable programs, text documents and source code. Some image file formats, like PNG or GIF, use only lossless compression, while others like TIFF and MNG may use either lossless or lossy methods.  
     Lossless audio formats are most often used for archiving or production purposes, with smaller lossy audio files being typically used on portable players and in other cases where storage space is limited and/or exact replication of the audio is unnecessary. Of any communication, under certain constraints: treating messages to be encoded as a sequence of independent and identically-distributed random variables, Shannon's source coding theorem shows that, in the limit, the average length of the shortest possible representation to encode the messages in a given alphabet is their entropy divided by the logarithm of the number of symbols in the target alphabet.
      In the coin toss example relative to information theory; a single toss of a fair coin has an entropy of one bit. Two tosses has an entropy of two bits. The entropy rate for the coin is one bit per toss. However, if the coin is not fair, then the uncertainty is lower (if asked to bet on the next outcome, we would bet preferentially on the most frequent result), and thus the Shannon entropy is quite lower. Mathematically, a single coin flip (fair or not) is an example of a "Bernoulli trial," and its entropy which is given by the binary entropy function. A series of tosses of a two-headed coin will have zero entropy, since the outcomes are entirely predictable. The entropy rate of English text is between 1.0 and 1.5 bits per letter, or as low as 0.6 to 1.3 bits per letter, according to estimates by Shannon based on human experiments.
      Conversely, negentropy or syntropy, in a living system is the entropy that it exports to keep its own entropy low; and it is based on the intersection of entropy and life. The concept and phrase "negative entropy" was introduced by Erwin Schrödinger in his 1943 popular-science book "What is Life?". However Léon Brillouin shortened the phrase to negentropy, to express it in a more "positive" way: a living system imports negentropy and stores it. Of particular is Albert Szent-Györgyi (1974) who proposed replacing the term negentropy with syntropy. That term may have originated in the 1940s with the Italian mathematician Luigi Fantappiè, who tried to construct a unified theory of biology and physics. whereas, Buckminster Fuller popularized the word negentropy for living systems.
      In the book, "What is Life?" Schrödinger explained in his own words:

" [...] if I had been catering for them [physicists] alone I should have let the discussion turn on "free energy" instead. It is the more familiar notion in this context. But this highly technical term seemed linguistically too near to energy for making the average reader alive to the contrast between the two things. "

      Nonetheless, in 2009, Mahulikar & Herwig redefined negentropy for dynamically ordered sub-system as the "specific entropy deficit" of the ordered sub-system relative to its surrounding chaos. Thus, negentropy has units [J/kg-K] when defined based on specific entropy per unit mass, and [K-1] when defined based on specific entropy per unit energy. This definition enabled:

i) scale-invariant thermodynamic representation of dynamic order existence,

ii) formulation of physical principles exclusively for dynamic order existence and evolution,

iii) mathematical interpretation of Schrödinger's negentropy debt.

Therefore as a "new scientist" and as an inventor / pyramidologist of the PyramiTronix Resonator, the significance for multiple cultures and multiple disciplines surviving in a multiple universe suggest that as "diverse living systems" we as individuals in relation to other beings do exist in multiple universe as microcosms and that within each multiple universe as a macrocosms will encounter entropy such that each living system is adherent to its' capacity to thrive or die. The paradoxes ultimately exist as different problems in different dimensions of time and space and of these paradoxes suggest the potential to evolve and grow and would be the solutions as paradigms shifts needed to avoid entropy. Thus solutions necessary for change are inherent in multiple cultures and multiple disciplines which need to base their survival on negentropy. This of course implies that there are multiple universes which perpetuates the belief that life exists beyond our solar system -- a macrocosm within a macrocosm? Perhaps the answer to "What is Life" that Schrodinger was compelled to write about needs to be revised and / or have a new chapter as a paradigm shift?

Question: What would that chapter be about? I would surmise blogs as "aggregate mutual connections" that discuss on advancements in science and technology as paradigm shifts. Some blog aggregate sites could be a "dynamic sub-systems" that have negentropy for individuals to promote advancements in science and technology as a new scientist: These aggregate sites that support this blog are relocated in the "top Internet sites."

No comments:

Post a Comment