The philosophy of biology is a subfield of philosophy of science, which deals with epistemological, metaphysical, and ethical issues in the biological and biomedical sciences. Although philosophers of science and philosophers generally have long been interested in biology, philosophy of biology emerged as an independent field of philosophy in the 1960s and 1970s.

During that time, philosophers of science began paying increasing attention to biology, from the rise of Neodarwinism in the 1930s and 1940s, to the discovery of the structure of DNA in 1953, and to more recent advances in genetic engineering. Other ideas such as the reduction of all life processes to biochemical reactions as well as the incorporation of psychology into a broader neuroscience were also addressed.

Biologists with philosophic interests responded, emphasising the dual nature of the living organism. On the one hand there was the genetic program; the genotype. On the other there was its extended body or soma; the phenotype. In accommodating the more probabilistic and non-universal nature of biological generalisations, it was a help that standard philosophy of science was in the process of accommodating similar aspects of 20th century physics.


Mental energy is the concept of a principle of activity powering the operation of the mind, soul or psyche. Energy in this context is used as the literal meaning of the activity or operation. Mental energy has been defined as the driving force of the psyche, emotional as well as intellectual.

Just as physical energy acts upon physical objects, psychological energy acts upon psychological entities or thoughts. Psychological energy and force are the basis of an attempt to formulate a scientific theory according to which psychological phenomena would be subject to precise laws akin to how physical objects are subject to laws of physics. This concept of psychological energy is completely separate and distinct from the mystical eastern concept of spiritual energy.

In The Ego and the Id, Freud argued that the id was the source of the personality’s desires, and therefore of the psychic energy that powered the mind. Freud defined libido as the instinct energy or force, and later added the death drive, also contained in the id, as a second source of mental energy. In 1928, Carl Jung published a seminal essay entitled On Psychic Energy. Later, the theory of psychodynamics and the concept of psychic energy were developed further.

Studies have found that mental effort can be measured in terms of increased metabolism in the brain. Mental energy has been repeatedly compared to or connected with quantitative physical energy. The concept of psychodynamics was proposed with the idea that all living organisms are energy systems also governed by this principle.


The egg of Li Chun refers to a Chinese folk belief that it is much easier to balance an egg on a smooth surface during Li Chun, the official first day of spring in the Chinese lunar calendar which usually falls on February 4 or 5, than at any other time of the year. Balancing fresh chicken eggs on their broad end was a traditional Li Chun ritual in China.

In 1945 Life magazine reported on an egg-balancing craze among the population of Chungking, on that year’s Li Chu. That article and its follow-ups started a similar egg-balancing mania in the United States, but transposed to the astronomical vernal equinox in March. Japanese newspapers picked up the story in 1947. In 1978, New York artist Donna Henes started organizing egg-balancing ceremonies with the stated goal to bring about world peace and international and harmony.

As far as science knows, no physical influence of other celestial bodies on the egg can affect its balance as required by the folk belief. Gravitational and electromagnetic forces, in particular, are considerably weaker and steadier than the forces created by the person’s hand and breathing.

In 1947, Japanese physicist Ukichiro Nakaya verified experimentally that eggs in fact can be balanced with ease at any time of the year. He noticed that the shell of an egg usually has many small bumps and dimples, so that, by turning the egg in different directions, it can be made to touch a flat surface on three points at once. It is not hard to find an orientation such that the triangle spanned by the three contact points lies right under the egg’s center of mass, which is the condition for balancing any object. Of course, balancing an egg on a rough surface is easy too, for the same reason.

Martin Gardner also observed that if you are convinced that an egg will balance more easily on a certain day you will try a little harder, be more patient, and use steadier hands. If you believe that eggs won’t balance on other days, this belief is transmitted subconsciously to your hands.


In physics and cosmology, the anthropic principle is the collective name for several ways of asserting that physical and chemical theories, especially astrophysics and cosmology, need to take into account that there is life on Earth, and that one form of that life, Homo sapiens, has attained sapience. The only kind of universe humans can occupy is one that is similar to the current one.

Originally proposed as a rule of reasoning, the term has since been extended to cover supposed “superlaws” that in various ways require the universe to support intelligent life, usually assumed to be carbon-based and occasionally asserted to be human beings. Anthropic reasoning assesses these constraints by analyzing the properties of hypothetical universes whose fundamental parameters or laws of physics differ from those of the real universe. Anthropic reasoning typically concludes that the stability of structures essential for life, from atomic nuclei to the whole universe, depends on delicate balances between different fundamental forces.

These balances are believed to occur only in a tiny fraction of possible universes, so that this universe appears fine-tuned for life. Anthropic reasoning attempts to explain and quantify this fine tuning. Within the scientific community the usual approach is to invoke selection effects and to hypothesize an ensemble of alternate universes, in which case that which can be observed is subject to an anthropic bias.

However, the term anthropic in “anthropic principle” has been argued to be a misnomer. While singling out our kind of carbon-based life, none of the coincidences require human life or demand that carbon-based life develop intelligence.

The anthropic principle has given rise to some confusion and controversy, partly because the phrase has been applied to several distinct ideas. All versions of the principle have been accused of undermining the search for a deeper physical understanding of the universe. Those who invoke the anthropic principle often invoke multiple universes or an intelligent designer, both controversial and criticised for being untestable and therefore outside the purview of accepted science.


Popcorn was first discovered thousands of years ago by the Native Americans, who believed that the popping noise was that of an angry god who escaped the kernel.

Each kernel of popcorn contains a certain amount of moisture and oil. Corn is able to pop because, unlike other grains, the outer hull of the kernel is both strong and impervious to moisture, and the starch inside consists almost entirely of a dense starchy filling. This allows pressure to build inside the kernel until an explosive pop results.

As the oil and the water are heated past the boiling point, they turn the moisture in the kernel into a superheated pressurized steam, contained within the moisture proof hull. Under these conditions, the starch inside the kernel gelatinizes, softening and becoming pliable. The pressure continues to increase until the breaking point of the hull is reached, a pressure of about 135 psi and a temperature of 356 °F. The hull ruptures rapidly, causing a sudden drop in pressure inside the kernel and a corresponding rapid expansion of the steam, which expands the starch and proteins of the endosperm into airy foam. As the foam rapidly cools, the starch and protein polymers set into the familiar crispy puff.

During the Great Depression, popcorn was comparatively cheap at 5 to 10 cents a bag and became popular. Thus, while other businesses failed, the popcorn business thrived and became a major source of income for some struggling farmers. During World War II, sugar rations diminished candy production causing Americans to eat three times more popcorn than they had before.

At least six localities, all in the United States, claim to be the Popcorn Capital of the World: Valparaiso, Indiana; Van Buren, Indiana; Marion, Ohio; Ridgway, Illinois; Schaller, Iowa; and North Loup, Nebraska. According to the USDA, most of the maize used for popcorn production is specifically planted for this purpose. Most is grown in Nebraska and Indiana, with increasing area in Texas. As the result of an elementary school project, popcorn became the official state snack food of Illinois.

Popcorn, threaded onto a string, is used as a wall or Christmas tree decoration in some parts of North America, as well as on the Balkan peninsula. The world’s largest popcorn ball was unveiled in October 2006 in Lake Forest, Illinois. It weighed 3,415 pounds, measured 8 feet in diameter, and had a circumference of 24.6 feet.


Stochastic resonance is observed when noise added to a system improves the system’s performance in some fashion. More technically, stochastic resonance occurs if the signal to noise ratio of a nonlinear system or device increases for moderate values of noise intensity.

It was discovered and proposed for the first time in 1981 to explain the periodic recurrence of ice ages. Since then, the same principle has been applied in a wide variety of systems. Currently, stochastic resonance is commonly invoked when noise and nonlinearity concur to determine an increase of order in the system response.

Stochastic resonance has been observed in a wide variety of experiments involving electronic circuits, chemical reactions, semiconductor devices, nonlinear optical systems, magnetic systems and superconducting quantum interference devices (SQUID). Of special interest are the neurophysiological experiments on stochastic resonance, three popular examples of which are the mechanoreceptor cells of crayfish, the sensory hair cells of cricket and human visual perception.

Computationally, neurons exhibit stochastic resonance because of non linearities in their processing. Stochastic resonance has yet to be fully explained in biological systems, but neural synchrony in the brain, specifically in the Gamma wave frequency, has been suggested as a possible neural mechanism for stochastic resonance by researchers who have investigated the perception of subconscious visual sensation.

Stochastic resonance based techniques have been used to create a novel class of medical devices, such as vibrating insoles, for enhancing sensory and motor function in the elderly, patients with diabetic neuropathy, and patients with stroke.

A related phenomenon is dithering applied to analog signals before analog to digital conversion. Stochastic resonance can be used to measure transmittance amplitudes below an instrument’s detection limit. If Gaussian noise is added to a subthreshold or immeasurable signal, then it can be brought into a detectable region. After detection, the noise is removed. In this way, a fourfold improvement in the detection limit can be obtained.

Stochastic resonance is a generic phenomenon. It has to do with the fact that adding noise to certain types of nonlinear systems possessing several simultaneously stable states may improve their ability to process information. As such, it is at the origin of intense interdisciplinary research at the crossroads of nonlinear dynamics, statistical physics, information and communication theories, data analysis, life and medical sciences. It opens tantalizing perspectives, from the development of new families of detectors to brain research. From the fundamental point of view it is still a largely open field of research. Its microscopic foundations have been hardly addressed, its quantum counterpart needs to be further elucidated, and its relevance in complex transition phenomena remains to be explored.


Helmholtz resonance is the phenomenon of air resonance in a cavity. The name comes from a device created in the 1850s by Hermann Helmholtz to show the height of the various tones. An example of Helmholtz resonance is the sound created when one blows across the top of an empty bottle.

When air is forced into a cavity, the pressure inside increases. Once the external force that forces the air into the cavity disappears, the higher-pressure air inside will flow out. However, this surge of air flowing out will tend to over-compensate, due to the inertia of the air in the neck, and the cavity will be left at a pressure slightly lower than the outside, causing air to be drawn back in. This process repeats with the magnitude of the pressure changes decreasing each time.

This effect is similar to that of a bungee jumper bouncing on the end of a bungee rope, or a mass attached to a spring. Air trapped in the chamber acts as a spring. Changes in the dimensions of the chamber adjust the properties of the spring. A larger chamber would make for a weaker spring, and vice versa.

The air in the the neck of the chamber is the mass. Since it is in motion, it possesses some momentum. A longer port would make for a larger mass, and vice versa. The diameter of the port is related to the mass of air and the volume of the chamber. A port that is too small in area for the chamber volume will choke the flow while one that is too large in area for the chamber volume tends to reduce the momentum of the air in the port.

Helmholtz resonance finds application in internal combustion engines, subwoofers and acoustics. In stringed instruments, such as the guitar and violin, the resonance curve of the instrument has the Helmholtz resonance as one of its peaks, along with other peaks coming from resonances of the vibration of the wood. An ocarina is essentially a Helmholtz resonator where the area of the neck can be easily varied to produce different tones. The West African djembe has a relatively small neck area, giving it a deep bass tone. The djembe may have been used in West African drumming as long as 3,000 years ago, making it much older than our knowledge of the physics involved.

Helmholtz resonators are used in architectural acoustics to reduce undesirable sounds such as standing waves by building a resonator tuned to the problem frequency, thereby eliminating it. This technique is most usually used for low frequency waves.


According to ancient and medieval science, Ether is the material that fills the region of the universe above the terrestrial sphere. It was imagined in Greek mythology to be the pure essence where the gods lived and which they breathed, analogous to the air breathed by mortals.

Aristotle included Ether in the system of the classical elements of Ionic philosophy as the quintessence, on the principle that the four terrestrial elements were subject to change and moved naturally in straight lines while no change had been observed in the celestial regions and the heavenly bodies moved in circles.

In Aristotle’s system Ether had no qualities, was neither hot, cold, wet, or dry, and was incapable of change. By its nature it moved in circles. Medieval scholastic philosophers granted Ether changes of density in which the bodies of the planets were considered to be denser than the medium which filled the rest of the universe.

Early modern physics proposed the existence of a medium of the Ether meaning upper air or pure, fresh air, a space filling substance or field, thought to be necessary as a transmission medium. The assorted Ether theories embody the various conceptions of this medium and substance. This early modern Ether has little in common with the Ether of classical elements from which the name was borrowed.

Although hypotheses of the Ether vary somewhat in detail they all have certain characteristics in common. Essentially it is considered to be a physical medium occupying every point in space, including material bodies. A second essential feature is that its properties gives rise to the electric, magnetic and gravitational potentials and determines the propagation velocity of their effects.

Therefore the speed of light and all other propagating effects are determined by the physical properties of the Ether at the relevant location, analogous to the way that gaseous, liquid and solid media affect the propagation of sound waves.

The Ether is considered the overall reference frame for the universe and thus velocities are all absolute relative to its rest frame. Therefore, any physical consequences of those velocities are considered as having absolute or real effects.

Recent Ether theories of velocity effects, phenomenon of gravitation and planetary motion, creation of proton, of stars and planets, etc., exist but are not generally accepted by the mainstream scientific community.

John Bell, interviewed by Paul Davies in The Ghost in the Atom has suggested that an Ether theory allows a reference frame in which signals go faster than light. Bell suggests the Ether was wrongly rejected on purely philosophical grounds, in that what is unobservable does not exist.

Einstein found the non-Ether theory simpler and more elegant, but Bell suggests that doesn’t rule it out. Besides the arguments based on his interpretation of quantum mechanics, Bell also suggests resurrecting the Ether because it is a useful pedagogical device. That is, lots of problems are solved more easily by imagining the existence of an Ether.


Hermann Minkowski was a Russian born German mathematician, of Jewish and Polish descent, who created and developed the geometry of numbers and who used geometrical methods to solve difficult problems in number theory, mathematical physics, and the theory of relativity.

At the Eidgenossische Polytechnikum he was one of Einstein’s teachers. Minkowski explored the arithmetic of quadratic forms, especially concerning “n” variables, and his research into that topic led him to consider certain geometric properties in a space of multiple dimensions. In 1896, he presented his geometry of numbers, a geometrical method that solved problems in number theory.

By 1907 Minkowski realized that the special theory of relativity, introduced by Einstein in 1905, could be best understood in a four dimensional space, since known as Minkowski Spacetime, in which the time and space are not separated entities but intermingled in a four dimensional space time, and in which the geometry of special relativity can be nicely represented.

In physics, spacetime is any mathematical model that combines space and time into a single construct called the spacetime continuum. Spacetime is usually interpreted with space being three dimensional and time playing the role of the fourth dimension. According to Euclidean space perception, the universe has three dimensions of space and one dimension of time. By combining space and time into a single manifold, physicists have significantly simplified a large number of physical theories, as well as described in a more uniform way the workings of the universe at both the supergalactic and subatomic levels.

The concept of spacetime combines space and time within a single coordinate system, typically with four dimensions: length, width, height, and time. Dimensions are components of a coordinate grid typically used to locate a point in space, or on the globe, such as by latitude, longitude and planet (Earth). However, with spacetime, the coordinate grid is used to locate events rather than just points in space, so time is added as another dimension to the grid.

Formerly, from experiments at slow speeds, time was believed to be a constant, which progressed at a fixed rate. However, later high speed experiments revealed that time slowed down at higher speeds with such slowing called time dilation. Many experiments have confirmed the slowing from time dilation, such as atomic clocks onboard a Space Shuttle running slower than synchronized Earth clocks. Since time varies, it is treated as a variable within the spacetime coordinate grid, and time is no longer assumed to be a constant, independent of the location in space.

Treating spacetime events with the four dimensions which include time is the conventional view. However, other invented coordinate grids treat time as three additional dimensions, with length time, width time, and height time, to accompany the three dimensions of space. When dimensions are understood as mere components of the grid system rather than physical attributes of space, it is easier to understand the alternate dimensional views, such as latitude, longitude, plus Greenwich Mean Time (three dimensions), or city, state, postal code, country, and UTC time (five dimensions). The various dimensions are chosen, depending on the coordinate grid used.

The term spacetime has taken on a generalized meaning with the advent of higher-dimensional theories. How many dimensions are needed to describe the universe is still an open question. Speculative theories such as string theory predict 10 or 26 dimensions with some predicting 11 dimensions consisting of 10 spatial and 1 temporal, but the existence of more than four dimensions would only appear to make a difference at the subatomic level.

In theoretical physics, Minkowski space is often compared to Euclidean space. While a Euclidean space has only spacelike dimensions, a Minkowski space has also one timelike dimension. Therefore the symmetry group of a Euclidean space is the Euclidean group and for a Minkowski space it is the Poincare group.

The beginning part of his address delivered at the 80th Assembly of German Natural Scientists and Physicians in 1908 is now famous:

The views of space and time which I wish to lay before you have sprung from the soil of experimental physics, and therein lies their strength. They are radical. Henceforth space by itself, and time by itself, are doomed to fade away into mere shadows, and only a kind of union of the two will preserve an independent reality.

Strictly speaking, the use of the Minkowski space to describe physical systems over finite distances applies only in the Newtonian limit of systems without significant gravitation. In the case of significant gravitation, spacetime becomes curved and one must abandon special relativity in favor of the full theory of general relativity.

Even in such cases, Minkowski space is still a good description in an infinitesimally small region surrounding any point, barring gravitational singularities. More abstractly, we say that in the presence of gravity spacetime is described by a curved four dimensional manifold for which the tangent space to any point is a four dimensional Minkowski space. Thus, the structure of Minkowski space is still essential in the description of general relativity.


Holonomic brain theory, originated by psychologist Karl Pribram and initially developed in collaboration with physicist David Bohm, is a model for human cognition that is drastically different from conventionally accepted ideas. Pribram and Bohm suggest a model of cognitive function as being guided by a matrix of neurological wave interference patterns situated temporally between holographic gestalt perception and discrete quantum vectors derived from reward anticipation potentials.

Pribram was originally struck by the similarity of the hologram idea in brain function, along with Bohm’s idea of implicate order in physics, and contacted him for collaboration. In particular, the fact that information about an image point is distributed throughout the hologram, such that each piece of the hologram contains some information about the entire image, seemed suggestive to Pribram about how the brain could encode memories. Pribram was encouraged in this line of speculation by the fact that others had found that the spatial frequency encoding displayed by cells of the visual cortex was best described as a Fourier transform of the input pattern. This holographic idea lead to the coining of the term holonomic to describe the idea in wider contexts than just holograms.

In this model, each sense functions as a lens, refocusing wave patterns either by perceiving a specific pattern or context as swirls, or by discerning discrete grains or quantum units. David Bohm has said that if you take the lenses away, what you are left with is a hologram.

According to Pribram and Bohm, future orientation is the essence of cognitive function, which they have attempted to define through use of the Fourier theorem and quantum mechanical formulae. According to Pribram, the tuning of wave frequency in cells of the primary visual cortex plays a role in visual imaging, while such tuning in the auditory system has been well established for decades. Pribram and colleagues also assert that similar tuning occurs in the somatosensory cortex.

Pribram distinguishes between propagative nerve impulses on the one hand, and slow potentials  or hyperpolarizations that are essentially static. At this temporal interface, he indicates, the wave interferences form holographic patterns.

What the data suggests is that there exists in the cortex a multidimensional holographic process serving as an attractor or point toward which muscular contractions operate to achieve a specified environmental result. The specification has to be based on prior experience of the species or the individual and stored in holographic form. Activation of the stored process involves patterns of muscular contraction guided by basal ganglia, cerebellar, brain stem and spinal cord, whose sequential operations need only to satisfy the target encoded in the image of achievement much as the patterns of sequential operations of heating and cooling must meet the setpoint of the thermostat.

According to this theory, waveforms within the matrix of a distributed system allow fluctuations taking place to create new patterns, and the resulting dynamic potential can then organize new foci of activity oriented to the precipitation of strategic planning and exercise of free will.

In a 1998 interview, Pribram addressed the understanding of cognitive potential, stating that if you get into your potential mode, then new things can happen. But usually free will is conceived of in terms of how many constraints are operating, and we have in statistics a notion of degrees of freedom. I think our will essentially is constrained, more or less. We have so many degrees of freedom, and the more degrees of freedom we have, the more we feel free, and we have freedom of choice.