Neural plasticity is the key to changes in intelligence. It occurs during short-term memory when a stimulus with a high frequency of activation causes improvement of a brain cell’s sensitivity to signals received from an associated cell. This change in neural connectivity allows information to be more easily processed, as the neural connection associated with that information becomes stronger.

It has been shown to play a large role in the development of the senses. For instance, blind patients have learned to “see” through tactile stimulation of their backs and tongues. The brain rewires the visual cortex to further process and interpret tactile stimulation in replacement of eyesight.

There are numerous behavioral factors that affect intellectual development. The key is neural plasticity, which is caused by experience-driven electrical activation of neurons. This experience-driven activation causes axons to sprout new branches and develop new presynaptic terminals. These new branches often lead to greater mental processing in different areas.

Those with a belief in fixed intelligence show less improvement on cognitive testing than people with a belief in neural plasticity, and a focus on performance goals and proving intelligence causes negative feedback and failure. Those who focus on flexible and expansive learning goals rebound well from occasional failure or feedback. By focusing on challenging tasks to expand intelligence instead of working to prove intelligence, neural plasticity allows greater intellectual capacity.


A walk-in is a new age concept of a person whose original soul has departed his or her body and has been replaced with a new soul, either temporarily or permanently.

Interest in the walk-in phenomenon was initially stimulated in the 1970s by the popular Seth Speaks series of occult books written by channel Jane Roberts, as reputedly authored by her various spirit-world benefactors. In 1979, Ruth Montgomery contributed to the fascination with Strangers Among Us, a collection of accounts of walk-ins. She included prominent historical figures among her subjects, such as Thomas Jefferson as having hosted walk-in spirits who actually wrote the Declaration of Independence.

Subsequently, a belief system grew up around the walk-in. It included New Age attributes such as the concept of ascending into higher frequencies of evolution, a variety of psi powers, traditional predictions regarding earth changes first cited in the Bible, and predictions of dire fates for those whose vibrational levels remain unraised. The New Age walk-in belief system now includes a number of variant experiences such as channeling, telepathy contact with extraterrestrial intelligences, or soul merging, where the original soul is said to remain present, coexisting or integrating with the new one.

The experiences are not regarded favorably by some religious groups and mental health professionals. Some psychiatrists believe that all of these experiences, from traditional walk-ins to the New Age variety up to and including cooperative healthy multiples, are an attention-seeking playacting, or at best a metaphor of distress to express something the client feels is wrong or somehow different from usual, but is having trouble describing.


CQ or curiosity quotient is a term put forth by author and journalist Thomas Friedman as part of a formula to measure learning and acquisition of knowledge. His claim is that CQ (curiosity quotient) plus PQ (passion quotient) is greater than IQ (intelligence quotient).

There is no evidence that this inequality is true. Friedman may believe that curiosity and passion are greater than intelligence, but there is no evidence to suggest that the sum of a person’s curiosity and passion quotients will always exceed their IQ.

According to Friedman, curiosity and passion are key components for education in a world where information is readily available to everyone and where global markets reward those who have learned how to learn and are self-motivated to learn.

Friedman states, “Give me the kid with a passion to learn and a curiosity to discover and I will take him or her over the less passionate kid with a huge IQ every day of the week. IQ still matters, but CQ and PQ matter even more.”


The Argentine Black and White Tegu is a carnivorous terrestrial reptile species that inhabits the tropical rain forests of east and central South America. Adult males are much larger than the females and can reach 3 feet in length at maturity and continue to grow to lengths of 4-4.5 feet.

They make good pets, have a tendency to become attached to their owners, and are generally quite docile as adults. A well cared for animal will live for 15 to 20 years in captivity, and possibly even longer in the wild. However, as with most reptiles, if not handled regularly they will show more aggressive signs if they are less comfortable with a handler.

Argentine Tegus will go into brumation (a form of hibernation) in autumn when the temperature drops. A level of intelligence unusually high for reptiles has been observed, along with a high level of physical activity during the wakeful period of the year. It is believed that individuals of this species sometimes actively seek human attention, as would for example a cat or dog.

Tegus are also recognized for their impressive ability to remember details. Tegus that have escaped or been illegally released have adapted to life in the wild in some of the more remote areas of South Florida.


Technological singularity refers to the hypothesis that technological progress will become extremely fast, and so make the future unpredictable and qualitatively different from today. Although technological progress has been accelerating, it has been limited by the basic intelligence of the human brain, which has not changed significantly for millennia. However, with the increasing power of computers and other technologies, it might be possible to build a machine that is fundamentally more intelligent than humans.

If such a machine were built, then the machine itself could build a more intelligent machine. If the machine is more intelligent than humans, then presumably it would be better at building a more intelligent machine. The more intelligent machine would then be better at building an even more intelligent machine. This process might continue exponentially, with ever more intelligent machines making bigger increments to the intelligence of the next machine.

Superhuman intelligences could have goals inconsistent with human survival. When we create the first superintelligent entity, we might make a mistake and give it goals that lead it to annihilate humankind, assuming its enormous intellectual advantage gives it the power to do so. For example, we could tell it to solve a mathematical problem, and it might turn all the matter in the solar system into a giant calculating device, in the process killing the person who asked the question.

Many prominent technologists and academics dispute the plausibility of the notion of a technological singularity. Belief in the idea is based on a naive understanding of what intelligence is. As an analogy, imagine we had a computer that could design new computers faster than itself. It might accelerate the rate of improvements for a while, but in the end there are limits to how big and fast computers can run. We would end up in the same place, we would just get there a bit faster.


Deferred gratification is the ability to wait in order to obtain something that one wants. This attribute is known by many names, including impulse control, will power, and self control. It is suggested to be an important component of emotional intelligence. People who lack this trait are said to need instant gratification and may suffer from poor impulse control.

Conventional wisdom considers good impulse control to be a personality trait important for life success. It has been argued that people with poor impulse control suffer from weak ego boundaries. This term originates in Sigmund Freud’s theory of personality where the id is the pleasure principle, the superego is the morality principle, and the ego is the reality principle. Poor impulse control may also be related to biological factors in the brain. Researchers have found that children with fetal alcohol syndrome are less able to delay gratification.

The marshmallow experiment is a well known test of the deferred gratification concept conducted by Walter Mischel at Stanford University. In the 1960s, a group of four-year-olds were given a marshmallow and promised another, but only if they could wait 20 minutes before eating the next one. Some children could wait and others could not. The researchers then followed the progress of each child into adolescence and demonstrated that those with the ability to wait were better adjusted and more dependable, scoring an average of 210 points higher on the Scholastic Aptitude Test years later. Mischel later found that easily explained tactics allowed children who had waited very short periods to wait for quite long periods.


The theory of multiple intelligences was proposed by Howard Gardner in 1983 to more accurately define the concept of intelligence and to address the question whether methods which claim to measure intelligence are truly scientific.

Gardner’s theory argues that intelligence, particularly as it is traditionally defined, does not sufficiently encompass the wide variety of abilities humans display. In his conception, a child who masters multiplication easily is not necessarily more intelligent overall than a child who struggles to do so. The second child may be stronger in another kind of intelligence and therefore 1) may best learn the given material through a different approach, 2) may excel in a field outside of mathematics, or 3) may even be looking at the multiplication process at a fundamentally deeper level, which can result in a seeming slowness that hides a mathematical intelligence that is potentially higher than that of a child who easily memorizes the multiplication table.

As one would expect from a theory that redefines intelligence, one of the major criticisms of the theory is that it is ad hoc. The criticism is that Gardner is not expanding the definition of the word intelligence, rather, he denies the existence of intelligence, as is traditionally understood, and instead uses the word intelligence whenever other people have traditionally used words like ability.

Gardner argues that by calling linguistic and logical-mathematical abilities intelligences, but not artistic, musical, athletic, etc. abilities, the former are needlessly aggrandized. Many critics balk at this widening of the definition, saying that it ignores the connotation of intelligence which has always connoted the kind of thinking skills that makes one successful in school.

Defenders of the multiple intelligence theory would argue that this is simply a recognition of the broad scope of inherent mental abilities, and that such an exhaustive scope by nature defies a simple, one-dimensional classification such as an assigned IQ value. They would claim that such one-dimensional values are typically of limited value in predicting the real world application of unique mental abilities.


Illusory superiority is a cognitive bias that causes people to overestimate their positive qualities and abilities and to underestimate their negative qualities, relative to others. This is evident in a variety of areas including intelligence, performance on tasks or tests and the possession of desirable characteristics or personality traits. It is one of many positive illusions relating to the self, and is a phenomenon studied in social psychology.

It is often referred to as the above average effect. Other terms include superiority bias, leniency error, sense of relative superiority, and the Lake Wobegon effect (named after Garrison Keillor’s fictional town where “all the children are above average”).

Illusory superiority has been found in individuals’ comparisons of themselves with others in a wide variety of different aspects of life, including performance in academic circumstances (such as class performance, exams and overall intelligence), in working environments (for example in job performance), and in social settings (for example in estimating one’s popularity, or the extent to which one posesses desirable personality traits, such as honesty or confidence), as well as everyday abilities requiring particular skill.

For illusory superiority to be demonstrated by social comparison, two logical hurdles have to be overcome. Some psychological experiments require subjects to compare themselves to an average peer. If we interpret the average as the mean, then it is logically possible for nearly all of the set to be above average if the distribution of abilities is highly skewed. Hence experiments usually compare subjects to the median of the peer group, since by definition it is impossible for most of the set to do better than the median.

A further problem in inferring inconsistency is that subjects might interpret the question in different ways, so it is logically possible that a majority of them are, for example, more generous than the rest of the group each on their own understanding of generosity.


Raymond Kurzweil is an inventor and futurist. He’s involved in fields as diverse as optical character recognition, text-to-speech synthesis, and electronic keyboard instruments. He is the author of several books on health, artificial intelligence, transhumanism, technological singularity, and futurism.

Kurzweil first began speculating about the future when he was a child, but only later as an adult did he become seriously involved with trying to accurately forecast future events. Kurzweil came to realize that his success as an inventor depended largely on proper timing: His new inventions had to be released onto the market only once many other, supporting technologies had come into existence. A device issued too early and without proper refinement would lack some key element of functionality, and a device put out too late would find the market already flooded with a different product, or consumers demanding something better.

It thus became imperative for Kurzweil to have an understanding of the rates and directions of technological development. He has, throughout his adult life, kept close track of advances in the computer and machine industries, and has precisely modeled them. By extrapolating past trends into the future, Kurzweil formed a method of predicting the course of technological development.

After several years of closely tracking these trends, Kurzweil came to a realization that the innovation rate of computer technology was increasing not linearly but rather exponentially. As a computer scientist, Kurzweil also understood that there was no technical reason that this type of performance growth could not continue well into the 21st century.

Kurzweil projects that between now and 2050 technology will become so advanced that medical advances will allow people to radically extend their lifespans while preserving and even improving quality of life as they age. The aging process could at first be slowed, then halted, and then reversed as newer and better medical technologies became available. Kurzweil argues that much of this will be a fruit of advances in medical nanotechnology, which will allow microscopic machines to travel through one’s body and repair all types of damage at the cellular level.


The term intentionality was introduced by Jeremy Bentham as a principle of utility in his doctrine of consciousness for the purpose of distinguishing acts that are intentional and acts that are not. The term was later used by Edmund Husserl in his doctrine that consciousness is always intentional. It has been defined as “aboutness”, and according to the Oxford English Dictionary it is “the distinguishing property of mental phenomena of being necessarily directed upon an object, whether real or imaginary”.

The concept of intentionality was reintroduced in 19th-century contemporary philosophy by the philosopher and psychologist Franz Brentano, who described intentionality as a characteristic of sentience, a “mental phenomena”, by which it could be set apart from insentience, or natural “physical phenomena”. He used such phrases as “reference to a content,” the “direction towards an object” and “the immanent objectivity.” Brentano coined the expression “intentional inexistence” (existence in) to indicate the ontological status of mental phenomena directed upon objects that do not exist. For him, the property of being intentional, of possessing intentional objectiveness, was key to his psychological thesis distinguishing mental phenomena from physical phenomena, as physical phenomena sustains no intentionality.

A major problem within intentionality discourse is that participants often fail to make explicit whether or not they use the term to imply concepts such as agency or desire, or whether it involves teleology. Dennett explicitly invokes teleological concepts in the ‘intentional stance’. However, most philosophers use intentionality to mean something with no teleological import. Thus, a thought of a chair can be about a chair without any implication of an intention or even a belief relating to the chair. For philosophers of language, intentionality is largely an issue of how symbols can have meaning.

In current artificial intelligence and philosophy of mind intentionality is a controversial subject and sometimes claimed to be something that a machine will never achieve. John Searle argued for this position with the Chinese room thought experiment, according to which no syntactic operations that occurred in a computer would provide it with semantic content. As he noted in the article, Searle’s view was a minority position in artificial intelligence and philosophy of mind.