We have just started to understand that AI's use is fraught with dangers .
FIRST DANGER : THE COGNITIVE SURRENDER : WE STOP USING OUR BRAIN
A recent study posted as a Wharton School Research Paper provides evidence that people increasingly rely on artificial intelligence to make decisions, a phenomenon scientists call “cognitive surrender.” The findings suggest that individuals tend to adopt computer-generated answers without critical thought. This habit boosts human accuracy when the software is correct but significantly harms performance when the system makes mistakes.
"The findings suggest that individuals tend to adopt computer-generated answers without critical thought. This habit boosts human accuracy when the software is correct but significantly harms performance when the system makes mistakes. Since the late twentieth century, psychologists have generally divided human cognition into two distinct categories. System 1 represents immediate, automatic responses driven by instinct and emotion. System 2 involves the deliberate, effortful reflection required to solve complex mathematical equations or weigh difficult choices.
Since the late twentieth century, psychologists have generally divided human cognition into two distinct categories. System 1 represents immediate, automatic responses driven by instinct and emotion. System 2 involves the deliberate, effortful reflection required to solve complex mathematical equations or weigh difficult choices.
However, the rapid rise of generative algorithms presents a new dynamic that does not fit neatly into this traditional model. People now frequently delegate their thinking to external software, outsourcing tasks ranging from drafting emails to making complex medical diagnoses." .
LET,S GO FURTHER : THE BRAIN OF SAPIENS AS WE HAVE KNOW IT (STABLE AND UNIVERSAL) IS BEING REWIRED : IMPACT ON MILITARY OPERATION.
I invite you to go beyond the artificial barrier of nature and culture. For several decades there has been growing recognition of the unprecedented impact of technologies on the natural world,[1] of the speed and ireversibility of these changes,[2] of their complex interconnections in a whole system,[3] and the impossibility of addressing these issues in isolation.[4] Attention is now turning increasingly to impacts of change on the human body, as its integrity is seen to be threatened by chemicals, by viruses new and old and by changes in the biosphere.[5]
Paradoxically but understandably we have difficulty looking at human behaviour in the detached way we observe nature.[6] Beliefs, judgments, moralistic undertones or genetically coded optimism make it especially difficult to address the issue of a human ecology on the move.[7] Whether we define humanity as just another animal product of evolution,[8SE] the possessor of a special human nature,[9] or a blank slate free from determinism,[10] we take it for granted that human thinking, emotional process and socialization have been stable for several millennia, though it is readily accepted that height or weight has changed radically over time. This generalization embraces most evolutionists.[11] The mind of Sapiens homo sapiens, so anthropologists tell us, has not changed since our ancestors roamed glaciers. Should her frozen embryo be revived, should she be brought up in the third millennium, dressed at Harrods and educated in Harvard, nobody would ever guess her humble origin.[12]
An entire industry of human behaviour,, has been built on the premises that behaviour is predictable (if we get enough data), that behaviour is replicable and that behaviour changes slowly, if at all.[13] Developing military strategy and tactics, and the training to support them, has so far relied on these premises with some success.[14] Indeed, the stability of the “political animal” as defined by Aristotle some 2500 years ago,[15] has made Nicolas Machiavelli and Carl von Clausewitz relevant to modern scholars and useful to computerized strategies. Both Machiavelli and Clausewitz depended on historic regularities as a guide to their strategy manuals. A similar presumption that Abraham Maslow’s hierarchy of human needs is immutable seems to explain its immense popularity in military circles.[16]
Rigid schemas like those above make it difficult to understand how the human mind and body have evolved. Only a handful of thinkers such as Stephen Jay Gould, Lewis Mumford, and V. Gordon Child have addressed the complexity of the evolving web of human ecology.[17] A fixation on stasis makes it impossible to understand the unprecedented mutation humanity is currently undergoing. We are at loss to find the words to describe new types of behaviours, appearing in new generations.
HYPOTHESIS: HOMO SAPIENS NO MORE
In this chapter, I seek to challenge our faith in stability and normality. Technology is changing us. The body of humanity, the body of the state, the social body, the body of organizations and the individual body and mind are less stable than we would like to believe. My challenge is based on the following conjectures, commonplace in some research circles, but not common in public discourse. The first conjecture is that extensive use of new technologies and of new chemical compounds is changing radically the way humans process knowledge, emotions and relationships. The second conjecture is that these changes are not individual or cultural, which would make them temporary. They are universal and, given time, have the potential to be integrated into the genetic code through the evolution of propensity genes. The third conjecture is that these changes challenge the very basis of humanity as we have come to understand it through studying history and anthropology, because they will affect the way we relate to one another in society. For thousands of years our social relations have featured hierarchy, family, conflicts, wars and negotiations. Can we continue to take these patterns for granted
My challenge to our belief in stability is based on the conjecture that the human mind is being rewired. New technologies and new chemical/pharmaceutical composites are radically changing our cognitive process. Because the definition of cognitive process would require a dissertation in itself, it is narrowed here to its simplest components: "the process of thinking” and "the act of remembering".[22] (These are complicated enough, encompassing all mind-body processes from perception to kinaesthesia.)
The process of thinking starts with selection on a grand scale. Thanks to new technology, we begin to be able to observe the process. The brain scan, for example, provides new empirical evidence about the way humans incorporate knowledge, habits and culture. We can actually see how neurones entangle themselves and how sensory chaos can be transformed into a learned and cultural structure, through eliminating and ordering sensory inputs
A baby does not come into the world as a genetically pre-programmed automaton, nor is she totally determined by the environment. Cognitive development is a more exciting process. There is an inherited genetic structure, which is partly determining, but within this structure there is some flexibility. The potential is not a destiny, though it is inclination. A baby is the product of heredity, but is also receptive to environment, and by communicating with it he learns. That is also the way we evolve as a species, each individual being a part of a chain, adding to it through adaptation to new conditions, bit by bit. This means that every man registers in the structure of his brain his own unique experience, bringing flexibility and suppleness to the determinism of the genetic structure. Neurons and their connections follow the order of a genetic program while environment screens the outcome. In this process repetitions actually become structures, literally imprinted in the individual circuits we inherit. We are the same and different.
There is a caveat! This elegant dance between nature and nurture has a time limit. It closes down in early infancy for we age much more quickly than any of us wants to acknowledge. Neurologists explain how learning stabilizes pre-established synaptic combinations while eliminating others by a process called epigenesis. By a very early age the major synaptic avenues are traced, fixed, and the only change possible is to add a few streets. But we also know that the richer and more varied the early experience, the richer and more varied th cerebral structure. This doesn't mean it is impossible to go beyond heredity, but that it is easier during a limited period of time and within certain parameters. Experts have named this early learning period a “window of opportunity” – a whiff of liberty, or the source of evolution within determinism.
More importantly, this learning process explains how shared experiences can eventually alter genetic material. Nothing is ever truly stable, but the genetic structure aims at stability. The issue of our times is a radical change in the way the “window of opportunity” is used: On one side information has never been so abundant, complex, and interrelated. On the other hand the intellectual tools used to process information have never been so inadequate, as educators have abandoned structure.
Super-abundance of information has fostered the term infoglut. In a recent book, the authors invite us to think about
all the text in those 60,000 new books that spew out of U.S. presses every year, or the more than 300,000 books published worldwide,...about the more than 18,000 magazines published in the United States alone--up almost 600 from the year before--with more than 225 billion pages of editorial content. There were more than 20 billion pages of magazine editorial content about food and nutrition alone! Consider the 1.6 trillion pieces of paper that circulate through U.S. offices each year. Try scanning the 400,000 scholarly journals published annually around the world. If you prefer lighter reading, peruse some of the 15 billion catalogues delivered to U.S. homes in 1999, or the 87.2 billion pieces of direct mail that reached U.S. mailboxes in 1998. …If you believe that print media are obsolete, consider the more than 2 billion Web pages in the world, a large chunk of which can't even be found with the best search engine. A U.S. government study estimates that the amount of Internet traffic doubles about every hundred days. And online information is not restricted to the Internet. A 2000 University of Illinois study revealed that there are 11,339 distinct electronic databases on the market (up from 301 in 1975). If you like to sit in front of larger screens, you have 80 percent more feature films to watch today than were released in 1990. [23]
Paradoxically, the technical tools designed to process and communicate the interrelated and sophisticated mass of data are part of the glut. Search engines, for example, simply give us more information to wade through.
The abandonment of structure and repetition has become a problem with the way we manage information. Repetition is essential for structuring a young brain. In this process, reactive behaviours become actual structures, literally imprinted in the individual circuits.[24] It might be seen as a restrictive exercise. Yes, it is (habits and prejudices are formed), and no, it is not (without this process, no information can “stick”).For the last thirty years, hoping to help young minds become better adapted to a complex world, we have abandoned rules (like grammar and spelling,) and repetition (learning by rote). We have emphasised creativity and free association of ideas. We may have done more harm than good, not understanding how crucial it is in early years to structure the brain around patterns and cognitive associations, making it thus much harder to put some order in the infoglut. In his book Data Smog, David Shenk describes the oppressive aspect of accessing information.[25] The stress caused by information overload suggests that the increase in non-hereditary Attention Deficit Disorder (ADD) is linked to the information revolution. Research by neuropsychiatrist Gabor Maté supports the hypothesis.[26]
For the true causes of ADD we have to look at the social and psychological conditions that shape the brains of children in late twentieth century and early twenty-first century Western societies. “In attention deficit disorder the chief physiological problem appears to be located in the frontal lobe of the brain, in the area of the cortex where attention is allocated and emotions and impulses are regulated.”[27] The circuits of attention and emotion control need an appropriate input of stability and repetition; without it they malfunction.
Another major change in brain structure is illustrated by the influence of images in the modern learning process. Interactive screens replace books, and we revert to the pre-alphabet of iconic images. Students are saturated with visual images, and those designing educational materials are advised to vie for their attention in a rich visual environment. From traditional textbooks to the educational software, we find a wealth of pictorial representations.[28] But a picture is not worth a thousand words. The visually rich technical environment is actually changing the way we think. Images do not “speak” to us as books do. Techniques such as magnetic resonance imaging (MRI) have made it possible to demonstrate that images do not stimulate the same cerebral circuits. They are more emotive and passive, speaking directly to collective images, requiring less logic, no grammatical structure and no semantics. What we gain in universality, we lose in cultural diversity and sophistication. Some would argue that this makes us less effective, if not actually “dumb”. PowerPoint presentations epitomize this process.[29]
The way we store knowledge has also changed. The effects of our image-laden environment are especially dramatic when they are combined with our reliance on “external memory”. Queen’s University professor Merlin Donald explains how “a gradual externalization of human memory” has changed the way we think, remember, and communicate:
“External storage has dwarfed and overwhelmed individual memory. Methods of external memory retrieval are gaining ever-greater dominance, in relative terms, over the speed and accuracy of biological memory retrieval. The new technology of imagination – virtual reality, television, and cinema – threatens to outweigh individual imagination, and programmed corporate agendas already outstrip the individual thought process in many kinds of work.” [30]
Everywhere we see technology that is beyond our impoverished human capacity to synthesise, metabolise and control. This inhibits our ability to use knowledge intelligently, in ways that are not self-destructive. In a recent book, The Human Factor, Kim Vincente points to that “when it comes to safety-critical sectors… the consequences are worrisome.”[31]
THE FABRIC OF SOCIETY HAS BEEN IRREVERSIBLY AFFECTED BY THESE CHANGES .
They challenge the very basis of humanity as we have come to understand it through our study of history and anthropology, because they affect the way we relate with one another in society. The hierarchies, family relationships, and emotional deals embedded in complex cultures that have adapted to circumstances are changing as a result of the processes described above. There are implications for the managers of violence.
It is increasingly difficult to draw lines between virtual killing and real slaughter, between truth and lies, right and wrong, justice and injustice, information and propaganda, plagiarism and creation. All forms of social interaction are affected.[32] Without these standards, which have held our world together since the decline of universal religion, the ethical structure of our society is less cohesive. Looking into the future, dystopian academics emphasize the rapid erosion of the social fabric in speeches, books, journals and paradoxically web sites.[33] Utopians argue, that “electronic space” can lead to the betterment of society, by creating “Smart Mobs”.[34]
For now we should admit that we just do not know what new epistemological tools, criteria and standards will be created to accommodate the new reality. We are in the throes of passage, shooting the rapids, as described by writers like Robert Kaplan and Thomas Homer-Dixon.[35] However we can expect that opportunistic “viruses” will invade any weakened organism. In organizations, these viruses might manifest as organised crime, fascism or terrorism. In individuals, we see new forms of disorder and abhorrent behaviour, some of which no longer fit into psychiatric terminology.
Changes will Affect Sociability
Changes in processing knowledge and building emotions are neither individual nor cultural. If they rested exclusively in either the individual or the collective, they would be temporary changes, lasting only as long as the individual lived, or the organisation functioned. They are not temporary. I believe these changes are universal and, given time, have the potential to be integrated in the genetic code through the evolution of propensity genes.
Recent research, especially in Europe, is reinforcing the credibility of Lamarck’s theory about the inheritance of acquired characteristics. It is now believed that changes in the genetic material occur faster than can be accounted for by a Darwinian approach. This can be seen in the contemporary epidemic of obesity, which is partly explained by the awakening of an atavistic fat propensity gene. This epidemic is so striking that it has attracted attention, but there is now a field of research on possible changes in the brain structure, which must be taken into consideration in studies of human metabolism, whether of food, knowledge or emotions.
PRACTICAL ISSUES: VIRTUAL BATTLEFIELDS AND REAL SOLDIERS
Changes in the building blocks of thought are bound to have a dramatic impact on military life in general and on special operations in particular. Successful strategy depends on a clear, accurate understanding of the environment and perpetuation of its traditional ethos: discipline of the body, order of the mind and the deep emotion of esprit de corps.
I will not expand here on the challenge of understanding increasingly complex environments and weaponry. It is addressed elsewhere by a wealth of literature and analysis. The changes attracting the most attention today are technological developments. A host of new terms has been added to discussions of present and future military operations: smart weapons, brilliant weapons, digitized battlefield, virtual theater of war, cybermaneuver, dominant battlespace awareness, psychotechnology, narcotechnology, nanotechnology, and so on. John Arquilla and David Ronfeldt envision, “Cyberwar – a comprehensive information-oriented approach to battle that may be to the information age what Blitzkrieg was to the industrial age.”[36] Robert J. Bunker tells us, “The goal of the Army in future war, beyond that of securing assigned politico-military objectives, will be that of total cyberspace dominance not just digital battlespace dominance.”[37] Armies seek to operate in the mind, as well as on the battlefield.
Consider the implications of this thinking. Everything has become hidden and blurry. Old concepts, such as “frontiers”, “enemy”, “civilians” or “strategies” do not hold much reality. Fourth Generation Warfare, dispersed, undefined and manipulated by “chaos strategists…pits nations against non-national organizations and networks of organizations—including oppressed ethnic groups, mafias, narco-traffickers and extremist quasi-religious cults...”[38]
If psychological operations represent the genesis of special operations, we should consider carefully the impact their next generation:
“Psychological operations may become the dominant operational and strategic weapon in the form of media/information intervention. Logic bombs and computer viruses, including latent viruses, may be used to disrupt civilian as well as military operations. Fourth generation adversaries will be adept at manipulating the media to alter domestic and world opinion to the point where skillful use of psychological operations will sometimes preclude the commitment of combat forces. A major target will be the enemy population's support of its government and the war. Television news may become a more powerful operational weapon than armored divisions.”[39]
But in addition to thinking about the use of psychological operations by the next generation of military and political adversaries, we should consider the new vulnerabilities of complex military systems that have become cybernetically complex.[40] These complex organisations are:
“creating a whole new pattern of defeat, placing cybershock and paralysis on par with attrition and annihilation and manoeuvre and exhaustion. The cybershock-paralysis defeat pattern does not replace or compete with the other two, however. Instead, cybershock supplements and complements attrition and manoeuvre. Cybershock induces deep systemic paralysis throughout a complex military system, culminating the transformation in warfare first wrought by the Industrial Revolution more than 100 years ago.”[41]
My purpose is to draw some attention to new kinds of “internal disorders” faced by soldiers and leaders, which are not a simple mirror image of external disorders, but have a dynamic of their own and have the potential to jeopardize an already difficult process of adaptation to a quickly and ever changing environment.[42] This challenges the unspoken but persistent assumption that psychic and organizational structures are stable and will re-establish themselves after the individual human actors have recovered--that is, when stupor and disorganisation have passed.
When we challenge our assumptions of stability and normality, we step into uncharted territory for military strategy, where the scope is immense and the research scant.[43] Two concepts that provide a sample of changes to come, and subjects for further research, are chaos of the mind and disembodied esprit de corps.
CHAOS IN THE MIND
We pay lip service to the “human component” of the modern army. Lieutenant Colonel Tim Thomas points out that the American obsession with systems and information leads them to neglect the human factor in their theories of information warfare. Terminology, he argues, leads them towards hardware and software, when it is the humans in the data management business who are really vital. How will we prevent terrorists or adversaries from disrupting the people in our information systems? [44]
But disorder does not originate only with adversaries – the so called “chaos strategists”.[45] For one could argue that the enemy is already inside or rather the objectives of the enemy may be substantially assisted by the weaknesses of a modern mind that is in chaos. What do we do to ourselves as leaders and managers that makes us vulnerable to single-minded fanatics? We insist on multitasking to the point that we find it impossible to focus.[46] We specialize to the point that we lose the capacity to synthesise. We rely on external memory to the point of losing the capacity to memorize. We rely on technology in general to the point that we are unable to “read” a foreign environment and adapt to it.
The last point is all the more worrisome in the context of Special Operations where the possibility of being stranded far from technological help is ever present. Will they be like science fiction star ship crews, looking for life with sensors, but unable to see it in front of them?
Disembodied Esprit de corps ?
We understand how important the emotional dimension is for training and motivating soldiers. In special operations, we may tend to focus only on the fact that the enemy do not have uniforms, drill, saluting or, for the most part, ranks. We pay scant attention to the question of how troops can remain cohesive in a new military order, which relies less and less on traditional order. New technologies are affecting the entire social and emotional spectrum of human life and military life is no exception. We could even advance the hypothesis that these changes will be more dramatic for the military than for civilian professions. This is a profession whose entire ethos is linked to strong social emotions and interactions, from hand-to-hand combat to loyalty for one country. In a borderless world, with corporate warriors for hire, this ethos will be challenged beyond our capacity to imagine.[47]
Hand-to-hand combat illustrates the problem. New technologies, such as videogames and simulators for training, and precision munitions in operations have changed the ethos of violence.[48] They make it remote and virtual. It is safer for us, but more deadly for them. We are beginning to understand that immersion in virtual violence can make people more fearful, numb to violence, hungry for more, and even more aggressive in the real world.[49] The remoteness of warriors from actual casualties, cries of agony, decaying bodies and the smell of death helps to situate the individual in a virtual world, cutting him off from the root of ethics, because empathy is the basis for laws of war that require humane treatment of prisoners, civilians, and the wounded.
Esprit de corps presents another problem. It is an indispensable commodity in military organisations, which cannot be bought, sold, bartered or commanded.[50] It was born of the revolutionary military advances in close order drill, introduced to mercenaries in the service of the Netherlands by Maurice of Nassau in 1580. Close order drill for the handling of firearms dramatically increased military effectiveness, and had the added effect of creating social bonds and esprit de corps among a disparate group of soldiers-for-hire.[51] The model has been the basis for professional armies since, and has been emulated by industry.[52] But distance education, computerized work places, and the information society have made inroads on the synchronized group-in-motion pioneered by Maurice of Nassau. Military organizations have been forced to ask themselves whether, “esprit de corps and the joint team perspective and mind-set so critical to military operations could be replicated in a ‘virtual’ PME (Professional Military Education) environment.” [53] Despite these misgivings, the virtual environment is a more prevalent experience than the parade ground. As early as 1970, one of the doyens of military sociology was pointing to changes that this technology would force:
“Technological trends in war-making have necessitated extensive common modification in the military profession....The changes in the military reflect organizational requirements which force the permanent military establishment to parallel other large-scale civilian organizations. As a result, the military takes on more and more the common characteristics of a government or business organization. Thus the differentiation between the military and the civilian is seriously weakened. In all these trends the model of the professional soldier is being changed by ‘civilianizing’ the military elite to a greater extent than the ‘militarizing’ of the civilian elite”.[54]
If this trend to civilianising the military elite continues, esprit de corps will not survive.
Respect for hierarchy will be another casualty of the changes. When technology creates virtual staffs, the amount of information available to commanders in a digital battlespace is multiplied. Defence analyst David Lutz asks troubling questions about these staff officers. If civilians in the digital loop exercise judgement in combat situations, what responsibility do they have for the outcomes?[55] There has always been a gulf between those who fight at the front and those who direct from the rear. When military and civilian staff officers are ensconced in air-conditioned comfort in Florida, will special operators at the front accept instructions that lead to loss of life and limb?
Will soldiers bond emotionally to their distant and invisible staff? Loyalty may become as irrelevant in military life as it has become in the corporate world. Yet highly skilled special operators may be approached frequently by those trying to recruit them for private jobs or foreign armies. What will keep them on our team?
Conclusion
Do we face military irrelevance? Military thinkers must do more than talk about technological change and revolutions in military affairs. They must strive to understand both the philosophy underlying them, and the physiology and neuroscience that are driven by broad currents of social evolution. For example, how many budding strategists understand that the military technical revolution, from which the more holistic revolution in military affairs is derived, is itself grounded in Marxist philosophy? Soviet Marshal Nikolai V. Ogarkov pointed to the radical changes in operations that would be brought about by automation, search-and-destroy complexes, and long-range precision-guided weapons. [56] Americans have built on Thomas Kuhn’s ideas about scientific revolution, and the Toffler’s futuristic visions, [57] but they have not begun to think about the fundamental instability of the human mind. Our vision of the future is built on images of a historical and anthropological past, which is changing. Our minds are being re-wired by technological change. The minds of the future will not behave in the same way as did those of the past. These changes will become part of the genetic code that we carry forward with us. The wars of the future may be fought with strategies of chaos that take advantage of the chaos of the mind, and unless we understand these phenomena, our soldiers will be loose canons in cyberspace.
Notes
[1] Colin Mason. The 2030 Spike: Countdown to Global Catastrophe
[2] Thomas Homer-Dixon. The Ingenuity Gap. New York, NY : Alfred A. Knopf, 2000.
From Rachel Carson “Silent spring” (1962), to “Our Stolen Future: Are We Threatening Our Fertility, Intelligence, and Survival? “ by Theo Colborn, Dianne Dumanoski and John Peterson (1997)
[3] Fritjof Capra. The web of life: a new scientific understanding of living systems. New York, N.Y.: Anchor Books, 1997.
[4] John L. Casti, Complexification . New York: HarperCollins, 1994. Also see M. Mitchell Waldrop, Complexity New York: Simon and Schuster, 1992.
[5] IPCC Third Assessment Report - Climate Change 2001
[6] Freud called it “cultural narcissism” Sigmund Freud. Civilization and its Discontents. Newly translated from the German and edited by James Strachey. New York, W. W. Norton, c1961
[7] Lionel Tiger. Optimism: The Biology of Hope. New York : Simon and Schuster, 1980, c1979.
[8] Merlin Donald, Precis of Origins of the Modern Mind: Three Stages in the Evolution of Culture and Cognition. Cambridge, Mass.: Harvard University Press, 1991.Wright, Robert. The Moral Animal. Why we are the way we are: The new science of evolutionary psychology. New York: Random House, 1994.
[9] Steven Pinker The Blank Slate: The Modern Denial of Human Nature. New York : Viking, 2002.
[10] An important ideology, the basis of our concept of modern democracy, developed by John Locke and to some extent Jean-Jacques Rousseau and Jean Paul Sartre
[11] For example, Robert Wright’s claim, in The Logic of Human Destiny, is that there is a human nature and that technological innovation is chaotic “and only after a Darwinian survival-of-the-fittest process of testing and elimination do successful new technologies become established…The number of possible technologies is infinite , and only a few pass this test of affinity with human nature” Kim Vincente, The Human Factor p.41.
[12] V. Gordon Childe. Man Makes Himself London: Watts & Co., 1941.
[13] Clinical Chaos: A Therapist's Guide to Non-linear Dynamics and Therapeutic Change edited by Michael Butz and Linda Chamberlain. Philadelphia: Brunner/Mazel, c1998.
[14] diGenova,J. E. “Terrorism, Intelligence, and the Law.” In Proceedings of the Annual Symposium on the Role of Behavioral Science in Physical Security (9th) - Symmetry and Asymmetry of Global Adversary Behavior Held at Springfield, Virginia on 3-4 April 1984,' AD-A152 459, p53-59. Washington, DC: Department of Defense, 4 April 1984. 7p.
[15] Aristotle, The Politics, Book 1.
[16] Abraham Maslow. The Farther Reaches of Human Nature NY: Viking, 1971. Harmondworth, Eng: Penguin Books, 1973
[17] Lewis Mumford. Technics and Civilization. New York : Harcourt, Brace & World, [1963]. Man, V. Gordon Childe explains, very quickly became “inadequately adapted for survival in any particular environment. His bodily equipment for coping with any special set of conditions is inferior to that of most animals. He has not, and probably never had, a furry coat like the polar bear's for keeping in the body's heat under cold conditions. His body is not particularly well adapted for escape, self-defence, or hunting. He is not, for instance, exceptionally fleet of foot, and would be left behind in a race with a hare or an ostrich. He has no protective colouring like the tiger or the snow leopard, nor bodily armour like the tortoise or crab...He lacks the beak and talons of the hawk and its keenness of vision. For catching his prey and defending himself, his muscular strength, and nails are incomparably inferior to those of the tiger”[17] (Gordon Childe, 1941,pp.41-42) Man’s body has become a techné which depends mainly on external technologies that is on human communities producing and managing them. We leave here for ever the word of biology and zoology for this knowledge is transmitted “artificially” through cultures.
[18] Nonzero: The Logic of Human Destiny. New York: Vintage Books, 2000
[19] Neil Postman, Technology: The Surrender of Culture to Technology. New York: Vintage, 1992
[20] Eric, T.Jr. Dean, Shook Over Hell: Post Traumatic Stress, Vietnam, and the Civil War. Cambridge: Harvard University Press, 1997.
Richard, A.Gabriel, The Painful Field: The Psychiatric Dimension of Modern War (Westport: Greenwood Press, 1988).
[21]. A New Form of Warfare by James J. Schneider His book The Structure of Strategic Revolution was released in November 1994 .
[22] J. Michael Jaffe. Media Interactivity, Cognitive Flexibility, and Self-efficacy. A dissertation submitted in partial fulfilment of the requirements for the degree of Doctor of Philosophy (Communication) The University of Michigan 1995 http://research.haifa.ac.il/~jmjaffe/Dissert
[23] The Attention Economy. Does your company suffer from organizational ADD? When there is contention for attention, those who seek it turn to the most reliable attention getters. By Thomas H. Davenport and John C. Beck From Chapter 1, "A New Perspective on Business: Welcome to the Attention Economy," of The Attention Economy (Harvard Business School Press, April 2001, ISBN: 1-57851-441-X) http://www.acm.org/ubiquity/book/t_davenport_2.html.
[24] E., Jensen, 2000 Learning Smarter: The New Science of Teaching, San Diego, California.: The Brainstore, Inc. R. Rupp, How We Remember and Why We Forget. New York: Three River Press, 1988.
[25] His research involved collecting numerous books and articles from journals, magazines and newspapers; recording many hours of interviews and accumulating over twenty thousand pages of text by performing NEXIS searches and downloading information from the internet. David Shenk described his grievance with this process within the text of his book. Data smog : surviving the information glut . San Francisco, Calif. : Harper Edge, 1997.
[26] Doctor Gabor Mate. Scattered Minds: A New Look At The Origins And Healing of Attention Deficit Disorder. Vintage Canada, 2000.
[27] Ibid.
[28] R.K.Lowe. Successful instructional diagrams. London: Kogan Page, 1993.
[29] In a provocative essay “Power Point is Evil,” published in the 2003 September issue of Wired magazine, Edward Tufte--makes the point clear.
“Rather than learning to write a report using sentences, children are being taught how to formulate client pitches and infomercials. Elementary school PowerPoint exercises (as seen in teacher guides and in student work posted on the Internet) typically consist of 10 to 20 words and a piece of clip art on each slide in a presentation of three to six slides -a total of perhaps 80 words (15 seconds of silent reading) for a week of work. Students would be better off if the schools simply closed down on those days and everyone went to the Exploratorium or wrote an illustrated essay explaining something.” “In a business setting, a PowerPoint slide typically shows 40 words, which is about eight seconds' worth of silent reading material. With so little information per slide, many, many slides are needed. Audiences consequently endure a relentless sequentiality, one damn slide after another. When information is stacked in time, it is difficult to understand context and evaluate relationships. Visual reasoning usually works more effectively when relevant information is shown side by side. Often, the more intense the detail, the greater the clarity and understanding. This is especially so for statistical data, where the fundamental analytical act is to make comparisons”