Subscribe to EIR Online
PERSONAL MEMORANDUM 
TO: EIR Editorial/Home Page 
FROM: Lyndon H. LaRouche, Jr. 
RE: Understanding Non-Linearity 
September 15, 1997 

Popular Misconceptions About Science 
SCIENCE IS NOT "STATISTICS" 

There is an already significant, and rapidly growing amount of communication directed, partly, toward me, and, more often, as idle gossip about me, in that electronic purgatory which, I am told, is populated by (shades of H.G. Wells' "Dr. Moreau"!) strange, tormented, polymorphic entities: web-footed, half-man, half-mouse. A significant ration of this electronic chaff and chatter reflects the widely circulated opinion, that I am responsible for more or less successfully discrediting two popular, pseudo-scientific hoaxes: F. Sherwood Roland's "Ozone Hole" hoax, and the kindred fraud, "Global Warming." 

Like squeals from flies in extremis, as fancy might hear the wriggling creatures trapped in another kind of web, there is a number of U.S. co-thinkers of Britain's current Labour Party Prime Minister, Tony Blair, who are frustrated, and enraged against what they consider President Clinton's stubborn refusal to impose those demands for collapsing the U.S. economy, demands which are pushed by the "Global Warming" fanatics. Prominent amid this Blairing protest, is the complaint, that persons associated with me were active in exposing the two referenced hoaxes. Since my name is more or less a household word in this and many other countries around the world, the enraged ones find it emotionally self-gratifying to develop a "conspiracy theory," identifying me as the evil genius causing their own, Maurice Strong's, and Tony Blair's frustration in these matters. 

Apart from such cranks, there is a significant number of individuals of manifest good will, who are willing to submit their adopted opinions to the tests of my own and other criticism. I excerpt a passage typical of one such recent communication: 

"Can you tell me information about your science. Because there are a lot of good scientists out there, like you, dealing with information. If the top ones out there say there might be an Ozone problem, and some of these men and women are not being bought by the Powerful, then, why discredit them? Please answer that." 

The author of that statement is factually mistaken, but the question is nonetheless fair by the standards appropriate for the "Generation X" presence within university classrooms and related settings. The received questions, pertaining to "environmental issues" of this type, pose three interrelated questions. 1.) Why do I reject those new views on the "environment," the which have become popularized during the recent thirty-odd years? 2.) What is the basis for my scientific method? 3.) What authority lies within that, my scientific method, that of economic science, which qualifies me to pass authoritative judgment on the competence of a top-ranking hoaxster such as F. Sherwood Rowland? 

Based on those considerations, rather than responding, repetitively, to each of these inquiries individually, it were suitable than I publish a single, common reply to all those received, and other messages which pose the same general line of questioning. Naturally, this present response will also be posted on the relevant EIR site. 

On the subject of the "Ozone Hole" hoax, evidence continues to support the case set forth by co-authors Ralf Schauerhammer and Rogelio Maduro, in their internationally celebrated {The Holes in the Ozone Scare}.(1) In the case of the "Global Warming" scare, the unscientific method is modelled on the fraudulent tactic used by Rowland, and others, to concoct the "Ozone Hole" hoax. Thus, in broad terms, the Schauerhammer-Maduro book demonstrates the case to be made against both of these hoaxes. In answer to part of the misinformed doubt that environmental scientists had been "bought," wrongly asserted by the reader: It is notable that Rowland became a "top scientist," including his appointment to lead the American clone (AAAS) of the British Association for Advancement of Science (BAAS), as one instance of the celebrity he attained in recognition of his concoction of the "Ozone Hole" scam. 

The argument to be made against both of these, and related pseudo-scientific "ecologisms," is to be presented on two levels. The first level is typified by Dr. Dixy Lee Ray's endorsement of the Schauerhammer-Maduro text: 

"... Everyone interested in the so-called global environmental issues should read this powerful book, and then consider whether press releases and computer simulations that are unaccompanied by solid scientific evidence should drive our nation's science policy."(2) 

Dr. Ray spoke as a representative of those standards of scientific competence which were generally accepted by the scientists from the generations which lived, as adults, through either or both of the two World Wars of this century. Rowland typifies the post-modernist collapse in intellectual and moral standards of scientific practice, the which has taken over leading positions in shaping "politically-correct science opinion" during the past twenty-five years.(3) 

The clinical fact, that an incompetent, Rowland, has achieved as much celebrity as he has, guides us toward a second, deeper issue. The post-modernist quackademics of Rowland's following, received their university education under the direction of scientists from my own generation. This poses the question: "What misled relevant faculty members, from the World Wars I and II generations, into awarding today's new generation of leading, 'politically correct' science-quacks their university degrees?" What is the relevant virus of error infecting the classroom and related practice of earlier generations of actual scientists, the virus which is expressed by their "Baby Boomer" and "Generation X" students, as the "Ozone Hole" and "Global Warming" hoaxes? 

In my conclusion, I shall identify summarily the anti-science, political motives responsible for these activities of Rowland et al. That, I think, should wait until after I have situated the problem within the bounds of the science profession as such. I begin with a crucial example of the relevant problem, as encountered within my own specialty, economics. 
 
 

--Why Karl Marx and Adam Smith Were Incompetent-- 

Adam Smith and his follower Karl Marx committed the same fundamental blunder, in their respective misdefinitions of the axiomatic principles of political-economy. The difference between these two, is that Marx, unlike the modern Manicheans, such as Michael Novak and those of the Mont Pelerin Society, admitted the existence of that specific fallacy of composition in his construction.(4) There is nothing in the design of the economics doctrines of virtually any variety of economics doctrine taught in any university today, which makes any functional distinction between the presumption that the economy is run by apes, or by human beings. Specifically, all of this assortment excluded consideration of those developable cognitive functions of the individual human mind, within which discoveries of physical principle are generated. These are same principles assimilated for economic practice, and also assimilated as increases in the per-capita, physical, productive powers of labor. 

The omission is monstrously large, a monstrous and pervasive incompetence inhering in virtually all "mainstream" varieties of textbooks and university classroom instruction today. This reveals the same, defective state of mind, in the field of economics, exhibited by such former proteges of Bertrand Russell as the "inventor of information theory," Norbert Wiener, and the inventor of "systems analysis," Russellite acolyte John von Neumann. 

The core of the relevant argument to be made, involves the empirical evidence which demonstrates, conclusively, that the human individual differs fundamentally from that class of higher apes with which some zoologists have often, mistakenly, identified the human species. Essentially, under the conditions which have existed on this planet during approximately two millions years to date, the ecological population-potential of all species of higher apes, combined, has never exceeded several millions living individuals. Whereas, man, who appears, superficially, to have the ecological attributes of a higher ape, had reached planetary population-levels in the hundreds of millions during European civilization's Hellenistic period, and is measured in billions today.(5) 

The combined archeological and historical evidence compels us to recognize that this qualitative distinction, which places mankind outside ecology, outside the domain of lower forms of life, is that mental faculty, relatively unique to mankind, whose fruits are typified by the increase of the human species' potential relative population-density, through the benefits of scientific and technological progress. For example, a throwing-spear, recently excavated from a stratum 600,000 years deep within Germany's Hartz Mountains region, can be attributed to nothing other than a mind identical with the modern human genotype's.(6) 

This subject, the relationship between those distinctive, cognitive powers of the human individual's mind, and the increase of the potential relative population-density of the human species, is the foundation of all of my professional accomplishments over more than four decades to the present date. It is from the standpoint of my original and related discoveries in this area of investigation, that I have adopted and advanced that science of physical economy first established, under that name, by the principal mentor of my adolescent intellectual life, Gottfried Leibniz, during his related work of the 1671-1716 interval. Since late 1952, my work has been indebted to Bernhard Riemann's 1854 revolution in physical geometry for the representation of the implicitly measurable relationship between validated discoveries of physical principle, by individual minds, and the increase of the productive powers of labor (i.e., increase of potential relative population-density) by societies which commit themselves to scientific and technological progress. 

For purposes of illustration, the application of Riemann's metrical principles to my discoveries respecting the human mind, the so-called "LaRouche-Riemann Model,"(7) is typified by the work of such earlier followers of Leibniz as Lazard Carnot and the circles of Carl Gauss and Alexander von Humboldt, in developing the principles used by President Abraham Lincoln's United States to launch that modern machine-tool economy-driver model later copied by post-1876 Germany, and other nations.(8) 

--Cognition:The Active Principle in Economy-- 

The key to the relative uniqueness of my own discoveries, is my shifting the investigation of the way in which the individual human mind generates experimentally validatable discoveries of physical principle: rejecting the parochial view of "physical science," as customarily defined during the Twentieth Century, and, employing for physical science, instead, the standpoint of the role of metaphor in Classical art-forms of poetry, dramatic tragedy, musical polyphony, and plastic arts in such traditions as those of ancient Scopas, Praxiteles, or modern followers of Leonardo da Vinci such as Raphael Sanzio. To restate this point in a relevant way: the ontological paradox which demands a resolving discovery of new physical principle, in the domain of experimental physical science, is viewed by the cognitive processes of the developed individual mind, as the same type of challenge represented by a true metaphor in the domain of Classical forms of plastic and non-plastic art. 

The issue which prompted me to effect these discoveries, was a 1948 confrontation with Professor Norbert Wiener's "information theory" hoax. My response to Wiener's provocation (and, also, the same hoax presented by John von Neumann under the rubric of "systems analysis"), was premised upon my previously established, and deeply embedded commitment to the methodological standpoint of Gottfried Leibniz, the commitment which I had adopted during my mid-adolescence. Although I first adopted this method from Leibniz, rather than the Plato from whom Leibniz had himself adopted it, my method, then and now, is strictly Platonic. The term "Platonic" has the following, decisive significance in addressing the issues posed by the currently popular ecological hoaxes against science. 

The central issue posed by the notion of "human knowledge," is the fact, that all claims to such knowledge depend absolutely upon the contention that the laws of the universe are not embedded within the domain of sense-perceptions as such, but, rather, lie within man's ability to willfully change human behavior to such effect, that man's per-capita power over the universe is willfully, manifestly increased. The forms of mental activity, through which those willful increases in power over nature are achieved, are the subject-matter of knowledge, as knowledge must not be confused with mere sense-perception, or with mere "textbook learning." 

This may be restated as follows. The foundation of both science, and Classical forms of artistic composition, is the process by which individual human minds are capable of generating those experimentally validatable discoveries of both physical and cognitive principle, the which are generated as solutions to contradictions which can not be resolved by deductive methods. The type of contradiction involved is typified by the following general case. 

Given, the circumstance, that undeniable evidence shows the occurrence of phenomena whose existence is implicitly prohibited by presently established principles of scientific knowledge. Since, the disturbing evidence, and the previously established scientific knowledge, are both manifestations of the same faculty for determining empirical actuality, the contradiction between the extant belief and such contradictory evidence is ontological in implication. Hence, the contradiction is rightly described as an ontological paradox. 

The parallel case, in Classical forms of art, is typified by the issues of Shakespeare's Hamlet, the famous Act III soliloquy most emphatically. Hamlet knows, that clinging to his accustomed, swashbuckling code of conduct, dooms him, and also dooms the kingdom of Denmark. The existence of a contradictory, alternate behavior, is apparent to him. He would prefer, however, to cling to the inevitable doom of following his habituated inclinations, rather than risk the uncertainties of a future "from whose bourn no traveller has returned." So, he and Denmark are doomed; so, the final scene of the play closes, over the warm corpse of Hamlet, with the character Horatio, speaking from within the play, to us, the surviving witnesses, in the play's audience; that Horatio, then, implores us, to relive that contradiction, that we, in the future, might escape the self-doom which Hamlet imposed upon both his own nation, as upon himself. In all Classical art-forms, the expression of such dualities of implication --ontological paradoxes,-- is called "metaphor."(9) 

The presently existing possibility of a mathematical representation of this process of discovery of a validatable new physical principle, we owe to that family of discoveries by Bernhard Riemann which is centered around his 1854 habilitation dissertation, "On The Hypotheses Which Underlie Geometry,"(10) and to the preceding work of Gottfried Leibniz,(11) J. F. Herbart,(12) and, immediately, the work of Carl Gauss on the development of a general theory of curved surfaces, out of preceding and accompanying work on biquadratic residues.(13) 

A summary account of my own approach, which led into my rereading of Riemann's 1854 habilitation dissertation from this standpoint, will be helpful to the reader on several counts, respecting the material covered in this general reply. 

The starting-point for my attack on Wiener's "information theory" hoax, was, inevitably, the nature of the distinction between processes whose underlying ordering is overall entropic, as distinct from, for example, the species of living processes, which are anti-entropic in their typical, underlying distinctions in ordering, differing so from what we consider particular cases of non-living processes, including non-living organic processes. This was the same starting-point adopted for all issues of physical principle, by such notable followers of Cardinal Nicholas of Cusa's founding principles for modern experimental physics, as Luca Pacioli, Leonardo da Vinci, and Johannes Kepler. 

To determine, from the standpoint of crucial-experimental tests, whether particular types of human communication of ideas are entropic, or not, is a matter of showing whether, or not, the result of that communication, is a potential increase, or decrease of the entropy expressed in society's physical relationship to nature. This measurement must be made from the standpoint of the relevant actor, mankind, receiving this communication. Hence, we must measure the experimental result so: in terms of man's physical power over nature, per capita, and in terms of improvements in the demographic characteristics of the relevant class of households. Thus, for such measurements, we must exclude all consideration of money-prices, or related fictitious valuations; we must limit our attention to the physical interaction of mankind with nature: i.e., to Leibniz's and my own relevant domain in science, that of the science of physical economy. 

Respecting the increase or decrease of the entropy of social interaction with nature, we start with the general fact, that the increase of the human species' potential relative population-density, and correlated demographic considerations, depends upon discoveries of principle which, introduced, have the mathematical-physical implication of axiomatic changes in the notion of a geometry of man's functional interrelationship with the universe. 

The changes corresponding to successful axiomatic transformations of this type, are expressed as activities each corresponding to those principles. Hence, in the successful case, the gain in productive power of labor (of potential relative population-density) occurs at the "price" of increase of per-capita value for "energy of the system," when the latter is defined in respect to the process taken as a whole. Yet, in the successful case, the ratio of the process's "free energy" to its required "energy of the system," is either increased, or, at worst, not decreased. Thus, physical economy adopts the following relative definition of anti-entropy: the requirement, that the ratio of free energy to energy of the system not decrease, despite a required increase in the per-capita relative value of "energy of the system." 

This notion of a contrast of entropy to anti-entropy, lies outside what the ordinary university graduate considers mathematics. It lies within a higher, "meta-mathematical domain," which Leibniz identified as the domain of Analysis Situs, and which, in mathematics, is otherwise associated, in its more limited aspects, with hypergeometric forms of modular functions.(14) In other words, the generative (e.g., "causal") distinction between entropy and anti-entropy, as distinct types of ordering, can be reflected in the results of the relevant ordering, but can never be defined in terms of a statistical function, or any other deductive mode of argument.(15) 

This brings us to the indispensable role of a Classical Euclidean geometry in science. No one could possibly achieve competence in scientific matters, without a grounding in a strict geometry of this type, a grounding preferably by about the time of onset of puberty, or slightly earlier. On this account, the introduction of the so-called "new math," during the course of the 1950s, has crippled the cognitive functions of two generations of relevant university graduates. We shall state the case at an appropriate place, here below; but, at this instant, we proceed as if the reader had had the benefit of a pre-1966 U.S. standard for a competent, pre-science secondary education. 

To make this distinction in notions of ordering clearer to the reader, consider the self-bounded characteristics of a deductive form of geometry, such as a classroom version of Euclidean geometry. Such a geometry allows as theorems, only propositions which are not inconsistent with any among a fixed set of combined definitions, axioms, and postulates. In the method of Plato, such a set of definitions, axioms, and postulates, is termed an hypothesis. The introduction of a newly discovered, and experimentally validated, physical principle, or of a new principle of cognition as such, creates a new physical geometry, one which is pervasively inconsistent with any acceptable theorem of a preexisting, deductive system of argument: thus, requiring a new hypothesis.(16) Thus, physical science is focussed upon the nature of the ordering of successively more powerful hypotheses. (The ordering principle of such a succession is Plato's notion of Higher Hypothesis.) These higher, meta-mathematical, forms of ordering, such as the distinctions between efficiently entropic and anti-entropic orderings, are apparently "meta-mathematical" precisely for the reason that they reflect the efficiency of those axiomatic principles (i.e., of higher hypothesis) which do not exist within the previously established systems of deductively ordered beliefs. These are the crucial issues of Riemann's 1854 habilitation dissertation, and also the underlying issues of the notions of modular, hypergeometric functions in the work of Gauss and Riemann. 

I have found it convenient, pedagogically, to illustrate this point by reference to the estimate for the size of the Earth constructed by the famous Third Century B.C. representative of Plato's Academy at Athens, Archimedes' contemporary and correspondent, Eratosthenes. In summary, the illustration is as follows. 

From no later than the time of Thales, Classical Greece's original development of the currents leading directly into modern science, had used sound principles to estimate both the distance of the Sun and Moon from the Earth's surface. For reasons of scale intrinsic to the kinds of observation available, there was an inevitably large margin of error and difference, in and among these various observations. However, despite those margins of error, it was clearly shown to the Classical Greek mind, that the Sun was a very large object, at a very great distance from the surface of the Earth.(17) Eratosthenes, a representative of Plato's Athens Academy who rose to a topmost position in Egypt, conducted such observations himself. On the basis of that knowledge respecting the relationship of Sun to Earth, he devised a conceptually simple astrophysical approach to measurements in the geodesy of the Earth's surface. 

If one defines, astrophysically, the meridian line which connects Egypt's Aswan (ancient Syene) to Alexandria, and if one places plumb-bob-oriented gnomons (pins) within hemispherical sundials at measured distances along that line, the size of the Earth can be estimated with decent approximation. (Eratosthenes' estimate came within approximately fifty miles of the Earth's polar diameter.) The comparison of the angles of the shadow cast by such a series of gnomons,(18) when the shadows are each pointed, during the same day, in a north-south direction, implicitly defines the curvature of the Earth's surface along that interval of the meridian-line. 

Thus, for measuring all but very small areas of the Earth's surface, we must enter the domain of astrophysics, the domain of geodesy. We must abandon the limits of a two-dimensional survey, to include a third dimension, corresponding to the line of the radius of curvature(19) at each point (very small, e.g., infinitesimal interval) of the Earth's surface. All valid discoveries of new physical principles are analogous to this Eratosthenes experiment.(20) The validated new principle, which corrects the error in our previous doctrines about the physical universe, has the character of a new dimensionality in a physical space-time geometry. The discovery of that "dimensionality," constitutes the solution for the ontological paradox addressed. This new "dimensionality," appears to deductive opinion as in the form of an added axiom of deductive mathematical physics. 

Thus, metaphor has the same form as those ontological paradoxes which require validatable discoveries of new physical principle. The difference is, that ontological paradox looks at one aspect of man's interaction with the physical phase-space of that universe of which he is a part; metaphor looks, similarly, at the principles of individual human cognition themselves. On this account, Shakespeare, as master tragedian, is sometimes described as a "great psychologist." The problem to be solved, is the fact that some stubborn mental blocks prevent ill-fated men, women, and even entire societies, from either discovering, or accepting a feasible alternative to a self-imposed, awful destruction. To avoid such doom, we must discover those principles of both the individual mind, and of relations among individual minds, which will enable us to prevent repetition of such errors. 

It is metaphor which defines Classical art. It is the efficient interaction between discoveries of physical principle, the domain of physical science, and the use of Classical art-forms to uncover the moral principles of cognition, which defines a science of human history, the science of physical economy, and the corresponding principles of statecraft. 

Thus, prior to my apprehending the relevance of Riemann's work for the foregoing line of investigation, it was clear that the accumulation of new dimensionalities of validated discovery of physical and Classical-artistic principle was, at once, the expression of an increase of society's per-capita "energy of the system," and, at the same time, the source of an increase of the ratio of total "energy output," per capita, to "energy of the system," per capita. When Georg Cantor's development of the concepts of transfinite ordering is properly situated, within the framework of Riemann's 1854 discoveries, the means for expressing my anti-Wiener notion of anti-entropy, as the basis for a reform of economic science, is evident. 

In the development of European culture, Plato traced science to Pythagoras and his school, and the anti-scientific, or contemplative standpoint, to the succession of Eleatics, materialists, and radical nominalists, and, of course, Plato's enemies, the Aristoteleans. The first, the scientific standpoint, chooses as its primary subject-matter, the interrelationship between the self-development of the individual cognitive processes, and the human species' increasing power to exist, relative to the whole universe with which the human cognitive processes are interacting efficiently. The second, emphasizes the relatively nominalist standpoint of formal logic, placing mankind as observer of the mere representation of the sense-perceptual actuality. 

Thus, economic science requires, that the young members of society enjoy a quality of education which emphasizes reenacting validated original discoveries of physical principle and Classical art-forms, as opposed to merely learning approved representations and procedures. Hence, the functional significance of the difference between knowledge and mere learning. In economy, the essential requirement, is that the persons employed in the economic process must be capable of revolutionizing that process. This latter efficiency is fostered by that quality of education, in which the pupil reenacts validated original discoveries of principle, instead of merely learning "the right answer," without going through the experience of reenacting the discovery. The Classical humanist form of education, as opposed to the mind-destructive modes which have become almost universal within U.S. education today, addresses the most fundamental principle of a science of physical-economy. Any brand of economic teaching which ignores this principle, such as that of Adam Smith or Karl Marx, is intrinsically a hoax. 

--The LaRouche-Riemann Principle-- 

Although my original intention in challenging Wiener's "information theory" hoax,(21) was not aimed at so ambitious a result, by late 1951, it was clear to me that we must redefine the meaning of the term "science," contrary to generally accepted, pro-Aristotelean, academic usages at that time. This was not a redefinition in merely the dictionary sense of the term, but, rather, a new functional sense of scientific practice in general. "Science" could not be defined as the sum of mankind's experimental observations of nature. To eliminate the source of most of the monstrous errors promulgated as generally accepted classroom notions of "science," it was indispensable to discard entirely the pro-Aristotelean delusion of "scientific objectivity." Science must be understood, functionally, not merely in terms of validated physical principles, but, rather, subjectively: in terms of the adducible characteristics of those individual cognitive processes, within whose sovereign domain all validated discoveries of principle were generated as otherwise impossible solutions to a devastating paradox in the existing state of established scientific belief. We must understand, that what crucial experimental methods do, is to validate those types of cognitive processes which generate experimentally validatable discoveries of physical principle. 

The key to this proposed, improved functional notion of "science," and of scientific method, lies within the science of physical economy as Leibniz had defined it, and as I had freshly defined it at that point in my work. When the economies of entire nations, or, better, humanity generally, are considered as indivisible entireties, the anti-entropic form of increase of the potential relative population-density of a society, is a measure of mankind's increase of the our species' per-capita power over nature.(22) 

This relationship, between the society and the universe at large, is rooted in the referenced distinctions of the individual person's, developable, sovereign cognitive processes, the unique role of those individual cognitive processes in generating (or, replicating the generation of) discoveries of principle, such as validatable discoveries of physical principle. This defines the individual's, and the relevant society's potential relationship to nature, a potential reflected as increase of potential relative population-density. However, the actual relationship of society to nature, is located within the structured social relations which shape the effective relations, respecting ideas for practice, among the sovereign cognitive processes of the individual members of society as a whole. 

If we consider the individual cognitive processes and these structured social relations as the subjective side of man's relations to nature at large, we can match this subjective side with the adequacy of the array of physical principles, and the rate of change of that latter array. Thus, the functional relations between man and nature must be conceptualized. That is the required basis for a functional notion of the term "science." 

This combination of interacting subjective and physical development, defines the scope and content of the science of physical economy, both as Leibniz founded it during the 1671-1716 interval of his life's work, and as I have reconstructed it in connection with my refutation of the "information theory" hoax. 

To address this consideration, we must now pause, as promised above, to bring certain readers into the picture. This includes, notably, those who were victims of the influence of "New Math" and kindred pedagogical obscenities, during their secondary and university education. 

We have referenced a term here, "LaRouche-Riemann Method." Since Riemann was born ninety-four years before my birth, and died nearly seventy years before I took up the study of Gottfried Leibniz's work: Why "LaRouche-Riemann;" why not "Riemann-LaRouche"? Two important considerations demand that the former, and not the latter, must be used in an intelligible representation of the content of this discovery. The first, relatively simpler point, is that after I had made a set of discoveries of principle, I then recognized that Riemann's work supplied the necessary clues for solving those problems of measurement posed by my earlier discoveries.(23) The second consideration, is a far more profound one, a consideration on which I have reported in various published locations, including my October 2, 1996 "The Essential Role of 'Time-Reversal' in Mathematical Economics."(24) The most efficient route to understanding both of the notions underlying the usage "LaRouche-Riemann Method," begins with the subject of the student's pre-science grounding in Classical Euclidean geometry. 

Although any relatively sound representation of Euclidean geometry to secondary pupils, will suffice to provide a foundation for intelligent discussion of the elementary issues of scientific method, no further comprehension of the subject could be realized without reference to the implications of Plato's dialectical method in shaping the origins of Euclid's geometry, and in enabling us to proceed from that geometry, to higher ones: to physical geometries. 

The entire collection of Plato's dialogues must be studied, not only for the particular topics addressed, but for the single method which underlies each and all among them: the Socratic dialectical method.(25) Ask the question: Whence are derived the kinds of definitions, axioms, and postulates which underlie a formal Euclidean geometry? The Socratic method demonstrates the answer. The Socratic dialectical method exposes a rigorous approach to "smoking out" otherwise hidden assumptions, assumptions treated naively as if they were "self-evident," assumptions underlying the choices between propositions which are believed, and those which are not. Euclidean geometry, is thus largely a product of the Socratic dialectical method, which was developed, through the Hellenistic and Roman periods, under the continuing influence of Plato's Academy of Athens. This, so viewed, is the exemplar of all formal systems of thought which are premised implicitly upon propositions sharing a common basis in a single set of definitions, axioms, and postulates. 

The application of this same Socratic dialectical method, of Plato, to that geometry itself, led to discovery of new, superior geometries. The most significant such discoveries began with the seminal work founding modern experimental physical science, the De docta ignorantia of a chief organizer of the 1439-1440 Council of Florence, Cardinal Nicholas of Cusa.(26) 
 
 

Cusa's work on the matter of methods for experimental development of physical science, led directly to the work of such among his explicit followers as Luca Pacioli, Leonardo da Vinci, and Johannes Kepler. This work, together with the added materials supplied from Pacioli, Leonardo, and Kepler, was the common foundation of such Seventeenth-Century leaders in science as Blaise Pascal, Christiaan Huyghens, and Gottfried Leibniz. That approach, as enriched by, and reflected within the labors of, most notably, Lazare Carnot, Carl Gauss, and Bernhard Riemann, served as the guide to my own work in the science of physical economy. 

The vicious fallacy in permitting the replacement of competent mathematics instruction by the so-called "New Math," is demonstrated by the fact that the "New Math" evades the existence of the most important issues of geometry in particular, and mathematics in general. These are the same issues indispensable for access to higher geometries. 

Modern science has shown, that the principal errors of assumption of Classical geometry, are the following: 

1. That geometry presumed, in error, that its axiomatic notions of space and time were self-evident principles of the universe, existing independently of any experimental proof. 

2. The prevailing, erroneous axiomatic presumption among Aristotelean, neo-Aristotelean, and other philosophically reductionist commentators on this geometry, was that extension in space and time was perfectly continuous as a matter of principle. In other words, that linear extension could be subdivided infinitely to such a degree that no margin for existence of discontinuity could occur within perfect extension. 

Although the first comprehensive refutation of these two errors was supplied by Riemann's 1854 habilitation dissertation, Riemann's discovery was implicit in much of the work of Plato and his followers, such as Eratosthenes. The most devastating event in refuting perfectly continuous extension, appeared in Cusa's De docta ignorantia, as Cusa's discovery of the fact that pi was not the type of incommensurable which Archimedes' quadrature had presumed it to be, but of a higher type, named later "transcendental."(27) The error of the assumptions of Descartes, Newton, et al., on this account, was addressed by Leibniz, who used both the issue of the catenary ("hanging chain") curve, and the Huyghens-Roemer-Leibniz-Bernoulli proofs of the isochronic characteristics of refraction of light,(28) to show that a mathematics derived simply from Euclidean presumptions of extension and continuity could not map the reality of the physical universe. 

For a student who is well-grounded in both Euclid and Plato, understanding of the matter is more or less readily accessed. The shift to a "New Math" program of instruction, has devastating, disastrous effects, on this account. The use of the radical demands of Bertrand Russell's Principia Mathematica, in the manner the "New Math" ideology does so, arbitrarily denies the existence of the crucial, ontological problems of Classical geometry: the false presumption, that extension in space and time is simply, self-evidently, both linear and perfectly continuous. All fundamental progress in modern science is premised on efficient acknowledgement of the reality, that space-time extension is neither self-evidently linear, nor perfectly continuous. 

If the student is able to recognize this issue, an understanding of the relevant problems is reasonably well charted. Lacking that recognition, comprehension is most difficult, if not impossible. 

This same recognition provides us the proper distinction between a lunatic sort of "ivory tower" mathematics (including making a virtual god of statistical methods), and physical science. 

The lunatic presumes that the appearance of the physical universe is something given to us by an "ivory tower" type of mathematics, that the laws of the universe can be derived, as Russell insisted, in the Principia Mathematica, as elsewhere, from a mathematics as he defined it. In contrast to the hesychast from the ivory-tower mathematicians's virtual space-time, the scientist, such as Riemann, insists, that the function of mastering mathematics is to perfect one's ability to construct an appropriate, previously non-existent mathematics, on the occasion of any validatable discovery of physical principle which refutes the assumptions of a previously adopted mathematical physics. 

Riemann's crucial breakthrough on this account, was to assail the previously persisting delusion, that extension in space and time were self-evident notions, rather than, as they are now shown to have been, principles subject to the tests of experimental validation. For example: for Riemann, as for Carl Gauss and Wilhelm Weber, the proof, by Weber, of the existence of the Ampere "longitudinal force" in electrodynamics, already sufficed to demonstrate the non-existence of linearity in the microphysically small. As Riemann insisted, it is in the domains of the very large (astrophysics) and very small (microphysics) that we must anticipate violations of the naive notions of linearized extension in space and time. 

Out of the combined work of Leibniz, Gauss, and Riemann (most notably), a refined general principle of experimental physical science emerges. In modern agro-industrial economy, this development falls naturally into the domain of the economic tradition of France's Jean-Baptiste Colbert (Leibniz's sometime sponsor) and one among Leibniz's more notable followers in science, France's Lazare Carnot. The essential task of a theory of knowledge,(29) is to define the means by which the appropriately developed cognitive processes of the individual mind, react to ontological contradictions (metaphors) by generating experimentally validatable new principles of nature and cognition itself. 

For this purpose, we must represent the process of physical-scientific progress in terms of successive generations of valid new physical principles. Thus, as this case has been identified above, we must represent scientific knowledge, by an ordered sequence of successively more powerful hypotheses, using the term "hypothesis" here, in the Platonic sense, as typified by a coherent set of definitions, axioms, and postulates in a Euclidean geometry. In this image, the sequence is to be defined by the efficient principle which generates that succession of hypotheses. 

That principle is the efficient principle expressed by successful generation of validatable new principles. For this purpose, it is convenient to describe the process of generating the discovery of such a new principle as a four-step process: 

1. The posing of an ontological paradox. This is representable in communication as a paradoxical, confrontational juxtaposition of valid new empirical evidence with that empirically validated, previously established system of belief, which implicitly prohibits the existence of the new evidence considered. 

2. The generation of an experimentally testable new principle which generates a new system of belief consistent with all the evidence. This action, which occurs behind the opaque screen of sovereignty of the individual's cognitive processes, is not representable in any system of communication. 

3. The statement of a proposed principle of solution, expressed in terms of the paradox addressed. This is representable. 

4. The experimental design, which tests the efficiency of the discovered principle. This is representable. 

It is the second of those four steps which is troublesome. Although it is not directly representable in any deductive system, including a mathematics, it is knowable through the replication of the same four-step act of discovery by other minds, such as those of students. Furthermore, the accuracy of the experience of those minds, is verifiable in terms of the implications of the four-step replication. Thus, it is, contrary to empiricist and positivist dogma, a knowable conception, or, what Plato identifies by the term idea. In other words, persons who have replicated that discovery within their own minds, know that discovery as step two of the four-step process identified here. They may then use words, or other representable expressions to identify, by reference, the existence of that idea as the subject of their thought. 

For scientific progress, it is not a defect, but rather a superlative advantage, in such an idea, that it can not be derived by means of grammar, or any deductive device. Since the discovery of validatable principle which resolves an ontological contradiction is truthful, those who condemn ideas, as Plato defines ideas, are persons incapable of truthfulness, and therefore most untrustworthy types of scientists. 

Similarly, a competent performance of a Classical musical composition can not be accomplished by a literal reading of a printed score. In Classical composition, as opposed to the bang-bang parodies of musicality, the musical idea is of exactly the same metaphorical origin as a discovery of a validatable new physical principle. The musical idea is located in the equivalent of our Step Two of a cognitive process, here. The same principle reigns in all Classical forms of plastic and non-plastic art. It is within these qualities of idea, that our essential humanity lies, that we exist as made in the image of God. 

In science, of course, the obvious point, is that these ideas are the most efficient power in all human practice. Thus, they are superbly real, far more real than any mere object of sense-perception. All efficient physical principles are of this same ontological quality. 

In Plato, the process of generating successively more powerful hypotheses, by what we have represented as a four-step cognitive action, is referenced as an efficient, and knowable principle (idea) of higher hypothesis. That is to say, that the development of the creative powers of the cognitive processes of the student, through the successive acts of recreating discoveries by the indicated four-step method, rather than merely learning those discoveries in a text-book fashion, trains the cognitive processes to attack ontological paradoxes in a certain fashion. This developed method of attack, as expressed through successively successful applications, represents, thus, a knowable idea. This quality of knowable idea corresponds to the notion of higher hypothesis. The generalization of the improvement in higher hypothesis, provides the idea corresponding to Plato's "hypothesizing the higher hypothesis," the cognitive aspect of Plato's principle of "Becoming." 

Thus, in this light, science becomes the matter of organizing the mental and related activities of groups of scientists and others, around a task-oriented process --a mission-- of perpetuating scientific progress, in this sense, as a series of successively more powerful hypotheses represents such progress. 

This brings us to the second point, the matter of "time-reversal." 

Let us agree to describe propositions which are not-inconsistent with any among the definitions, axioms, and postulates of a formal hypothesis as theorems of that hypotheses. Thus, we have, corresponding to a fixed such hypothesis, an expandable array of theorems so defined: a theorem-lattice. Within such lattices, there is an associated notion of sequence. For example, the fact that the derivation of some proposition is conditional upon the preceding derivation of another proposition, represents a sequence. This is the epistemological form in which the notion of "time" appears, not as a self-evident, linear form of extension, but, rather, as a relative form of extension rooted in experimental physical science, rather than a merely formal mathematics. 

In contrast, an hypothesis exists, relative to its theorem-lattice, as independent of time, as seemingly "eternal." 

Hence, in the adoption of an hypothesis, we have implicitly adopted the past, present, and future propositions, theorems, events, etc., implicit in it. The case for higher hypothesis is an analogous one. Hence, the decisions we make in generating validatable principles of nature, have the form of letting the future consequences of our actions guide our present actions: apparent time-reversal

Such is the notion of laws of the universe. To the degree our perception of such laws is accurate to within a given number of future centuries, millennia, and so forth, the corresponding future, acting through us, is acting upon the present. This appears to be "teleology," but, as we shall now indicate, a far different type of teleology than that which is sometimes brushed against, briefly, in the undergraduate philosophy semester. 

--Kepler & The "Three-Body Problem"-- 

There are two points of caution to be emphasized at this juncture. First, we must consider the possibility, that not only do mankind's notions of laws of the universe change, but, that the laws of the universe themselves may change in a more or less analogous manner. No sane scientist would be so reckless as to propose either a "Big Bang" creation, or a universe according to Hoyle: except, as he, or she presented such a thesis in the form of a question, such as: "Let us ask ourselves why some people are lured into adopting a piece of cosmic dogma as absurd at this? What, ladies and gentlemen, is the fallacy which is expressing itself in the putting-forward of such absurdities?" 

Consider the setting for what is frequently identified as the "three-body problem." 

The idea of a universal gravitation was introduced by Johannes Kepler in 1609, in his The New Astronomy. This was a notion which he linked, there, to the phenomenon of magnetism.(30) Kepler derived an expression for gravitation from his famous three laws. Newton and his associates later plagiarized this, Kepler's discovery of gravitation, and derived the famous Newtonian "law of gravitation" as an algebraic manipulation of Kepler's original formulation.(31) 

The seemingly curious result of the English empricists' plagiarism is, that Kepler's notion of the ordering of the solar system worked, but Newton's plagiarized, reductionist, algebraic derivation did not. The failure of Newton's method is a paradox known as "the three-body problem." The solution to that paradox follows from the seemingly "teleological" argument we have outlined immediately above. 

The crucial issue permeating that paradox is the popular classroom fallacy identified as echoing Thomas Hobbes' implied ideological blind faith in the existence of linearization in the infinitesimally small.(32) To be as brief as the subject itself permits, consider the following question. 

Reference the example of Eratosthenes' estimate for the implied size of the Earth, from his estimating the circumference of the Earth from the curvature of a measured interval along the measured distance of the meridian-line between Syene and Alexandria in Egypt. Compare this with the method developed, and employed by Carl Gauss, to demonstrate that the newly discovered heavenly body, Ceres, was an asteroid with the harmonic orbital characteristics which Kepler had specified for a missing planet's orbit, between the orbits of Mars and Jupiter. Compare this with a generalized notion of curved surfaces developed for astrophysics, geodesy, and geomagnetism, by Gauss. The question is, can we infer the trajectory of the entirety of a lawful motion from the curvature of an observed small interval of that trajectory? Or, in the alternative: is the orbit determined, from instant to instant, by the mechanical (e.g., "Newtonian") interaction of bodies and related forces? 

Kepler's argument, derived from the line of thinking of such adopted predecessors as Nicholas of Cusa, Luca Pacioli, and Leonardo da Vinci, was that the lawful orbits of the solar system were predetermined as knowable pre-orderings. In Kepler's work, this notion underwent expression in differing forms. However, throughout, his principle was that these orderings, which we might associate with the principle of Analysis Situs, expressed an efficiently underlying, not-entropic principle. On this account, the entirety of his astrophysics coheres with the view, that it is feasible, on principle, to derive a measured curvature of a lawful orbit within a very small interval of observation, to such effect that we can adduce the entirety of that trajectory from the characteristic curvature of least action in that small interval. 

This is the same method underlying the mathematical tactic developed by Gauss, by means of which he solved the orbit of Ceres. 

In other words, the determination of the apparent change (curvature) in a lawfully-determined trajectory, as distinct from a kinematically determined one, is of the order of an higher hypothesis, relative to any single hypothesis governing a mechanical approximation of a portion of that trajectory. 

Now, turn immediately to the case of the characteristic curvature of interaction of the human species with the universe at large. As we have indicated, the characteristic action, which distinguishes the human species from the higher apes, is the anti-entropic impact of the generation of a validatable principle within what we have located as Step Two of the Four-Step process of discovery of such a principle. This represents a change of curvature, distinguishing the human species absolutely from all other species. This determines the specific physical-space-time curvature of the human species' existence. 

This curvature is located immediately in the very small: within the cognitive processes of an individual mind, within a monad

The same principle serves us, as it distinguishes particular processes which are living, from particular processes which are not. What is the difference between the characteristic of a carbon atom as a functional part of a living process, as distinct from the same carbon atom which has moved on to become part of a non-living process. Categorically, consider the distinction between anti-entropic and entropic processes as a matter of "curvature" in the Gauss-Riemann sense of that term. 

Consider another useful illustration of the issue, before turning directly to the relevance of this to the subject of "environmentalism." 

Consider the rather commonplace, fallacious argument, that thermonuclear fusion of like-charged material is fatally resisted by mutually repulsive "Coulomb Forces" in the vicinity of atomic-nuclear distances. On what authority is it asserted, that the simple "Coulomb Force" operates throughout the atomic-nuclear scale as it appears to act on the macro scale? Wilhelm Weber's successful demonstration of the efficiency of an electrodynamical agency called the "longitudinal force," more than a hundred years ago, had already shown that the assumptions usually attributed to the "Coulomb Force" do not operate in that same fashion once a certain smallness of distance from the nucleus has been reached. 

From many analogous examples, it should have been clear to all serious scientific thinkers, centuries before this time, that lawful physical principles are expressed as such, in manners which suggest that the present action according to such a principle of lawfulness, functions as if it were a response to a future state of that same process. In other words, from the kinematic standpoint of the empiricist or materialist, it must appear to the alarmed empiricist or materialist ideologue, that forces are also acting through time-reversal as we might imagine forces to be acting, contrarily, in the present, to generate future states. 

This paradox is demystified, immediately we introduce the notion of higher hypothesis. Relative to any sequential mathematical scheme cohering with a consistent hypothesis, the relevant hypothesis is operating with relatively equal efficiency, simultaneously, in past, present, and future. Relative to any ordered sequences of hypotheses, or of the changes in physical states corresponding to such sequences of hypotheses, the implied higher hypothesis is fixed as operating, simultaneously, and efficiently, in past, present, and future. Look at the Crab Nebula, for example, with regard to the anomalous case of the attributed speed-of-light distances among the component points of that coherently changing object. 

Is this merely the present author's conjecture? Not at all. It would appear to be merely conjecture, only if one commits the blunder of accepting Aristotle's fraudulent notion of the detached observer. Once we recognize that scientific knowledge is obtained, not by contemplating the universe, but by studying how we may generate those thoughts which enable us to efficiently act to change the universe, then the principles of cognition underlying the discovery of lawful physical principles, are the epistemological basis for defining the underlying determination of validatable physical laws. 

--Examples from Physical Economy-- 

Pedagogically, the simplest and clearest, experimental demonstrations of the issues and principles, are from my field of specialization, the domain of the science of physical economy. The most economical choices of examples, reference five historical cases: The revolutionary reconstitution of France by King Louis XI (1461-1483); the first science-driver model of economic growth, that directed by France's Minister Jean-Baptiste Colbert; the invention of the steam-powered industrial revolution, by Gottfried Leibniz, during his work of 1671-1716; the 1792-1814 science-driver "crash program" devised and directed by France's Leibnizian, Lazare Carnot, the originator of the machine-tool industry, and his former teacher and collaborator Gaspard Monge of the 1794-1814 Ecole Polytechnique; and, the American model of modern industrial economy, the most successful form of economy developed to the present date, that 1861-1876 development, devised by economist Henry Carey, begun under President Abraham Lincoln, and successfully introduced to Japan, Germany, and Russia during the 1870s. For our purposes here, we sum up the principles adducible from the Carey-Lincoln, updated version of that Franklin-Hamilton model of the Leibnizian "American System of political-economy," the updated version developed in the 1861-1876 U.S.A., and then copied by Germany and Russia, in cooperation with Henry C. Carey, beginning 1876. Again, that latter was the model which made the U.S. economy the most powerful nation-state economy of the world, and the technologically most advanced, during the course of the 1861-1876 industrial revolution.(33) 

In the course of summarizing that point, we bring the discussion to focus on two crucial expressions of policies which have destroyed the U.S. economy and popular culture, including a correlated general degeneration in religious culture, during the recent thirty-odd years. We come to those cases at the appropriate point below. 

The secret of the highest rates of progress in conditions of life of a nation and its population, is typified by the program of the Ecole Polytechnique under Gaspard Monge's direction. The center of that program was the education of what were named "brigades" of adolescent students, producing, thus, the most advanced and powerful center of scientific and engineering work in the world up to that time. For the key to this success, refer to the Four-Step model of original discovery and education, which I have outlined above. Refer also, to Lazare Carnot's invention of the machine-tool principle of high-precision mass-production, which France's "Organizer of Victory" Carnot introduced, during 1792-1794, to effect the rout of all of the armies invading France at that time, and to suddenly, during that two-year period, establish the armies of France as the most powerful and technologically most advanced in the world at that time. 

Trace the combined role of a Four-Step model of Classical secondary and higher education and the machine-tool-design principle, within the setting of the Carey-Lincoln economic revolution of 1861-1876 and its emulation by Germany and by the Mendelyeev-Witte faction of industrial development in 1876-1905 Russia. Compare this to Franklin Roosevelt's economic revolution during World War II, and with the German-American aerospace program of the 1945-1966 interval.(34) See accompanying Figure 1, a flow-chart outlining the principles of a machine-tool-design driven economy. 

It is the development of the (creative) cognition of the individual student's mind, through repeated experiences of the type described by the Four-Step method, which enables those students to focus the thus-developed creative powers corresponding to Step Two, for the solution to problems posed in the form of an ontological paradox of science, or a metaphor of Classical art-forms.(35) As Figure 1 portrays, that generation of validated discoveries of principle, which occurs as a product and, largely, a by-product of such Classical-humanist modes of secondary and higher education, produces both a flourishing of new machine-tool-design principles, and also a highly adaptive labor-force, capable of mastering the newly introduced technologies. 

That example typifies the fact, that a sustainable net profit of a national economy is generated only through the anti-entropic impact of this, or related modes of proliferation and investment in the benefits of scientific and technological progress. The agency which generates that realized anti-entropy, is the agency expressed as Step Two of the Four-Step process. 

This agency, this sovereign cognitive potential of the individual person, is the location of that which defines man and woman as each made in the image of God. The passion associated with the kind of creative activity represented by Step Two, is termed agape in the Classical Greek of Plato and the Apostle Paul. Paul's I Corinthians 13, exemplifies that principle as at the center of all Christianity. Christianity is love of agape, both agape as the passion of characteristically, distinctive human activity (i.e., creative cognition), and fostering of that quality which is "made in the image of God," agape, within each human being. 

In contrast, it is fair to describe social theories such as the definitions of "human nature" by Thomas Hobbes, John Locke, Bernard Mandeville, and Adam Smith, as satanic. The idea of "free trade," or, as Francois Quesnay terms it, laissez-faire, is best understood by recognizing it as a relic of Manicheanism, the Bogomil form of Manicheanism most notably. This represents a denial of the efficient existence of that which defines the individual as "made in image of God," creative cognition, and the substitution of lusty bestial alternatives, such as the Seven Deadly Sins, for agape, in the ordering of the interpersonal behavior of society. So, "free trade," is nothing other than the doctrine Mandeville and his devotee von Hayek described it to be; they insist, that good comes only from giving unrestricted license to evil. For them, there is nothing in man, but the linear extension of those passions which are associated with the Seven Deadly Sins. The essence of Friedrich von Hayek's satanic definition of "freedom," were best described by the motto, "Let the inner sow loose!" 

Thus, as institutionalized practice shapes the expressed "curvature" of the individual person in society, so, we determine the characteristic feature of the trajectory which the history of that society will follow. 

Look at the general principle once again, in light of that example from the domain of political-economy. 

In determining the nature of the lawful interactions among ostensibly non-living, living, and cognitive processes, we must proceed by recognizing that these processes are distinguished from one another in terms of differing characteristic physical-space-time curvatures, notably in their infinitesimally small intervals of action. If one, then, proposes to define an interaction among processes of such characteristically, mutually distinct curvatures, by use of "models" which arbitrarily presume mechanistically linear interactions in the very small, the resulting calculation can be guaranteed to be absurd, totally false to reality. 

Notably, if one projects "environmental" calculations which leave out the role of human cognition in the technological development of economy, the resulting judgment on the relations between man and nature will be totally false to reality. The spread of disease, as a result of the banning of DDT, the increase of morbidity rates in populations around the world, as a result of impact of the "Ozone Hole" hoax on refrigeration of the food-delivery again, and the threatened accelerated increase of death-rates, globally, from the "Global Warning" fraud, are warnings of the dangers involved in linearizing thinking about the living processes. 

It is more than fair to sum up that point, respecting the fallacy of linearization, thus: If one assesses the impact of economy upon ecology, by reference to any of the generally accepted varieties of classroom economics doctrine today, the resulting conclusion is necessarily a fraudulent one. 

--In The Matter of Proof-- 

In the matter of what might be termed carelessly "environmental science," there are two broad classifications. 

One, is the standard of scientific proof generally accepted by specialists in the relevant fields prior to 1962-1972. Proofs from this quarter may have their problematic features, but the standard of practice from that period was "within the ball-park" of truthfulness and competence. This standard worked, not because the mathematics employed was particularly good; it worked, usually, despite bad mathematical models, because the standard applied for purposes of policy shaping, was that of crucial-experimental demonstration of principle, rather than reliance on mathematical models as such. 

The second standard, is the ideological one associated with the influential "1001 Club" which was established under Britain's consort Prince Philip and the Netherlands drone Prince Bernhard, as adjuncts to the 1961 founding of the World Wildlife Fund. The prescriptions of this second standard, are usually not merely incompetent, but outright hoaxes. Three prominent examples of such frauds are those just cited above: Rachel Carson's fraudulent allegations against DDT --for which no scientific proof was ever supplied, F. Sherwood Rowland's "Ozone Hole" hoax, and the "Global Warming" hoax. 

One of the most revealing case-studies is found in the campaigns against the use of nuclear fission as a source of energy. In response to the critics' question: Whence shall we secure the needed energy-supplies to replace nuclear-fission sources, the replies by the anti-nuclear propagandists were invariably either frauds or simply the foolish babbling of wild-eyed illiterates. 

The most crucial, rule-of-thumb parameters for defining the principal energy-sources of society, are power per kilogram of fuel, and "energy-flux density" in the available mode of generation of usable power. That is, in the latter case, the amount of usable energy-flow passing through a cross-sectional area per second. Kilowatts per square centimeter, is one such rule-of-thumb measurement. This is a notion as old as the Ecole Polytechnique's Sadi Carnot, and as durable. The higher, and the more coherent the organization of the energy-flux-density, the more efficient the energy-flow per watt-hour transmitted. 

The issue is not simply crude heat-efficiency, but the relationship of energy-flux-density in the very small to threshhold values for certain types of physical reactions. Thus, the level of technology, and thus of average productive powers of labor, which could be achieved, is constrained by considerations of energy-flux-density, related considerations of coherence, and so forth. 

Today, for example, in physics generally, the forseeable future improvements in energy-sources, are, first, successive generations of improvement in controlled thermonuclear fusion, and, second, the calculably still-higher orders of energy-density, if it proves possible to control a matter/anti-matter reaction as an energy-source in, for example, inter-solar-system and stellar explorations. 

In these matters, the political proponents of "soft" energy-supplies are illiterate fanatics. Politically, they are dangerous llliterates. They typify a society which has substituted the "encounter group's" notion of "sensitivity," for both truthfulness and even sanity. In other words, they are essentially immoral people. If they are to be judged "sincere," then one must say that they are as "sincerely immoral," as, perhaps, the followers of Satan should be. 

Certainly, among the principal authors of the modern "anti-technology" cults, British Consort Prince Philip, the 1961 cofounder of the World Wildlife Fund and "1001 Club," is utterly evil, as also the Prince Bernhard of the Netherlands, the other cofounder, who took time on the day of his wedding to a Dutch princess, to sign, "Heil Hitler," in a letter of resignation from the Nazi SS, which he sent personally, directly to Nazi F\xFChrer Adolf Hitler. Over the recent decades, such would-be Mephistopheles have succeeded in recruiting a large number of would-be Fausts. To wit: 

Generally, through the influence of foundations and other ideologically motivated institutions fitting the same paradigm as the "1001 Club," a kind of industry of environmentalist hoaxes has been established. Graduate students and others have found that the easy way to make a living, is to go on the payroll of an institution which wishes to have putative scientific support for one or more of these "environmentalist" hoaxes. 

Rowland is notable, not only because his personal celebrity was built around such corrupt practices, but because he typifies the way in which computer technology has been misused, as a substitute for science, in the concocting of the fraudulent studies produced by professionals who have prostituted themselves to making their careers as the equivalent of call-girls or street-walkers in this manner. The fact that this corrupt practice has proliferated as long as it has, lends to the "environmentalist" juvenile delinquents of yesterday's pseudo-science that bit of balding and touch of snow in the thatch which is too often mistaken by the credulous onlooker, for sign of mature judgment. Through the personal success of Rowland, and the growing cheapness of modern personal computers, the "Ozone Hole" hoax has made the fraud of the "computer model" the fashion leader of the "environmentalist" industry. 

When a person has become immoral, in such ways as we have indicated here, it is not required that we also prove them corrupt. To call the Devil wicked, it is not necessary to prove that he takes bribes. In response to the thought expressed by one questioner, doubtless a commonplace thought: to prove that Satan is evil, it is not necessary to discover that he, or a slave-owner, for example, has been bought. 

Let us conclude with a relevant observation on that concluding topic, the topic of wickedness. Too often, when a horrifying type of crime has been committed, too many speculate on what they imagine might have been the motive of Hobbesian or Lockean "self-interest" which might have motivated the perpetrator to such hideous extremes. The exemplary word of caution to those who dupe themselves into playing such parlor games, is the fact, that sometimes a killer kills because he enjoys killing, and kills in an extraordinarily nasty way, because his impulse will not be gratified otherwise. Sometimes it is less that the victim has evoked hate, than that hate has sought out a convenient victim for its expression. No one kills out of "impersonal motives," and in times when the greatest degree of evil is afoot, it is increasingly the case, that malice arises from perceived issues of "self-interest," less and less often, than the expression of malice has become, in itself, the perpetrator's "self-interest." 

Many environmentalists are honestly illiterates, of whom we might say, "They know no better." Many, like most of the "radicals" of the 1964-1968 campus ferment, were brainwashed into what they became, because of the explosions of lability, suggestibility, and desire for flight from reality, induced by such triggering factors as the 1962 missiles-crisis, the assassination of President John F. Kennedy, the nightly horror of TV footage from Vietnam, and the murder of Rev. Martin Luther King, Jr. However, those, such as Princes Philip and Bernhard, or Dame Margaret Mead, who preyed upon these victims, in order to induce in these unfortunates the aberrant states, were purely evil persons, whose motivation was malice per se. 

In any case, when the habit of rejecting truthfulness becomes a functional state of mind, the condition of moral corruption has already taken command of that personality. That evil mind then needs no special consideration to be prompted to express the quality which that mind had acquired. 

NOTES 

1. (Washington, D.C.: 21st Century Science Associates, 1992). 

2. ibid. 

3. The "politically correct" language codes introduced at some leading, present-day universities, typify of contemporary definitions of "political correctness" cum "mainstream opinion," and recall George Orwell's fictionalAnimal Farm and 1984. The non-fictional, real-life precedent for the today's "mainstream opinion" was the Josef Goebbels' Nazi Propaganda Ministry. 

4. As I emphasized to my students in each of the courses on economics which I taught at sundry campuses during 1966-1973, Marx noted that his "model" excluded consideration of "the technical composition of capital;" that "exclusion" is the formal root of the fallacy of his models of "extended reproduction" and "falling rate of profit." That admission reflects his exclusion of the relevant cognitive principle from his scrutiny. The use of the term "Manichean," to identify Hobbes, Adam Smith, the Mont Pelerin Society, et al., is neither simile, nor hyperbole; from Thomas Hobbes' Leviathan onward, the entirety of the English and British empiricist and the Franco-Austrian positivist doctrine for economics, is derived explicitly from the continuing influence of the notorious, neo-Manichean, Bogomil cult in the region of Toulouse and the Rhone. Bernard de Mandeville's Fable of the Bees, the official "Old Testament" of Friedrich von Hayek's Mont Pelerin Society, is explicit in its translation of Hobbes' "each in war against all," into that doctrine, that good comes spontaneously from awarding evil practices the license of laissez-faire. Like all varieties of Manicheanism, the premise of the argument of Hobbes, John Locke, Bernard Mandeville, Adam Smith, Jeremy Bentham, John von Neumann, and others, is that Satan rules the universe of the flesh (the material realm), while God (pending some Judgment Day) is confined to the smaller, ineffable realm of spiritual life, within the person, family, and church. Hence, the argument of these Manicheans, such as U.S. Associate Supreme Court Justice Antonin Scalia, that no moral purpose must be superimposed upon Satan's church, "the marketplace" of Michael Novak's economics theology. 

5. Lyndon H. LaRouche, Jr., "The coming Pearl Harbor effect," Executive Intelligence Review, Sept. 12, 1997. 

6. ibid. Also, Hartmut Thieme, "Lower Paleolithic hunting spears from Germany," Nature, Vol. 385, Feb. 27, 1997, p. 807.

7. The term, "LaRouche-Riemann Model," was introduced at a November 1978, New York City meeting of representatives of both Executive Intelligence Review (EIR) and the Fusion Energy Foundation (FEF). The topic of that meeting was the securing of declassified Soviet reports which showed that the Soviet design for the "hydrogen bomb" had relied upon the principles of isentropic compression derived from Bernhard Riemann's \xDCber die Fortpflanzung ebener Luftwellen von endlicher Schwingungsweite [Bernhard Riemanns Gesammelte Mathematische Werke, H. Weber, ed., (New York: Dover Publications reprint, 1953), hereinafter identified as Riemanns Werke: pp.156-175]. The focus of the discussion was the stubborn adherence to doctrines axiomatically premised upon the absurd axiomatic presumption of linearization in the infinitesimally small, prevailing among otherwise gifted circles of leading plasma physicists and others engaged in aspects of fusion-energy development. Into this discussion, the present writer pointed out two suggested practical considerations. First, that the Riemannian shock-wave effect is also characteristic of the domain of physical economies, where it is expressed in transitions to higher technological domains, and, also, collapses into lower states. Second, that the principles applicable to relevant plasma problems could be illustrated by a quarterly, computer-assisted forecasting model for the U.S. economy which EIR could produce, with cooperation from FEF scientists. The present author supplied the set of constraints to be used in transforming U.S. official data into the form needed to produce such forecasts. However, it must be recognized by all concerned, that the measurements to be made in connection with that modelling, must be interpreted from the standpoint of the implications of Riemann's 1854 habilitation dissertation, \xDCber die Hypothesen, welche der Geometrie zu Grunde liegen, Riemanns Werke, pp. 272-287. Hence "LaRouche-Riemann Model." That latter name was used for the most successful of any published quarterly forecast reports for the U.S. economy, from late 1979 through the Third Quarter of 1983. As I informed a nationwide TV audience during early 1984, the forecasting was dropped at the close of 1983, because of the wildly fraudulent, "cosmetic" statistical practices of the U.S. Government and Federal Reserve System, introduced during the closing period of 1983. The crucial issue there, "linearization in the very small," is also the crucial issue in this present report. See below. 

8. LaRouche, op. cit. Also, A. Chaitkin, Treason in America, (New Benjamin Franklin House, New York, 1984).

9. There is no true art, in any form, without metaphor. Metaphor is not merely a required feature of all art; it is the common, principal subject-matter of the entirety of any and all works of art. "Classical" is rightly employed as a term derived from reference to ancient, Classical Greece, in the latter's role as the origin of all of European civilization's post-Archaic art-forms, to the present date. We are obliged to employ the qualifying term, "Classical art," because of a misguided, widespread opinion, the latter which includes works violating the Classical principle of metaphor under the rubric "art." 

10. \xDCber die Hypothesen, welche der Geometrie zu Grunde liegen, op. cit. 

11. On Analysis Situs: various locations. 

12. Riemanns Werke, pp. 509-538. 

13. In the Carl Friedrich Gauss Werke (Hildesheim-New York: Georg Olms Verlag, 1981), the relevant Gauss writings, as known to Bernhard Riemann during the 1850s, are to be found as follows: biquadratic residues: Vol. II, pp. 65-148; curved surfaces: Vol. IV, pp. 188-334; on the influence upon Riemann exerted by Gauss's work on hypergeometric series, see Riemann's Vorlesungen \xFCber die hypergeometrische Reihe, Riemanns Werke, pp. 69-108. Compare the latter paper of Riemann with Gauss's diagrams, and the commentator's associated text, as presented on pages 102-104 of Ludwig Schlesinger's \xDCber Gausses Arbeiten zur Funktiontheorie [Werke, Vol. X] 

14. ibid. 

15. To qualify the use of Leibniz's term, Analysis Situs, here: The writer's discoveries of the 1948-1951 phase of his project of refuting Wiener, et al., defined "Analysis Situs" as follows. In examining the way in which mankind's continued existence depends upon successful interaction with the universe at large, scientific method must proceed from recognition that the evidence to be considered touches three distinct qualities of function, as these are expressed in terms of three distinct qualities of specific forms of empirical evidence. The specific forms of empirical evidence are assorted among: 1) Relations which are knowable directly through sense-perception: Macrophysics; 2) Relations in the very large, which can not be observed directly through the senses, Astrophysics, 3.) and, relations in the very small, which lie totally beyond the reach of direct activity of the senses, Microphysics. The functional distinctions, encountered in all three of the foregoing forms, are: A.) particular processes which are ostensibly non-living in themselves, including non-living organic processes; B.) particular processes which are ostensibly living; C.) cognitive processes. Thus, Analysis Situs pertains to all possible, functionally significant permutations among the nine "cells" defined by this three-by-three array. Ultimately, Analysis Situs is the notion that a single ordering-principle implicitly subsumes the ordering of all those permutations. This higher ordering-principle is equivalent to Plato's notion variously identified as "Becoming," or "hypothesizing the higher hypothesis." 

16. As indicated by Riemann, in his referenced, 1854 habilitation dissertation, we can not derive the metrical characteristics of physical space-time merely from the dimensionality of the manifold. We must also consider the non-linear colligation among the physical principles represented by these dimensions. In other words, we must measure, experimentally, the metrical characteristics of the actual physical space-time representation by the manifold. The methods employed for this purpose by Carl Gauss, as in adducing the orbit of the asteroid Ceres, exemplify the conceptual approach required. 

17. It was also established by these Greek mathematicians and astronomers, long before the frauds of the hoaxster Claudius Ptolemy, that the Earth orbitted the Sun. 

18. e.g., angular differences between successive plumb-bob lines of the series of sundials. 

19. e.g., in first estimate, the radius is assumed to correspond to the plumb-bob line: were the Earth a sphere, and could one assume that the gravitational "forces" to be considered were, for practical purposes, those assumed by Isaac Newton's crude notions. 

20. In the case that the curvature within a very small interval of continuing (but, not necessarily "continuous") action, is non-constant, we are approaching the transition from the curvature of conic sections into the domain of hypergeometric, modular cases of "compounded," non-constant curvatures. For a simple example, the product of a cycloid and a conic section. This is a crucial, relevant point, addressed below. 

21. In referring to "Wiener's hoax," we are not attacking his useful outline of principles of design of automatic control systems; his hoax was his act of sleight-of-hand, in claiming that all human knowledge could be reduced to the mechanistic terms of such automatic control systems. This was the same blunder made by Wiener's fellow-Russell acolyte, John von Neumann, both in advancing his 1938 claims to have discovered the secret of all economy in "systems analysis," and his later emulation of Wiener's "information theory" hoax, in defending the delusion of "artificial intelligence." 

22. As I have identified the definition of economic anti-entropy above. 

23. If one wished to insist upon the strictest term, the choice would be "Leibniz-LaRouche-Riemann Method." On the basis of internal features of his work, Riemann was as wholly indebted to an adolescent grounding in Leibniz as I was. It was that commonality of grounding which led us, along different tracks of investigation, to converging conclusions, respecting the notion of a physical geometry, as distinct from a merely formal one. 

24. Executive Intelligence Review, Oct. 11, 1996. Later republished in Fidelio, Winter 1996. 

25. Which has an important relationship to the work of Heraclitus, but no principled congruence with the so-called "dialectical method" of Immanuel Kant, G.W.F. Hegel, or Karl Marx. Kant and Hegel are followers of the anti-Plato, reductionist, Aristotelean dialectic, and Marx is in the same genre. 

26. The institution of the Papacy had been wrecked by the Fourteenth Century "New Dark Age" and its aftermath. Theologian Cusa, who had been a member of the so-called Conciliar movement, through his writing on the principles of the modern nation-state, Concordancia catholica, was self-persuaded by this very line of argument that the Christian Church must be reunited around a common principle represented by a single, common spokesman. This led to the reestablishment of the formerly shattered Catholic Church itself, through the initial successes of the Council of Florence. Cusa had aimed to bring the eastern and Latin rites together in reconciliation, around agreement to the so-called "Filioque" principle of the Augustinian reading of the Nicene Creed. Through his scholarly work in Byzantine centers, Cusa turned up Byzantine documents which proved to leaders of the eastern Rite, that Byzantium, according to its own documents, had been in error in opposing the Augustinian doctrine. The result was the temporary reunification of the eastern and Latin rites effected during the 1439-1440 sessions of the great ecumenical Council of Florence. In this process, Cusa's work in Greek scientific manuscripts (many among which had been lost to the west since the 1350 death of the Hohenstaufen Emperor Frederick II), led to his formulation of the principles of modern experimental physical science. 

27. To relieve some readers of the mistaken apprehension that I have overlooked certain relevant mathematical matters: I have shown elsewhere, repeatedly, that the commonly taught (and credulously believed) dictum, that the discovery of the "transcendental" character of pi was due to the successive work of Leonhard Euler, Lambert, Hermite, and Lindemann, is a myth built upon a series of frauds, beginning with Leonhard Euler's defense of Dr. Samuel Clarke's argument on this account. 

28. In reality, it is easily shown, by references to complexities of compounded orbits, that the cycloid approximates, but is not actually representative of an isochronic principle. The actual isochronic curvature brings us immediately into the domain of the catenary. 

29. i.e., epistemology 

30. Johannes Kepler: New Astronomy, William H. Donahue, trans. (Cambridge: Cambridge University Press, 1992). 

31. Cf. Lyndon H. LaRouche, Jr., The Science of Christian Economy (Washington, D.C.: Schiller Institute, 1991), pp. 374-377, 470-473. 

32. Hobbes was educated in mathematics by the personal lackey, Galileo Galilei, of the Ockhamite nominalist, the de facto post-1582 ruler of Venice, Paolo Sarpi. Sarpi, whose leading allies in England at that time, featured the Cecil family, and, therefore, Francis Bacon, was the actual teacher whose notions of physics were faithfully copied and presented by Galileo. Hobbes, a very, very intimate associate of Francis Bacon, applied the mechanistic misconceptions of causality which he had learned from Galileo, to social processes. This produced the mechanistic, "statistical gas theory" view of social process made infamous by Hobbes' assertion of "each in war against all. The coupling of this mechanistic notion with the method which Descartes, another Sarpi network asset, employed in plagiarizing what is known as "Cartesian geometry," is the axiomatic basis upon which depend the notion of linearization through infinite series later defended by Dr. Samuel Clarke and Leonhard Euler, and the introduction of the "limit theorem" by Augustin Cauchy, et al., in their fanatical attacks upon the work of Gottfried Leibniz. 

33. Although the 1876 U.S. was the most advanced, and most powerful nation-state economy of the world, in totality, and per capita of labor force, the most powerful political and financial agency of the world was the British Empire and its London-centered Anglo-Dutch international financial oligarchy. In per capita values, the United Kingdom of 1876 was vastly inferior to the U.S.A., to say nothing of the basis of London's power, in the misery imposed upon its imperial and other victims abroad. But for the treasonous elements, such as the House of Morgan and the August Belmont influence, serving as British agents inside the U.S.A., London could not have succeeded in creating, "George Soros" style, the financial crisis of 1873, nor in corrupting a sufficient number of members of the U.S. Congress to pass the Specie Resumption Act and related "British gold standard" measures which kept the U.S. in chronic financially-induced economic depression-cycles during the 1877-1907 interval. 

34. Under heavy pressure from the pro-"systems analysis" forces within the U.S. "establishment," the U.S. government introduced heavy cut-backs into the U.S. aerospace program, beginning 1966-1967. U.S. aerospace progress since 1967-1969, has been chiefly, overall, coasting downhill, presently nearing absolute bottom. 

35. Even in the educational programs which have gone from bad to worse in this century's evolution of U.S. secondary and higher education, a similar benefit may occur through the personal initiative of an egregious student who rejects the generally accepted classroom and textbook methods of those institutions, and prefers to work through rediscoveries of principle independently, by some approximation of the same Four-Step method. In such a case, as experience of education during the recent five decades typifies, the result is that most of the graduates will learn to sing for their supper (a paid career), not for the benefit of music (science); a dwindling handful will be committed to truthfulness in knowledge. 
 
 

Back to top

clear
clear
clear