Joined
Mar 8, 2020
Messages
617
Reputation
252
Say what you want about his theories but he is no layman relating to general thought
oh yeah he's definitely clever, but he hasn't actually done anything and if he was as revolutionary as he claimed he would've some impact on the sciences and math that he supposedly spend his time on now.

look up ramanujan. this guy has no predictions or anything relating to be some pompous gifted guy with the answer to everything he claims to be
 

HannibaI

where is clariese
HBO Manlets
Joined
Mar 7, 2020
Messages
177
Reputation
44
oh yeah he's definitely clever, but he hasn't actually done anything and if he was as revolutionary as he claimed he would've some impact on the sciences and math that he supposedly spend his time on now.

look up ramanujan. this guy has no predictions or anything relating to be some pompous gifted guy with the answer to everything he claims to be
I have to be blunt and admited i have not looked into alot of his work. But from what i have seen and how he talks is very impressive. For an example is a remark about gorillas being smarter then somalians and skull circumference correlating with intelligence. I think these 2 indicate he has many opinions that align with my own to some degree to be intresting with his status of smartest man in america.
 

GigaLarp

ioi king
Masculinity Crew
Joined
Jan 30, 2020
Messages
1,664
Reputation
980
I have to be blunt and admited i have not looked into alot of his work. But from what i have seen and how he talks is very impressive. For an example is a remark about gorillas being smarter then somalians and skull circumference correlating with intelligence. I think these 2 indicate he has many opinions that align with my own to some degree to be intresting with his status of smartest man in america.
:glasses: :clap:
 

missing_person

Incel
HQNP
Joined
Mar 14, 2020
Messages
112
Reputation
40
That obese IQ frauding pedophile faggot? Yea sounds about right
except I'm
- not obese
- not frauding my iq like that scamming retard who got laughed at by the entire academia for his ridiculous word salads
- the only person who was constantly debunking low iq pedos and their degeneracy here
- probably the only non-faggot here as well
 

Userr

14yr/rateme/john kramer/userr out
Autists
Joined
Feb 13, 2020
Messages
17,953
Reputation
6,208
except I'm
- not obese
- not frauding my iq like that scamming retard who got laughed at by the entire academia for his ridiculous word salads
- the only person who was constantly debunking low iq pedos and their degeneracy here
- probably the only non-faggot here as well
i made this thread and its a joke and chris lagan if fucking gigachad with 25.5 inch skull he isnt "obese" and ugly as you try to paint it. He isnt frauding shit its confirmed smartest man on earth.

1584208821948.png

1584208849549.png
 

missing_person

Incel
HQNP
Joined
Mar 14, 2020
Messages
112
Reputation
40
i made this thread and its a joke and chris lagan if fucking gigachad with 25.5 inch skull he isnt "obese" and ugly as you try to paint it. He isnt frauding shit its confirmed smartest man on earth.

View attachment 15601
View attachment 15602
i know it was you

he isn't shit, just a media stunt. probably couldnt even get a phd with all his "intelligence". his ramblings are nothing but tautological babbling every literate person can produce.

and who gives a fuck about his skull when he looks like shit, completely invisible to hot women
 

Userr

14yr/rateme/john kramer/userr out
Autists
Joined
Feb 13, 2020
Messages
17,953
Reputation
6,208
i know it was you

he isn't shit, just a media stunt. probably couldnt even get a phd with all his "intelligence". his ramblings are nothing but tautological babbling every literate person can produce.
stop trolling his theories are out there and abstract but in general facts and terms he uses many corredct things like skull cicumfrence corelated with iq. somalians being dumber then gorrilas. eugenics to make a ideal world
 

missing_person

Incel
HQNP
Joined
Mar 14, 2020
Messages
112
Reputation
40
every 85 iq sfcel usually comes to the same conclusions when he attempts being smart. enough said.

and cage at him lifting. proof he's no higher than 110.
 

Userr

14yr/rateme/john kramer/userr out
Autists
Joined
Feb 13, 2020
Messages
17,953
Reputation
6,208
ever 85 iq sfcel usually comes to the same conclusions when he attempts being smart. enough said.

and cage at him lifting. proof he's no higher than 110.
its beyons sfcel and racial bias
 

missing_person

Incel
HQNP
Joined
Mar 14, 2020
Messages
112
Reputation
40
Set theory

See also: Von Neumann–Bernays–Gödel set theory



History of approaches that led to NBG set theory

The axiomatization of mathematics, on the model of Euclid's Elements, had reached new levels of rigour and breadth at the end of the 19th century, particularly in arithmetic, thanks to the axiom schema of Richard Dedekind and Charles Sanders Peirce, and in geometry, thanks to Hilbert's axioms.[60] But at the beginning of the 20th century, efforts to base mathematics on naive set theory suffered a setback due to Russell's paradox (on the set of all sets that do not belong to themselves).[61] The problem of an adequate axiomatization of set theory was resolved implicitly about twenty years later by Ernst Zermelo and Abraham Fraenkel. Zermelo–Fraenkel set theory provided a series of principles that allowed for the construction of the sets used in the everyday practice of mathematics, but they did not explicitly exclude the possibility of the existence of a set that belongs to itself. In his doctoral thesis of 1925, von Neumann demonstrated two techniques to exclude such sets—the axiom of foundation and the notion of class.[60]

The axiom of foundation proposed that every set can be constructed from the bottom up in an ordered succession of steps by way of the principles of Zermelo and Fraenkel. If one set belongs to another then the first must necessarily come before the second in the succession. This excludes the possibility of a set belonging to itself. To demonstrate that the addition of this new axiom to the others did not produce contradictions, von Neumann introduced a method of demonstration, called the method of inner models, which later became an essential instrument in set theory.[60]

The second approach to the problem of sets belonging to themselves took as its base the notion of class, and defines a set as a class which belongs to other classes, while a proper class is defined as a class which does not belong to other classes. Under the Zermelo–Fraenkel approach, the axioms impede the construction of a set of all sets which do not belong to themselves. In contrast, under the von Neumann approach, the class of all sets which do not belong to themselves can be constructed, but it is a proper class and not a set.[60]

With this contribution of von Neumann, the axiomatic system of the theory of sets avoided the contradictions of earlier systems, and became usable as a foundation for mathematics, despite the lack of a proof of its consistency. The next question was whether it provided definitive answers to all mathematical questions that could be posed in it, or whether it might be improved by adding stronger axioms that could be used to prove a broader class of theorems. A strongly negative answer to whether it was definitive arrived in September 1930 at the historic mathematical Congress of Königsberg, in which Kurt Gödel announced his first theorem of incompleteness: the usual axiomatic systems are incomplete, in the sense that they cannot prove every truth which is expressible in their language. Moreover, every consistent extension of these systems would necessarily remain incomplete.[62]

Less than a month later, von Neumann, who had participated at the Congress, communicated to Gödel an interesting consequence of his theorem: that the usual axiomatic systems are unable to demonstrate their own consistency.[62] However, Gödel had already discovered this consequence, now known as his second incompleteness theorem, and he sent von Neumann a preprint of his article containing both incompleteness theorems.[63] Von Neumann acknowledged Gödel's priority in his next letter.[64] He never thought much of "the American system of claiming personal priority for everything."[65]

Von Neumann Paradox

Main article: Von Neumann paradox

Building on the work of Felix Hausdorff, in 1924 Stefan Banach and Alfred Tarski proved that given a solid ball in 3‑dimensional space, there exists a decomposition of the ball into a finite number of disjoint subsets, that can be reassembled together in a different way to yield two identical copies of the original ball. Banach and Tarski proved that, using isometric transformations, the result of taking apart and reassembling a two-dimensional figure would necessarily have the same area as the original. This would make creating two unit squares out of one impossible. However, in a 1929 paper,[66] von Neumann proved that paradoxical decompositions could use a group of transformations that include as a subgroup a free group with two generators. The group of area-preserving transformations contains such subgroups, and this opens the possibility of performing paradoxical decompositions using these subgroups. The class of groups isolated by von Neumann in his work on Banach–Tarski decompositions subsequently was very important for many areas of mathematics, including von Neumann's own later work in measure theory (see below).

Ergodic theory

In a series of papers published in 1932, von Neumann made foundational contributions to ergodic theory, a branch of mathematics that involves the states of dynamical systems with an invariant measure.[67] Of the 1932 papers on ergodic theory, Paul Halmos writes that even "if von Neumann had never done anything else, they would have been sufficient to guarantee him mathematical immortality".[68] By then von Neumann had already written his articles on operator theory, and the application of this work was instrumental in the von Neumann mean ergodic theorem.[68]

Operator theory

Main article: Von Neumann algebra
See also: Direct integral

Von Neumann introduced the study of rings of operators, through the von Neumann algebras. A von Neumann algebra is a *-algebra of bounded operators on a Hilbert space that is closed in the weak operator topology and contains the identity operator.[69] The von Neumann bicommutant theorem shows that the analytic definition is equivalent to a purely algebraic definition as being equal to the bicommutant.[70] Von Neumann embarked in 1936, with the partial collaboration of F.J. Murray, on the general study of factors classification of von Neumann algebras. The six major papers in which he developed that theory between 1936 and 1940 "rank among the masterpieces of analysis in the twentieth century".[3] The direct integral was later introduced in 1949 by John von Neumann.[71]

Measure theory

See also: Lifting theory

In measure theory, the "problem of measure" for an n-dimensional Euclidean space Rn may be stated as: "does there exist a positive, normalized, invariant, and additive set function on the class of all subsets of Rn?"[68] The work of Felix Hausdorff and Stefan Banach had implied that the problem of measure has a positive solution if n = 1 or n = 2 and a negative solution (because of the Banach–Tarski paradox) in all other cases. Von Neumann's work argued that the "problem is essentially group-theoretic in character":[68] the existence of a measure could be determined by looking at the properties of the transformation group of the given space. The positive solution for spaces of dimension at most two, and the negative solution for higher dimensions, comes from the fact that the Euclidean group is a solvable group for dimension at most two, and is not solvable for higher dimensions. "Thus, according to von Neumann, it is the change of group that makes a difference, not the change of space."[68]

In a number of von Neumann's papers, the methods of argument he employed are considered even more significant than the results. In anticipation of his later study of dimension theory in algebras of operators, von Neumann used results on equivalence by finite decomposition, and reformulated the problem of measure in terms of functions.[72] In his 1936 paper on analytic measure theory, he used the Haar theorem in the solution of Hilbert's fifth problem in the case of compact groups.[68][73] In 1938, he was awarded the Bôcher Memorial Prize for his work in analysis.[74]
 

missing_person

Incel
HQNP
Joined
Mar 14, 2020
Messages
112
Reputation
40
Geometry

Von Neumann founded the field of continuous geometry.[75] It followed his path-breaking work on rings of operators. In mathematics, continuous geometry is a substitute of complex projective geometry, where instead of the dimension of a subspace being in a discrete set 0, 1, ..., n, it can be an element of the unit interval [0,1]. Earlier, Menger and Birkhoff had axiomatized complex projective geometry in terms of the properties of its lattice of linear subspaces. Von Neumann, following his work on rings of operators, weakened those axioms to describe a broader class of lattices, the continuous geometries. While the dimensions of the subspaces of projective geometries are a discrete set (the non-negative integers), the dimensions of the elements of a continuous geometry can range continuously across the unit interval [0,1]. Von Neumann was motivated by his discovery of von Neumann algebras with a dimension function taking a continuous range of dimensions, and the first example of a continuous geometry other than projective space was the projections of the hyperfinite type II factor.[76][77]

Lattice theory

Between 1937 and 1939, von Neumann worked on lattice theory, the theory of partially ordered sets in which every two elements have a greatest lower bound and a least upper bound. Garrett Birkhoff writes: "John von Neumann's brilliant mind blazed over lattice theory like a meteor".[78]

Von Neumann provided an abstract exploration of dimension in completed complemented modular topological lattices (properties that arise in the lattices of subspaces of inner product spaces): "Dimension is determined, up to a positive linear transformation, by the following two properties. It is conserved by perspective mappings ("perspectivities") and ordered by inclusion. The deepest part of the proof concerns the equivalence of perspectivity with "projectivity by decomposition"—of which a corollary is the transitivity of perspectivity."[78]

Additionally, "n the general case, von Neumann proved the following basic representation theorem. Any complemented modular lattice L having a "basis" of n ≥ 4 pairwise perspective elements, is isomorphic with the lattice ℛ(R) of all principal right-ideals of a suitable regular ring R. This conclusion is the culmination of 140 pages of brilliant and incisive algebra involving entirely novel axioms. Anyone wishing to get an unforgettable impression of the razor edge of von Neumann's mind, need merely try to pursue this chain of exact reasoning for himself—realizing that often five pages of it were written down before breakfast, seated at a living room writing-table in a bathrobe."[78]

Mathematical formulation of quantum mechanics

See also: von Neumann entropy, Quantum mutual information, Measurement in quantum mechanics § von Neumann measurement scheme, and von Neumann measurement scheme

Quantum mechanics
Part of a series on
i ℏ ∂ ∂ t | ψ ( t ) ⟩ = H ^ | ψ ( t ) ⟩ {\displaystyle i\hbar {\frac {\partial }{\partial t}}|\psi (t)\rangle ={\hat {H}}|\psi (t)\rangle }
{\displaystyle i\hbar {\frac {\partial }{\partial t}}|\psi (t)\rangle ={\hat {H}}|\psi (t)\rangle }

Schrödinger equation
Background[show]
Fundamentals[show]
Effects[show]
Experiments[show]
Formulations[show]
Equations[show]
Interpretations[show]
Advanced topics[show]
Scientists[show]
Categories[show]

Von Neumann was the first to establish a rigorous mathematical framework for quantum mechanics, known as the Dirac–von Neumann axioms, with his 1932 work Mathematical Foundations of Quantum Mechanics.[72] After having completed the axiomatization of set theory, he began to confront the axiomatization of quantum mechanics. He realized, in 1926, that a state of a quantum system could be represented by a point in a (complex) Hilbert space that, in general, could be infinite-dimensional even for a single particle. In this formalism of quantum mechanics, observable quantities such as position or momentum are represented as linear operators acting on the Hilbert space associated with the quantum system.[79]

The physics of quantum mechanics was thereby reduced to the mathematics of Hilbert spaces and linear operators acting on them. For example, the uncertainty principle, according to which the determination of the position of a particle prevents the determination of its momentum and vice versa, is translated into the non-commutativity of the two corresponding operators. This new mathematical formulation included as special cases the formulations of both Heisenberg and Schrödinger.[79] When Heisenberg was informed von Neumann had clarified the difference between an unbounded operator that was a self-adjoint operator and one that was merely symmetric, Heisenberg replied "Eh? What is the difference?"[80]

Von Neumann's abstract treatment permitted him also to confront the foundational issue of determinism versus non-determinism, and in the book he presented a proof that the statistical results of quantum mechanics could not possibly be averages of an underlying set of determined "hidden variables," as in classical statistical mechanics. In 1935, Grete Hermann published a paper arguing that the proof contained a conceptual error and was therefore invalid.[81] Hermann's work was largely ignored until after John S. Bell made essentially the same argument in 1966.[82] However, in 2010, Jeffrey Bub argued that Bell had misconstrued von Neumann's proof, and pointed out that the proof, though not valid for all hidden variable theories, does rule out a well-defined and important subset. Bub also suggests that von Neumann was aware of this limitation, and that von Neumann did not claim that his proof completely ruled out hidden variable theories.[83] The validity of Bub's argument is, in turn, disputed.[84] In any case, Gleason's Theorem of 1957 fills the gaps in von Neumann's approach.

Von Neumann's proof inaugurated a line of research that ultimately led, through the work of Bell in 1964 on Bell's theorem, and the experiments of Alain Aspect in 1982, to the demonstration that quantum physics either requires a notion of reality substantially different from that of classical physics, or must include nonlocality in apparent violation of special relativity.[85]

In a chapter of The Mathematical Foundations of Quantum Mechanics, von Neumann deeply analyzed the so-called measurement problem. He concluded that the entire physical universe could be made subject to the universal wave function. Since something "outside the calculation" was needed to collapse the wave function, von Neumann concluded that the collapse was caused by the consciousness of the experimenter. Von Neumann argued that the mathematics of quantum mechanics allows the collapse of the wave function to be placed at any position in the causal chain from the measurement device to the "subjective consciousness" of the human observer. Although this view was accepted by Eugene Wigner,[86] the Von Neumann–Wigner interpretation never gained acceptance amongst the majority of physicists).[87] The Von Neumann–Wigner interpretation has been summarized as follows:[88]


The rules of quantum mechanics are correct but there is only one system which may be treated with quantum mechanics, namely the entire material world. There exist external observers which cannot be treated within quantum mechanics, namely human (and perhaps animal) minds, which perform measurements on the brain causing wave function collapse.[88]
Though theories of quantum mechanics continue to evolve to this day, there is a basic framework for the mathematical formalism of problems in quantum mechanics which underlies the majority of approaches and can be traced back to the mathematical formalisms and techniques first used by von Neumann. In other words, discussions about interpretation of the theory, and extensions to it, are now mostly conducted on the basis of shared assumptions about the mathematical foundations.[72]

Von Neumann Entropy

Main article: Von Neumann entropy

Von Neumann entropy
is extensively used in different forms (conditional entropies, relative entropies, etc.) in the framework of quantum information theory.[89] Entanglement measures are based upon some quantity directly related to the von Neumann entropy. Given a statistical ensemble of quantum mechanical systems with the density matrix ρ {\displaystyle \rho }
\rho
, it is given by S ( ρ ) = − Tr ⁡ ( ρ ln ⁡ ρ ) . {\displaystyle S(\rho )=-\operatorname {Tr} (\rho \ln \rho ).\,}
S(\rho )=-\operatorname {Tr}(\rho \ln \rho ).\,
Many of the same entropy measures in classical information theory can also be generalized to the quantum case, such as Holevo entropy and the conditional quantum entropy.

Quantum mutual information

Quantum information theory is largely concerned with the interpretation and uses of von Neumann entropy. The von Neumann entropy is the cornerstone in the development of quantum information theory, while the Shannon entropy applies to classical information theory. This is considered a historical anomaly, as it might have been expected that Shannon entropy was discovered prior to Von Neumann entropy, given the latter's more widespread application to quantum information theory. However, the historical reverse occurred. Von Neumann first discovered von Neumann entropy, and applied it to questions of statistical physics. Decades later, Shannon developed an information-theoretic formula for use in classical information theory, and asked von Neumann what to call it, with von Neumman telling him to call it Shannon entropy, as it was a special case of von Neumann entropy.[90]

Density matrix

Main article: Density matrix

The formalism of density operators and matrices was introduced by von Neumann[91] in 1927 and independently, but less systematically by Lev Landau[92] and Felix Bloch[93] in 1927 and 1946 respectively. The density matrix is an alternative way in which to represent the state of a quantum system, which could otherwise be represented using the wavefunction. The density matrix allows the solution of certain time-dependent problems in quantum mechanics.

Von Neumann measurement scheme

The von Neumann measurement scheme, the ancestor of quantum decoherence theory, represents measurements projectively by taking into account the measuring apparatus which is also treated as a quantum object. The 'projective measurement' scheme introduced by von Neumann, led to the development of quantum decoherence theories.[citation needed]
 

missing_person

Incel
HQNP
Joined
Mar 14, 2020
Messages
112
Reputation
40
Quantum logic

Main article: Quantum logic

Von Neumann first proposed a quantum logic in his 1932 treatise Mathematical Foundations of Quantum Mechanics, where he noted that projections on a Hilbert space can be viewed as propositions about physical observables. The field of quantum logic was subsequently inaugurated, in a famous paper of 1936 by von Neumann and Garrett Birkhoff, the first work ever to introduce quantum logics,[94] wherein von Neumann and Birkhoff first proved that quantum mechanics requires a propositional calculus substantially different from all classical logics and rigorously isolated a new algebraic structure for quantum logics. The concept of creating a propositional calculus for quantum logic was first outlined in a short section in von Neumann's 1932 work, but in 1936, the need for the new propositional calculus was demonstrated through several proofs. For example, photons cannot pass through two successive filters that are polarized perpendicularly (e.g., one horizontally and the other vertically), and therefore, a fortiori, it cannot pass if a third filter polarized diagonally is added to the other two, either before or after them in the succession, but if the third filter is added in between the other two, the photons will, indeed, pass through. This experimental fact is translatable into logic as the non-commutativity of conjunction ( A ∧ B ) ≠ ( B ∧ A ) {\displaystyle (A\land B)\neq (B\land A)}
(A\land B)\neq (B\land A)
. It was also demonstrated that the laws of distribution of classical logic, P ∨ ( Q ∧ R ) = ( P ∨ Q ) ∧ ( P ∨ R ) {\displaystyle P\lor (Q\land R)=(P\lor Q)\land (P\lor R)}
P\lor (Q\land R)=(P\lor Q)\land (P\lor R)
and P ∧ ( Q ∨ R ) = ( P ∧ Q ) ∨ ( P ∧ R ) {\displaystyle P\land (Q\lor R)=(P\land Q)\lor (P\land R)}
P\land (Q\lor R)=(P\land Q)\lor (P\land R)
, are not valid for quantum theory.[95]

The reason for this is that a quantum disjunction, unlike the case for classical disjunction, can be true even when both of the disjuncts are false and this is, in turn, attributable to the fact that it is frequently the case, in quantum mechanics, that a pair of alternatives are semantically determinate, while each of its members are necessarily indeterminate. This latter property can be illustrated by a simple example. Suppose we are dealing with particles (such as electrons) of semi-integral spin (spin angular momentum) for which there are only two possible values: positive or negative. Then, a principle of indetermination establishes that the spin, relative to two different directions (e.g., x and y) results in a pair of incompatible quantities. Suppose that the state ɸ of a certain electron verifies the proposition "the spin of the electron in the x direction is positive." By the principle of indeterminacy, the value of the spin in the direction y will be completely indeterminate for ɸ. Hence, ɸ can verify neither the proposition "the spin in the direction of y is positive" nor the proposition "the spin in the direction of y is negative." Nevertheless, the disjunction of the propositions "the spin in the direction of y is positive or the spin in the direction of y is negative" must be true for ɸ. In the case of distribution, it is therefore possible to have a situation in which A ∧ ( B ∨ C ) = A ∧ 1 = A {\displaystyle A\land (B\lor C)=A\land 1=A}
A\land (B\lor C)=A\land 1=A
, while ( A ∧ B ) ∨ ( A ∧ C ) = 0 ∨ 0 = 0 {\displaystyle (A\land B)\lor (A\land C)=0\lor 0=0}
(A\land B)\lor (A\land C)=0\lor 0=0
.[95]

As Hilary Putnam writes, von Neumann replaced classical logic with a logic constructed in orthomodular lattices (isomorphic to the lattice of subspaces of the Hilbert space of a given physical system).[96]

Game theory

Von Neumann founded the field of game theory as a mathematical discipline.[97] Von Neumann proved his minimax theorem in 1928. This theorem establishes that in zero-sum games with perfect information (i.e. in which players know at each time all moves that have taken place so far), there exists a pair of strategies for both players that allows each to minimize his maximum losses, hence the name minimax. When examining every possible strategy, a player must consider all the possible responses of his adversary. The player then plays out the strategy that will result in the minimization of his maximum loss.[98]

Such strategies, which minimize the maximum loss for each player, are called optimal. Von Neumann showed that their minimaxes are equal (in absolute value) and contrary (in sign). Von Neumann improved and extended the minimax theorem to include games involving imperfect information and games with more than two players, publishing this result in his 1944 Theory of Games and Economic Behavior (written with Oskar Morgenstern). Morgenstern wrote a paper on game theory and thought he would show it to von Neumann because of his interest in the subject. He read it and said to Morgenstern that he should put more in it. This was repeated a couple of times, and then von Neumann became a coauthor and the paper became 100 pages long. Then it became a book. The public interest in this work was such that The New York Times ran a front-page story.[citation needed] In this book, von Neumann declared that economic theory needed to use functional analytic methods, especially convex sets and topological fixed-point theorem, rather than the traditional differential calculus, because the maximum-operator did not preserve differentiable functions.[97]

Independently, Leonid Kantorovich's functional analytic work on mathematical economics also focused attention on optimization theory, non-differentiability, and vector lattices. Von Neumann's functional-analytic techniques—the use of duality pairings of real vector spaces to represent prices and quantities, the use of supporting and separating hyperplanes and convex sets, and fixed-point theory—have been the primary tools of mathematical economics ever since.[99]

Mathematical economics

Von Neumann raised the intellectual and mathematical level of economics in several influential publications. For his model of an expanding economy, von Neumann proved the existence and uniqueness of an equilibrium using his generalization of the Brouwer fixed-point theorem.[97] Von Neumann's model of an expanding economy considered the matrix pencil A − λB with nonnegative matrices A and B; von Neumann sought probability vectors p and q and a positive number λ that would solve the complementarity equation

p T ( A − λ B ) q = 0 {\displaystyle p^{T}(A-\lambda B)q=0}
{\displaystyle p^{T}(A-\lambda B)q=0}

along with two inequality systems expressing economic efficiency. In this model, the (transposed) probability vector p represents the prices of the goods while the probability vector q represents the "intensity" at which the production process would run. The unique solution λ represents the growth factor which is 1 plus the rate of growth of the economy; the rate of growth equals the interest rate.[100][101]

Von Neumann's results have been viewed as a special case of linear programming, where von Neumann's model uses only nonnegative matrices. The study of von Neumann's model of an expanding economy continues to interest mathematical economists with interests in computational economics.[102][103][104] This paper has been called the greatest paper in mathematical economics by several authors, who recognized its introduction of fixed-point theorems, linear inequalities, complementary slackness, and saddlepoint duality. In the proceedings of a conference on von Neumann's growth model, Paul Samuelson said that many mathematicians had developed methods useful to economists, but that von Neumann was unique in having made significant contributions to economic theory itself.[105]

Von Neumann's famous 9-page paper started life as a talk at Princeton and then became a paper in German, which was eventually translated into English. His interest in economics that led to that paper began as follows: When lecturing at Berlin in 1928 and 1929 he spent his summers back home in Budapest, and so did the economist Nicholas Kaldor, and they hit it off. Kaldor recommended that von Neumann read a book by the mathematical economist Léon Walras. Von Neumann found some faults in that book and corrected them, for example, replacing equations by inequalities. He noticed that Walras' General Equilibrium Theory and Walras' Law, which led to systems of simultaneous linear equations, could produce the absurd result that the profit could be maximized by producing and selling a negative quantity of a product. He replaced the equations by inequalities, introduced dynamic equilibria, among other things, and eventually produced the paper.[106]

Linear programming

Building on his results on matrix games and on his model of an expanding economy, von Neumann invented the theory of duality in linear programming when George Dantzig described his work in a few minutes, and an impatient von Neumann asked him to get to the point. Then, Dantzig listened dumbfounded while von Neumann provided an hour lecture on convex sets, fixed-point theory, and duality, conjecturing the equivalence between matrix games and linear programming.[107]

Later, von Neumann suggested a new method of linear programming, using the homogeneous linear system of Paul Gordan (1873), which was later popularized by Karmarkar's algorithm. Von Neumann's method used a pivoting algorithm between simplices, with the pivoting decision determined by a nonnegative least squares subproblem with a convexity constraint (projecting the zero-vector onto the convex hull of the active simplex). Von Neumann's algorithm was the first interior point method of linear programming.[107]

Mathematical statistics

Von Neumann made fundamental contributions to mathematical statistics. In 1941, he derived the exact distribution of the ratio of the mean square of successive differences to the sample variance for independent and identically normally distributed variables.[108] This ratio was applied to the residuals from regression models and is commonly known as the Durbin–Watson statistic[109] for testing the null hypothesis that the errors are serially independent against the alternative that they follow a stationary first order autoregression.[109]

Subsequently, Denis Sargan and Alok Bhargava extended the results for testing if the errors on a regression model follow a Gaussian random walk (i.e., possess a unit root) against the alternative that they are a stationary first order autoregression.[110]
 

Userr

14yr/rateme/john kramer/userr out
Autists
Joined
Feb 13, 2020
Messages
17,953
Reputation
6,208
want to know what actual geniuses (not random hillbillies) are concerned with?
stop trolling for the 100th time he obviosuly is extremley high iq if you watched him do something like a gameshow where he managed to work a guessing game with stragergry so strong he beat it with no more general knowdlge then a normal person.

He has alot of good ideas and discussions
 
// Infolinks