The Method of Exterior Forms in Linear Programming

The method of exterior forms by H. Grassmann and E. Cartan is used for solving the linear programming problem. It captures the essence of the problem in a convenient, compact form. The solution is presented by Cramer’s like rules and is reduced to computing the set of values of the objective function at the vertices of the polyhedron constraints, without any explicit calculation of the vertices themselves. Introduction The linear programming problem represents a rare example of completely decidable problem in which either existence or absence of the solution is being established in a finite number of steps. This is due to the fact that the number of points that are suspicious for the extremum is finite and effectively definable. It is well known that these points are the vertices of a convex polyhedron on which the linear function is being searched for the extrema [1, 2, 3]. A direct search of all vertices is a valid procedure, which gives the detailed information about all the corner points and the value of the function being optimized. However, in large-scale problems with a sufficiently large number of constraints the quantity of corner points can be overwhelming. The famous G. B. Danzig simplex method [1] has the problem reduced to the turn-based optimization, strictly approximating the value of the function to the extreme, and therefore does not requiring verification of all vertices of the polyhedron. The proposed method of exterior forms due to genious of H. Grassmann and E. Cartan [4,5] reduces geometry to multilinear algebra and allows to write explicit formulas, similar to Cramer’s rules for the solution of linear equations, to calculate the values of the objective function at the vertices of the polyhedron through determinants. Although, it does not much decreases the amount of calculations in comparison with the full search (approximately, just by one order), the simplicity, clearness, and good programmability of the algorithm, as well as the ability to see the values of the objective function at all corner points can be an advantage in applications. Together with the value of the objective function, the set of free variables for each vertex is automatically determined. The algorithm stops with the choice of the extreme value of the objective function, being valid, that is such a value where all the coordinates of the corner point are nonnegative. It is assumed that the problem is presented in a special canonical form. Linear Dependency and Exterior Product In this section we recall the properties of the exterior product of vectors [4,5]. Definition 1. Let V be a real n dimensional linear space. The exterior algebra ( ) V Λ over the space V is a free graded − R algebra with multiplication : ( ) ( ) ( ) V V V ∧ Λ ×Λ →Λ , such that: • ( ) V Λ is a direct sum of linear spaces =0 ( ) n i i V Λ ⊕ , where 0 ( ) = V Λ  , ( ) = V V Λ , • ( ) = i V V V V Λ ∧ ∧ ∧  ( i times), 1 < i n ≤ , the i -th exterior power of the space , V that is a linear space generated by all the products 1 1 1 i α α ∧ ∧  , where 1 k V α ∈ , 1 k i ≤ ≤ , subject to constraints of associativity and bilinearity of the product ∧ , Bulletin of Mathematical Sciences and Applications Submitted: 2016-01-12 ISSN: 2278-9634, Vol. 14, pp 7-12 Accepted: 2016-02-10 doi:10.18052/www.scipress.com/BMSA.14.7 Online: 2016-02-15 © 2016 SciPress Ltd., Switzerland SciPress applies the CC-BY 4.0 license to works we publish: https://creativecommons.org/licenses/by/4.0/ • for any two subspaces ( ) i V Λ , ( ) j V Λ , 0 , i j n ≤ ≤ , the value of the product : ( ) ( ) ( ) : ( , ) i j i j i j V V V α α α α ∧ Λ ×Λ →Λ ∧  belongs to the subspace ( ) i j V + Λ , • the multiplication ∧ is anticommutative, that is for any two elements ( ) i i V α ∈Λ , ( ) j j V α ∈Λ the following holds = ( 1) i j i j j i α α α α ⋅ ∧ − ∧ . Multiplication ∧ is called exterior. The dimension of the space ( ) i V Λ , 0 i n ≤ ≤ , is equal to the number of the basis vectors 1 2 j j ji e e e ∧ ∧ ∧  , 1 2 1 < < < i j j j n ≤ ≤  , where 1 { , , } n e e  -the basis of the space V , that is ! ( ( )) = = !( )! i i n n dim V C i n i Λ − . Accordingly, the dimension of the exterior algebra ( ) V Λ is 2 . Exterior multiplication allows to express the fact of linear dependency of the vectors in the form of equality. Lemma 1. Let 1 { , , } m α α  be a finite non-empty set of linearly independent vectors of the space V , ( ) = dim V n , m n ≤ . A vector V β ∈ is linearly dependent on the vectors 1, , m α α  if and only if 1 = 0 m β α α ∧ ∧ ∧  . Proof. If =1 = m i i i c β α ∑ , then 1 1 =0 = = 0 m m i i m i c β α α α α α ∧ ∧ ∧ ∧ ∧ ∧ ∑   . Conversely, suppose that 1 = 0 m β α α ∧ ∧ ∧  . Add to the set 1 { , , } m α α  new vectors 1, , m n α α +  to complete it to a basis of the vector space V . Then =1 = n i i i c β α ∑ , and the equation 1 = 1 = 0 n i i m i m cα α α + ∧ ∧ ∧ ∑  implies 1, , = 0 m n c c +  . Proposition 1. Suppose that 1 { , , } n e e  is a basis of the linear space V . The exterior product of the vectors = j i i j a e α , = 1, , i m  , m n ≤ , is equal to zero 1 = 0 m α α ∧ ∧  if and only if all the m m × -determinants 1 k km ∆  of the matrix 1 1 1 2 2 1


Introduction
The linear programming problem represents a rare example of completely decidable problem in which either existence or absence of the solution is being established in a finite number of steps. This is due to the fact that the number of points that are suspicious for the extremum is finite and effectively definable. It is well known that these points are the vertices of a convex polyhedron on which the linear function is being searched for the extrema [1,2,3].
A direct search of all vertices is a valid procedure, which gives the detailed information about all the corner points and the value of the function being optimized. However, in large-scale problems with a sufficiently large number of constraints the quantity of corner points can be overwhelming. The famous G. B. Danzig simplex method [1] has the problem reduced to the turn-based optimization, strictly approximating the value of the function to the extreme, and therefore does not requiring verification of all vertices of the polyhedron.
The proposed method of exterior forms due to genious of H. Grassmann and E. Cartan [4,5] reduces geometry to multilinear algebra and allows to write explicit formulas, similar to Cramer's rules for the solution of linear equations, to calculate the values of the objective function at the vertices of the polyhedron through determinants. Although, it does not much decreases the amount of calculations in comparison with the full search (approximately, just by one order), the simplicity, clearness, and good programmability of the algorithm, as well as the ability to see the values of the objective function at all corner points can be an advantage in applications. Together with the value of the objective function, the set of free variables for each vertex is automatically determined. The algorithm stops with the choice of the extreme value of the objective function, being valid, that is such a value where all the coordinates of the corner point are nonnegative. It is assumed that the problem is presented in a special canonical form.

Linear Dependency and Exterior Product
In this section we recall the properties of the exterior product of vectors [4,5].
, such that: -the i -th exterior power of the space , V that is a linear space generated by all the products 1 subject to constraints of associativity and bilinearity of the product ∧ , • for any two subspaces ≤ , the value of the product • the multiplication ∧ is anticommutative, that is for any two elements ( ) Multiplication ∧ is called exterior. The dimension of the space ( ) i V Λ , 0 i n ≤ ≤ , is equal to the number of the basis vectors 1 2 . Accordingly, the dimension of the exterior algebra ( ) V Λ is 2 n . Exterior multiplication allows to express the fact of linear dependency of the vectors in the form of equality.
be a finite non-empty set of linearly independent vectors of the space Conversely, suppose that The proof follows from the expansion of the exterior product of the vectors through the basis ones. The coefficients of this expansion are the required determinants.
The number of the determinants The Linear Programming Problem The linear programming problem in the standard form is formulated in the following way [1,2,3]. It is required to find an extreme (minimum or maximum), if it exists, of a linear function m n and equations (2) are linearly independent.

Volume 14
Further, a simple case, when the linear variety determined by equations (2) passes through the origin, is eliminated from the consideration. In that case, there is only one corner point of the system of equations and inequalities (2) -(3), the origin, which is easy to be checked on the extreme.
Since not all j b , = 1, , j m  , are zero, equations (2) can be rewritten in the equivalent form It is known that the function ( ) J x reaches an extremum, when it exists, at one or more corner points. The corner points of the system (2)  belongs to a finite set of fractions the determinants composed of the columns with numbers 1 2 , , , m k k k  of the matrices Remark. If the fraction is the number of elements in the latter set.
The value of the objective function, taken from the set x z x y z + ≥ 10 Volume 14 Solution.

Conclusion.
The paper presents a new simple method, taken from geometry [4,5], for solving the linear programming problem using only the idea of linear dependency of vectors. The method is easily programmable and, in some cases, can compete with the G. B. Danzig simplex method, widely used to solve practical problems [1,2,3].
At the first stage the algorithm calculates the spectrum of all possible values of the objective function at the vertices of the simplex, but without calculating the vertices themselves, using explicite formulas via determinants. The duration of this phase is caused by the need to calculate a large number m n C ⋅ 2 of m m × -determinants. The second stage deals with sorting the points outside the simplex to the first extreme point belonging to the simplex. In practice, the second step is much shorter than by G. B. Danzig algorithm [1].