Home | Metamath
Proof Explorer Theorem List (p. 205 of 426) | < Previous Next > |
Browser slow? Try the
Unicode version. |
||
Mirrors > Metamath Home Page > MPE Home Page > Theorem List Contents > Recent Proofs This page: Page List |
Color key: | Metamath Proof Explorer
(1-27775) |
Hilbert Space Explorer
(27776-29300) |
Users' Mathboxes
(29301-42551) |
Type | Label | Description |
---|---|---|
Statement | ||
Theorem | mdetf 20401 | Functionality of the determinant, see also definition in [Lang] p. 513. (Contributed by Stefan O'Rear, 9-Jul-2018.) (Proof shortened by AV, 23-Jul-2019.) |
maDet Mat | ||
Theorem | mdetcl 20402 | The determinant evaluates to an element of the base ring. (Contributed by Stefan O'Rear, 9-Sep-2015.) (Revised by AV, 7-Feb-2019.) |
maDet Mat | ||
Theorem | m1detdiag 20403 | The determinant of a 1-dimensional matrix equals its (single) entry. (Contributed by AV, 6-Aug-2019.) |
maDet Mat | ||
Theorem | mdetdiaglem 20404* | Lemma for mdetdiag 20405. Previously part of proof for mdet1 20407. (Contributed by SO, 10-Jul-2018.) (Revised by AV, 17-Aug-2019.) |
maDet Mat mulGrp RHom pmSgn g | ||
Theorem | mdetdiag 20405* | The determinant of a diagonal matrix is the product of the entries in the diagonal. (Contributed by AV, 17-Aug-2019.) |
maDet Mat mulGrp g | ||
Theorem | mdetdiagid 20406* | The determinant of a diagonal matrix with identical entries is the power of the entry in the diagonal. (Contributed by AV, 17-Aug-2019.) |
maDet Mat mulGrp .g | ||
Theorem | mdet1 20407 | The determinant of the identity matrix is 1, i.e. the determinant function is normalized, see also definition in [Lang] p. 513. (Contributed by SO, 10-Jul-2018.) (Proof shortened by AV, 25-Nov-2019.) |
maDet Mat | ||
Theorem | mdetrlin 20408 | The determinant function is additive for each row: The matrices X, Y, Z are identical except for the I's row, and the I's row of the matrix X is the componentwise sum of the I's row of the matrices Y and Z. In this case the determinant of X is the sum of the determinants of Y and Z. (Contributed by SO, 9-Jul-2018.) (Proof shortened by AV, 23-Jul-2019.) |
maDet Mat | ||
Theorem | mdetrsca 20409 | The determinant function is homogeneous for each row: The matrices X and Z are identical except for the I's row, and the I's row of the matrix X is the componentwise product of the I's row of the matrix Z and the scalar Y. In this case the determinant of X is the determinant of Z multiplied by Y. (Contributed by SO, 9-Jul-2018.) (Proof shortened by AV, 23-Jul-2019.) |
maDet Mat | ||
Theorem | mdetrsca2 20410* | The determinant function is homogeneous for each row (matrices are given explicitly by their entries). (Contributed by SO, 16-Jul-2018.) |
maDet | ||
Theorem | mdetr0 20411* | The determinant of a matrix with a row containing only 0's is 0. (Contributed by SO, 16-Jul-2018.) |
maDet | ||
Theorem | mdet0 20412 | The determinant of the zero matrix (of dimension greater 0!) is 0. (Contributed by AV, 17-Aug-2019.) |
maDet Mat | ||
Theorem | mdetrlin2 20413* | The determinant function is additive for each row (matrices are given explicitly by their entries). (Contributed by SO, 16-Jul-2018.) |
maDet | ||
Theorem | mdetralt 20414* | The determinant function is alternating regarding rows: if a matrix has two identical rows, its determinant is 0. Corollary 4.9 in [Lang] p. 515. (Contributed by SO, 10-Jul-2018.) (Proof shortened by AV, 23-Jul-2018.) |
maDet Mat | ||
Theorem | mdetralt2 20415* | The determinant function is alternating regarding rows (matrix is given explicitly by its entries). (Contributed by SO, 16-Jul-2018.) |
maDet | ||
Theorem | mdetero 20416* | The determinant function is multilinear (additive and homogeneous for each row (matrices are given explicitly by their entries). Corollary 4.9 in [Lang] p. 515. (Contributed by SO, 16-Jul-2018.) |
maDet | ||
Theorem | mdettpos 20417 | Determinant is invariant under transposition. Proposition 4.8 in [Lang] p. 514. (Contributed by Stefan O'Rear, 9-Jul-2018.) |
maDet Mat tpos | ||
Theorem | mdetunilem1 20418* | Lemma for mdetuni 20428. (Contributed by SO, 14-Jul-2018.) |
Mat | ||
Theorem | mdetunilem2 20419* | Lemma for mdetuni 20428. (Contributed by SO, 15-Jul-2018.) |
Mat | ||
Theorem | mdetunilem3 20420* | Lemma for mdetuni 20428. (Contributed by SO, 15-Jul-2018.) |
Mat | ||
Theorem | mdetunilem4 20421* | Lemma for mdetuni 20428. (Contributed by SO, 15-Jul-2018.) |
Mat | ||
Theorem | mdetunilem5 20422* | Lemma for mdetuni 20428. (Contributed by SO, 15-Jul-2018.) |
Mat | ||
Theorem | mdetunilem6 20423* | Lemma for mdetuni 20428. (Contributed by SO, 15-Jul-2018.) |
Mat | ||
Theorem | mdetunilem7 20424* | Lemma for mdetuni 20428. (Contributed by SO, 15-Jul-2018.) |
Mat RHom pmSgn | ||
Theorem | mdetunilem8 20425* | Lemma for mdetuni 20428. (Contributed by SO, 15-Jul-2018.) |
Mat | ||
Theorem | mdetunilem9 20426* | Lemma for mdetuni 20428. (Contributed by SO, 15-Jul-2018.) |
Mat | ||
Theorem | mdetuni0 20427* | Lemma for mdetuni 20428. (Contributed by SO, 15-Jul-2018.) |
Mat maDet | ||
Theorem | mdetuni 20428* | According to the definition in [Weierstrass] p. 272, the determinant function is the unique multilinear, alternating and normalized function from the algebra of square matrices of the same dimension over a commutative ring to this ring. So for any multilinear (mdetuni.li and mdetuni.sc), alternating (mdetuni.al) and normalized (mdetuni.no) function D (mdetuni.ff) from the algebra of square matrices (mdetuni.a) to their underlying commutative ring (mdetuni.cr), the function value of this function D for a matrix F (mdetuni.f) is the determinant of this matrix. (Contributed by Stefan O'Rear, 15-Jul-2018.) (Revised by Alexander van der Vekens, 8-Feb-2019.) |
Mat maDet | ||
Theorem | mdetmul 20429 | Multiplicativity of the determinant function: the determinant of a matrix product of square matrices equals the product of their determinants. Proposition 4.15 in [Lang] p. 517. (Contributed by Stefan O'Rear, 16-Jul-2018.) |
Mat maDet | ||
Theorem | m2detleiblem1 20430 | Lemma 1 for m2detleib 20437. (Contributed by AV, 12-Dec-2018.) |
RHom pmSgn pmSgn.g | ||
Theorem | m2detleiblem5 20431 | Lemma 5 for m2detleib 20437. (Contributed by AV, 20-Dec-2018.) |
RHom pmSgn | ||
Theorem | m2detleiblem6 20432 | Lemma 6 for m2detleib 20437. (Contributed by AV, 20-Dec-2018.) |
RHom pmSgn | ||
Theorem | m2detleiblem7 20433 | Lemma 7 for m2detleib 20437. (Contributed by AV, 20-Dec-2018.) |
RHom pmSgn | ||
Theorem | m2detleiblem2 20434* | Lemma 2 for m2detleib 20437. (Contributed by AV, 16-Dec-2018.) (Proof shortened by AV, 1-Jan-2019.) |
Mat mulGrp g | ||
Theorem | m2detleiblem3 20435* | Lemma 3 for m2detleib 20437. (Contributed by AV, 16-Dec-2018.) (Proof shortened by AV, 2-Jan-2019.) |
Mat mulGrp g | ||
Theorem | m2detleiblem4 20436* | Lemma 4 for m2detleib 20437. (Contributed by AV, 20-Dec-2018.) (Proof shortened by AV, 2-Jan-2019.) |
Mat mulGrp g | ||
Theorem | m2detleib 20437 | Leibniz' Formula for 2x2-matrices. (Contributed by AV, 21-Dec-2018.) (Revised by AV, 26-Dec-2018.) (Proof shortened by AV, 23-Jul-2019.) |
maDet Mat | ||
Syntax | cmadu 20438 | Syntax for the matrix adjugate/adjunct function. |
maAdju | ||
Syntax | cminmar1 20439 | Syntax for the minor matrices of a square matrix. |
minMatR1 | ||
Definition | df-madu 20440* | Define the adjugate or adjunct (matrix of cofactors) of a square matrix. This definition gives the standard cofactors, however the internal minors are not the standard minors, see definition in [Lang] p. 518. (Contributed by Stefan O'Rear, 7-Sep-2015.) (Revised by SO, 10-Jul-2018.) |
maAdju Mat maDet | ||
Definition | df-minmar1 20441* | Define the matrices whose determinants are the minors of a square matrix. In contrast to the standard definition of minors, a row is replaced by 0's and one 1 instead of deleting the column and row (e.g., definition in [Lang] p. 515). By this, the determinant of such a matrix is equal to the minor determined in the standard way (as determinant of a submatrix, see df-subma 20383- note that the matrix is transposed compared with the submatrix defined in df-subma 20383, but this does not matter because the determinants are the same, see mdettpos 20417). Such matrices are used in the definition of an adjunct of a square matrix, see df-madu 20440. (Contributed by AV, 27-Dec-2018.) |
minMatR1 Mat | ||
Theorem | mndifsplit 20442 | Lemma for maducoeval2 20446. (Contributed by SO, 16-Jul-2018.) |
Theorem | madufval 20443* | First substitution for the adjunct (cofactor) matrix. (Contributed by SO, 11-Jul-2018.) |
Mat maDet maAdju | ||
Theorem | maduval 20444* | Second substitution for the adjunct (cofactor) matrix. (Contributed by SO, 11-Jul-2018.) |
Mat maDet maAdju | ||
Theorem | maducoeval 20445* | An entry of the adjunct (cofactor) matrix. (Contributed by SO, 11-Jul-2018.) |
Mat maDet maAdju | ||
Theorem | maducoeval2 20446* | An entry of the adjunct (cofactor) matrix. (Contributed by SO, 17-Jul-2018.) |
Mat maDet maAdju | ||
Theorem | maduf 20447 | Creating the adjunct of matrices is a function from the set of matrices into the set of matrices. (Contributed by Stefan O'Rear, 11-Jul-2018.) |
Mat maAdju | ||
Theorem | madutpos 20448 | The adjuct of a transposed matrix is the transposition of the adjunct of the matrix. (Contributed by Stefan O'Rear, 17-Jul-2018.) |
Mat maAdju tpos tpos | ||
Theorem | madugsum 20449* | The determinant of a matrix with a row consisting of the same element is the sum of the elements of the -th column of the adjunct of the matrix multiplied with . (Contributed by Stefan O'Rear, 16-Jul-2018.) |
Mat maAdju maDet g | ||
Theorem | madurid 20450 | Multiplying a matrix with its adjunct results in the identity matrix multiplied with the determinant of the matrix. See Proposition 4.16 in [Lang] p. 518. (Contributed by Stefan O'Rear, 16-Jul-2018.) |
Mat maAdju maDet | ||
Theorem | madulid 20451 | Multiplying the adjunct of a matrix with the matrix results in the identity matrix multiplied with the determinant of the matrix. See Proposition 4.16 in [Lang] p. 518. (Contributed by Stefan O'Rear, 17-Jul-2018.) |
Mat maAdju maDet | ||
Theorem | minmar1fval 20452* | First substitution for the definition of a matrix for a minor. (Contributed by AV, 31-Dec-2018.) |
Mat minMatR1 | ||
Theorem | minmar1val0 20453* | Second substitution for the definition of a matrix for a minor. (Contributed by AV, 31-Dec-2018.) |
Mat minMatR1 | ||
Theorem | minmar1val 20454* | Third substitution for the definition of a matrix for a minor. (Contributed by AV, 31-Dec-2018.) |
Mat minMatR1 | ||
Theorem | minmar1eval 20455 | An entry of a matrix for a minor. (Contributed by AV, 31-Dec-2018.) |
Mat minMatR1 | ||
Theorem | minmar1marrep 20456 | The minor matrix is a special case of a matrix with a replaced row. (Contributed by AV, 12-Feb-2019.) |
Mat matRRep minMatR1 matRRep | ||
Theorem | minmar1cl 20457 | Closure of the row replacement function for square matrices: The matrix for a minor is a matrix. (Contributed by AV, 13-Feb-2019.) |
Mat minMatR1 | ||
Theorem | maducoevalmin1 20458 | The coefficients of an adjunct (matrix of cofactors) expressed as determinants of the minor matrices (alternative definition) of the original matrix. (Contributed by AV, 31-Dec-2018.) |
Mat maDet maAdju minMatR1 | ||
According to Wikipedia ("Laplace expansion", 08-Mar-2019, https://en.wikipedia.org/wiki/Laplace_expansion) "In linear algebra, the Laplace expansion, named after Pierre-Simon Laplace, also called cofactor expansion, is an expression for the determinant det(B) of an n x n -matrix B that is a weighted sum of the determinants of n sub-matrices of B, each of size (n-1) x (n-1)". The expansion is usually performed for a row of matrix B (alternately for a column of matrix B). The mentioned "sub-matrices" are the matrices resultung from deleting the i-th row and the j-th column of matrix B. The mentioned "weights" (factors/coefficients) are the elements at position i and j in matrix B. If the expansion is performed for a row, the coefficients are the elements of the selected row. In the following, only the case where the row for the expansion contains only the zero element of the underlying ring except at the diagonal position. By this, the sum for the Laplace expansion is reduced to one summand, consisting of the element at the diagonal position multiplied with the determinant of the corresponding submatrix, see smadiadetg 20479 or smadiadetr 20481. | ||
Theorem | symgmatr01lem 20459* | Lemma for symgmatr01 20460. (Contributed by AV, 3-Jan-2019.) |
Theorem | symgmatr01 20460* | Applying a permutation that does not fix a certain element of a set to a second element to an index of a matrix a row with 0's and a 1. (Contributed by AV, 3-Jan-2019.) |
Theorem | gsummatr01lem1 20461* | Lemma A for gsummatr01 20465. (Contributed by AV, 8-Jan-2019.) |
Theorem | gsummatr01lem2 20462* | Lemma B for gsummatr01 20465. (Contributed by AV, 8-Jan-2019.) |
Theorem | gsummatr01lem3 20463* | Lemma 1 for gsummatr01 20465. (Contributed by AV, 8-Jan-2019.) |
CMnd g g | ||
Theorem | gsummatr01lem4 20464* | Lemma 2 for gsummatr01 20465. (Contributed by AV, 8-Jan-2019.) |
CMnd | ||
Theorem | gsummatr01 20465* | Lemma 1 for smadiadetlem4 20475. (Contributed by AV, 8-Jan-2019.) |
CMnd g g | ||
Theorem | marep01ma 20466* | Replacing a row of a square matrix by a row with 0's and a 1 results in a square matrix of the same dimension. (Contributed by AV, 30-Dec-2018.) |
Mat | ||
Theorem | smadiadetlem0 20467* | Lemma 0 for smadiadet 20476: The products of the Leibniz' formula vanish for all permutations fixing the index of the row containing the 0's and the 1 to the column with the 1. (Contributed by AV, 3-Jan-2019.) |
Mat mulGrp g | ||
Theorem | smadiadetlem1 20468* | Lemma 1 for smadiadet 20476: A summand of the determinant of a matrix belongs to the underlying ring. (Contributed by AV, 1-Jan-2019.) |
Mat mulGrp RHom pmSgn g | ||
Theorem | smadiadetlem1a 20469* | Lemma 1a for smadiadet 20476: The summands of the Leibniz' formula vanish for all permutations fixing the index of the row containing the 0's and the 1 to the column with the 1. (Contributed by AV, 3-Jan-2019.) |
Mat mulGrp RHom pmSgn g g | ||
Theorem | smadiadetlem2 20470* | Lemma 2 for smadiadet 20476: The summands of the Leibniz' formula vanish for all permutations fixing the index of the row containing the 0's and the 1 to itself. (Contributed by AV, 31-Dec-2018.) |
Mat mulGrp RHom pmSgn g g | ||
Theorem | smadiadetlem3lem0 20471* | Lemma 0 for smadiadetlem3 20474. (Contributed by AV, 12-Jan-2019.) |
Mat mulGrp RHom pmSgn pmSgn g | ||
Theorem | smadiadetlem3lem1 20472* | Lemma 1 for smadiadetlem3 20474. (Contributed by AV, 12-Jan-2019.) |
Mat mulGrp RHom pmSgn pmSgn g | ||
Theorem | smadiadetlem3lem2 20473* | Lemma 2 for smadiadetlem3 20474. (Contributed by AV, 12-Jan-2019.) |
Mat mulGrp RHom pmSgn pmSgn g Cntz g | ||
Theorem | smadiadetlem3 20474* | Lemma 3 for smadiadet 20476. (Contributed by AV, 31-Jan-2019.) |
Mat mulGrp RHom pmSgn pmSgn g g g g | ||
Theorem | smadiadetlem4 20475* | Lemma 4 for smadiadet 20476. (Contributed by AV, 31-Jan-2019.) |
Mat mulGrp RHom pmSgn pmSgn g g g g | ||
Theorem | smadiadet 20476 | The determinant of a submatrix of a square matrix obtained by removing a row and a column at the same index equals the determinant of the original matrix with the row replaced with 0's and a 1 at the diagonal position. (Contributed by AV, 31-Jan-2019.) (Proof shortened by AV, 24-Jul-2019.) |
Mat maDet maDet subMat minMatR1 | ||
Theorem | smadiadetglem1 20477 | Lemma 1 for smadiadetg 20479. (Contributed by AV, 13-Feb-2019.) |
Mat maDet maDet matRRep minMatR1 | ||
Theorem | smadiadetglem2 20478 | Lemma 2 for smadiadetg 20479. (Contributed by AV, 14-Feb-2019.) |
Mat maDet maDet matRRep minMatR1 | ||
Theorem | smadiadetg 20479 | The determinant of a square matrix with one row replaced with 0's and an arbitrary element of the underlying ring at the diagonal position equals the ring element multiplied with the determinant of a submatrix of the square matrix obtained by removing the row and the column at the same index. (Contributed by AV, 14-Feb-2019.) |
Mat maDet maDet matRRep subMat | ||
Theorem | smadiadetg0 20480 | Lemma for smadiadetr 20481: version of smadiadetg 20479 with all hypotheses defining class variables removed, i.e. all class variables defined in the hypotheses replaced in the theorem by their definition. (Contributed by AV, 15-Feb-2019.) |
Mat maDet matRRep maDet subMat | ||
Theorem | smadiadetr 20481 | The determinant of a square matrix with one row replaced with 0's and an arbitrary element of the underlying ring at the diagonal position equals the ring element multiplied with the determinant of a submatrix of the square matrix obtained by removing the row and the column at the same index. Closed form of smadiadetg 20479. Special case of the "Laplace expansion", see definition in [Lang] p. 515. (Contributed by AV, 15-Feb-2019.) |
Mat maDet matRRep maDet subMat | ||
Theorem | invrvald 20482 | If a matrix multiplied with a given matrix (from the left as well as from the right) results in the identity matrix, this matrix is the inverse (matrix) of the given matrix. (Contributed by Stefan O'Rear, 17-Jul-2018.) |
Unit | ||
Theorem | matinv 20483 | The inverse of a matrix is the adjunct of the matrix multiplied with the inverse of the determinant of the matrix if the determinant is a unit in the underlying ring. Proposition 4.16 in [Lang] p. 518. (Contributed by Stefan O'Rear, 17-Jul-2018.) |
Mat maAdju maDet Unit Unit | ||
Theorem | matunit 20484 | A matrix is a unit in the ring of matrices iff its determinant is a unit in the underlying ring. (Contributed by Stefan O'Rear, 17-Jul-2018.) |
Mat maDet Unit Unit | ||
In the following, Cramer's rule cramer 20497 is proven. According to Wikipedia "Cramer's rule", 21-Feb-2019, https://en.wikipedia.org/wiki/Cramer%27s_rule: "[Cramer's rule] ... expresses the [unique] solution [of a system of linear equations] in terms of the determinants of the (square) coefficient matrix and of matrices obtained from it by replacing one column by the column vector of right-hand sides of the equations." The outline of the proof for systems of linear equations with coefficients from a commutative ring, according to the proof in Wikipedia (https://en.wikipedia.org/wiki/Cramer's_rule#A_short_proof), is as follows: The system of linear equations to be solved shall be given by the N x N coefficient matrix and the N-dimensional vector . Let be the matrix obtained by replacing the i-th column of the coefficient matrix by the right-hand side vector . Additionally, let be the the matrix obtained by replacing the i-th column of the identity matrix by the solution vector , with . Finally, it is assumed that det is a unit in the underlying ring. With these definitions, it follows that (cramerimplem2 20490), using matrix multiplication (mamuval 20192) and multiplication of a vector with a matrix (mulmarep1gsum2 20380). By using the multiplicativity of the determinant (mdetmul 20429) it follows that det det det det (cramerimplem3 20491). Furthermore, it follows that det (cramerimplem1 20489). To show this, a special case of the Laplace expansion is used (smadiadetg 20479). From these equations and the cancellation law for division in a ring (dvrcan3 18692) it follows that det det det . This is the right to left implication (cramerimp 20492, cramerlem1 20493, cramerlem2 20494) of Cramer's rule (cramer 20497). The left to right implication is shown by cramerlem3 20495, using the fact that a solution of the system of linear equations exists (slesolex 20488). Notice that for the special case of 0-dimensional matrices/vectors only the left to right implication is valid (see cramer0 20496), because assuming the right-hand side of the implication ( ), could be anything (see mavmul0g 20359). | ||
Theorem | slesolvec 20485 | Every solution of a system of linear equations represented by a matrix and a vector is a vector. (Contributed by AV, 10-Feb-2019.) (Revised by AV, 27-Feb-2019.) |
Mat maVecMul | ||
Theorem | slesolinv 20486 | The solution of a system of linear equations represented by a matrix with a unit as determinant is the multiplication of the inverse of the matrix with the right-hand side vector. (Contributed by AV, 10-Feb-2019.) (Revised by AV, 28-Feb-2019.) |
Mat maVecMul maDet Unit | ||
Theorem | slesolinvbi 20487 | The solution of a system of linear equations represented by a matrix with a unit as determinant is the multiplication of the inverse of the matrix with the right-hand side vector. (Contributed by AV, 11-Feb-2019.) (Revised by AV, 28-Feb-2019.) |
Mat maVecMul maDet Unit | ||
Theorem | slesolex 20488* | Every system of linear equations represented by a matrix with a unit as determinant has a solution. (Contributed by AV, 11-Feb-2019.) (Revised by AV, 28-Feb-2019.) |
Mat maVecMul maDet Unit | ||
Theorem | cramerimplem1 20489 | Lemma 1 for cramerimp 20492: The determinant of the identity matrix with the ith column replaced by a (column) vector equals the ith component of the vector. (Contributed by AV, 15-Feb-2019.) (Revised by AV, 28-Feb-2019.) |
Mat matRepV maDet | ||
Theorem | cramerimplem2 20490 | Lemma 2 for cramerimp 20492: The matrix of a system of linear equations multiplied with the identity matrix with the ith column replaced by the solution vector of the system of linear equations equals the matrix of the system of linear equations with the ith column replaced by the right-hand side vector of the system of linear equations. (Contributed by AV, 19-Feb-2019.) (Revised by AV, 1-Mar-2019.) |
Mat matRepV matRepV maVecMul maMul | ||
Theorem | cramerimplem3 20491 | Lemma 3 for cramerimp 20492: The determinant of the matrix of a system of linear equations multiplied with the determinant of the identity matrix with the ith column replaced by the solution vector of the system of linear equations equals the determinant of the matrix of the system of linear equations with the ith column replaced by the right-hand side vector of the system of linear equations. (Contributed by AV, 19-Feb-2019.) (Revised by AV, 1-Mar-2019.) |
Mat matRepV matRepV maVecMul maDet | ||
Theorem | cramerimp 20492 | One direction of Cramer's rule (according to Wikipedia "Cramer's rule", 21-Feb-2019, https://en.wikipedia.org/wiki/Cramer%27s_rule: "[Cramer's rule] ... expresses the solution [of a system of linear equations] in terms of the determinants of the (square) coefficient matrix and of matrices obtained from it by replacing one column by the column vector of right-hand sides of the equations."): The ith component of the solution vector of a system of linear equations equals the determinant of the matrix of the system of linear equations with the ith column replaced by the righthand side vector of the system of linear equations divided by the determinant of the matrix of the system of linear equations. (Contributed by AV, 19-Feb-2019.) (Revised by AV, 1-Mar-2019.) |
Mat matRepV matRepV maVecMul maDet /r Unit | ||
Theorem | cramerlem1 20493* | Lemma 1 for cramer 20497. (Contributed by AV, 21-Feb-2019.) (Revised by AV, 1-Mar-2019.) |
Mat maDet maVecMul /r Unit matRepV | ||
Theorem | cramerlem2 20494* | Lemma 2 for cramer 20497. (Contributed by AV, 21-Feb-2019.) (Revised by AV, 1-Mar-2019.) |
Mat maDet maVecMul /r Unit matRepV | ||
Theorem | cramerlem3 20495* | Lemma 3 for cramer 20497. (Contributed by AV, 21-Feb-2019.) (Revised by AV, 1-Mar-2019.) |
Mat maDet maVecMul /r Unit matRepV | ||
Theorem | cramer0 20496* | Special case of Cramer's rule for 0-dimensional matrices/vectors. (Contributed by AV, 28-Feb-2019.) |
Mat maDet maVecMul /r Unit matRepV | ||
Theorem | cramer 20497* | Cramer's rule. According to Wikipedia "Cramer's rule", 21-Feb-2019, https://en.wikipedia.org/wiki/Cramer%27s_rule: "[Cramer's rule] ... expresses the [unique] solution [of a system of linear equations] in terms of the determinants of the (square) coefficient matrix and of matrices obtained from it by replacing one column by the column vector of right-hand sides of the equations." If it is assumed that a (unique) solution exists, it can be obtained by Cramer's rule (see also cramerimp 20492). On the other hand, if a vector can be constructed by Cramer's rule, it is a solution of the system of linear equations, so at least one solution exists. The uniqueness is ensured by considering only systems of linear equations whose matrix has a unit (of the underlying ring) as determinant, see matunit 20484 or slesolinv 20486. For fields as underlying rings, this requirement is equivalent with the determinant not being 0. Theorem 4.4 in [Lang] p. 513. This is Metamath 100 proof #97. (Contributed by Alexander van der Vekens, 21-Feb-2019.) (Revised by Alexander van der Vekens, 1-Mar-2019.) |
Mat maDet maVecMul /r Unit matRepV | ||
A polynomial matrix or matrix of polynomials is a matrix whose elements are univariate (or multivariate) polynomials. See Wikipedia "Polynomial matrix" https://en.wikipedia.org/wiki/Polynomial_matrix (18-Nov-2019). In this section, only square matrices whose elements are univariate polynomials are considered. Usually, the ring of such matrices, the ring of n x n matrices over the polynomial ring over a ring , is denoted by M(n, R[t]). The elements of this ring are called "polynomial matrices (over the ring )" in the following. In Metamath notation, this ring is defined by Mat Poly1, usually represented by the class variable (or , if is already occupied): Mat with Poly1. | ||
Theorem | pmatring 20498 | The set of polynomial matrices over a ring is a ring. (Contributed by AV, 6-Nov-2019.) |
Poly1 Mat | ||
Theorem | pmatlmod 20499 | The set of polynomial matrices over a ring is a left module. (Contributed by AV, 6-Nov-2019.) |
Poly1 Mat | ||
Theorem | pmat0op 20500* | The zero polynomial matrix over a ring represented as operation. (Contributed by AV, 16-Nov-2019.) |
Poly1 Mat |
< Previous Next > |
Copyright terms: Public domain | < Previous Next > |