Algorithmic Complexity/Examples/Multiplication of Square Matrices
Jump to navigation
Jump to search
Example of Algorithmic Complexity
Consider two square matrices $\mathbf A$ and $\mathbf B$ of order $n$.
Consider the matrix multiplication operation:
- $\ds \mathbf A \times \mathbf B := \forall i j \in \closedint 1 n: c_{i j} = \sum_{k \mathop = 1}^n a_{i k} \circ b_{k j}$
This takes $2 n^3$ steps when using the usual method.
However, using the Strassen algorithm, this takes a number of steps proportional to $n^{\lg 7}$, where $\lg$ denotes logarithm base $2$.
As $\lg 7 < 3$, it follows that the Strassen algorithm uses fewer steps than the usual method, for sufficiently large $n$.
Proof
![]() | This theorem requires a proof. You can help $\mathsf{Pr} \infty \mathsf{fWiki}$ by crafting such a proof. To discuss this page in more detail, feel free to use the talk page. When this work has been completed, you may remove this instance of {{ProofWanted}} from the code.If you would welcome a second opinion as to whether your work is correct, add a call to {{Proofread}} the page. |
Sources
- 1998: David Nelson: The Penguin Dictionary of Mathematics (2nd ed.) ... (previous) ... (next): complexity (of an algorithm)
- 2008: David Nelson: The Penguin Dictionary of Mathematics (4th ed.) ... (previous) ... (next): complexity (of an algorithm)