# Savitch's theorem Savitch's theorem In computational complexity theory, Savitch's theorem, proved by Walter Savitch in 1970, gives a relationship between deterministic and non-deterministic space complexity. It states that for any function {displaystyle fin Omega (log(n))} , {displaystyle {mathsf {NSPACE}}left(fleft(nright)right)subseteq {mathsf {DSPACE}}left(fleft(nright)^{2}right).} In other words, if a nondeterministic Turing machine can solve a problem using {displaystyle f(n)} space, a deterministic Turing machine can solve the same problem in the square of that space bound. Although it seems that nondeterminism may produce exponential gains in time, this theorem shows that it has a markedly more limited effect on space requirements. Contents 1 Proof 2 Corollaries 3 See also 4 Notes 5 References 6 External links Proof The proof relies on an algorithm for STCON, the problem of determining whether there is a path between two vertices in a directed graph, which runs in {displaystyle Oleft((log n)^{2}right)} space for {displaystyle n} vertices. The basic idea of the algorithm is to solve recursively a somewhat more general problem, testing the existence of a path from a vertex s to another vertex t that uses at most k edges, where k is a parameter that is given as input to the recursive algorithm. STCON may be solved from this problem by setting k to n. To test for a k-edge path from s to t, one may test whether each vertex u may be the midpoint of the s-t path, by recursively searching for paths of half the length from s to u and u to t. Using pseudocode (in Python syntax) we can express this algorithm as follows: def k_edge_path(s, t, k) -> bool: """k initially equals n (which is the number of vertices)""" if k == 0: return s == t if k == 1: return (s, t) in edges for u in vertices: if k_edge_path(s, u, floor(k / 2)) and k_edge_path(u, t, ceil(k / 2)): return True return False This search calls itself to a recursion depth of {displaystyle O(log n)} levels, each of which requires {displaystyle O(log n)} bits to store the function arguments and local variables at that level: k and all vertices (s, t, u) require {displaystyle O(log n)} bits each. The total auxiliary space complexity is thus {displaystyle Oleft((log n)^{2}right)} .[Note 1] Although described above in the form of a program in a high-level language, the same algorithm may be implemented with the same asymptotic space bound on a Turing machine.

To see why this algorithm implies the theorem, consider the following. For any language {displaystyle Lin {mathsf {NSPACE}}(f(n))} , there is a Turing machine {displaystyle M} which decides {displaystyle L} in space {displaystyle f(n)} . Assume w.l.o.g. the alphabet is a binary alphabet {displaystyle {0,1}} (viz. {displaystyle Lsubseteq {0,1}^{*}} ). For any input word {displaystyle xin {0,1}^{*}} , there is a directed graph {displaystyle G_{x}^{M}} whose vertices are the configurations of {displaystyle M} when running on the input {displaystyle x} . There may be infinitely many such configurations; e.g. when {displaystyle M} keeps writing a symbol on the tape and moving the head to the right in a loop, ad infinitum. The configurations then grow arbitrarily large. However, we know that at most {displaystyle fleft(nright)} space is needed to decide whether {displaystyle xin L} , so we only care about configurations of size at most {displaystyle fleft(nright)} ; call any such configuration admissible. There are finitely many admissible configurations; namely {displaystyle 2^{Oleft(f(n)right)}} . Therefore, the induced subgraph {displaystyle [G_{x}^{M}]} of {displaystyle G_{x}^{M}} containing (exactly) the admissible configurations has {displaystyle 2^{Oleft(f(n)right)}} vertices. For each input {displaystyle xin {0,1}^{n}} , {displaystyle [G_{x}^{M}]} has a path from the starting configuration to an accepting configuration if and only if {displaystyle xin L} . Thus by deciding connectivity in {displaystyle [G_{x}^{M}]} , we can decide membership of {displaystyle x} in {displaystyle L} . By the above algorithm this can be done deterministically in space {displaystyle Oleft(left(log 2^{Oleft(f(n)right)}right)^{2}right)=Oleft(fleft(nright)^{2}right)} ; hence {displaystyle L} is in {displaystyle {mathsf {DSPACE}}left(fleft(nright)^{2}right)} .

Since this holds for all {displaystyle fin Omega (log n)} and all {displaystyle Lin {mathsf {NSPACE}}left(fleft(nright)right)} , we get the statement of the theorem: For all functions {displaystyle fin Omega (log n)} , {displaystyle {mathsf {NSPACE}}left(fleft(nright)right)subseteq {mathsf {DSPACE}}left(fleft(nright)^{2}right)} . ^ Note there can be up to {displaystyle n^{2}} edges in the input graph, each edge consuming {displaystyle 2log n} space, hence the total (i.e., not merely auxiliary) space complexity is {displaystyle O(n^{2}log n)} (and this bound is tight). However, edges may be represented implicitly, via the non-deterministic Turing machine under inspection. Corollaries Some important corollaries of the theorem include: PSPACE = NPSPACE This follows directly from the fact that the square of a polynomial function is still a polynomial function. It is believed that a similar relationship does not exist between the polynomial time complexity classes, P and NP, although this is still an open question. NL ⊆ L2 STCON is NL-complete, and so all languages in NL are also in the complexity class {displaystyle {mathsf {color {Blue}L}}^{2}={mathsf {DSPACE}}left(left(log nright)^{2}right)} . See also Exponential time hypothesis Immerman–Szelepcsényi theorem Notes ^ Arora & Barak (2009) p.86 ^ Arora & Barak (2009) p.92 References Arora, Sanjeev; Barak, Boaz (2009), Computational complexity. A modern approach, Cambridge University Press, ISBN 978-0-521-42426-4, Zbl 1193.68112 Papadimitriou, Christos (1993), "Section 7.3: The Reachability Method", Computational Complexity (1st ed.), Addison Wesley, pp. 149–150, ISBN 0-201-53082-1 Savitch, Walter J. (1970), "Relationships between nondeterministic and deterministic tape complexities", Journal of Computer and System Sciences, 4 (2): 177–192, doi:10.1016/S0022-0000(70)80006-X, hdl:10338.dmlcz/120475 Sipser, Michael (1997), "Section 8.1: Savitch's Theorem", Introduction to the Theory of Computation, PWS Publishing, pp. 279–281, ISBN 0-534-94728-X External links Lance Fortnow, Foundations of Complexity, Lesson 18: Savitch's Theorem. Accessed 09/09/09. Richard J. Lipton, Savitch’s Theorem. Gives a historical account on how the proof was discovered. Categories: Structural complexity theoryTheorems in computational complexity theory

Si quieres conocer otros artículos parecidos a Savitch's theorem puedes visitar la categoría Structural complexity theory.

Subir

Utilizamos cookies propias y de terceros para mejorar la experiencia de usuario Más información