Smoothed analysis is a way of measuring the complexity of an algorithm. It gives a more realistic analysis of the practical performance of the algorithm, such as its running time, than using worst-case or average-case scenarios. For instance the simplex algorithm runs in exponential-time in the worst-case and yet in practice it is a very efficient algorithm. This was one of the main motivations for developing smoothed analysis.
more from Wikipedia
Average-case complexity
Average-case complexity is a subfield of computational complexity theory that studies the complexity of algorithms on random inputs. The study of average-case complexity has applications in the theory of cryptography. Leonid Levin presented the motivation for studying average-case complexity as follows:: "Many combinatorial problems (called search or NP problems) have easy methods of checking solutions for correctness.
more from Wikipedia
Best, worst and average case
In computer science, best, worst and average cases of a given algorithm express what the resource usage is at least, at most and on average, respectively. Usually the resource being considered is running time, but it could also be memory or other resources. In real-time computing, the worst-case execution time is often of particular concern since it is important to know how much time might be needed in the worst case to guarantee that the algorithm will always finish on time.
more from Wikipedia
Complexity class
In computational complexity theory, a complexity class is a set of problems of related resource-based complexity. A typical complexity class has a definition of the form: the set of problems that can be solved by an abstract machine M using O(f) of resource R, where n is the size of the input.
more from Wikipedia
Hardness of approximation
In computer science, hardness of approximation is a field that studies the algorithmic complexity of finding near-optimal solutions to optimization problems. It complements the study of approximation algorithms by proving, for certain problems, a limit on the factors with which their solution can be efficiently approximated.
more from Wikipedia
Analysis of algorithms
In computer science, the analysis of algorithms is the determination of the amount of resources (such as time and storage) necessary to execute them. Most algorithms are designed to work with inputs of arbitrary length. Usually the efficiency or running time of an algorithm is stated as a function relating the input length to the number of steps or storage locations (space complexity).
more from Wikipedia
P (complexity)
In computational complexity theory, P, also known as PTIME or DTIME(n), is one of the most fundamental complexity classes. It contains all decision problems which can be solved by a deterministic Turing machine using a polynomial amount of computation time, or polynomial time.
more from Wikipedia