In mathematics, an inequality is a relation that holds between two values when they are different. The notation a ¿ b means that a is not equal to b. It does not say that one is greater than the other, or even that they can be compared in size. If the values in question are elements of an ordered set, such as the integers or the real numbers, they can be compared in size. The notation a < b means that a is less than b. The notation a > b means that a is greater than b.
more from Wikipedia
X
X is the twenty-fourth letter in the ISO basic Latin alphabet.
more from Wikipedia
Codomain
In mathematics, the codomain or target set of a function is the set Y into which all of the output of the function is constrained to fall. It is the set Y in the notation f: X ¿ Y. The codomain is also sometimes referred to as the range but that term is ambiguous as it may also refer to the image. The codomain is part of the modern definition of a function f as a triple (X, Y, F), with F a subset of the Cartesian product X × Y.
more from Wikipedia
Consistency
In logic, a consistent theory is one that does not contain a contradiction. The lack of contradiction can be defined in either semantic or syntactic terms. The semantic definition states that a theory is consistent if and only if it has a model, i.e. there exists an interpretation under which all formulas in the theory are true. This is the sense used in traditional Aristotelian logic, although in contemporary mathematical logic the term satisfiable is used instead.
more from Wikipedia
Armstrong's axioms
Armstrong's axioms are a set of axioms (or, more precisely, inference rules) used to infer all the functional dependencies on a relational database. They were developed by William W. Armstrong on his 1974 paper. The axioms are sound in that they generate only functional dependencies in the closure of a set of functional dependencies (denoted as F) when applied to that set (denoted as).
more from Wikipedia
Arithmetic
Arithmetic or arithmetics is the oldest and most elementary branch of mathematics, used by almost everyone, for tasks ranging from simple day-to-day counting to advanced science and business calculations. It involves the study of quantity, especially as the result of operations that combine numbers. In common usage, it refers to the simpler properties when using the traditional operations of addition, subtraction, multiplication and division with smaller values of numbers.
more from Wikipedia
Information theory
Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data.
more from Wikipedia
Multivalued dependency
In database theory, multivalued dependency is a full constraint between two sets of attributes in a relation. In contrast to the functional dependency, the multivalued dependency requires that certain tuples be present in a relation. Therefore, a multivalued dependency is a special case of tuple-generating dependency. The multivalued dependency plays a role in the 4NF database normalization.
more from Wikipedia