By Matthew Perdue, Updated Aug 30, 2022
AndreaObzerova/iStock/GettyImages
Cardinality is a foundational concept in set theory that describes the size of a finite collection of distinct objects. In practice, a cardinal number is a non‑negative integer that specifies exactly how many elements a set contains.
Although the notion is simple, correctly determining cardinality is essential for mathematicians, computer scientists, and data analysts alike. Two sets can differ in composition yet share the same cardinality, which is why the concept is used to compare the “size” of sets rather than their specific members.
Start with a concrete, finite set. Elements need not be numbers; they can be letters, symbols, or any distinct items. For instance:
R = {a, 1, 3, 7, @}
Simply count every distinct item in the set. In the example above, there are five elements, so the cardinality of set R is 5.
The sequence in which elements appear does not affect cardinality. Rearranging the set yields the same count:
R = {a, 1, 3, 7, @}
R′ = {7, @, 3, a, 1}
Moreover, two different sets can have identical cardinalities. Consider:
R = {a, 1, 3, 7, @}
S = {1, 2, b, 3, 9}
Both sets contain five elements, so Card(R) = Card(S) = 5, even though they are not equal as sets.
Understanding cardinality allows you to make accurate comparisons between sets, analyze algorithmic complexity, and interpret data structures with confidence.