In the mathematical discipline of graph theory, a vertex cover (sometimes node cover) of a graph is a set of vertices such that each edge of the graph is incident to at least one vertex of the set. The problem of finding a minimum vertex cover is a classical optimization problem in computer science and is a typical example of an NP-hard optimization problem that has an approximation algorithm. Its decision version, the vertex cover problem, was one of Karp's 21 NP-complete problems and is therefore a classical NP-complete problem in computational complexity theory. Furthermore, the vertex cover problem is fixed-parameter tractable and a central problem in parameterized complexity theory.
The minimum vertex cover problem can be formulated as a half-integral linear program whose dual linear program is the maximum matching problem.
Maps, Directions, and Place Reviews
Definition
Formally, a vertex cover of an undirected graph is a subset of such that , that is to say it is a set of vertices where every edge has at least one endpoint in the vertex cover . Such a set is said to cover the edges of . The following figure shows two examples of vertex covers, with some vertex cover marked in red.
A minimum vertex cover is a vertex cover of smallest possible size. The vertex cover number is the size of a minimum vertex cover, i.e. . The following figure shows examples of minimum vertex covers in the previous graphs.
Examples
- The set of all vertices is a vertex cover.
- The endpoints of any maximal matching form a vertex cover.
- The complete bipartite graph has a minimum vertex cover of size .
Properties
- A set of vertices is a vertex cover if and only if its complement is an independent set.
- Consequently, the number of vertices of a graph is equal to its minimum vertex cover number plus the size of a maximum independent set (Gallai 1959).
Vertex Radio Software Video
Computational problem
The minimum vertex cover problem is the optimization problem of finding a smallest vertex cover in a given graph.
If the problem is stated as a decision problem, it is called the vertex cover problem:
The vertex cover problem is an NP-complete problem: it was one of Karp's 21 NP-complete problems. It is often used in computational complexity theory as a starting point for NP-hardness proofs.
ILP formulation
Assume that every vertex has an associated cost of . The (weighted) minimum vertex cover problem can be formulated as the following integer linear program (ILP).
This ILP belongs to the more general class of ILPs for covering problems. The integrality gap of this ILP is , so its relaxation gives a factor- approximation algorithm for the minimum vertex cover problem. Furthermore, the linear programming relaxation of that ILP is half-integral, that is, there exists an optimal solution for which each entry is either 0, 1/2, or 1.
Exact evaluation
The decision variant of the vertex cover problem is NP-complete, which means it is unlikely that there is an efficient algorithm to solve it exactly. NP-completeness can be proven by reduction from 3-satisfiability or, as Karp did, by reduction from the clique problem. Vertex cover remains NP-complete even in cubic graphs and even in planar graphs of degree at most 3.
For bipartite graphs, the equivalence between vertex cover and maximum matching described by König's theorem allows the bipartite vertex cover problem to be solved in polynomial time.
For tree graphs, an algorithm finds a minimal vertex cover in polynomial time by finding the first leaf in the tree and adding its parent to the minimal vertex cover, then deleting the leaf and parent and all associated edges and continuing repeatedly until no nodes remain in the tree.
Fixed-parameter tractability
An exhaustive search algorithm can solve the problem in time 2knO(1). Vertex cover is therefore fixed-parameter tractable, and if we are only interested in small k, we can solve the problem in polynomial time. One algorithmic technique that works here is called bounded search tree algorithm, and its idea is to repeatedly choose some vertex and recursively branch, with two cases at each step: place either the current vertex or all its neighbours into the vertex cover. The algorithm for solving vertex cover that achieves the best asymptotic dependence on the parameter runs in time . The klam value of this time bound (an estimate for the largest parameter value that could be solved in a reasonable amount of time) is approximately 190. That is, unless additional algorithmic improvements can be found, this algorithm is suitable only for instances whose vertex cover number is 190 or less. Under reasonable complexity-theoretic assumptions, namely the exponential time hypothesis, the running time cannot be improved to 2o(k), even when is .
However, for planar graphs, and more generally, for graphs excluding some fixed graph as a minor, a vertex cover of size k can be found in time , i.e., the problem is subexponential fixed-parameter tractable. This algorithm is again optimal, in the sense that, under the exponential time hypothesis, no algorithm can solve vertex cover on planar graphs in time .
Approximate evaluation
One can find a factor-2 approximation by repeatedly taking both endpoints of an edge into the vertex cover, then removing them from the graph. Put otherwise, we find a maximal matching M with a greedy algorithm and construct a vertex cover C that consists of all endpoints of the edges in M. In the following figure, a maximal matching M is marked with red, and the vertex cover C is marked with blue.
The set C constructed this way is a vertex cover: suppose that an edge e is not covered by C; then M ? {e} is a matching and e ? M, which is a contradiction with the assumption that M is maximal. Furthermore, if e = {u, v} ? M, then any vertex cover - including an optimal vertex cover - must contain u or v (or both); otherwise the edge e is not covered. That is, an optimal cover contains at least one endpoint of each edge in M; in total, the set C is at most 2 times as large as the optimal vertex cover.
This simple algorithm was discovered independently by Fanica Gavril and Mihalis Yannakakis.
More involved techniques show that there are approximation algorithms with a slightly better approximation factor. For example, an approximation algorithm with an approximation factor of is known. The problem can be approximated with an approximation factor in - dense graphs.
Inapproximability
No better constant-factor approximation algorithm than the above one is known. The minimum vertex cover problem is APX-complete, that is, it cannot be approximated arbitrarily well unless P = NP. Using techniques from the PCP theorem, Dinur and Safra proved in 2005 that minimum vertex cover cannot be approximated within a factor of 1.3606 for any sufficiently large vertex degree unless P = NP. Moreover, if the unique games conjecture is true then minimum vertex cover cannot be approximated within any constant factor better than 2.
Although finding the minimum-size vertex cover is equivalent to finding the maximum-size independent set, as described above, the two problems are not equivalent in an approximation-preserving way: The Independent Set problem has no constant-factor approximation unless P = NP.
Pseudo Code
Vertex cover in hypergraphs
Given a collection of sets, a set which intersects all sets in the collection in at least one element is called a hitting set, and a simple NP-hard problem is to find a hitting set of smallest size or minimum hitting set. By mapping the sets in the collection onto hyperedges, this can be understood as a natural generalization of the vertex cover problem to hypergraphs which is often just called vertex cover for hypergraphs and, in a more combinatorial context, transversal. The notions of hitting set and set cover are equivalent.
Formally, let H = (V, E) be a hypergraph with vertex set V and hyperedge set E. Then a set S ? V is called hitting set of H if, for all edges e ? E, it holds S ? e ? ?. The computational problems minimum hitting set and hitting set are defined as in the case of graphs. Note that we get back the case of vertex covers for simple graphs if the maximum size of the hyperedges is 2.
If that size is restricted to d, the problem of finding a minimum d-hitting set permits a d-approximation algorithm. Assuming the unique games conjecture, this is the best constant-factor algorithm that is possible and otherwise there is the possibility of improving the approximation to d - 1.
Fixed-parameter tractability
For the hitting set problem, different parameterizations make sense. The hitting set problem is W[2]-complete for the parameter OPT, that is, it is unlikely that there is an algorithm that runs in time f(OPT)nO(1) where OPT is the cardinality of the smallest hitting set. The hitting set problem is fixed-parameter tractable for the parameter OPT + d, where d is the size of the largest edge of the hypergraph. More specifically, there is an algorithm for hitting set that runs in time dOPTnO(1).
Hitting set and set cover
The hitting set problem is equivalent to the set cover problem: An instance of set cover can be viewed as an arbitrary bipartite graph, with sets represented by vertices on the left, elements of the universe represented by vertices on the right, and edges representing the inclusion of elements in sets. The task is then to find a minimum cardinality subset of left-vertices which covers all of the right-vertices. In the hitting set problem, the objective is to cover the left-vertices using a minimum subset of the right vertices. Converting from one problem to the other is therefore achieved by interchanging the two sets of vertices.
Applications
An example of a practical application involving the hitting set problem arises in efficient dynamic detection of race conditions. In this case, each time global memory is written, the current thread and set of locks held by that thread are stored. Under lockset-based detection, if later another thread writes to that location and there is not a race, it must be because it holds at least one lock in common with each of the previous writes. Thus the size of the hitting set represents the minimum lock set size to be race-free. This is useful in eliminating redundant write events, since large lock sets are considered unlikely in practice.
Source of the article : Wikipedia
EmoticonEmoticon