Levenshtein distance
|
In information theory, the Levenshtein distance or edit distance between two strings is given by the minimum number of operations needed to transform one string into the other, where an operation is an insertion, deletion, or substitution. It is named after the Russian scientist Vladimir Levenshtein, who considered this distance in 1965. It is useful in applications that need to determine how similar two strings are, such as spell checkers.
For example, the Levenshtein distance between "kitten" and "sitting" is 3, since these three edits change one into the other, and there is no way to do it with less than three edits:
- kitten
- sitten (substitution of 'k' for 's')
- sittin (substitution of 'i' for 'e')
- sitting (insert 'g' at the end)
It can be considered a generalization of the Hamming distance, which is used for strings of the same length and only considers substitution edits. There are also further generalizations of the Levenshtein distance that consider, for example, exchanging two characters as an operation.
Contents |
The algorithm
A commonly-used bottom-up dynamic programming algorithm for computing the Levenshtein distance involves the use of an (n + 1) × (m + 1) matrix, where n and m are the lengths of the two strings. Here is pseudocode for a function LevenshteinDistance that takes two strings, str1 of length lenStr1, and str2 of length lenStr2, and computes the Levenshtein distance between them:
int LevenshteinDistance(char str1[1..lenStr1], char str2[1..lenStr2]) // d is a table with lenStr1+1 rows and lenStr2+1 columns declare int d[0..lenStr1, 0..lenStr2] // i and j are used to iterate over str1 and str2 declare int i, j, cost for i from 0 to lenStr1 d[i, 0] := i for j from 0 to lenStr2 d[0, j] := j for i from 1 to lenStr1 for j from 1 to lenStr2 if str1[i] = str2[j] then cost := 0 else cost := 1 d[i, j] := minimum( d[i-1, j ] + 1, // insertion d[i , j-1] + 1, // deletion d[i-1, j-1] + cost // substitution ) return d[lenStr1, lenStr2]
The invariant maintained throughout the algorithm is that we can transform the initial segment s[1..i]
into t[1..j]
using a minimum of d[i,j]
operations. At the end, the bottom-right element of the array contains the answer.
Possible improvements
Possible improvements to this algorithm include:
- We can adapt the algorithm to use less space, O(m) instead of O(mn), since it only requires that the previous row and current row be stored at any one time.
- We can store the number of insertions, deletions, and substitutions separately, or even the positions at which they occur, which is always
j
. - We can give different penalty costs to insertion, deletion and substitution.
- The initialization of
d[i,0]
can be moved inside the main outer loop. - This algorithm parallelizes poorly, due to a large number of data dependencies. However, all the
cost
s can be computed in parallel, and the algorithm can be adapted to perform theminimum
function in phases to eliminate dependencies.
Proof of correctness
As mentioned earlier, the invariant is that we can transform the initial segment s[1..i]
into t[1..j]
using a minimum of d[i,j]
operations. This invariant holds since:
- It is initially true on row and column 0 because
s[1..i]
can be transformed into the empty stringt[1..0]
by simply dropping alli
characters. Similarly, we can transforms[1..0]
tot[1..j]
by simply adding allj
characters. - The minimum is taken over three distances, each of which is feasible:
- If we can transform
s[1..i]
tot[1..j-1]
ink
operations, then we can simply addt[j]
afterwards to gett[1..j]
ink+1
operations. - If we can transform
s[1..i-1]
tot[1..j]
ink
operations, then we can do the same operations ons[1..i]
and then remove the originals[i]
at the end ink+1
operations. - If we can transform
s[1..i-1]
tot[1..j-1]
ink
operations, we can do the same tos[1..i]
and then do a substitution oft[j]
for the originals[i]
at the end if necessary, requiringk+cost
operations.
- If we can transform
- The operations required to transform
s[1..n]
intot[1..m]
is of course the number required to transform all ofs
into all oft
, and sod[n,m]
holds our result.
This proof fails to validate that the number placed in d[i,j]
is in fact minimal; this is more difficult to show, and involves an argument by contradiction in which we assume d[i,j]
is smaller than the minimum of the three, and use this to show one of the three is not minimal.
Upper and lower bounds
The Levenshtein distance has several simple upper and lower bounds that are useful in applications which compute many of them and compare them. These include:
- It is always at least the difference of the sizes of the two strings.
- It is at most the length of the longer string.
- It is zero if and only if the strings are identical.
- If the strings are the same size, the Hamming distance is an upper bound on the Levenshtein distance; otherwise the Hamming distance plus the difference in sizes is an upper bound.
- If the strings are called
s
andt
, the number of characters found ins
but not int
is a lower bound.
Sample implementations
Python
From http://hetland.org/python/distance.py by Magnus Lie Hetland
def distance(a,b): "Calculates the Levenshtein distance between a and b." n, m = len(a), len(b) if n > m: # Make sure n <= m, to use O(min(n,m)) space a,b = b,a n,m = m,n current = range(n+1) for i in range(1,m+1): previous, current = current, [i]+[0]*m for j in range(1,n+1): add, delete = previous[j]+1, current[j-1]+1 change = previous[j-1] if a[j-1] != b[i-1]: change = change + 1 current[j] = min(add, delete, change) return current[n]
Haskell
Because Haskell automatically memoizes results of previous calls, it is particularly suited to a simple recursive implementation:
editDistance :: String->String->Int editDistance [] [] = 0 editDistance s [] = length s editDistance [] t = length t editDistance (s:ss) (t:ts) = minimum [ (if s == t then 0 else 1) + editDistance ss ts, 1 + editDistance ss (t:ts), 1 + editDistance (s:ss) ts ]
Scheme
Uses srfi-25 and srfi-42
(define add1 (lambda (x) (+ x 1))) (define sub1 (lambda (x) (- x 1))) (define levenshtein-distance (lambda (s1 s2) (let* ((width (add1 (string-length s1))) (height (add1 (string-length s2))) (d (make-array (shape 0 height 0 width) 0))) (do-ec (:range x width) (array-set! d 0 x x)) (do-ec (:range y height) (array-set! d y 0 y)) (do-ec (:range x (string-length s1)) (:range y (string-length s2)) (array-set! d (add1 y) (add1 x) (min (add1 (array-ref d y (add1 x))) (add1 (array-ref d (add1 y) x)) (+ (array-ref d y x) (if (eqv? (string-ref s1 x) (string-ref s2 y)) 0 1))))) (displarray d) (array-ref d (sub1 height) (sub1 width)))))
See also
External links
- Levenshtein Distance, in Three Flavors, by Michael Gilleland (http://www.merriampark.com/ld.htm)
- NIST's Dictionary of Algorithms and Data Structures: Levenshtein Distance (http://www.nist.gov/dads/HTML/Levenshtein.html)
- CSE 590BI, Winter 1996 Algorithms in Molecular Biology (http://www.cs.washington.edu/education/courses/590bi/96wi/) The algorithms from lectures 2, 3 and 4 are based on the Levenshtein distance but implement a different scoring function. The Haskell example was based on these notes.
- Levenshtein Distance - visualized (http://www-igm.univ-mlv.fr/~lecroq/seqcomp/node2.html)
- Distance between strings - Levenshtein and Hamming (http://www.cut-the-knot.org/do_you_know/Strings.shtml)
- Another Edit Distance Visualization (very fast) (http://www.cs.unm.edu/~luger/ai-final/demos.html)de:Lewenstein-Distanz
fr:Distance de Levenshtein it:Distanza di Levenshtein ja:レーベンシュタイン距離 pl:Odległość Levenshteina