Talk:Simulated annealing
|
On the intro section
This was the original reading of the introductory section:
- Simulated annealing is a generic probabilistic heuristic approach to global optimization problems. It locates a good approximation to the global optimum in a large search space with reasonable probability. It was independently invented by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983, and by V. Cerny in 1985.
- The name comes from annealing in metallurgy, a technique involving heating and controlled cooling of a material to improve its structural properties by removing crystal defects. In this process the metal reaches its most stable configuration, minimizing its total internal energy.
Three remarks:
- Shouldn't this be called a META-heuristic? As I understand, a heuristic is a non-guaranteed procedure that applies to a specific problem, e.g. "pick the nearest unvisited node" would be an heuristic for Euclidean TSP; whereas a meta-heuristic would be a general appraoch that can be used for almost any problem -- such as simulated annealing, genetic algorithm, etc.
- I've never heard of a meta-heuristic. I would not call it a heuristic, but a heuristic method (as opposed to a deterministic method, like SQP or steepest descent). moink 04:37, 2 Aug 2004 (UTC)
- The claim that simulated annealing finds a GOOD approximation to the global optimum with REASONABLE probability is and rather misleading (and, strictly speaking, rather meaningless). Simulated annealing may satisfy some users, but it is easy to make up instances where it will deliver, say, a solution 230 times worse than the global optimum, with probability 1-2-30, even with a cooling period of 230 steps. There is no reason to assume that the reader's optimization problems will be of the first type, and not of the second type.
- Yes, this should be more clearly explained. We should give examples of the types of problems in which it is likely to work and those in which it is likely not to work. moink 04:37, 2 Aug 2004 (UTC)
- Metallugical annealing does NOT "minimize the total internal" energy, by a long shot. Ignoring the nuclear energy and quantum noise, it will usually lower the total bond energy by only a little bit, and stop still very far from the optimum. Even if cooling is slow enough to turn the whole piece into a single perfect crystal, if the crystal axes happen to be oriented in a non-optimum way, the chances of them flipping to the optimum one are nil.
Jorge Stolfi 20:34, 30 Mar 2004 (UTC)
- Feel free to fix it. moink 04:37, 2 Aug 2004 (UTC)
This statement was deleted:
- If T is infinity, it moves around randomly.
It is not correct for the "classical" transition probability function, because that version would always take a downhill move, even when T=&infinity;. (The phrase is valid for a physical system, though; which only underscores the bogosity of the "physical analogy" claims.)
Jorge Stolfi 03:49, 11 Jun 2004 (UTC)
- Yeah, it's more correct to say that the method was inspired by metallurgy than that it is a direct analog. moink 04:37, 2 Aug 2004 (UTC)
This too was deleted:
- Nonetheless, simulated annealing can be very effective at finding good suboptimal solutions.
This is definitely POV (or, rather, a meaningless statement — it all depends on what one considers "very effective" and "good solution").
In fact, it seems that the only true claim that can be made about SA is that, unlike the greedy method, it will not get stuck in the first local minimum. One cannot even claim that it will outperform the trivial method ("generate N random states, pick the best one").
Jorge Stolfi 04:46, 11 Jun 2004 (UTC)
- No, but one can say that in a certain class of problems, it will with high probability outperform a random search. moink 04:37, 2 Aug 2004 (UTC)
I am done for a while. There is still a major problem in the article, which is the complicated interation between the neighbor selection fucntion and the energy-determined transition probabilities. They should perhaps be merged into a single "move" function.
Jorge Stolfi 06:48, 11 Jun 2004 (UTC)
- Great! Feel free to change it. moink 04:37, 2 Aug 2004 (UTC)
Old temp page still in use?
There's a page sitting in Simulated annealing/Temp that hasn't seen any significant activity since it was created in October 2004, and isn't linked to anywhere. Does anyone know what's up with it? Bryan 07:10, 29 May 2005 (UTC)
Credit for Metropolis et al?
Hello. Wasn't simulated annealing introduced first in the article:
Metropolis,N., A. Rosenbluth, M. Rosenbluth, A. Teller, E. Teller, "Equation of State Calculations by Fast Computing Machines", J. Chem.Phys.,21, 6, 1087-1092, 1953.
?
- As I understand it, no. The Metropolis algorithm is a way of sampling from a distribution
- <math>P(x) = {1 \over Z} \exp(-\beta E(x)).<math>
- for some value of inverse temperature <math>\beta<math> and for some energy function <math>E<math> that is a function of a vector <math>x<math>. It was typically used to computed expected values of functions over x. I believe that <math>\beta<math> was held fixed. Kirkpatrick realized (and proved) that if <math>\beta<math> decreased slowly as time approaches infinity, then the expected value of x approaches the global minimum of E. -- hike395 02:01, 4 Jun 2005 (UTC)