Talk:Computable number
|
I removed the following incorrect definition from the main "computable number" page. The set of numbers described by this definition is not even closed under addition (plus, you would get a different set of numbers in different bases). -- Cwitty
A real number is called computable if its digit sequence can be produced by some algorithm. The algorithm takes a natural number n as input and produces the n-th digit of the real number's decimal expansion as output.
- Thanks for the catch! I thought this was the original definition given by Turing, but apparently not... AxelBoldt 17:21, 23 Feb 2004 (UTC)
- AxelBoldt has pointed out (on my talk page) that the definition I removed above is similar to Turing's original definition (although, to be nitpicky, Turing used binary rather than decimal). I apologize for calling it "incorrect" above. Cwitty 02:40, 4 Mar 2004 (UTC)
From the article:
- "and arguably this field contains all the numbers we ever need in practice"
This, of course, depends on physics. Does anyone know anything about the current state of research in this area (eg Pour-El and Richards + ensuing work)? It would be great to have a seperate article on this topic. -- Pde