Talk:History of computing hardware
|
Missing image Cscr-featured.png Featured article star | History of computing hardware is a featured article, which means it has been identified as one of the best articles produced by the Wikipedia community. If you see a way this page can be updated or improved without compromising previous work, feel free to contribute. |
Contents |
2001 talk
I doubt very much now that Harvard Mark I was fully programmable. Later models could switch conditionally from one paper roll to another, but since I don't believe it could rewind the paper rolls, no actual loops are possible and it's not Turing complete. But I'm not sure. Anybody know details about the Mark I?
Oh, and I just read that the Manchester Mark I was actually the first functional von Neumann machine, even before EDVAC -- but of course based on EDVAC's ideas. --AxelBoldt
I believe you are correct about the IBM-Harvard Mark I. This machine, by the way, was not built at or by Harvard. It was built for Harvard (and the U.S. Navy) by IBM.
My understanding is that the first operational stored program computer was the "Manchester Baby Mark I", a test machine for the Williams-tube storage technology, not the Manchester Mark I itself. The EDSAC at Cambridge appears to have preceeded the Manchester Mark I as the first "practical" stored program computer in operation.
Axel: An electromechanical computer necessarily uses some electronics. Thus "electro". But I understand what you mean about the electronics/electromechanical distinction.
Aiken directed the construction of the ASCC by IBM engineers at the IBM Endicott labs. Construction was completed in 1943. It was moved to Harvard, and operation began May 1944. [1] (http://www-groups.dcs.st-andrews.ac.uk/~history/Mathematicians/Aiken.html)
As stated, the EDVAC was never completed--so all EDVAC-based computers were "before EDVAC". The "Baby" was first based on the EDVAC design that got a program running. --The Cunctator
I gotta say, the Wiki method really works--this entry has gotten amazingly better in a vary short period of time. It's still a little too discursive (some of the specificity would be better in stand-alone entries), but it's highly informative and readable. --The Cunctator
Not to disagree, but there's still a whole lot missing. No mention of Whirlwind, SAGE, PLATO, to give just a few examples.
- Is that a disagreement or not? SAGE is mentioned in the history of networking... --The Cunctator
It seems like the end of the article is the original timeline visible at the top of this page table, and reads very much like a timeline. Wouldn't more of an overview and synthesis be appropriate, considering we have the (very good IMHO) other timeline?
I agree, particularly the latter part of the article has too many dates, names and details obscuring the general flow of progress. --AxelBoldt
I don't want to be argumentative, but I thought the new article didn't tell much of anything before WWII or after 1970, let alone flow of progress. The flight control system of the F14, while interesting, was hardly a landmark computer.
Yes, there was a fair amount where I just went in and pasted missing stuff from the old page. However, I feel it is more important to have date-filled placeholders than nothing at all. Now that some base data is there, anyone can go in and rewrite/rearrange it. By all means, feel free to edit as you see appropriate. The power of Wiki :-) --Alan Millar
Names, dates, and details are good things; but need to be pushed down into more detailed articles on more specific topics. At the same time, an overview/summary/synthesis needs to be presented at this level. But my guess is its easier to do this bottom-up rather than top-down. In other words, collect all the detailed information first, then refactor into appropriate levels of detail.
Also, should this article cover software as well as hardware? -HWR
Of course, hardware w/o software is scrap metal. The question is whether it's tangible enough to procude records. --Yooden
Anyone can refactor (a basic design feature of Wiki), but only if there is some information to refactor, so I think the bottom-up approach is necessary.
- But all the information is already on the Computing timeline page, so why repeat it here? I think this article should have a bird's eye view on Computing history, just outlining the developments, and not listing anecdotes such as ads bought by certain companies at certain sports games. --AxelBoldt
As to Swiss clocks: the essence of computing is not the addition and subtraction
of numbers, although it grew out of it and is a necessary part of it. The essence
of computing is the execution
of a sequence of instructions, and in that respect modern computers have as much in
common with Swiss clocks as the abacus. And no, I'm not recommending
removing the reference to the abacus :-) --Alan Millar
- Swiss clocks neither process information nor can be programmed. They are just fancy mechanical devices, like all mechanical clocks. I don't see any relation to the history of computing except maybe that some early mechanical calculators used similar mechanisms as mechanical clocks (why Swiss?). Also, why are they mentioned in the paragraph about programmability? --AxelBoldt
What about music boxes? They're programmed to play tunes. -HWR
They have a single sequence, as do player pianos, and player pianos can even use a different paper roll to play a different tune. In that respect, the music box mechanically is a predecessor to the Jacquard loom. The Swiss clocks had multiple sequences of actions, where a main cog would activate other cogs to order different actions. The first GOSUB? :-) --Alan Millar
Actually, there are music boxes that play tunes from interchangeable discs. I don't know the chronology of this however.
BTW, is this article restricted to the history of DIGITAL computers? Analog computers don't generally execute sequences of instructions. -HWR
"IBM decided to enter the PC market ..., with the IBM XT" is not correct -- the XT was their second machine, with the hard drive.
That's correct--I'll change it. The first one was simply called the "IBM PC". Some mention of Compaq and the beginnings of the clone market in that era seems appropriate too. --LDC
I'm afraid this entry is getting too timeline-y...but I see that others are aware of that. Looks like we need to start thinking about some more subentries...anyone have any suggestions? --The Cunctator
Unfortunately the timeline here has many inaccuracies and ommissions of historical importance: 1965: IBM System 360 (first OS); 1968 first mouse/window system demo; 1973: CPM first micro OS; 1969 Intel 4004; 1977 Commodore Pet & TRS 80; 1978 Atari 400/800; 1979 Motorola 68000 32 bit CPU (w. 16 bit data and 24 bit address bus); 1981 Commodore Vic20 & IBM PC & Xerox Star (w. GUI/Mouse/Ethernet...); 1982 Commodore 64 with 64k RAM $600 & Timex Sinclair 2K RAM $99; 1983 1 million Commodore Vic20s and 1 million Apple IIs sold; 1985 Commodore Amiga with multitasking/Color GUI/accelerated video/stereo sound/3.5" floppy $1200; 1988 7 million Commodore 64 and 128 computers sold.... --Jonathan--
Feel free to enter whatever you think is missing to Computing timeline, not to History of computing. --AxelBoldt
Ack! It's getting insanely more timeliney! I'm thinking of paring. Please, everyone, notice Computing timeline. History of computing shouldn't supposed to list every computer, but discuss the intellectual development of the engineering/science of computing. --The Cunctator
Moved from /Permission-subpage:
I have obscured the email addresses in the message below in an obvious way. --AxelBoldt
Received: from mail11.svr.pol.co.uk by mail.metrostate.edu; Tue, 21 Aug 2001 19:25:28 -0500 Received: from modem-88.bass.dialup.pol.co.uk ([217.134.8.88] helo=arthur.the-roost) by mail11.svr.pol.co.uk with esmtp (Exim 3.13 #0) id 15ZLpr-0001gy-00 for Axel.Boldt@OBSCURED1.metrostate.edu; Wed, 22 Aug 2001 01:25:32 +0100 Received: from benji.the-roost ([10.0.0.5] helo=localhost ident=mail) by arthur.the-roost with esmtp (Exim 2.12 #1) id 15ZLpq-0003Te-00 for Axel.Boldt@OBSCURED2.metrostate.edu; Wed, 22 Aug 2001 01:25:30 +0100 Received: from stephen by localhost with local (Exim 3.12 #1) id 15ZLpp-0000vx-00 for Axel.Boldt@OBSCURED3.metrostate.edu; Wed, 22 Aug 2001 01:25:29 +0100 Date: Wed, 22 Aug 2001 01:25:29 +0100 From: Stephen White <swhite@OBSCURED4.ox.compsoc.net> To: Axel Boldt <Axel.Boldt@OBSCURED5.metrostate.edu> Subject: Re: Computing history timeline for GNU encyclopedia Message-ID: <20010822012529.A3581@benji.the-roost> References: <sb812be5.012@mail.metrostate.edu> Mime-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Disposition: inline User-Agent: Mutt/1.2.5i In-Reply-To: <sb812be5.012@mail.metrostate.edu>; from Axel.Boldt@OBSCURED6.metrostate.edu on Mon, Aug 20, 2001 at 03:25:19PM -0500 Sender: <stephen@OBSCURED7.trillian.earth.li> ---- Original Message ---- > From Axel Boldt <Axel.Boldt@OBSCURED8.metrostate.edu> > Date: Monday, 20 Aug 2001, 21:25 > > I noticed that you have the definite computing history timeline on > your web site. Maybe you have heard about the GNU style > encyclopedia at http://wikipedia.com ; we currently have only a weak > entry about computing history (in fact some of it seems to be > illegally copied from your site). Would you consider donating your > timeline to the Wikipedia? You can enter and edit the article about > computing history yourself, just go to > http://wikipedia.com/wiki/History_of_computers and click on "edit this > page right now". Ok. First I'll give you permission to use whatever you want from my computing history site in the encyclopedia. I'd appreciate it if the link http://www.ox.compsoc.net/~swhite/history.html is retained for people to get the most up-to-date version of my information, however since the GPL doesn't allow for such provsios this will remain an informal "Gentleman's agreement" and is not legally required for the inclusion of material from my site in your encylopedia or derived works. On the second front I'm rather busy moving house at the end of the week and I've been planning a bit of an update to my computing history pages for a while - so I'm not sure when Ill have time to look closely at your history of computing entry and possibly update it. However I'll leave this email in my pending folder in the hope that I'll have time to do so in the not-too-distant future. Good luck with the project, -- Stephen White Oxford University Computing Society System Administrator http://ox.compsoc.net/~swhite/ PGP Key ID: 0xC79E5B6A <swhite@OBSCURED9.ox.compsoc.net>
Fantastic!!! --User:LMS
- See also : History of computing
2002 talk
Is there a reason for all the bold entries in the article? They don't seem consistent. I'd like to remove them. Aldie 15:53 Nov 29, 2002 (UTC)
- It appears that the original authors were trying to break up the long, long blocks of type by bolding some of the names. Crossheads are the way to do this. Change all the existing === h3 heads to the correct == h2 heads, debold all the names, then go back and add some === h3 heads that say things like TV Typewriter, etc. Ortolan88
2003 talk
Noyce and Kilby were independent inventors of the Integrated Circuit. Intel invented the Microprocessor, of course, but not Noyce. 169.207.117.23 21:48, 25 Nov 2003 (UTC)
This page and and the timelines are incorrectly titled. They seem to be about history of technology used in computing rather than history of computing itself. Obviously, most computing until recently was done with pencil and paper, and that is not mentioned in these timelines. Would anyone object to moving this page to history of computing technology and starting a separate page that is about computing, not about machines used in computing? The statement that the "computing era" began only when computing machinery began is idiotic. Michael Hardy 21:48, 30 Nov 2003 (UTC)
Slide rules are not even mentioned on this page. Really, I'm beginning to think people trained in computer science should not be allowed in public places, in the interest of public safety. Michael Hardy 21:52, 30 Nov 2003 (UTC)
- Done. 169.207.88.95 16:42, 1 Feb 2004 (UTC)
The article titled history of computing hardware is fairly long, but no one has attempted to write a history of computing itself on Wikipedia. Such an article would treat algorithms to be executed with pencil and paper, with or without the aid of tables, as well as computing with abaci, slide rules, or machines of any kind. Michael Hardy
- I'd like that article (history of computing) to be renamed to "History of computing methods" for clarity. Tempshill 00:37, 4 Dec 2003 (UTC)
Why not move this article to History of computers, rather than the current and cumbersome History of computing hardware? Yes, before circa 1950, "computer" meant a person who did mathematical computation, and so one could argue that "History of computers" could refer to either computers (people) or the computers (machines)...but that would be a fairly trifling objection, I think. --Sewing 21:58, 17 Dec 2003 (UTC)
- Hear, hear. A good suggestion for clarity and simplicity. --Wernher 22:05, 17 Dec 2003 (UTC)
- Well, I want to change it but am reluctant to act, for 2 reasons: (1) There are a lot of pages that link to this one (which means manually changing them if I want to be a good Wikipedian); and (2) I may be wading into something I will later regret. I'll take a wait-and-see attitude for now... --Sewing 18:17, 18 Dec 2003 (UTC)
Title dispute
Originally listed at VfD
- History of computers. It is currently a redirect to History of computing hardware. I couldn't move the 2nd article to the 1st, so I removed the redirect text in the 1st article, but I still couldn't do the move. "History of computing hardware" is a cumbersome attempt by a mathematician to distinguish the history of computers from the History of computing (the article's former title), which encompasses not only computers but pen and paper as well. His point is valid, but the new title he chose for the article is unnecessarily awkward. --Sewing 17:14, 21 Dec 2003 (UTC)
- I am thinking whether History of computation is a better title than History of computing. btw There is a Timeline of computing, too. Optim 17:47, 21 Dec 2003 (UTC)
- I agree History of computing is not ideal. But isn't History of computation also awkward? Anyhow, it goes back to Michael Hardy's argument that "computing" (and "computation") is not just about computers but about mathematical techniques that precede computers. I think History of computers is the best option: it is simple and unambiguous. --Sewing 18:08, 21 Dec 2003 (UTC)
- History of computation still seems nice and more correct to me. Optim 19:01, 21 Dec 2003 (UTC)
- I think the term computation is more often (academically) used for the theoretical side of things (algorithms, complexity,etc.), computers seems better for the practical side to me. --Imran 22:17, 21 Dec 2003 (UTC)
- That's right. We can have a Computation article for the academic theoretical history and a Computers article for practical-business computing. how do u think? Optim 00:49, 22 Dec 2003 (UTC)
- Well, computer science-related articles already take care of academic and theoretical subjects. What I mean is that the current article History of computing hardware--which is about the history of computers--should be moved to the much simpler History of computers. --Sewing 01:47, 23 Dec 2003 (UTC)
- That's right. We can have a Computation article for the academic theoretical history and a Computers article for practical-business computing. how do u think? Optim 00:49, 22 Dec 2003 (UTC)
- Keep, who wouldn't be interested in the history of computers? Lirath Q. Pynnor
- Move to History of computers. Mathematics is as much a part of the history of computers as it is the history of their hardware. - Mark 06:58, 30 Dec 2003 (UTC)
- I don't know why this was listed on VfD so long. It isn't really a VfD decision. It's more of a title dispute so I've listed it at Wikipedia:Current disputes over articles instead. Angela. 05:42, Jan 4, 2004 (UTC)
- I am thinking whether History of computation is a better title than History of computing. btw There is a Timeline of computing, too. Optim 17:47, 21 Dec 2003 (UTC)
Jack Kilby 1957
Even though I changed the date for the IC to 1958 to conform to the Nobel laureate article, I happen to know that Kilby thought of the IC during the mass vacation at TI (which would have been in late 1957). Kilby didn't have the vacation seniority, so he came to work at an empty Texas Instruments facility. The quietness of the work environment allowed Kilby to concentrate his thoughts and invent the IC. 169.207.115.129 01:41, 5 Jan 2004 (UTC) Thus the 1958 date must be the official publication date and not the actual date of conception.
Invention of the abacus
Some sources assert that the abacus was inventing in China around 3000 BC; others that it was invented by the Romans or Babylonians around 1000-500 BC and traveled east to China. At present, this Wikipedia article says it was of Chinese invention. It would be nice to come up with an account of the current opinion that was as complete, accurate, and NPOV as possible.
Turing Completion is not a good test for a computer
The article states that Turing Completion is "as good a test as any" for whether a machine is a computer. I fundamentally disgree. It is too easy to build a machine that is theoretically Turing Complete. The Z3 has been shown to be theoretically Turing complete yes. But so what! The z3 had no conditional branching and the proof that it was Turing complete relies on mathmatical tricks defined in the 1990's. It was never intended to be used as a general purpose machine. Babbages Analytical engine was more flexible than the Z3. Furthermore if the z3 was Turing Complete I would lay money on a bet that the ABC as also Turing Complete it was functionally very similar. And what about the Colossi. The MKii Colossi (of which 9 not 10 were built, the MKi was later converted to a MKii) at least had conditional branching. It too must have been "theoretically Turing complete.
It really isn't good enough to shy away from a hard definition by hiding behind the definition of Turing Completion. It has been shown that Conways game of life is Turing complete. It is possible to build a universal turing machine using only a carefully defined set of tiles and them applying conways rules. And what does this prove? It proves that Turing Completion is not a very difficult status to achieve.
Practical as opposed to theorectical Turing completion is something very different. The first computer that could automatically exploit the fact that it was Turing complete and could do this in a practical way, and solve real problems - That was the first computer. The ENIAC does not count it was a serial single purpose machine. Sure you could rebuild it like so many lego bricks but that is hardly a practical general purpose computer. The Manchester MKi was the first stored program machine but it's purpose was to prove that the williams kilburn tube worked effectively as a memory, not solve real problems. It was a research machine. The EDSAC at Cambridge was the first real computer in the modern sense. It was the first machine that could automatically exploit the fact that it was Turing Complete and it could do this in a practical way not merely as a party trick or under laboritory conditions. It was the first machine to impliment the von Neumann Architecture and solve real problems. (the Manchester Mki and maybe the BINIAC preceded the EDSAC but they never solved a real problem.)
A computer is a tool it must be practically capable not just theoretically capable. All the machines before EDSAC were theorectically general purpose but practically special purpose. A computer is a general purpose device. EDSAC was the first modern computer (You may now rip me to pieces ;-) John R.Harris (http://www.virtualtravelog.net/)
- A minor comment: contrary to the commonly held belief, the Colossus computer in fact did not have condition branching. (Or, indeed, branching of any kind - or a program of any kind, for that matter!) So it definitely was not Turing-complete. See Talk:Colossus computer for more. Noel (talk) 05:05, 1 Mar 2005 (UTC)
The role of weather prediction in the development of computing
I am trying to work in Lewis Fry Richardson's use of differential equations for predicting weather. At the time he wrote his book 1922, computing was not practical for predicting weather, and yet I believe Atanasoff was trying to solve some meteorological problems when he invented the ABC; thus there has been a meteorological application since the first electronic computer; to this day, the supercomputers are used for predicting weather. Ancheta Wis 23:08, 5 May 2004 (UTC)
See: Navier-Stokes equations for the basic equation of weather prediction Ancheta Wis 18:08, 22 May 2004 (UTC) and alsoWikipedia:WikiProject Fluid dynamics. Richardson's approach is listed in Numerical ordinary differential equations. Ancheta Wis 10:12, 25 May 2004 (UTC)
I am replying to a high school librarian's assessment of this article (http://www.syracuse.com/news/poststandard/index.ssf?/base/news-0/1093338972139211.xml): upon repeated re-reading and editing of the statements in this article, I can state categorically that the edits are made in good faith. As a professional with decades spent on technology, I have learned and experienced items which not even a professional historian could possibly have learned. Since the field has expanded every decade since the 1880's, and since technologists have not had a venue for documenting their accomplishments until the advent of Wikipedia, their work has gone unsung until now. Ancheta Wis 16:58, 26 Aug 2004 (UTC)
Italics everywhere?
Why is it that seemingly every noun in the article is in italics? Did someone get confused about how to make Wiki links? Most of the italic portions would be (I think) most appropriately either deitalicised, or made into wiki links. (Italic emphasis gratuitously added to illustrate how tiresome it is to read something formatted like that.)
Unless there's some particular reason why it's like that, I'll try to change them around a bit at some point soon. PMcM 02:27, 3 Dec 2004 (UTC)
- Michael Hardy puts the usage thus: When a noun is used in a sentence, then it is not italicized, unless the sentence is about that noun, in which case it is italicized. Here is a link to further use of italics. (http://en.wikipedia.org/wiki/Wikipedia:Manual_of_Style#Italics) Ancheta Wis 02:47, 3 Dec 2004 (UTC) Thus when I refer to logarithms of numbers (which are about a transformation of the respective numbers), I italicize to emphasize the transformation of the operations of multiplication and division into the operations of addition and subtraction.
Speedy response! Just finished playing around with it.
Who is Michael Hardy? I think that possibly going by the Wikipedia guidelines I feel it more appropriate to have a lot (about 75%) of what is/was in italics in that article as wiki links.
Certainly if it was written on paper it would be more appropriate to have the visual cue of italic text, used sparingly here and there where it might be confusing otherwise, but I personally don't feel it's necessary in the majority of places it was present in the article. If you're really incredibly attached to them, please feel free to put them back in, but I think the article would be less well off without the inclusion of the links I added. Thanks. PMcM 03:06, 3 Dec 2004 (UTC)
Unrelated: Any idea why this talk page has no contents section? Is it likely to be something I have set wrong, or is it the same for others? PMcM 03:10, 3 Dec 2004 (UTC)
- It does have a contents section, you just have to look hard for it ;-) The reason is, the top of the page is filled with comments divided using horizontal rules. The TOC doesn't appear until after those. — Matt 10:39, 3 Dec 2004 (UTC)
Also, apologies for the somewhat patronising tone I used to initially raise the issue. PMcM 03:13, 3 Dec 2004 (UTC)
What to do with this tale...
I removed this:
- During World War II, Curt Herzstark's plans for a mechanical pocket calculator (see Curta) literally saved his life. In 1938, while he was technical manager of his father's company Rechenmaschinenwerk AUSTRIA Herzstark & Co. he had already completed the design, but could not manufacture it due to the Nazi annexation of Austria. Instead, the company was ordered to make measuring devices for the German army. In 1943, perhaps influenced by the fact that his father was a liberal Jew, the Nazis arrested him for "helping Jews and subversive elements" and "indecent contacts with arian women" and sent him to the Buchenwald concentration camp. However, the reports of the army about the precision-production of the firm AUSTRIA and especially about the technical expertise of Herzstark lead the Nazis to treat him as an "intelligence-slave". His stay at Buchenwald seriously threatened his health, but his condition improved when he was called to work in the Gustloff factory linked to the camp. There he was ordered to make a drawing of the construction of his calculator, so that the Nazis could ultimately give the machine to the Führer as a gift after the successful end of the war. The preferential treatment this allowed him ensured that he survived his stay at Buchenwald until the camp's liberation in 1945, by which time he had redrawn the complete construction from memory. See: Cliff Stoll, Scientific American 290, no. 1, pp. 92-99. (January 2004) Also see: [2] (http://www.vcalc.net/cu-bckup.htm).
While this is a fascinating tale, I'm not sure whether its significant enough in terms of the history of computing hardware to deserve a long paragraph in an overview article on the topic. Mechanical calculators were commonplace by the 1930's, even if they weren't miniaturized. --Robert Merkel 23:38, 5 Dec 2004 (UTC)
- inserted the information into Curt Herzstark Ancheta Wis 07:13, 6 Dec 2004 (UTC)
I just noticed that the DNA computing section in History of Computing was removed . Once one concedes that the travelling salesman problem is a true computation problem, then one must also concede that a computation using DNA is a computing hardware feat. If that is so, then the recognition that DNA can form the basis for a Turing tape is part of the history (and future) of computation; thus the recognition that DNA forms a code is part of the intellectual heritage of computing and part of its future. That is why Adleman actually solved a travelling salesman problem using DNA. But if that is truly a CS item, then Gamow deserves to be mentioned as this was part of the work that occurred before 1960. Ancheta Wis 01:43, 14 Dec 2004 (UTC)
- This is an overview, which means some editorial judgement needs to be made about what are the most essential points to be covered in the space available. There are any number of things this article omits or discusses only briefly. There is much that could be said about analog computers, for instance. DNA computing, while an interesting concept, has not seen wide practical adoption compared to the technologies descended from those covered on this page. Therefore, remove was IMO appropriate. --Robert Merkel 12:26, 14 Dec 2004 (UTC)
Colossus relays
I altered the following:
- "The Colossus used only vacuum tubes and had no relays."
It seems Colossus did use relays, both for buffering output and as part of its counters: [3] (http://www.codesandciphers.org.uk/lorenz/colossus.htm), [4] (http://www.codesandciphers.org.uk/virtualbp/fish/machines.htm) — Matt Crypto 09:30, 13 Dec 2004 (UTC)
American developments
Just curious; why is "American developments" a distinct section? --Khendon 14:37, 1 Mar 2005 (UTC)
- Earlier versions indeed intermixed the various projects irregardless of nation. Thus the current headings are a matter of preference by the contributors.
Hm. I think it makes much more sense to have a purely chronological article. --Khendon 16:46, 1 Mar 2005 (UTC)
- I made the change in organization a while ago, to put Zuse and Colossus before the stuff on what was happening in America because it was the American work that led to ENIAC and the EDVAC design. Go back and read what it was like before and after the change was made and you'll hopefully see why I did it. Personally I think the article pays too much attention to Zuse and Colossus - they were both fascinating dead ends IMO - and if I was reorganizing the article further would considerably trim down the material on them. --Robert Merkel 23:13, 1 Mar 2005 (UTC)
Request for references
Hi, I am working to encourage implementation of the goals of the Wikipedia:Verifiability policy. Part of that is to make sure articles cite their sources. This is particularly important for featured articles, since they are a prominent part of Wikipedia. The Fact and Reference Check Project has more information. Thank you, and please leave me a message (http://en.wikipedia.org/w/wiki.phtml?title=User_talk:Taxman&action=edit§ion=new) when a few references have been added to the article. - Taxman 19:33, Apr 22, 2005 (UTC)
- Taxman, this is an overview article, summarising facts found across many other articles. Hence, there's likely not to be much direct referencing here. --Robert Merkel 04:19, 23 Apr 2005 (UTC)
- Added W.J. Eckert's little orange book. It would be good to acknowledge Lewis Fry Richardson's work, but it was decades before computers arose which could implement his method. (Now it would be called a system analysis, but he invented a field here.) I don't see a suitable way to work it into the article, which is about hardware, after all. Ancheta Wis 07:46, 23 Apr 2005 (UTC)
- Great, thanks for your work, that is much better. Certainly an overview article can have references that back up its facts too. - Taxman 13:49, Apr 23, 2005 (UTC)
Heron of Alexandria
I scanned the article page and couldn't find any refference to Heron. I thought it may be a good idea to put a reference to his automated theater in the beggining, right around Wilhelm Schickard. However, I'm not sure since the concept of computing here seems to be more calculation-based, if no one has any problems I think it would be a good addition. Herons automated theater was a series of pegs with strings wrapped around them. Various weights were tied to the strings and controled the movement of objects for the play. In the end it was a simple analog computer program. --Capi crimm 03:21, 24 May 2005 (UTC)
- I propose that you start a new page, History of automata or History of automatons. Stanislaw Ulam, John Von Neumann, John H. Conway, ... Stephen Wolfram were/are quite aware that the computing paradigm has automata in it. The topic is called Category:Cellular automata. However the concept of computing is tied to Leibniz' notion of expression evaluation, which means, in the case of a computing machine, we are automating human computation. When we are automating the motion of a puppet, which is one of the things that computers can do, the subject is called Automatic control, or Cybernetics or Robotics. Computer used to be a job title. Perhaps someday computers will be called controllers, as well. Come to think of it, perhaps History of controllers or History of cybernetics would be a good page for Heron's automated theater. Ancheta Wis 08:52, 24 May 2005 (UTC)
- When I followed the Cellular automata link myself, I found material on the history of cellular automata, which would work quite well on the future History of ... page to which I am referring, as well as Heron's automated theater. Would you like to start such a page? I could contribute to it as well. Ancheta Wis 08:57, 24 May 2005 (UTC) If History of Cybernetics were to be the page, then Norbert Wiener's concept of the steersman (the root meaning of cybernetics) or the pilot would come into play. Aeronautics would come into play as well, because the theory of control became pressing with with the invention of the airplane. There is an extensive literature on automatic control in the IEEE Transactions. So we wouldn't just be flying blindly, to put a little pun in this. Machine vision could also be added, if the topic were to be Cybernetics.
List of books
I added a list of books for further reading. These are the ones I had on my shelf. The order may look haphazard, but I tried to put the more accessible ones at the top. I thought about ordering them by date or by author - if anyone thinks they should be that way, please feel free to change the order (and to add to the list, of course). --Bubba73 20:15, 7 Jun 2005 (UTC)
- Thanks. If you had used them to fact check material in this or other articles, please consider listing them as actual references, or better yet, citing individual facts to them. Much better than taking a (potentially unknown) Wikipedia editor's word for the material is citing it to a reliable source. Use whatever format you like, but (Smith, 2003) is fine, or you can use some form of footnote system including the invisible one. Thanks again. - Taxman Talk</sup> 23:12, Jun 7, 2005 (UTC)
- I haven't done much (if anything) to this page, but I've contributed a lot to particular computers, mainly 1946-1954. I used 6 or 7 of those books, but mainly about 3 of them. I should have put a reference for each of the edits (I did on a few) at the time, since now it is hard to know where I got what information, w/o looking it up again.
Too much analog...
Maybe some of the new analog computer stuff should be trimmed (and placed in the appropriate article), as it leaves the article as a whole rather unbalanced. --Robert Merkel 04:15, 11 Jun 2005 (UTC)
Add more to the other sections then. Greg321 10:39, 11 Jun 2005 (UTC)
- This article is supposed to be a readable summary of the history of computing hardware. At the moment, this is like a history of the automobile that spent half its content looking at steam-powered cars. The excessive information obscures the forest for the trees.--Robert Merkel 23:29, 11 Jun 2005 (UTC)
- While not an advocate of the current balance in the content, the current information does highlight the fact that steam powered computation was a vision which pioneers like Babbage and William Stanley Jevons were pursuing. It is not irrelevant, as it shows that technologies such as both mechanical and electrical analog computation, electronic digital computation, DNA computation and quantum computing are possible technologies, and certainly not the only possible forms for computing hardware. The period from 1945-1950 was important, but not the only possibility. It could have happened several other ways. The mechanical antecedents are very important, as they illuminate a path that could have been taken as early as the 1500s; only the requirements for precision of manufacture in the computing devices are missing for large-scale computing hardware. Ancheta Wis 12:40, 13 Jun 2005 (UTC)
- That's an interesting point to discuss a little bit, but too much speculation as to what might have happened if history had turned out differently is likely to get unencyclopedic very quickly. --Robert Merkel 01:25, 14 Jun 2005 (UTC)
- William Stanley Jevons actually was one who dreamed of steam-powered computation, possibly as early as the time when he lived in Australia 150 years ago. His Logic Machine was exhibited in Sydney last year, BTW. Ancheta Wis 21:42, 14 Jun 2005 (UTC)
- While not an advocate of the current balance in the content, the current information does highlight the fact that steam powered computation was a vision which pioneers like Babbage and William Stanley Jevons were pursuing. It is not irrelevant, as it shows that technologies such as both mechanical and electrical analog computation, electronic digital computation, DNA computation and quantum computing are possible technologies, and certainly not the only possible forms for computing hardware. The period from 1945-1950 was important, but not the only possibility. It could have happened several other ways. The mechanical antecedents are very important, as they illuminate a path that could have been taken as early as the 1500s; only the requirements for precision of manufacture in the computing devices are missing for large-scale computing hardware. Ancheta Wis 12:40, 13 Jun 2005 (UTC)
More detail on some sections
More detail could be added to the sections on electronic computation. We could add more on the role of Herman Goldstine, von Neumann, etc. for example. What I have in mind is the chance meeting of Herman Goldstine and von Neumann on the Princeton train, and how it turned into a doctoral examination on Computer Engineering for Goldstine. Another item might be how the Israelis got the von Neumann architecture first hand; that is how their first machine got built. Another item might be the use of Binary Coded Decimal in the first electronic computers. Perhaps we might sketch a little outline before actually adding in the text. Ancheta Wis 23:36, 14 Jun 2005 (UTC)
- I agree. Some of this is in other articles. See IAS machine for how Israel got the von Neumann architecture - the plans for the IAS were made freely available, and about 15 computers were based on the design, WEIZAC was one of them. Bubba73 01:10, 15 Jun 2005 (UTC)