Talk:Remainder
|
Contents |
Remainder in non-integer division
As I mentioned in the discussion page for modulo operation, I made an edit to that article today that clarifies that computers do not constrain a and n to be integers. Rather, the evaluation of a mod n in computers is carried out such that quotient q is an integer and remainder r is allowed to be a non-integer.
I am unsure of the ramifications of this on the remainder article. It seems that it is possible to define "remainder" more generally, not just in the context of pure integer division / the division algorithm, so I think this should be perhaps mentioned here. Please do this, if you can, or correct me if I'm wrong. Thanks - mjb 04:44, 30 Jan 2005 (UTC)
- You see, the remainder can indeed be defined for any real numbers of course. Even for complex numbers if you wish. However, mathematicians don't do that. Actually, mathematicians don't even bother to define the remainder for negative numbers (I know you had been struggling with that issue).
- You see, unlike in computer science, where the remainder is just a function, in math the remainder is a concept, albeit a very important one. It has to do with the arithmetic of natural numbers. That's why mathematicians have no use for the generalizations to the real numbers, or even to the negative integers.
- Good job on modulo operation. Oleg Alexandrov | talk 04:54, 30 Jan 2005 (UTC)
- Thanks, and thanks for your contributions and assistance throughout this process. I still think we can improve this remainder article, though. I can accept that the concept of remainder in math is only for natural numbers. It does seem, though, that there is a separate, less restrictive concept of remainder in computing. Because of this, and especially since modulo operation currently is defined in terms of the definition given in remainder, I think it may be best to at least acknowledge the existence of the schism. - mjb 07:00, 30 Jan 2005 (UTC)
Remainder (Reply to User:Mjb)
Thank you for your edits. However, you are not mathematically correct.
The word "remainder" means "leftover", "residue". As such, the remainder of division of 5 by 3 is not 5, is not 8, it is 2, and only 2. As such, your addition of the section "Other definitions" in the remainder article is not mathematically correct.
The concept of remainder is well-defined in Mathematics. The only possible ambiguity is when you talk about negative numbers, there you have a choice in sign. But that is all. Please take a look at Division algorithm and Euclidean algorithm.
I think you confuse remainder with modular arithmetic.
You are right that the remainder can be defined for real numbers, I will add a section for this. But, still the same basic truth holds, the remainder is always smaller in absolute value than the quotient.
Tomorrow I will go back to remainder and correct things. Please do not take it personally. Oleg Alexandrov | talk 16:02, 30 Jan 2005 (UTC)
- OK, you don't like the "other definitions" section. My response: I see three contradictions to your assertion that remainder is always smaller in absolute value than the quotient and your implication that it is not possible to define remainder in other ways:
- (1) the explanations that User:Revolver gave to me in the discussion on Talk:Modular arithmetic last summer, in which he says that this is true in "basic arithmetic" only if you define it that way;
- (2) the "Generalisations" section of Division algorithm which says "There is nothing particular special about the set of remainders {0, 1, ..., |d| − 1}. We could use any set of |d| integers, such that every integer is congruent to one of the integers in the set. This particular set of remainders is very convenient, but it is not the only choice."; and most importantly for me…
- (3) E.L. Lady's very clear, informal paper on the Division Algorithm theorem (http://www.math.hawaii.edu/~lee/courses/Division.pdf), which states "In fact, after a while one might notice that q can be chosen quite arbitrarily. One could choose q=0 and r=a, or q=1 and r=a−d. This is in fact typical in mathematics: when there are more unknowns than equations, that usually means that there are lots of dierent solutions. When one looks at the example we started with, though, one sees that q=0 and q=1 are not correct values for the quotient when 1052 is divided by 29. The problem is that we have been considering only the equation a=qd+r instead of looking at the whole theorem. The values q=0 and q=1 don't work in the above example because they give the values 1052 and 1023 for r, and these don't satisfy the condition 0≤r<d. […] There's a very important insight to be learned from what we've just done. Namely, in the Division Algorithm, having two unknowns q and r is a smokescreen. The Division Algorithm is actually a statement about only one variable q.
- Given the above references, it seems that one can define remainder as one likes. The Division Algorithm theorem does not; it constrains remainder to the well-known set 0≤r<d. You seem to be saying that this is the one and only definition of remainder. It seems to me that while this is the preferred, well-known definition, it is indeed arbitrary, and generalisations can be made without invalidating the theorem. The Division Algorithm states some facts about q, d, a and r ***when*** 0≤r<d. It does not follow from this that "remainder" in general must satisfy those same constraints. This article should make a clear distinction between the general definition and that of the Division Algorithm. - mjb 18:46, 30 Jan 2005 (UTC)
- I think you have a point about the remainder thing. There is indeed one widely known definition, which is unambiguous, well-defined, and accepted by all mathematicians. Say, that is the default. But of course that does not need to be the only definition. Ultimately, the remainder can be in any set, even if in that case the word "remainder" loses its actual original meaning. I will think more of this. Oleg Alexandrov | talk 19:53, 30 Jan 2005 (UTC)
- OK, I think we are on the same page. Hooray! :) I have made some minor edits to the article today, but I think it still could use further improvement. I think the last two sections should be combined somehow, in order to emphasize that there is the one widely accepted, default definition, and then there are the more general extrapolations, of which the one involving negative numbers is just one of many possible examples. Another example would be the use of non-integers… - mjb 20:08, 30 Jan 2005 (UTC)
- I think you should not combine the last two sections. Do you remember how confused you were by these matters at the beginning? Putting the sections separate, shows the logical progression from the first definition, to the last definition. Putting things together might obscure the point and leaving the reader with the opinion that the remainder operation is impossible to define in satisfactory unambiguous manner, which is false. What do you think? Oleg Alexandrov | talk 21:04, 30 Jan 2005 (UTC)
- The two sections I am referring to are "the case of general integers" and "other definitions". It is my understanding that you added "the case of general integers" under protest, and do not consider it to be anything more than a pointless theoretical exercise, just like the example in "other definitions". That is why I think it is safe to combine those two sections. - mjb 00:24, 31 Jan 2005 (UTC)
- I think we should not include a mention of the division of reals. You see, you and me have different agendas. My agenda is to explain how the word "remainder" is used in math. Your agenda is to explain to yourself and others the rationale behind the mod function in computing.
- In math, the definition of remainder for reals does not make sense. The reals can be divided without remainder. As such, there is no place in the math place for this generalization. Of course, people in computer science overload certain operators beyond its original meaning, but mathematicians do not bother with that. I think you should write about remainder of division of reals only on modulo operation, because that's a purely computing issue, and not a mathematical one. This is an opinion, other opinions welcome (you proved me wrong once). Oleg Alexandrov | talk 21:09, 30 Jan 2005 (UTC)
Remainder (continued)
When you say remainder for reals does not make sense since reals can be divided without remainder, I think you are assuming too much. It is true that reals can be divided without remainder, but this is only if you require that the quotient also be real. The modulo operation in computing (at least according to my observations) finds the remainder, given that q is limited to the set of integers — even though a, n, and r can be real. Thus there is often a remainder, which of course must be real. I do not see this as violating any principles. qn+r=a is still true and, I assume, provable, for the given sets of values allowed for each variable. Yet for some reason it is offensive to you. I don't understand why. The modulo operation merely "solves for r", given real or integer a and n and certain bounds, one of which is q being {…,−1,0,1…}. These equations, taken with their bounds, are statements of fact; one is no more valuable than the other. Mathematicians use one set of bounds, computer scientists another. This may not always be the case in the future. - mjb 00:24, 31 Jan 2005 (UTC)
- OK now, my comments were too rough, I am sorry. I know how it feels, as your "wtf" subject line in the morning did not make me feel good either.
- I just feel that all that stuff is not mathematical, rather, computational, so I don't know how much it belongs on a math page.
- But what you say makes sense. Would you mind putting that in the third section, that is, in "Other definitions"? Oleg Alexandrov | talk 00:48, 31 Jan 2005 (UTC)
Proven or proved
To say "it can be proved" is correct English. See for example this Britannica article (http://www.britannica.com/eb/article?tocId=70531). For some reason, it sounds more natural to my ear than "it can be proven". That's why I will change this in remainder.Oleg Alexandrov 05:07, 10 Mar 2005 (UTC)
- OK. I think this is one of those situations where what is correct and what sounds natural are sometimes different things, depending on regional dialects. I asked several people and got several answers. Everyone felt "proved" was right, but they also all hesitated and conceded that they hear "proven" used in that context just as often. - mjb 01:48, 11 Mar 2005 (UTC)
- I see. If you feel strongly about it, you can revert to "proven". I am not a native speaker, and ultimately I can't know for sure. Oleg Alexandrov 02:08, 11 Mar 2005 (UTC)
Formula formatting
Can we discuss this before getting carried away with the edits today? — mjb 22:51, 11 May 2005 (UTC)
- You mean the removal of ×? I don't care either way. But, since Michael Hardy removed one occurance, I thought to remove all of them for consistency. Oleg Alexandrov 23:02, 11 May 2005 (UTC)
- Yeah, I thought you were going to remove every single instance of ×, which wouldn't be appropriate, in my opinion (see below). mjb 23:25, 11 May 2005 (UTC)
Actually, I probably would have removed them all if I hadn't been in a bit of a hurry. But I'm not going to be adamant about this point. Michael Hardy 02:49, 12 May 2005 (UTC)
Style guides
- User:MathMartin/Styleguide is a rough draft that only gives one recommendation for writing formulas in HTML: it says to use
−
instead of a hyphen to represent minus or negative. - Wikipedia:How to write a Wikipedia article on Mathematics provides general advice for writing the article itself, but not much info about formulas, other than to supplement them with prose.
Are there other guides we can refer to?
No-break space
Regarding
or  
)… the pros are that it
- prevents formulas from wrapping, and
- improves readability.
The cons are that:
- Wikipedia gets confused sometimes and adds an extra space, such as the one before the d in x < r ≤ x + |d| (there is no space there in the original code); and
- there are no clear rules on when and how to use them.
Is the formatting bug something you can live with? Is there a workaround?
- I myself don't care. :) I asked Michael Hardy to comment on this. Oleg Alexandrov 23:46, 11 May 2005 (UTC)
Multiplication sign
I think it is reasonable to assume that standard algebraic notation is best for formulas, so two variables q and d being multiplied are probably best written as qd when presented in a formula. That is, when citing a formula, don't use ×
.
However, when explaining, for a general audience (not just mathematicians), the formula or giving examples of its application, I feel it is prudent to use the multiplication sign — "×", coded as ×
. So, I think these examples should remain as-is (although I added no-break spaces here):
- When dividing 26 by 4, 6 is the quotient and 2 is the remainder, because 26 = 6 × 4 + 2.
- −42 = 9 × (−5) + 3
Do others agree with this convention?
- Yes! Oleg Alexandrov 23:46, 11 May 2005 (UTC)
Less-than sign
Wikipedia seems to be pretty good at dealing with unescaped "<" characters, but since the MediaWiki markup also uses HTML tags, I think it would be ideal to use <
when we want to represent the less-than sign, just like in HTML and XML.
Does anyone know of a style guide that addresses this issue and makes recommendations?