1) Any real number, when divided by zero, produces modulus and quotient zero.
2) Any real number multiplied by zero is equal to zero.
Therefore, it logically follows, that zero divided by zero is equal to zero.
Premise 1 is false, it presupposes you can divide by zero, this operation is undefined. The division algorithm states
a=bq + r, where b|a (b divides a), The set of R/0 is not closed under division, or the multiplication inverse.
R/0 is an indeterminate form. It is undefined. A limiting process can be applied to an indeterminate form, but remember the episilon-delta proof, the limit never actually gets to zero, only "as close as we like"
The whole process shoudl be restricted to integers anyway to eliminate irrational numbers in the real set.
Right! If you divide something by 0, that's the same as dividing it by nothing. If something isn't divided, the thing that is left is the original something, right? Therefore, 2/0=2.
Arithmatical division is both the taking and making of groups.
An arithmatical quotient is that number of groups made or taken.
∴ That number of groups of nothing one can take and make from any something is absolute - indeed, that exact opposite of nothing, "−0."
Oh, play the mathematical BS. This is the exact reason stuff is so confounded.
Take 10 Arabs in the desert. Divide their number by 2 and you get 2 groups of 5, right. Since "0" is nothing, divide them by nothing and they are not divide, right? So, there are still 10 Arabs, right?
English has its characteristic laws that don't make any sense. Mathematics is a language that has characteristic laws that don't make any sense as well. It's the reason that we have flaws in our thinking.
Look, -0 is absence of zero. So, what exactly is the amount of non-zero?