• silent_water [she/her]
    ·
    7 months ago

    division is not defined for 0. it can yield multiple values, any value at all, explode to infinity, etc.. but even that statement depends on taking a limit because you can't actually divide by zero. you break basic algebraic laws if you try to include it. it's such a essential fact of algebra that you only name an element of a ring 0 if it's the additive identity and always multiplies to zero. when you extend such a set to a field, you define a division operation as multiplication by the multiplicative inverse except for the additive identity because such an operation is never well-defined.

    your Wikipedia link is discussing limits. limits are only well-defined when you can prove that every step of the limiting process is well-defined and the overall sequence converges absolutely. if I just write 5/0, there's no sequence - you can't say the limit diverges to infinity or resolves to a specific number because there is no limiting sequence to begin with. you need a function like sin(x)/x to produce a limit such that you know for certain that 0/0 in this very particular case is 0 (ie the discontinuity at 0 is removable).

    if you're interested in this, you're looking for ring theory. a lot of textbooks will give you the basis to prove that division by zero produces inconsistent results for any field - this is why it's one of the field axioms.

    • naevaTheRat@lemmy.dbzer0.com
      ·
      7 months ago

      I wasn't talking about limits. Read the stuff about extended real lines etc. There are some (2 or 3? idk not many) systems where we do define 1/0 as infinity.

      I do think your right in the framework of ring theory though, but I haven't done much of that. It's a framework for analysing a lot of algebra and the maths we usually do but I don't think it's the universal truth of all mathematics which is possible. Am I wrong there?

      • silent_water [she/her]
        ·
        7 months ago

        sure, you can extend the real line but you're basically defining a new value that behaves a lot like NaN in software. you have to be very careful about the operations because of the strange properties of the new terms introduced. what you get out loses a lot of the basic properties you're used to with arithmetic - ie afaik it's neither a field nor a ring. it's also a really misleading statement to say it's dividing by zero - you're changing what division by zero even means. and the infinity you get back isn't infinity in the usual sense - the supremum of the natural numbers. rather it's a symbolic infinity.

        basically, it's as much division by zero as -1/12 is the sum of the natural numbers. they're both true in a very particular sense but only after you change the meaning of all the words in the statement.

        said another way, you can make anything you like true by introducing new axioms but those axioms have deep impacts on what's true in the new system those axioms generate and it's misleading to say that the ability to introduce those axioms makes an undefined operation sensible in a system that lacks those axioms.

        • naevaTheRat@lemmy.dbzer0.com
          ·
          7 months ago

          Ok I suppose that's fair, most people probably are taking division as meaning in the sense it applies in what is it even called, default maths? real analysis or whatever, and while people call the operation division in other systems it's sort of a homophone for a rather different thing that shares characteristics.

          People should still learn about the Riemann sphere though and it's a sensible operation under that. We do it all the time in quantum shit mwahahahaha.

          • silent_water [she/her]
            ·
            7 months ago

            it's a version of field theory where the rules aren't all properly explained? I wish we just taught groups, rings, and fields as soon as modular arithmetic gets introduced. it's not really that complicated and it makes sense if you have matrices, integer rings, Z, R, and Q available as examples. we just leave things poorly explained by not teaching the axioms.

            Riemann spheres are awesome, I just want to be careful with my language in a space where people don't even know what a field is, generally. but god I love math. I really want to go back to grad school and finish a phd - I've been settling for teaching myself with books and free online lectures.

            • naevaTheRat@lemmy.dbzer0.com
              ·
              7 months ago

              it’s a version of field theory where the rules aren’t all properly explained?

              Literally lol, that's the funniest thing I've heard in a few days. I do remember studying physics and getting a bit like "ok but wtf are numbers cause this is a mess?" and studying sets in my own time, then a bit of field theory and going "Oh yes, this actually makes sense now" although I was only ever at the level of amateur dabbler.

              I burned out of a physics PhD (funniest thing I've heard that referred to is "I have a post mortem in X") but even by the time I started it I just wished I could go back and choose pure maths, it's not like there are jobs for physics anyway :P (unless you want to make bankers richer, or build weapons which both indelibly mar the soul). Algebraic topology is something I will understand one day. I just need more time, and to move closer to a uni. Even if I'm like 70 and wasting state money I'm gonna study it some day.

              Riemann spheres are awesome, I just want to be careful with my language in a space where people don’t even know what a field is, generally.

              Yeah it's reasonable, I guess I don't want people just thoughtstopping at "you can't divide by zero" and never thinking there's anything deeper to it instead of maybe going "well, what is division by zero why can you do it sometimes and not other times? why do we sometimes pretend we can do it in systems where we can't?". At the end of the day I think we're on the same side of "maths is cool and people should learn more of it's nuances" but worry about people taking different things from the same trivial remark haha.

              Good luck on getting back to it! I hope you do better than me at your studies. It's such a fun little world of order and puzzles.