Probably didn’t need me to tell you that tho

  • BynarsAreOk [none/use name]
    ·
    edit-2
    2 years ago

    I'll have to say though that education in general is way too focus on testing rather than proven practice of learned skills. Is AI the thing that will make people realize a lot of it is built on a false promise of bullshit rote memorization and not any useful practice whatsoever? Then the fact a lot of majors just exist because capitalism needs a trained workforce, it is education to do jobs rather than to learn anything useful.

    Doctors and engineers wont be fearing this anytime soon but I am not realy willing to give a shit about Jane and her bachelor degree in economics,business or whatever PMC BS and how ChatGPT made her education useless. I mean gee why did it take that long to realize the course was BS anyway?

    • UnicodeHamSic [he/him]
      ·
      2 years ago

      Actually because so much of medicine involves knowledge that is broad not deep it is a prime target for AI research. Especially given how much could be saved. Worth noting in lots of places they are automating medicine the old fashioned way. Just standardizing treatment for common issues to minimize the amount of billable provider hours per case. In some areas it has improved the quality and access of care. We would expect the same of AI screening pts. Which in a better world would improve things

      • usernamesaredifficul [he/him]
        ·
        2 years ago

        AI researchers when they talk about AI medicine think of medical AI as a crude diagnostic tool for doctors that suggests possible diagnoses. AI is not a substitute for an experienced doctors judgement

        • TheCaconym [any]
          ·
          edit-2
          2 years ago

          OK, but isn't that the main task of a GP doctor ? pattern recognition based on symptoms / the results of analysis / medical history ?

          All the actual human part of healthcare seems to, mostly, be done by nurses, not doctors.

          • usernamesaredifficul [he/him]
            ·
            edit-2
            2 years ago

            yeah but AI is just not that sophisticated in its reasoning like I said it's a crude tool. and the more sophisticated AI models don't explain their reasoning well

            • TheCaconym [any]
              ·
              2 years ago

              Ah, fully agreed. Properly formatting the input data (especially ongoing symptoms) for the model would also be a nightmare; and yeah, there's a whole field about trying to pinpoint how the black box reached a decision - and it's not making much progress.

              Eventually, though ? it's really one job I could see mostly automated; earlier than most others.

              • usernamesaredifficul [he/him]
                ·
                2 years ago

                maybe personally I'm dubious that such an AI would be cost effective. But I am dubious in general of claims AI will revolutionise our society.

        • ElHexo
          ·
          edit-2
          4 months ago

          deleted by creator

          • hexaflexagonbear [he/him]
            ·
            2 years ago

            I think the proposed uses for AI (at least in medical imaging) is to tweak so that it has a very high false positive rate, low false negative rate, so it causes a reduction in the amount of work that radiologists do while missing as few actual positives as possible. Which is definitely useful and doesn't seem overly dangerous, but it's not "revolutionary" so it takes a backseat when talking about AI or anything. I think it's actually a shame that small improvements don't get sexy coverage, since really incremental improvements in science and engineering are probably as responsible for modern wonders as the initial revolutionary discoveries.

    • came_apart_at_Kmart [he/him, comrade/them]
      ·
      2 years ago

      education is weird and very individual, in terms of how one internalizes the information/tools/skills presented. it's further complicated by bourgeois ideology pressuring would be learners to do some fiscal cost/benefit analysis before, during and after the formal process, thus limiting the critical analysis of the process instead of expanding it to something more rigorous and complex like "how can i use this to help my community?" or w/e

      i got a lot out of my education, but i went to school with people in my same program who didn't seem to get as much. or even sometimes took, in my opinion, the wrong lesson home. it seems like a lot of people, in general, think of education as merely an info dump where they get a bunch of knowledge and then they are a subject matter expert and go forth into the world with that knowledge. naturally, "continuing education" developed as a formal mechanism to cover gaps expanding over time within professional associations to make sure someone who graduated in the 80s isn't spreading bunk that has been overturned in the interim, but really points to a cultural problem of people thinking they can or should ever be "done" learning or with education.

      anyway, more to the point of AI replacing the thinking labor of humans, this is nothing new. technology and labor have ever existed in tension created by capitalism and feudalism before. personally, i'm glad i don't have to sit in an office and do long division all day. if they come up with some new way to deploy technology to fuck people like me out of an ok job path, i'll pivot. i'm in my 40s, so it wouldn't be the first time. i've been walking around expecting the next shithammer for 20+ years anyway.