I write about technology at theluddite.org

  • 36 Posts
  • 14 Comments
Joined 1 year ago
cake
Cake day: June 7th, 2023

help-circle






  • This seems like a good read. Bookmarked after a skim.

    I'm currently reading "How Labor Powers the Global Economy," and after skimming the OP, this seems like exactly what HLPtGE set out to address. It's a very, very compelling reformulation of capitalist economics and the LTV that embraces the inherent chaos of prices without hand-waving them away, like most economists of all persuasions, including Marxists, do today. Its a model that borrows a lot of techniques from statistical mechanics and other physical sciences, so much so that people have started calling the approach "econophysics."




  • I do software consulting for a living. A lot of my practice is small organizations hiring me because their entire tech stack is a bunch of shortcuts taped together into one giant teetering monument to moving as fast as possible, and they managed to do all of that while still having to write every line of code.

    In 3-4 years, I'm going to be hearing from clients about how they hired an undergrad who was really into AI to do the core of their codebase and everyone is afraid to even log into the server because the slightest breeze might collapse the entire thing.

    LLM coding is going to be like every other industrial automation process in our society. We can now make a shittier thing way faster, without thinking of the consequences.








  • I am totally in favor of criticizing researchers for doing science that actually serves corporate interests. I wrote a whole thing doing that just last week. I actually fully agree with the main point made by the researchers here, that people in fields like machine vision are often unwilling to grapple with the real-word impacts of their work, but I think complaining that they use the word "object" for humans is distracting, and a bit of a misfire. "Object detection" is just the term of art for recognizing anything, humans included, and of course humans are the object that interests us most. It's a bit like complaining that I objectified humans by calling them a "thing" when I included humans in "anything" in my previous sentence.

    Again, I fully agree with much of their main thesis. This is a really important point:

    As co-author Luca Soldaini said on a call with 404 Media, even in the seemingly benign context of computer vision enabled cameras on self-driving cars, which are ostensibly there to detect and prevent collision with human beings, computer vision is often eventually used for surveillance.

    “The way I see it is that even benign applications like that, because data that involves humans is collected by an automatic car, even if you're doing this for object detection, you're gonna have images of humans, of pedestrians, or people inside the car—in practice collecting data from folks without their consent.” Soldaini said.

    Soldaini also pointed to instances when this data was eventually used for surveillance, like police requesting self-driving car footage for video evidence.

    And I do agree that sometimes, it's wise to update our language to be more respectful, but I'm not convinced that in this instance it's the smoking gun they're portraying it as. The structures that make this technology evil here are very well understood, and they matter much more than the fairly banal language we're using to describe the tech.






  • theluddite@lemmy.mltoTechnology@lemmy.mlIs SEO killing creativity?
    ·
    edit-2
    1 year ago

    I'm not sure if that article is just bad or playing some sort of 4D chess such that it sounds AI written to prove its point.

    Either way, for a dive into a closely related topic, one that is obviously written by an actual human, I humbly submit my own case study on how Googles ad monopoly is directly responsible for ruining the Internet. I posted it here a week ago or so, but here it is in case you missed it and this post left you wanting.






  • Past automation technologies had the most effect on low-skilled workers. But with generative AI, the more educated and highly skilled workers who previously were immune to automation are vulnerable. According to the International Labor Organization, there are between 644 and 997 million knowledge workers globally, between 20% and 30% of total global employment. In the US, the knowledge-worker class is estimated to be nearly 100 million workers, one out of three Americans. A broad spectrum of occupations — marketing and sales, software engineering, research and development, accounting, financial advising, and writing, to name a few — is at risk of being automated away or evolving.

    I'd take that bet, even at outrageous odds. I've now won over 700 dollars betting against self-driving cars with people in the tech world, and another couple hundred against crypto. Some of that even came from my former boss. I think I've won over a grand betting against tech hype in the last 4-5 years.

    Business Insider, in the unlikely event that you read this, DM me. Let's make a bet.




  • The real problem with LLM coding, in my opinion, is something much more fundamental than whether it can code correctly or not. One of the biggest problems coding faces right now is code bloat. In my 15 years writing code, I write so much less code now than when I started, and spend so much more time bolting together existing libraries, dealing with CI/CD bullshit, and all the other hair that software projects has started to grow.

    The amount of code is exploding. Nowadays, every website uses ReactJS. Every single tiny website loads god knows how many libraries. Just the other day, I forked and built an open source project that had a simple web front end (a list view, some forms -- basic shit), and after building it, npm informed me that it had over a dozen critical vulnerabilities, and dozens more of high severity. I think the total was something like 70?

    All code now has to be written at least once. With ChatGPT, it doesn't even need to be written once! We can generate arbitrary amounts of code all the time whenever we want! We're going to have so much fucking code, and we have absolutely no idea how to deal with that.