• 0 Posts
  • 12 Comments
Joined 1 year ago
cake
Cake day: August 8th, 2023

help-circle


  • Good point with the line! Some of the best liars are good at pretending to themselves they believe something.

    I don't think its widely known, but it is known, (old sneeeclub posts about it somwhere) that he used to feed the people he was dating LSD and try to convince them they "depended" on him.

    First time I met him, in a professional setting, he had his (at the time) wife kneeling at his feet wearing a collar.

    Do I have hard proof he's a criminal? Probably not, at least not without digging. Do I think he is? Almost certainly.






  • As you were being pedantic, allow me to be pedantic in return.

    Admittedly, you might know something I don't, but I would describe Andrew Ng as an academic. These kinds of industry partnerships, like the one in that article you referred to, are really, really common in academia. In fact, it's how a lot of our research gets done. We can't do research if we don't have funding, and so a big part of being an academic is persuading companies to work with you.

    Sometimes companies really, really want to work with you, and sometimes you've got to provide them with a decent value proposition. This isn't just AI research either, but very common in statistics, as well as biological sciences, physics, chemistry, well, you get the idea. Not quite the same situation in humanities, but eh, I'm in STEM.

    Now, in terms of universities having the hardware, certainly these days there is no way a university will have even close to the same compute power that a large company like Google has access to. Though, "even back in" 2012, (and well before) universities had supercomputers. It was pretty common to have a resident supercomputer that you'd use. For me, and my background's orginally in physics, back then we had a supercomputer in our department, the only one at the university, and people from other departments would occasionally ask to run stuff on it. A simpler time.

    It's less that universities don't have access to that compute power. It's more that they just don't run server farms. So we pay for it from Google or Amazon and so on, like everyone in the corporate world---except of course the companies that run those servers (they still have to pay costs and lost revenue). Sometimes that's subsidized by working with a big tech company, but it isn't always.

    I'm not even going to get into the history of AI/ML algorithms and the role of academic contributions there, and I don't claim that the industry played no role; but the narrative that all these advancements are corporate just ain't true, compute power or no. We just don't shout so loud or build as many "products."

    Yeah, you're absolutely right that MIRI didn't try any meaningful computation experiments that I've seen. As far as I can tell, their research record is... well, staring at ceilings and thinking up vacuous problems. I actually once (when I flirted with the cult) went to a seminar that the big Yud himself delivered, and he spent the whole time talking about qualia, and then when someone asked him if he could describe a research project he was actively working on, he refused to, on the basis that it was "too important to share."

    "Too important to share"! I've honestly never met an academic who doesn't want to talk about their work. Big Yud is a big let down.



  • My worry in 2021 was simply that the TESCREAL bundle of ideologies itself contains all the ingredients needed to “justify,” in the eyes of true believers, extreme measures to “protect” and “preserve” what Bostrom’s colleague, Toby Ord, describes as our “vast and glorious” future among the heavens.

    Golly gee, those sure are all the ingredients for white supremacy these folk are playing around with what, good job there are no signs of racism... right, right?!?!

    In other news, I find it wild that big Yud has gone on an arc from "I will build an AI to save everyone" to "let's do a domestic terrorism against AI researchers." He should be careful, someone might this this is displaced rage at his own failure to make any kind of intellectual progress while academic AI researchers have passed him by.

    (Idk if anyone remembers how salty he was when AlphaGo showed up and crapped all over his "symbolic AI is the only way" mantra, but it's pretty funny to me that the very group of people he used to say were incompetent are a "threat" to him now they're successful. Schoolyard bully stuff and wotnot.)