Don’t get me wrong, I’m not defending the current US healthcare system, it’s horrible and riddled with perverse incentives, and should be mostly (if not entirely) nationalized. I’m just not sure how to justify the idea that healthcare is a “right”.

I know that sometimes people on the left draw a comparison to the right to a public defender. I’m not sure that argument really holds up though, because you only have the right to a public defender under the specific circumstance of being prosecuted by the government for a crime. The logic there is “if the government is going to significantly interfere with your life by arresting you and trying you for a crime, then it at least has to allow you to get legal defense from a qualified attorney, even if you need the government to pay for it.” There’s not, like, a right to a publicly paid lawyer for any and all purposes.

  • TraschcanOfIdeology [they/them, comrade/them]
    ·
    edit-2
    2 years ago

    and all life as members of a social species, really)

    The eco-feminist tradition goes further, and points out that not only other people's (mostly uncompensated, especially when it comes to women) labor keeps every one of us alive and healthy, but that the labor of creating and nurturing life is not reciprocated by a human species that sees ecosystems as a source of resources to be extracted.