Don’t get me wrong, I’m not defending the current US healthcare system, it’s horrible and riddled with perverse incentives, and should be mostly (if not entirely) nationalized. I’m just not sure how to justify the idea that healthcare is a “right”.
I know that sometimes people on the left draw a comparison to the right to a public defender. I’m not sure that argument really holds up though, because you only have the right to a public defender under the specific circumstance of being prosecuted by the government for a crime. The logic there is “if the government is going to significantly interfere with your life by arresting you and trying you for a crime, then it at least has to allow you to get legal defense from a qualified attorney, even if you need the government to pay for it.” There’s not, like, a right to a publicly paid lawyer for any and all purposes.
You have a right to legal representation. Are lawyers slaves? You have a right to vote- are politicians, poll workers, etc, slaves?
listing off other rights doesn't work because the only one they think is real is the right to property, and they can't imagine that their collection of guns isn't, by itself, enough to keep someone from shooting them and picking their pocket.
Conservatives want to get rid of PDs for that reason. They really are evil evil