You are asking the wrong person this question.
True. I guess it's a question that I meant for a broader audience than just you and I.
I think it's truly as simple as this: those vaccine mandates came around before literally everything in the US was political. Every single thing is political now. It's the only lens through which people are capable of viewing society through.
That seems to be the case with large segments of the country. These days, I try very hard to not involve myself in political discussion as I find it to be a general waste of my time. Personally, I don't find businesses and schools requiring a vaccine to be "political". It's a matter of health and wellness for the operation as a whole, and should be treated as such.