Can Colleges and Employers Legally Require You to Get Vaccinated? It’s Complicated. Some colleges and employers are mandating COVID vaccines, and some states are proposing laws to prohibit vaccine mandates — leaving the unvaccinated to wonder where they stand. by Megan Redshaw, The Defender April 29, 2021 A slew of colleges and universities are embracing COVID
Source