Florida's vaccine mandate removal: What it means for public health
As the administration of Gov. Ron DeSantis prepares to make Florida the first state to remove school vaccine mandates, deep concern is spreading among doctors, parents and public health workers for the safety of children, their families and others who might be vulnerable in a disease outbreak.