Home / General / What Florida ending vaccine mandates could mean for rest of US

What Florida ending vaccine mandates could mean for rest of US

Florida’s surgeon general announced last week that the state is moving to “end all vaccine mandates.” Experts explain what this could mean for the U.S.

Florida’s surgeon general made a surprising announcement last week that the state is moving to “end all vaccine mandates.”
Dr. Joseph Ladapo said the Florida Department of Health would be working with Gov. Ron DeSantis’ office to end all mandates in … [7377 chars]

Source: ABC News | Published: 2025-09-09T14:39:48Z

Credit: ABC News

Tagged: