I really just don't understand the stigma that people place on sex work. It's literally just a job. What's even the big deal?
Shame is a powerful suppressing force to control people. Stigmatize a natural act, and you control how people act. Control how people act, you have a legitimized social power structure.
This is not to ignore the history of diseases that are sexually transmitted, something to actually be careful of, but in the end it's all a way to keep a populace under thumb. Just for one example there was The Sexual Revolution of the mid-1900s where it was being normalized, but an unintended consequence was also the AIDS epidemic. But instead of treating the problem - AIDS - they instead doubled down on the stigmatizing of sex, especially homosexual relationships, using AIDS as a reason to not have sex at all. You see the aftereffects still today, where a decade ago being gay was seen as the cause of the fall of Western (cough US cough) society.
The scapegoats have changed over time and by region, but the reason is always the same no matter where you go.