What does the term tiger women refer to?
Tiger women refers to a term that was widely used in the Western media to describe Asian women who were seen as sexually aggressive or predatory. The term was first used in the early 19th century, and was often used to describe women from Japan or China who were seen as submissive towards their husbands. The term is now considered offensive and derogatory.