Sorry to be talking about sex so much lately, but this is a good question. Generally they don't need them. Feminists say they enforce male domination, Basically it's common knowledge they turn men on, sometimes more than bare breasts. Of course, the same thing could be said for bikini tops. Do women need them? Why? Also, if women wear bikinis to the beach, then why would walking around in average or sexy lingerie, say anywhere, at a store, school, be wrong? Of course, some will say because it makes the woman look like a whore. Does anyone get turned on by seeing a bra thru a white shirt, but I don't mean on some chubby guy.