Who won the first Women’s World Cup in soccer?

The United States won the first Women’s World Cup in 1991, defeating Norway in the final.

Leave a Reply

Your email address will not be published. Required fields are marked *