8 Movies That Changed The Way We View Women’s Health Issues

Although it can be argued that Hollywood has been at times a negative influence regarding women’s health issues (body image and weight, anyone?), the movie industry has also played an important role in bringing some other uncomfortable-to-talk-about health issues to light . Movies with topics that people were once afraid …

READ MORE →