How to Take Care of Your Body
American culture sends strong messages about what your body should look like, and this often dictates how you feel about yourself. This dialogue brings together two women at different stages in their lives, who share the challenges they’ve overcome in the pursuit of developing a healthy body image.