The term "body image" refers to the way a person feels about and perceives their own body. Negative body image has been linked to poor mental health and increased health risks. Use the following sources to learn more about the sociological and mental health aspects of body image.