Where did you learn it from? Just like this posts says we didn’t come out the womb feeling like this it’s taught. Who taught you to hate yourself?
What have you done to change this perception?
Any negative thoughts we have about ourselves isn’t true. We aren’t meant to down ourselves. We’re meant to uplift ourselves. Dress up, do your hair, go out, take pics, etc. Do whatever it is that will help you change how you view yourself so you can start to be kind to yourself💚.
