Sid and I were watching tv last night when this stupid commercial for some scar lightening cream came on. The woman in the commercial was "so ashamed" of her scars, and this cream was able to make her feel good about herself again. You know, the typical gimmicky line of BS.
Why be ashamed of a scar? Why try to hide it, or lie about it? I don't understand.
My mother got burned pretty badly on her arm about ten years ago, and it got infected, and left a pretty funky scar. It's barely noticeable, but if you know what to look for, it's a patch of slightly lighter, bumpy skin on her forearm. And she was so ashamed of it, it bothered her so much. One day I walked in on her complaining about it, how it was going to "mark her for the rest of her life." And I was gobsmacked. I'd never considered it like that.
When I was four, I fell off a swing and broke my arm. Really badly. We're talking bones sticking out, arm twisted around broke my arm. Where the bones stuck out, where the doctors cut it open to try and repair the damage, I have a pretty spectacular zig-zag scar on my forearm, about five inches long. Stupid people have seen it and asked appropriately stupid questions (DID YOU TRY TO CUT YOUR WRIST?). But I have never been ashamed of it. I don't try to hide it, and therefore, people don't really notice it. It is just part of who I am, and most people will not mention it, indeed, will not even see it until I specifically point it out.
I told my mother that scars were nothing to be ashamed of, that it was merely something that meant we have healed from physical trauma. If anything, we should be proud of our scars, because if you believe they "mark us for life" they mark us as people who have hurt, who have been scared, who bled and cried. They are testaments to pain, and to healing.
She looked at me cockeyed, but she never really complained about her scars again. Maybe she just thought I was crazy.