Do people ever talk about morals anymore?
I'm not sure, but after some articles that I read, I don't think we talk about them enough for sure.
The first story was about a man on a subway that was shot on a relatively full subway car, but no one saw the shooter because they were too busy on their phones or computers.
I just thought that was sad.
Then I read an article about telling a "little white lie". Now this wasn't one of those little white lies about how you look or anything like that.
No this was one of those white lies that you tell to someone to cover something up.
I wonder at what point a "little white lie", becomes a bold face lie.
Is it when someone does it to you?
What happened to just being honest with people and telling them what you think? I would much rather someone say something to me that might hurt my feelings than lie to my face.
We seem to be so wrapped up in trying to conform with what's happening in the world, that we don't remember what is right and wrong?
Are we to the point that nothing is wrong anymore? Or are we just not bold enough to say anything about it.
I don't know, but I think it's sad.