ETHICS, what is that, for the first thing, children aren't taught ethics or honestly anymore, it is all about ME and how I can get what I want and I want it now! For the last generation parents where told, you cannot discipline your children, we are watching you and the kids know this, then they do get a job when they hit the teen years and see the adults screwing everyone around them to get ahead, Ethics went out the window a long time ago. Look at our political leaders, there are no more ethics, it is sad but true, the country has changed our social outlook. We don't demand respect from others, we are supposed to be politically correct when we speak, do not voice your opinion, don't expect for your co-workers to actually work as directed; the list goes on and on..
Re: What Happened to Ethics?
There are 0 replies to this message