You know, i think this question is more philosophical than most people realize.
Why is it better to tell a truth? This is what I believe.
I think the world is the way it is because people are and have always been unable to tell the truth in all matters. When its a nice thing, people find the truth easy but when its a difficult task, they shy away from the truth and hide behind little falsities here and there. Also in trying to spare people's feelings, we end up lying and the problem with lies is that they never end at one. One lie leads to another and another and before you know it, you're in a lie tornado.
I for one think that it is better to say the truth, no matter how bitter it is, than to lie and spare a person's feelings. I say what I mean and I mean what I say. I feel that if people don't want to lie, then they shouldn't put themselves in positions where they have to. But as I said, its a philosophical question and what I just explained above is my philosophy