In fact, whether we should tell white lies in our daily lives has been a heated debate over the years. Different people have different opinions.
Many people believe that in order to avoid hurting the feelings of others, we sometimes need some white lies in our lives. White lies have a magic that brings hope and warmth. For example, if you tell a patient the truth about an incurable disease, he will feel hopeless. Conversely, if you tell him a white lie, he may enjoy the rest of his life more hopefully and calmly. Others, however, argue that no one should lie, no matter what kind of lie it is. They also argue that in any case, everyone should know the truth about their situation. A white lie is essentially a lie. When people finally know the truth, they may feel betrayed. More seriously, it can ruin a precious relationship.
In my opinion, whether we should lie really depends on the consequences it may cause. When we tell white lies, we'd better consider what the situation was and what the consequences might be.