Truth = Fiction?

Or should I say Fiction = Truth? I prefer fiction writing to anything else when it comes to reality – sounds crazy, right? But if you think about it, it’s the only time I read what people really feel, think, and do. More authors hit the mark when they hide behind fictional characters because the fictional characters get no repercussions from the public.

An author can create a character to speak their mind about sensitive topics that we don’t discuss or broach unless in like-minded company. They spout philosophical things that hit on the truth of how a person should act without it hurting anyone’s feelings. And they can take you straight to the heart of a matter without hesitation when we are afraid to tell a friend something they should hear.

And it’s always a more accurate picture of real-life events (like wars or controversial exploits) because the characters are allowed to tell it like it really was or is. I’d never discount nonfiction depictions, but I still say that fiction provides more truth. What do you think?