A set of ideas or beliefs that are taught as (and proclaimed as) truths by an authoritative body within a tradition. Followers of that same tradition are expected to accept these teachings, and in some cases it is expected that people outside of the tradition should accept these truths as well.
I'd like to hear why any of this should be considered good.
From a personal level, it's very dangerous to blindly follow everything someone tells you. We would never advise our children to just believe everything that's told to them, would we?
"Hey little Johnny, don't ever drink water. Water is filled with flouride and makes you easier to control. Instead, drink Brawndo!"
Why then do we accept such a practice from adults? Is it just because the "ideas or beliefs that are taught" come from a (proclaimed) authority figure?
Well, I'm sorry, but f#ck that. Until someone (or something) demonstrates to me that their values, opinions, or teachings can be consistently trusted, they don't get a free pass in having a voice in my life. This was true even of my parents from the time I was very little. There were a few things that they were right about - and on those topics I will approach them with questions or discussions. They were also many things they were wrong about... It doesn't make sense to me to put much stock in them for advice on things that they have spent their whole lives not knowing much about, does it?
All of that simply brings up the question of where authority comes from... Nothing has any authority in our lives unless we choose to endow it. Some people can be all willy-nilly in their giving away of authority, choosing to openly value any trash that is handed to them. I just think that's foolish. Before I eat a whole plate of something, I taste it first to see if I like it. If I don't like, I don't eat it. It's as simple as that.