It occurs to me that before I go on using these terms much more, I should write down definitions.
So, first: Epistemic Ethics. Epistemic ethics has to do with assigning moral value to how much one cares about the quality and provenance of one's knowledge.
The idea is that being skeptical (or credulous) can make you a good (or bad) person.
Let me put it in context: Nowadays you can go into a grocery store, and buy broccoli that the merchant assures you was grown in a garden without pesticides. In a country with fair laws. By farmers who weren't being oppressed. That's called provenance. How did it get to you, where did it come from?
Implicitly, you're a bad person if you don't care where your broccoli came from. If you buy your broccoli at the other store, the one across the street, you get a little twinge of guilt and fear that it might have been grown in toxic waste by slaves (which isn't far from the truth, maybe, but that's not my point).
And yet there's a section in the first store, the one with the happy ethical broccoli, that sells herbal supplements for improving various complaints of physiology. Most of them say what they're for. But if they do, they also have to say "These statements have not been evaluated by the FDA." But that's not a very strong warning, is it? Face it, we've all come to think of the FDA as a lumbering inept government bureaucracy (mostly because it's true). So if they haven't gotten around to finding out whether this particular flower or root will help me go to sleep at night, who cares?
But what the warning really should say, in most cases, is, "These statements have not been evaluated by anybody."
The FDA, flawed as it is, is an attempt to put into practice a particular attitude. It's an attitude that has evolved over the past few millennia, and has proven itself to help people in getting at the truth. It shows up in philosophy, it shows up in the US constitution, and it shows up in the scientific method. It's hard to articulate, and many people have done a better job of it than I could do here, so I will only try to sum up: it's an attitude of intellectual humility, and mutual honesty. It's epistemic ethics.
When the FDA requires a pharmaceutical company to conduct rigorous double-blind clinical trials, it's attempting to enforce epistemic ethics. You can argue, with good cause, about how it's implemented and how the system has evolved, but you must admit that it's done a good job at reducing how often people can get away with making stuff up. It's introduced disincentives for people to proffer to other people junk knowledge.
Western medicine has flawed standards of proof, and questionable motives, yes. But it sure beats no standards of proof, and obvious motives.
And yet people have turned away from the attitude that brought about the FDA. They're forgetting the motivation because they're angry at the implementation. They're forgetting the moral root of the situation. They're buying products that bear the words "These statements have not been evaluated" as a badge of honor.
So, when I see a flower in a bottle, one aisle over from the morally upright broccoli, it bugs me. It bugs me that I might be living in a society where you can be a bad person for ignoring where your broccoli came from, but you get a free pass to ignore where your knowledge came from.
And that's a problem of epistemic ethics.