This got me thinking. No one cares more about whether or not you are getting nutrients unless you are vegan. If I was eating McDonalds every day or eating processed food; no one would ask me about protein, calcium, or any other nutrients. But once you tell people you eat a whole foods plant based diet; they suddenly become experts on nutrition and food.
I mean since when is eating whole plant foods like beans, fruits, vegetables, rice, nuts, seeds not healthy? When we grew up our parents told us to eat our vegetables but when all you eat is vegetables it is suddenly considered unhealthy. What’s up with that?