The Science of Nutrition

We are constantly exposed to information about
food, diet and nutrition from magazines, newspapers,
books, food labels, advertisements, news broadcasts,
tv shows, not only from registered dietitians
but from all sorts of health professional
that may have minimal knowledge of food and
nutrition, as well as chefs, fitness instructors,
gurus of all kinds and other unspecified experts.
It’s easy to accept an idea because we like
it, because it’s nice, because it sounds
right, it makes sense, or we wish it was true.
Because we trust the person who supports it,
or because it worked for someone else.
But none of the above reasons belong to the
realm of science. Science is based on solid
evidence and a specific method which is used
to generate and evaluate that evidence. It’s
the only objective tool we have to tell what’s
true from what isn’t, what is just a brilliant
idea from what is actually proven by the facts,
what makes sense from what actually is, what
we wish it was true from what is true indeed.
It would be wrong to think that science should
only be made with evidence and measurable
facts. With that alone, science doesn’t
advance. The beauty of science is that we
are allowed to freely use our intuition and
imagination to try to explain the facts we
observe, even when that means formulating
the most absurd and provocative hypotheses,
and designing the most revolutionary theories,
BUT, what we are NOT allowed to do, is to
confuse speculations with facts.
If an author wants to write a diet book, say
the purple cucumber diet, and explain why
he thinks this is the best solution to lose
weight and forever be healthy, he has every
right to do it. But what he cannot do, is
selling what’s nothing more that his own
ideas, as if they were proven facts, or universally
accepted truths. Unfortunately, many popular
diet books do just that.
So before we even start discussing food and
nutrition, I want to give you a couple of
tools to be able to be more objective when
you are faced with new information concerning
diet, food and nutrition.
The very first thing we need to understand
is the difference between anecdotal evidence
and controlled experiments.
In most popular diet books you’ll find case
reports that may sound more or less like this:
There it was, Mr So&So, sick and wasting away,
his doctors had already given up on him, and
there I came, I had him follow my purple cucumber
diet, and low and behold, before a couple
of months all of his symptoms had disappeared,
his doctors were bewildered, his body weight
was normal, his blood tests were perfect,
he had never felt so well and energized his
whole life and he was happily running in the
Well, this is what we call an anecdotal case,
that is, the story of one specific episode.
Even accepting the truthfulness of the story,
in science, this has no demonstrative value
whatsoever because there’s nothing to indicate
that it’s not just a coincidence.
Maybe the unexpected recovery of Mr So&So
was indeed due to the purple cucumber diet,
but maybe it had nothing to do with it, and
he got better for completely different reasons.
Maybe he would have improved even without
the diet.
Imagine you have a cold, and I tell you to
wear a red t-shirt. The following day you
are feeling better, and I come and say “See?
I was right that wearing red makes the cold
go away”. You would likely answer, “what
are you talking about? It has nothing to do
with it. My cold resolved by itself because
I stayed in bed, the fact that I was wearing
a red t-shirt was just a coincidence”.
And you would be right, of course. But then
again, who knows? Maybe it WAS the red t-shirt.
Maybe it WAS the purple cucumber diet that
made Mr So&So get better. How can we find
out? The tool that scientists have to answer
this question is a controlled experiment.
We could select twenty patients just like
Mr So&So, and then randomly assign half of
them to follow the purple cucumber diet, and
the other half to follow a regular diet, everything
else being the same. If patients in the purple
cucumber diet group get better significantly
more or significantly faster than patients
following the regular diet, then we can say
that there is an association between the purple
cucumber diet and the patient’s recovery.
But bear with me one more minute. The fact
that we have now proven the existence of an
association still doesn’t imply that there
is a cause-effect link.
For example, did you know that ice-cream sales
are associated with jellyfish bites in coastal
areas? Indeed, it has been shown that on those
summer days in which ice-cream sales are higher,
jellyfish bites increase. Does that mean that
eating ice-cream somehow attracts jellyfish?
Maybe, but it is much more likely that the
there is no direct link between these two
events, and instead they are both consequence
of another variable, heat. In hotter days,
people buy more ice-creams to alleviate the
heat, and for the same reason they also take
more baths or spend more time in the ocean.
Because they bathe more, the odds of being
bitten by a jellyfish are increased. The fact
that two events occur together, doesn’t
imply that one causes the other. In Greece,
researchers observed that there is a very
high consumption of olive oil and a very low
incidence of cardiovascular disease. Does
this prove that olive oil prevents cardiovascular
disease? Not necessarily. Maybe they use a
lot of olive oil because they have a lot of
olives because there is a lot of sun, and
because there is a lot of sun they make a
lot of vitamin D in their skin which protects
them from cardiovascular disease. So always
remember, association doesn’t imply causation.
Designing controlled experiments is not always
possible especially when we want to evaluate
the effect of whole dietary pattern over the
long term, because they would be prohibitively
long, expensive, and require incredibly large
number of study subjects. In these cases,
scientists rely on epidemiological evidence,
that is studying the outcomes of a large number
of real-life cases and controls, chosen with
appropriate and objective criteria, in large
enough number that the outcomes can be of
statistical significance.
The big problem here is of a different kind,
and it’s what we call confounding factors.
Suppose we want to see if a vegan diet carries
a lower risk of heart disease. Well, you may
think, that’s easy: let’s just find a
bunch of vegans, say one thousand, and then
one thousand meat eaters, and let’s just
recOrd their rate of heart disease over time
and compare them to see if it’s lower in
vegans. Well, if we do it like that, chances
are that indeed we will find out that vegans
have lower rates of heart disease. But is
it because they don’t eat meat? Most people
who go on a vegan diet are health-savvy in
the first place. They are more careful at
designing a healthful balanced diet in general.
On top of that, they likely don’t smoke,
they don’t drink alcohol, and they are on
average more physically active. No wonder
they suffer less of heart disease. But if
we want to tease out specifically the contribution
of the vegan diet in itself, we will have
to adjust for all the confounding factors.
That is, everything else being equal, same
level of physical activity, same use of alcohol
or tobacco, and so on, do vegans still have
less heart disease than meat eaters? As you
can imagine, doing such an investigation is
extremely difficult. If you are curious about
the answer to this specific question, in light
of the evidence we have, it turns out that
if meat eaters also eat enough fiber, fruit
and vegetables, their risk for heart disease
is about the same as vegans.
But what’s important to remember now is
that just one or a few studies are never enough
to make a final conclusion about the reality
of things.
The universally accepted nutritional guidelines
that we will study in this course are never
based on just a bunch of studies, but on a
large body of numerous, consistent and replicable
evidences that have been generated at many
different levels, mechanistic level, cellular
level, biochemical level, chemical level,
epidemiological level, clinical level, in
a way that everything fits and points to the
same conclusion, convincingly supporting a
theory to the point that we are confident
to make practical recommendations to the population.
Unfortunately, many popular diet books and
gurus don’t follow the same rigorous standards
and make recommendations, or worse base entire
theories, on just one or two scientific studies,
and sometimes not even that.
The devil can cite scriptures for its purposes,
the saying goes. And this is true with scientific
studies published in peer-reviewed literature
too. Making a statement and referencing a
scientific study always creates a sort of
authoritative aura, but if you hand pick and
magnify those studies that support your ideas,
and ignore those that are against, all you
are doing is distorting reality. If you conveniently
select the parts of those study that you like,
or repOrt their findings in an inflated way,
you are not being objective. If of the studies
whose conclusion you find convenient, you
focus on the outcomes, and of those that go
against, you focus on flaws, limitations,
or conflicts of interests, you are not being
honest. Every study has limitations.
Simply referring to scientific studies isn’t
enough to make conclusions. Take these two
books: (china study, the paleo diet): they
propose eating models that are diametrically
opposite: one advocates for a totally plant
based diet, the other recommends meat for
breakfast. an yet both books are not only
interesting and very well written readings,
they are both based on solid evidence and
written by respected and honest scientists.
how is that possible? The thing is, the right
way to use the scientific evidence is asking
a question and then look for all the available
evidence, and if it lacks, design new experiments.
That way you will find evidence pro and evidence
against, and you can weigh the evidence.
The wrong way of using the scientific evidence,
on the other hand, is starting with a pre-set
mind and your made-up conclusions, and then
look for the evidence to support it. Chances
are that in the hundreds of thousands of studies
that have been performed in the history of
science, you will always find something that
agrees with your ideas. But if you focus on
that alone and ignore everything else, you
are doing a disservice to science.
Mass media and social media often report such
oversimplified versions of the information
to the point that it becomes distorted. One
thing is doing an experiment and finding that
a molecule in red meat may create a toxic
compound in your intestine, a much different
thing is tweeting “researchers finally prove
that red meat can kill you”. If you read
a scientific paper, you’ll find plenty of
‘may’, ‘this suggests’, ‘we hypothesize’,
‘it could potentially result in’ and then
‘the limitations of this study are’, ‘more
research is needed to confirm’, and so on.
These “grey zones” usually get lost when
the news trickles down to the medias, blogs
and social networks. And because of that,
it is very common to be faced with apparently
conflicting information, which only confuses
and frustrates the consumer.
“Scientists discover that chocolate prevents
cancer”, you may read one day. The following
month, you may read another headline that
says “A team of researchers from the university
of southern California detected a toxic compound
in chocolate”. What does that mean? Is chocolate
bad or good? Which one of them is lying? How
can chocolate beat cancer one day, and be
toxic the other day? Very likely, what study
number one really found was that chocolate
consumption lowered blood concentrations of
a pro-inflammatory molecule. While this has
the potential of being a cancer-protective
outcome, it certainly doesn’t mean that
eating chocolate in itself can prevent cancer.
Study number two may have found a few micrograms
of a compound, which when present at much
much higher concentrations is toxic. But this
doesn’t mean that the levels detected in
chocolate posed any practical risk for health.
Both findings were honest, it’s the headline
conclusions that were oversimplified and dramatized,
thus failing to put the information in perspective.
But who really bothers to go back and check
the original source after reading this kind
of news? Most people don’t even get past
the headline to read the whole article. A
re-tweet may be all they get about the story.
But reality is always more complex than 140
Risk assessment.
Imagine you see a guy walking down the street
wearing a helmet. Why are you wearing a helmet?
you ask him. For fear that something may fall
from the sky and hit me on the head, he replies.
You would probably think that taking this
protective measure is an overkill. It is indeed
possible that a random falling object may
hit him, but it is so unlikely that it doesn’t
make much sense to worry about it. But now
imagine that after a few seconds, you see
this same guy crossing a busy street on a
red light without even bothering to watch
left and right. Now you would certainly think
that something must be wrong with the guy:
he is totally not getting the priorities.
Well, when it comes to food, many of us behave
just like this guy. We are overly concerned
with issues that are far less pressing than
is generally thought, such as the potential
toxicity of food additives, food contaminants
or naturally occurring substances, residues
of pesticides, antibiotics and other agrochemicals,
or packaging migrations.
And while our perceived risk of minor problems
is exaggerated, the risk of other much more
impacting issues concerning eating behavior
in general, such as eating too much salt or
not enough vegetables, is often under-estimated.
An important tool to objectively assess the
impact of a potential danger is part of the
risk analysis process. In particular, when
we consider a danger we must take into account
its severity as well as its probability.
Take the danger of an airplane crash, for
example. Its severity would be very high,
but its probability is so low that we consider
air travel to be reasonably safe.
The same line of reasoning applies to the
potential toxicity of, say, artificial sweeteners,
alkaloids in herbs and spices, or residues
of pesticides.
In countries like the US or the european union,
we have legislation and agencies that worry
about this issues so we don’t have to. They
perform risk analysis assessments for naturally
occurring toxins or contaminants, they regulate
and monitor the acceptable levels of residues,
what additives can be used in each food and
in what amount, and so on, so that the levels
of all these substances are reasonably safe
for consumers.
As Paracelsus said, all things can be poisons,
depending on the dose. Even water can kill
you, if you drink 20 liters in a row. The
fact that a particular food color may be dangerous
at much higher doses than what you can possibly
get even if you eat a variety of food containing
it, doesn’t mean that this additive pose
any reasonable threat at the doses is’s
found in food. For some other substances,
like some preservatives, it is considered
that the benefit of using them far outweighs
the risk. Similarly, the benefit of eating
an apple with all its skin far outweighs the
risk of its negligible amounts of pesticide
Every day we are exposed to thousands of substances
that are potentially toxic, in the food we
eat, the water we drink, the air we breathe,
the clothes we wear, the products we use to
wash our body, our house and our clothes,
and so on. The fact that some of these substances
may cause an allergic reaction or contribute
to initiate cancer in 1 in 50 million people,
does not justify worrying more than we should
worry about being struck by a thunder or being
victims of an airplane crash.
If you still have to learn how to balance
your lipids, control glycemic load, choose
whole grain products over refined ones, eat
legumes, limit meat consumption, avoid excess
salt, optimize fruit and vegetable consumption,
and all those things that constitute the basis
of a health-promoting and disease-preventing
diet, it doesn’t make any sense to worry
about bisphenol A in plastic bottles, the
toxicity of food preservatives, or the effect
of microwave radiations on food nutrients.
These are not the priorities. Because excess
salt, saturated and trans fats, insulin peaks
and lack of antioxidants will have already
killed you long before pesticide residues
have started to even damage you.
In conclusion, when you come across new information
concerning food and nutrition, I strongly
encourage to always put everything in perspective.
Who says this? How can he say it? What’s
the evidence supporting it? How was it generated?
Is it just an anecdotal case, an epidemiological
study, or a controlled intervention trial?
How much evidence backs up this idea? Is it
just one study, or many? What is the real
impact of this finding?
Don’t trust those who ‘preach from high’,
who tell you what you should and shouldn’t
do without explaining you why and giving the
supporting evidence. Nutrition is a science.
It cannot be based on what sounds reasonable,
what we wish was true or what gurus or so-called
“experts” say. It cannot be based on faith,
but only on solid, convincing and repeated
I will leave you today with the words that
Carl Sagan used to describe Johannes Kepler,
who spent a large part of his life trying
to prove something that eventually turned
out to be wrong: “When he found that his
long-cherished beliefs did not agree with
the most precise observations, he accepted
the uncomfortable facts, he preferred the
hard truth to his dearest illusions. That,
is the heart of science.”

Leave a Reply

Your email address will not be published. Required fields are marked *