Default header image

Our Evolving Food Environment



Origins of Food Science

The origins of food science routed in chemistry, microbiology, biology and processing, as well as our modern understanding of nutrition, started at the end of the 1800s and was established through a series of discoveries, often resulting in researchers becoming Nobel laureates. Louis Pasteur studied wine diseases which led to the germ theory of diseases showing diseases could be prevented by killing or stopping germs leading to the invention of pasteurization of milk and wine to stop bacterial contamination. Pasteur’s work established the origins of food processing. Additionally, he is recognized for his role in vaccine development for Rabies, Cholera and Anthrax. Robert Koch, born in 1843, was the father of modern microbiology, establishing the Koch postulate to determine the origins of microbial infections and awarded the Noble Prize in Medicine in 1905 for his studies on Bacillus, Tuberculosis and Malaria.

Koch’s postulate contained four criteria to identify the causative agent of diseases: (1) the microorganism must be found in diseased but not healthy individuals; (2) the microorganism must be cultured from the diseased individual; (3) inoculation of a healthy individual with the cultured microorganism must recapitulate the disease; (4) the microorganism must be re-isolated from the inoculated, diseased individual and matched to the original microorganism. Through his postulate, Bacillus anthracis, Mycobacterium tuberculosis, and Vibrio cholera were identified, and when the cause could not be determined, he developed sanitation measures for Rinderpest (cattle plague), Surra of Cattle and treatment of Malaria with quinine giving rise to the gin and tonic.

Peter Dmitrievich Pavlov was born in 1849 and was awarded the Nobel prize in medicine in 1904 for his study of the nervous system and modern understanding of the gastrointestinal tract. Made famous for work on conditioned reflex and the salivary response observed in dogs at the sight of food, the development of fistulas that enabled this work was equally important. The use of fistulas enabled organ functions to be observed continuously under normal conditions leading to a new era in the development of human physiology.

The advent of physical chemistry and origins of the orbital theory, colligative properties and water activity, reaction kinetics, crystallization theory, Ostwald’s ripening, color science, and the term Mole was compiled by polymath Friedrich Wilhelm Ostwald, who was awarded the 1909 Noble prize in Chemistry for catalysis, chemical equilibria and reaction velocities all of which are innately important in food structure, stability and shelf-life. Remarkably, in addition to his studies, three of his Ph.D. students rose to fame, including Jacobus Henricus van’t Hoff (Nobel prize 1901) for work on the laws of chemical dynamics, Svante August Arrhenius (Nobel prize 1903) for the electrolytic theory of dissociation, Walther Hermann Nernst (Nobel prize 1920) for work in thermodynamics.

Also concurring with this time frame were the origins of our understanding of nutrition as micronutrients and their role in preventing diseases related to their deficiencies were slowly discovered. Although Hippocrates first referred to scurvy, James Lind (1747), a Royal Navy surgeon, prevented it by feeding citrus fruits in a controlled trial on board HMS Salisbury leading to the issuance of lemon juice to Navy crew members. The ascorbic nature of foods was at this point known to prevent scurvy, yet it was not until 1912 that Vitamin C was discovered and later isolated in 1928; in 1933, it was the first vitamin to be chemically synthesized for mass production. It was not until 1937 that Albert von Szent-Györgyi Nagyrápolt was awarded the Nobel prize for the biological combustion processes referencing vitamin C. The fascinating story of Christiaan Eijkman, who, along with Sir Frederick Hopkins, was awarded the Nobel Prize for vitamin theory, completed medical and doctoral degrees on the polarization of nerves at the Univeristy of Amsterdam. He then served as the health officer for the Netherlands (Dutch) East Indies, today Indonesia, but contracted Malaria and returned to Holland shortly after that. His return allowed him to study the emerging field of bacteriology with at-the-time-famous Robert Koch. In 1900, he was seconded back to Indonesia as part of the Pekelharing-Winkler mission to treat colonial soldiers suffering from an outbreak of unknown origin.

The disease was associated with the weakening of the limbs, followed by heart pounding and ending in asphyxia and death, throughout which no fever developed. Eight months after arriving, Pekelharing returned to Holland, leaving Eijkam behind and later wrote that “a micrococcus isolated from the area was almost certainly the cause of the disease.” Eijkman, still in Indonesia, was not convinced of the micrococcus origin as he continually failed to satisfy Koch’s four postulates in dogs. He also noticed anomalies in transmission not seen in other diseases caused by microorganisms; for example, those susceptible were women in the final stages of pregnancy, newly enlisted military, the warrior class, and students away from home. Considering that the condition rarely presented in nobility, the richest class, rickshaw drivers and the poorest of society, and was completely absent in villages living traditional lifestyles pushed Eijkman to continue trying to identify the cause. He switched away from dogs to chickens but still failed to recapitulate the bacteria with the disease; however, all of a sudden, both the control and treatment chicken groups began showing symptoms of the disease. Nearly convinced he had identified the cause and it was simple cross-contamination, he successively reproduced the experiment at greater and greater distances until he realized it must have another origin since control birds developed leg weakness despite being out of range of possible cross-infection. Dismayed, he asked the chicken handler to check the notes to see if anything else had changed when the birds started to get sick, and remarkably it was found that the diet changed in the first flock of chickens to get sick. The birds fed polished white rice developed polyneuritis, mimicking the symptoms of the disease now being called Beri-Beri, and was cured upon feeding unpolished brown rice with the ‘silverskin’ or bran still adhering. Convinced it was something in the polishings that cured beriberi, he returned to Holland, replaced by Grijns and notified prisons that had beriberi outbreaks to switch to whole rice, which ultimately cured the disease now known to be caused by a thiamine (vitamin B1) deficiency. After their Nobel prize in 1929, Minot and Murphy were credited with preventing pernicious anemia, a vitamin B12 deficiency, for which the structure was not identified until work by Hodgkin (Nobel Prize 1955). 

Classical nutrition slowly emerged between 1800-1970, establishing a relationship between food and health, with the primary focus on preventing nutritional deficiencies (e.g., orange juice preventing scurvy, treating corn with ash preventing niacin deficiencies). Fueled by the discovery of micronutrients, optimal nutrition emerged in the 1940s as nutritional components were identified in preventing diseases, including fatty acid composition and cardiovascular disease, folic acid and spina bifida, glycemic sugars and diabetes. Alongside the emergence of genomics, originating in the late 1990s, molecular nutrition still evolves today and considers the individual genome and the emergence of personalized nutrition. Nutritional science has moved past the classical concepts of avoiding nutrient deficiencies and basic nutritional adequacy to the concept of “positive” or “optimal” nutrition depending on the needs of the individual. This recognized aspects such as endurance athletes having different nutrition requirements than a sedentary professor.


To know where we are, we must first understand where we came from.

Diet as an Environmental Evolutionary Pressure

Charles Darwin first postulated that the “organic world is a product of the operation of discoverable natural forces (evolutionary discourses), and such changes (phenotypic or genotypic) in organisms are not spastic or stochastic” (Darwin, 1859). Whereby, evolutionary adaptations occur when organisms experience modified external or environmental conditions resulting in an evolutionary discourse between the environment and its genetic profile.

Darwin, Charles (1959). Evolution and Natural Selection. Boston: Beacon Press.


As a population of organisms evolve to an environment a set of genotypes, which are genetic characteristics, are considered preferential and if the environment is modified those genotypes may no longer be preferential. Hence, environmental changes (also known as evolutionary discourses) can initiate an evolutionary process that occurs over successive generations, optimizing the genome in accordance to the new environment.Relevant to food and nutritional sciences, is diet capable of acting as an evolutionary discourse resulting in discernable differences among populations? To establish diet is capable of acting as an evolutionary discourse, a detectable genetic and phenotypic difference among populations must exist. To make such a connection, Perry et al., 2007 eloquently linked diet to discernible genetic profiles of two isolated agricultural populations uninfluenced by the modern global food supply, which include: 1) high-starch consumers (Japanese, European American, and Hadza hunter-gatherers (rely extensively on starch-rich roots and tubers)); and 2) low-starch consumers (Biaka and Mbuti (the rainforest hunter-gatherers), Datog (pastoralists) and Yakut (pastoralist, fishing society)). These populations have differences in the gene copy number (i.e., multi-allelic copy number variants (CNV)) of the gene responsible for salivary amylase – AMY1.

The AMY1 genes are relevant because it shows selective pressures (i.e., a high starch diet) acted on amylase, the enzyme responsible for starch hydrolysis. These isolated populations are uninfluenced by the modernization of the global food supply, and, over generations, those that consumed high-starch diets, on average, have a higher AMY1 CNV than those from isolated populations, which consume low-starch diets. Hence, diet acted as selective pressure on the CNV (Perry et al., 2007). The selective pressure of a high-starch diet resulted in a phenotypic expression of greater amylase concentration in saliva.


Although diet composition is a selective pressure, the question now becomes, can the processing of foods represent a large enough selective pressure to drive evolution? Although significantly more challenging to establish, there is a specific point in human history that lends itself well to investigating the effects of food processing on human evolution. The defining point leading to the Anthropocene and subsequently to the 6th mass extinction of species relied on the mastery of fire (e.g., allowed for the magnification of energy output and entropy in nature). This represented a significant turning point in biological evolution, where Homo  became a unique genus, and ignition  fire and its transfer have from the minimum age of >1.8 million years (Ma) ago was its onset (Glikson, 2013). Analyses of tooth size by Brace, Rosenberg, & Hunt (1987) clearly illustrate dietary adaptation occurred in response to changing food environments in Australopithecus anamensis to Australopithecus afarensis to Australopithecus africanus.

Early hominids had small to moderate size incisors and large flat molars, suggesting a dietary shift at, or near the stem of hominid evolution (Teaford and Ungar, 2000). Since the late Pleistocene era, both tooth and body size have been decreasing; however, the reduction in dentation is much more dramatic than overall body size changes which led Brace to conclude “the middle Pleistocene and onset of the last glaciation, the principle function of dentation has been and still is, we argue, the processing of foods”.

The driving force for the selection of less dentation was not the type of food consumed, but instead the processing (i.e., cooking) that the food had undergone prior to consumption (Brace, et al., 1991). Observed changes in dentition are still apparent in populations today (Brace, et al., 1980) and it is well established that introduction of fire, and consequent cooking of foods (e.g. foods decreased in hardness due to gelatinization of root vegetables, seeds and meat), decreased the overall amount of dentition and human tooth size because cooked foods are often softer than their raw counterpart.


Our Changing Food Environment

Throughout most of human evolution, diet remained unchanged before the neolithic era, representing the first of three agricultural revolutions. At the dawn of the neolithic era, some 11 000-12 000 years ago, humans began to shift from nomadic living to forming permanent settlements facilitated by the advent of farming where stock breeding was used to cultivate edible plants allowing for less time spent on hunting and gathering leading to the development of culture, arts, and trades and facilitate population expansion. During the neolithic era, access to food changed; however, the basic composition of the diet remained constant. The second (British) agricultural revolution occurred between 1600 and 1800s, where agricultural output grew faster than the population facilitating the expansion of population where the population of England and Wales grew from 5.5 million in 1700 to 9 million in 1801. During this time, the agricultural output per agricultural worker in Britain doubled between 1500 and 1650 and again by 1850, and the rise in productivity accelerated the decline of the agricultural labor force, thereby adding to the urban workforce. The second agricultural revolution coincided with crop rotation, for example, using clover to fix nitrogen into fertilizer and deep plowing. The industrial revolution occurred after the second agricultural revolution and facilitated the onset of the third revolution due to the advent of mechanical machinery and advanced plant breeding, which increased food production and greatly exceeded the needs of the agricultural workforce. The third (Green) agricultural revolution led to the development of modern scientific farming between 1950 and 1960, whereby technology and capital produced high-yielding seed varieties, chemical fertilizers, factory farms (consolidation of land holdings) and the use of mechanical machinery.


Industrial Revolution

Following the second and third agricultural revolutions, excess seasonal food production spurred trade and introduced the requirement to preserve perishable agricultural commodities. Driven by the innovations of the time, starting with the steam engine, home refrigeration, ready-to-eat meals, and advertising, the food landscape shifted from meals and dishes prepared from whole foods to the Western Diet characteristic of high consumption of ultra-processed Foods. Although each agricultural revolution increased access to food, the composition of grains, oilseeds and animals-based products remained unaltered until the industrial revolution. Industrialization of our modern food supply gave rise to new foods, diets and nutrient combinations accounting for 70% (11% dairy products, 24% whole and refined grains, 19% refined simple sugars, 18% refined vegetable oils, and 2% alcohol) of the average consumer’s caloric intake unavailable to preagricultural hominoids (Cordain et al., 2015). The ingredients mentioned above are then formulated into ultra-processed foods such as cookies, cakes, breakfast cereals, bread and bagels, pizza, ice cream, and a plethora of sauces and dressings that make up the primary components of the Western diet.

To ensure formulated foods are shelf-stable, significant monosaccharides, disaccharides, or salt are essential to reduce water activity and limit microbial growth, preventing food spoilage. The average US consumer intakes ~70 kg/year of refined sugar, up from 56 kg/year in 1970 and 6.8 kg/year in 1815! The jump from 6.8 to 56 kg/year accounted mainly for new sources of crystalline sucrose, while the second increase coincided with the manufacture of high-fructose corn syrup (HFCS) from refined starches, and HFCS consumption increased from 0.2 to 29 kg/year between 1970-2000. Similarly, highly refined vegetable oil consumption has also steadily inclined since the advent of expeller and solvent extractions and today accounts for 18% of calories via consumption of cooking oils, shortenings and margarine (Cordain et al., 2015). Almost 10 g of salt is consumed daily, with 75% contained in ultra-processed foods, 15% added during cooking or directly to the food and only 10% from whole foods, which shifted from potassium sources to sodium.

Ancestral wheat varieties were small, difficult to harvest, and minimally digestible without prior processing (milling, refining, and cooking), containing significantly more germ and bran, the source of micronutrients and fiber and less endosperm (starch). Modern elite wheat varieties were selected to produce more endosperm relative to the bran and germ layers, yielding more starch after milling. Milling technology is a reduction technology that evolved alongside wheat; ancestral wheat was crudely milled using stone mortars to produce whole flour. Initial technological advancement harnessed power, first animal, then using air and water-driven mills, each drastically increasing throughput but not output as the entire grain was used as whole flour. Sieving eventually followed, allowing different fractions to be separated. Today flour is made by progressive rolling coupled with plan-sifters; the grain is passed through successive rollers, each with a smaller gap and the powders are collected and sieved successively, producing ultra-fine flour alongside other fractions. Pastry flour is uniform, small endosperm particles that provide superior functional properties for making cake and bread but lack the less functional, nutrient-laden germ and bran layers. Although cereals account for 24% of calories, 21% are highly refined to the point of being devoid of micronutrients leaving only 3% as whole flour. Although the story of milling may seem to have little relevance, there were very real consequences. As wheat milling expanded, so was rice polishing, where the hull and bran layers were removed, leaving white rice, or the germ (starch) layer. Removing the bran and germ layers prevented rancidity by eliminating lipase enzymes that break down oils facilitating its export overseas. Through industrialization, the cost of white and brown rice diminished, increasing domestic use across parts of Asia; because rice was the staple food through the 1800s, micronutrient deficiencies began causing more and more illness. Polishing rice removed the primary source of B1 from millions’ diets, causing catastrophic death from beriberi years before the concept of vitamins was perceived, yet did lead to Eikman’s 1929 Noble Prize in Medicine for Vitamin Theory. The refined ingredients of sugars, flours, oils and fats with salt formulated a smorgasbord of ultra-processed foods (UPF), ready-to-eat, convenient meals, the hallmark of the Western diet, each displacing meals prepared from whole foods so much so that it has shaped the Canadian Food Guide!


Canada’s Food Guide

Iterations of the Canadian food guide translate the evolving science of nutritional requirements into a practical pattern of food choices incorporating variety and flexibility. First established in 1938 and continually updated until 1961, nutritional advice was provided as a series of food rules (modifications in 1944 and 1949) approved by the Canadian Council on Nutrition. The food rules morphed into Canada’s food guide in 1961, with iterations in 1977 and 1982, and then into Canada’s Food Guide to healthy eating (1992 and 2019). The initial food rules aimed to guide the selection of food that provides the necessary nutrients, initially defined at 70% of the dietary requirement allowance (RDA), increasing to 100% in 1944. At this time, new riboflavin requirements altered milk requirements and introduced water and iodized salt requirements, while in 1949, wording to milk requirements was adjusted from an absolute quantity to “at least,” reflecting different energy requirements. Canada’s food guide was introduced in 1961, recognizing different dietary patterns to satisfy nutritional needs. For the first time, processing was used to differentiate whole grain cereals from highly refined and provided meat alternatives (eggs, cheese, beans, and peas). The 1977 guide changed the categories of ‘milk’ to ‘milk and milk products’ and from ‘meat and fish’ to ‘meat and meat alternatives.’ Also, in the version, ‘fruits and vegetables’ were combined into a single category and ‘whole grain products’ were expanded to include enriched products. Concurrent with this time frame, the understanding between diet and cardiovascular diseases emerged, leading to two significant changes to the guide: an emphasis on balancing energy intake with energy output and moderation of fat, sugar, salt and alcohol. The primary objective changed from 1977 to 1982 from preventing nutrient deficiencies to reducing chronic diet-related diseases. Another major adaptation followed in 1992, which took a total diet approach to meet both energy and nutrient requirements, recognizing that energy needs vary, moving away from absolute numbers of servings to relative ratios between the 4 categories (grain products, vegetables and fruits, milk products, and meat and alternatives).

Another major transformation of the guide occurred in 2019 to counter rising diet-related chronic diseases and conditions rapidly becoming Canada’s major public health concern. This rendition presented the largest transformation between iterations with water and only three categories. ‘Whole fruits and vegetables’ comprise half of the recommended diet surpassing ‘whole grain foods,’ represented by a quarter of the plate, and the remaining ‘protein foods’ amalgamate ‘meat and meat alternatives’ with ‘milk products.’ The new guidelines provide seven pillars to healthy eating: 1) Be mindful of eating habits, 2) Cook more often, 3) Enjoy food, 4) Eat meals with others, 5) Use labels, 6) Marketing can influence food choices, and, 7) Limit highly processed foods. The first six guidelines consider the complex behavior that drives consumers to select certain foods, while the seventh points to limiting highly processed foods, also known as Ultra-Processed Foods.


Ultra-Processed Foods and the NOVA Classification of Foods

The NOVA classification differentiates foods based on their degree of processing: Group 1) unprocessed or minimally processed foods; Group 2) processed culinary or food industry ingredients; Group 3) processed foods, and; Group 4) ultra-processed foods (UPFs) (Monteiro, Cannon, et al., 2018; Monteiro, Levy, Claro, Castro, & Cannon, 2010). Group 1 is whole foods (e.g., meat, milk, grains, legumes, nuts, fruits, and vegetables), while the other extreme is UPFs, which are not modified foods but formulations made mostly or entirely from substances derived from foods and additives (Global Panel on Agriculture and Food Systems for Nutrition, 2016). The differentiating factor, the’ intact’ nature of food, was exceptionally insightful, preluding to the importance of physical structure on the dietary quality of foods (Fardet & Rock, 2022). This change in dietary guidelines presents significant hurdles to achieving food security since most agricultural raw materials must be processed to ensure safe and palatable foods are not lost to derivative reactions along the supply chain (Jones, 2019); while avoiding these foods would pose a considerable challenge, given globally that almost two-thirds of all energy comes from UPFs (Gibney & Forde, 2022).

 Group 1Group 2Group 3Group 4
 Unprocessed or minimally processed foodsProcessed culinary ingredientsProcessed foodsUltra-processed foods
Level of ProcessingLowMediumMedium/HighHigh
Types of FoodEdible plants (seeds, fruits, leaves, stems, roots), or animals (muscle, eggs, milk), fungi, algae & waterIngredients used in stews, soups and broths, bread, jam, drinks & dessertsBottled vegetables, canned fish, fruits in syrup, cheeses and freshly made breadSoft drinks, sweet & savory snacks, reconstituted meats,  pre-prepared frozen dishes, foods made mostly or entirely from ingredients derived from foods and additives
Unit Operationsdrying, crushing, grinding, roasting, boiling, non-alcoholic fermentation, pasteurization, refrigeration, chilling, freezing, packagingPressing, refining, grinding, milling and dryingPreservation or cooking methods, and, in the case of bread and cheese, non-alcoholic fermentationHydrogenation & hydrolyzation, extrusion, and formulated foods made from a multitude of processes combining ingredients
Outcome of ProcessingPreserve natural foods to make them suitable for storage or to make them safe or edible, or more pleasant to consumeUsed in home and restaurant kitchens to prepare, season and cook Group 1 foodsModified Group 1 foods with increased durability or modified to enhance their sensory qualitiesBranded, convenient ( ready to consume), attractive (hyper-palatable) and highly profitable food products
Location of ProcessingSignificant  kitchen preparationIndustrially processed ingredients made from whole foodsPredominately industrially processedIndustrially processed with minimal processing at the household level

Healthy and Obesogenic Food Environments

Understanding food choice is extremely complex; here, aspects of Canada’s food environment are introduced. To place some perspective, consider outside what typifies the perception of a food environment, including access to transportation, ethnicity, socioeconomic status, proximity to healthy versus non-healthy food retailers and their cost differentials—confounded by the time available to prepare meals from scratch as much as who is being fed. A food environment describes geographic access to safe food (rural, urban, community /neighborhood, and down to even single institutes (universities, long-term care homes, prisons), the quality of the food available (fast-food, convenience store versus whole-food and farmers markets) and, uniquely experienced by the consumer biased by their level of understanding of nutrition and cost differential compared to affordability (both time and cost). Healthy food environments establish equitable access to healthy foods (vegetables, fruits, whole foods), requiring well-established food production and distribution networks, encompassing a variety of retailers and food service outlets, and providing raw, through to prepared and re-packaged healthier food options. Even in such a food environment, nutritional education and support infrastructure are required to ensure access to healthy eating (e.g., food banks, community gardens, and senior support). Community obesity rates trend higher in obesogenic environments, defined as those where it is difficult, sometimes impossible, to buy, prepare and eat meals made from a variety of healthy whole foods that ensure optimal nutrient acquisition and prevention of chronic long-term diet-related diseases. Obesogenic food environments are divided into food deserts, neighborhoods with little or no access to stores and restaurants that provide healthy and affordable foods, and food swamps, with an abundance of fast food, junk food outlets, convenience stores, and liquor stores alongside few healthy food retailers.


Canadian Food Regulations: in Brief

The Canadian Food Inspection Agency (CFIA) enforces the Food And Drug Act (https://laws-lois.justice.gc.ca/eng/acts/f-27/ ) and other regulations developed by Health Canada overseeing all production, sale and labeling of foods. The act defines food as “any article manufactured, sold or represented for use as food or drink for human beings, chewing gum, and any ingredient that may be mixed with food for any purpose whatever.” All other products for human consumption are classified and regulated as Drugs; those “include any substance or mixture of substances manufactured, sold or represented for the use in the diagnosis, treatment, mitigation or prevention of a disease, disorder or abnormal physical state, or its symptoms, in human beings or animals, or restoring, correcting or modifying organic functions in human beings or animals.”

Federal regulations and requirements for food labels are essential for consumers to make informed choices about healthy foods, ensuring that information is reliable and trustworthy. It may not seem immediately obvious why such strict enforcement and standardization are needed when labeling foods, but consider the historical advertisements that convolute messaging around the healthfulness of a food. The nutrition fact table is mandatory on most packaged food products, as are the serving size, calories and core nutrients. The percent daily value of nutrients differentiates foods with little (< 5%) or lots (> 15%) of a specific nutrient requirement and must contain information on the. 13 core nutrients (fat, saturated fat, trans fat, cholesterol, sodium, carbohydrates, fiber, sugar, protein, vitamin A, vitamin C, calcium and iron). The label also must contain an ingredient list, in descending order based on the weight, allergen and gluten declarations, expiration dates and best-before dates, the country of origin, nutrition claims and production methods (e.g., organic, free range, natural ingredients, irradiated (must contain radura symbol).

Health claims include “any representation in labeling and advertising that states, suggests, or implies that a relationship exists between the consumption of foods or food constituents and health” and must be approved by Health Canada before use. Functional claims are well-established specific beneficial effects that consuming food with a constituent has on normal functions or biological activities; an example would be vit A aids in developing and maintaining night vision. To make claims on glycemic carbohydrates, such as free of sugar (no sugar, zero sugar, sugarless), it must have less than 0.5 g of sugar, or reduced (lower, less) must have 25% less at and at least 5 g of sugar. Fiber claims include improving laxation by increasing stool bulk, reducing blood total or low-density lipoprotein cholesterol levels, reducing post-prandial blood glucose or insulin levels, and providing energy-yielding metabolites through colonic fermentation. Fiber claims can be made for barely beta-glucans, bran (from barely, corn, oat, or wheat), gums (gum arabic or guar); seed coats (oat and pea hull fiber and psyllium seed husk), oligosaccharides (Fructooligosaccharides, galactooligosaccharides, Isomaltooligosaccharides), peel and pulp (apple, blueberry, cranberry, orange, and tomato), as well as resistant maltodextrins and starches. The functional claims below are for all the micro and macronutrients known currently.

In addition to functional claims, Health Canada also regulates disease risk reduction claims (Food and Drugs Act: Section 5. 1) which are statements that link a food, or a food constituent beyond the established essential micronutrients, to reducing the risk of developing a diet-related disease or condition. There are currently 16 approved health claims; the first four were established in 2010 and included reduced sodium lowers hypertension, total dietary fat, saturated fat, cholesterol and trans fat on coronary heart disease, low-fat diets rich in fruits and vegetables reduce cancer risk, and, diets high in calcium lower risk of osteoporosis. The following decade saw the introduction of additional claims aimed at lowering blood cholesterol associated with plant sterols (2010), oat products (2010), psyllium (2011), unsaturated fat (2012), barley products (2012), whole ground flaxseed (2014), soy protein (2015), and PolyGlycopleX®, a polysaccharide complex containing glucomannan, xanthan Gum, sodium alginate. 2014 also introduced sugar-free chewing gum claims to reduce the risk of dental caries (cavities). The most recent disease risk reduction claims came in 2016 and included diets high in vegetables and fruits reduce the risk of coronary heart disease, eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA) lower blood triglycerides, and PolyGlycopleX® reduces post-prandial blood glucose. For most of these requirements to be stated, the food must also contain: 1) 10% of the recommended intake of a vitamin or mineral, < 100 mg of cholesterol per 100 g, < 0.5% alcohol, < 480 mg sodium per serving, and meets the criterion low in saturated fatty acids. Finally, in 2009 Health Canada approved limited claims on foods containing probiotics that contain bacteria essential to a healthy system. Function claims regarding the physiological effects of probiotic microorganisms in foods (e.g., “promotes regularity” and “improves nutrient absorption and aids in digestion”) are approved, while claims on “improving gut health” or “supporting immune function/system” are not recommended. It is also the manufacturer’s responsibility to ensure the stability and viability of the probiotic strain or mixed culture at the declared level until the best-before date. Between the regulations of food and drugs lies natural health products (NHPs) – whether in a food format or not- covering phytochemicals intended for therapeutic use and governed by the Natural Health Product Regulations.

Works Cited

Brace, C. L., Rosenberg, K. R., & Hunt, K. D. 1987. Gradual change in human tooth size in the late pleistocene and post-pleistocene. Evolution, 41, 705-720.
Brace, C. L., et al., 1991. What Big Teeth you Had Gradma! Human Tooth Size, Past and Present. In M. A. Kelley & C. S. Larsen (Eds.), Advances in Dental Antropology (pp. 33-57). New York, USA: Wiley-Liss Inc.
Brace, C. L. et al., 1980. Australian Tooth-Size Clines and the Death of a Stereotype [and Comments and Reply]. Current Anthropology, 21, 141-164.
Cordain, L., et al., 2005. Origins and evolution of the Western diet: health implications for the 21st century. The American Journal of Clinical Nutrition. 81: 341–354.
Darwin, C. 1859. On the origin of species by means of natural selection, or preservation of favoured races in the struggle for life: London : John Murray.
Glikson, A. 2013. Fire and human evolution: The deep-time blueprints of the Anthropocene. Anthropocene, 3, 89-92.
Global Panel on Agriculture and Food Systems for Nutrition. 2016. Food Systems and Diets: Facing the Challenges of the 21st Century . Vol. 2019.
Monteiro, C. A., er al., 2018. The UN Decade of Nutrition, the NOVA food classification and the trouble with ultra-processing. Public Health Nutrition, 21, 5-17.
Monteiro, C. A., et al., 2010. Increasing consumption of ultra-processed foods and likely impact on human health: evidence from Brazil. Public Health Nutrition, 14, 5-13.
Monteiro, C. A., et al., 2018. Household availability of ultra-processed foods and obesity in nineteen European countries. Public Health Nutrition, 21, 18-26.
Perry, G. H. et al., 2007. Diet and the evolution of human amylase gene copy number variation. Nature Genetics, 39, 1256.
Teaford, M. F., & Ungar, P. S. 2000. Diet and the evolution of the earliest human ancestors. Proceedings of the National Academy of Sciences, 97, 13506.