Category Archives: Uncategorized

Who’da thunk it? Paleo Diabetic made it onto a Top 80 List!

Not your average cave-woman

My little corner of teh innerwebs here made it onto Feedspot’s Top 80 Paleo Diet Blogs and Websites list. I need to peruse some of the other sites listed when time allows. You might wanna check it out, too.

Steve Parker, M.D.

Cataract Extraction Linked to Lower Risk of Dementia

Steve Parker MD, eye chart, eye exam

From JAMA Network, December 2021:


Association Between Cataract Extraction and Development of Dementia

Question  Is cataract extraction associated with reduced risk of developing dementia?

Findings  In this cohort study assessing 3038 adults 65 years of age or older with cataract enrolled in the Adult Changes in Thought study, participants who underwent cataract extraction had lower risk of developing dementia than those who did not have cataract surgery after controlling for numerous additional risks. In comparison, risk of dementia did not differ between participants who did or did not undergo glaucoma surgery, which does not restore vision.

Meaning  This study suggests that cataract extraction is associated with lower risk [~30% less] of developing dementia among older adults.

Importance  Visual function is important for older adults. Interventions to preserve vision, such as cataract extraction, may modify dementia risk.


Details in the abstract:

Objective  To determine whether cataract extraction is associated with reduced risk of dementia among older adults.

Design, Setting, and Participants  This prospective, longitudinal cohort study analyzed data from the Adult Changes in Thought study, an ongoing, population-based cohort of randomly selected, cognitively normal members of Kaiser Permanente Washington. Study participants were 65 years of age or older and dementia free at enrollment and were followed up biennially until incident dementia (all-cause, Alzheimer disease, or Alzheimer disease and related dementia). Only participants who had a diagnosis of cataract or glaucoma before enrollment or during follow-up were included in the analyses (ie, a total of 3038 participants). Data used in the analyses were collected from 1994 through September 30, 2018, and all data were analyzed from April 6, 2019, to September 15, 2021.

Exposures  The primary exposure of interest was cataract extraction. Data on diagnosis of cataract or glaucoma and exposure to surgery were extracted from electronic medical records. Extensive lists of dementia-related risk factors and health-related variables were obtained from study visit data and electronic medical records.

Main Outcomes and Measures  The primary outcome was dementia as defined by Diagnostic and Statistical Manual of Mental Disorders (Fourth Edition) criteria. Multivariate Cox proportional hazards regression analyses were conducted with the primary outcome. To address potential healthy patient bias, weighted marginal structural models incorporating the probability of surgery were used and the association of dementia with glaucoma surgery, which does not restore vision, was evaluated.

Results  In total, 3038 participants were included (mean [SD] age at first cataract diagnosis, 74.4 (6.2) years; 1800 women (59%) and 1238 men (41%); and 2752 (91%) self-reported White race). Based on 23 554 person-years of follow-up, cataract extraction was associated with significantly reduced risk (hazard ratio, 0.71; 95% CI, 0.62-0.83; P < .001) of dementia compared with participants without surgery after controlling for years of education, self-reported White race, and smoking history and stratifying by apolipoprotein E genotype, sex, and age group at cataract diagnosis. Similar results were obtained in marginal structural models after adjusting for an extensive list of potential confounders. Glaucoma surgery did not have a significant association with dementia risk (hazard ratio, 1.08; 95% CI, 0.75-1.56; P = .68). Similar results were found with the development of Alzheimer disease dementia.

Conclusions and Relevance  This cohort study found that cataract extraction was significantly associated with lower risk of dementia development. If validated in future studies, cataract surgery may have clinical relevance in older adults at risk of developing dementia.


Steve Parker, M.D.

Another Estimate of Paleolithic Man’s Diet

Not many edible leafy greens around this time of year

I see no reason to disagree with this abstract.

Abstract

We review the evolutionary origins of the human diet and the effects of ecology economy on the dietary proportion of plants and animals. Humans eat more meat than other apes, a consequence of hunting and gathering, which arose ∼2.5 Mya with the genus Homo. Paleolithic diets likely included a balance of plant and animal foods and would have been remarkably variable across time and space. A plant/animal food balance of 40-60% prevails among contemporary warm-climate hunter-gatherers, but these proportions vary widely. Societies in cold climates, and those that depend more on fishing or pastoralism, tend to eat more meat. Warm-climate foragers, and groups that engage in some farming, tend to eat more plants. We present a case study of the wild food diet of the Hadza, a community of hunter-gatherers in northern Tanzania, whose diet is high in fiber, adequate in protein, and remarkably variable over monthly timescales. Expected final online publication date for the Annual Review of Nutrition, Volume 41 is September 2021. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.

I Appreciate Your Support!

I published my first book in 2007 to extend my healing reach beyond the confines of the clinic and hospital room. I’m certain my writing has improved the health of many folks I’ll never know about, and that means more to me than any financial success I’ve had with the books. 

In 2020, my net profit from writing was $937.08, which is admittedly pitiful. The prior year’s net profit was $5,802.48. Pandemic effect, maybe? To lower my expenses in 2021, I’ll look into a private PO box instead of US Postal Service ($168/year), drop Amazon Prime ($129/year), and negotiate lower fees with Network Solutions.

I am blessed to have a hospitalist job that pays well. COVID-19 has caused major economic hardship for many of you, including unemployment.

My primary means of advertising has been blogging. Cross-posting on Facebook, Twitter, and LinkedIn has done almost nothing for book sales. A few years ago I could give my hospital patients a business card with links to my books, but my employer insisted I stop. 

If you care to support my writing, buy a book. If not for yourself, then for someone you care about. 

Steve Parker, M.D.

PS: All my books are here and at Smashwords.com.

PPS: Guesstimating my combined federal and state taxes being 40%, I have $562.25 left after paying taxes. And don’t forget sales tax on many things I might buy. 

For How Long Did Neanderthals Breast-Feed?

For 5-6 months.

Now aren’t you glad you read this blog? Where else you gonna get this vital info?

The discovery is based on dental analysis of a whopping three Neanderthals found in Italy.

The early onset of weaning in modern humans has been linked to the high nutritional demand of brain development that is intimately connected with infant physiology and growth rate. In Neanderthals, ontogenetic patterns in early life are still debated, with some studies suggesting an accelerated development and others indicating only subtle differences vs. modern humans. Here we report the onset of weaning and rates of enamel growth using an unprecedented sample set of three late (∼70 to 50 ka) Neanderthals and one Upper Paleolithic modern human from northeastern Italy via spatially resolved chemical/isotopic analyses and histomorphometry of deciduous teeth. Our results reveal that the modern human nursing strategy, with onset of weaning at 5 to 6 mo, was present among these Neanderthals. This evidence, combined with dental development akin to modern humans, highlights their similar metabolic constraints during early life and excludes late weaning as a factor contributing to Neanderthals’ demise.

Source: Early life of Neanderthals – PubMed

Steve Parker, M.D.

Managing Diabetes on Sick Days

home glucose monitor, diabetes
How old is this device?

For folks taking insulin, Diabetes Daily has a good article by endocrinologist Dr Francine Kaufman. An excerpt:

Everyone with diabetes who takes insulin needs to have a sick day plan. This is something you develop with your healthcare professional to help you manage the high and low sugar levels that can be associated with an illness. The following advice applies to people with type 1 diabetes and people with type 2 diabetes who take insulin – the advice may be different if you have type 2 diabetes and do not take insulin.

Here’s what’s covered in the article:

  • Track your important numbers in a sick log
  • Glucose levels
  • Ketone levels
  • Temperature
  • Fluid intake
  • Urination
  • Vomiting, diarrhea, and dehydration
  • Insulin, amount and time
  • Medications

Key messages from Dr. Kaufman

When you get sick, you are at risk of becoming dehydrated from poor intake or from excessive loss of fluids due to nausea, vomiting, diarrhea, and fever (your body may lose more water when you have a high temperature). In addition, dehydration is common in diabetes because high glucose levels (above 180-200 mg/dL) cause sugar to enter your urine, dragging an excess amount of fluid with it. Illness also puts you at risk of developing ketones, which when coupled with high glucose levels can lead to diabetic ketoacidosis (DKA), a very serious condition. How do you know if you have ketones? Good question, click here!

The purpose of your sick day plan is to try to keep your glucose levels in a safe range – to avoid dehydration and to prevent ketones from rising to a dangerous level.

Source: Zoning in on Sick Day Management: Practical Tips, Strategies, and Advice – Diabetes Daily

Steve Parker, M.D.

PS: Avoid the medical-industrial complex by getting and staying as healthy as possible. Let me help:

Are We Eff’d Up Due to Electric Light at Night?

No electricity

From the Journal of Pineal Research:

Key to the transition of humans from nomadic hunting-gathering groups to industrialized and highly urbanized societies was the creation of protected and artificially lit environments that extended the natural daylight hours and consolidated sleep away from nocturnal threats. These conditions isolated humans from the natural regulators of sleep and exposed them higher levels of light during the evening, which are associated with a later sleep onset. Here we investigated the extent to which this delayed timing of sleep is due to a delayed circadian system. We studied two communities of Toba/Qom Argentina, one with and the other without access to electricity. These communities have recently transitioned from a hunting-gathering subsistence to mixed subsistence systems and represent a unique model in which to study the potential effects of the access to artificial light on sleep physiology. We have previously shown that participants in the community with access to electricity had, compared to participants in the community without electricity, later sleep onsets and shorter sleep bouts. Here we show they also have a delayed dim light melatonin onset (DLMO). This difference is present during the winter but not during the spring when the influence of evening artificial light is likely less relevant. Our results support the notion that the human transition into artificially lit environments had a major impact on physiological systems that regulate sleep timing, including the phase of the master circadian clock.

Source: Access to electric light is associated with delays of the dim light melatonin onset in a traditionally hunter-gatherer Toba/Qom community – PubMed

Steve Parker, M.D.

Click pic to purchase book at Amazon.com. E-book versions available at Smashwords.com.

 

Steve Cooksey Returns to Full Carnivore Diet for Diabetes

Carnivore diet, but not raw

At Diabetes Warrior:

In this post I will be discussing my latest experiment. I am calling it “Diabetic Carnivore 2.0”. It’s 2.0 because I went ‘full-carnivore’ in 2017 for about three years, before tapering off earlier in 2020.

I’ll answer these questions in this post:

1) What is a carnivore in the context of this dietary experiment?
2) Why am I going ‘full-carnivore’ again?

*  *  *

Had we only grown lower carb, leafy green vegetables in our garden, I’d still be eating them probably … but we didn’t. We also grew higher carb vegetables and fruits like tomatoes, beets, turnips, onions and carrots.

We started out eating collards, chard and turnip green salads … all was well. Then I began easing turnips, carrots, beets, and tomatoes into our slaw. Small portions at first… but then the ‘carb creep’ happened. I would add more and more of the sugary, starchy veggies and fruits to the slaw, as well as eat more and more of them.

I only tracked my daily intake of carbs from the vegetables and fruits once. That one day, my carb totals were in the 70 gram range! Not a lot compared to ‘Standard American Diet’ but a lot compared to my typical ‘near zero carb’ meal plan.

Just like a previous high carb experiment (see this post, “Very Low Fat (and high carb) Experiment“), my body handled the sugar and starches from the vegetables pretty well at first but then the fasting blood sugars began to creep up.

Read on to see the connection to COVID-19.

In case you’re wondering, a carnivore diet is not a typical paleo diet.

Steve Parker, M.D.

Click pic to purchase book at Amazon.com. E-book versions available at Smashwords.com.

Multiple Sclerosis and the Paleo Diet

Not Dr Terry Wahls

From a recent scientific article:

Preliminary studies suggest that a modified Paleolithic diet may benefit symptoms of fatigue in progressive multiple sclerosis (MS). However, this diet restricts the consumption of eggs, dairy, and gluten-containing grains, which may increase the risk of micronutrient deficiencies. Therefore, we evaluated the nutritional safety of this diet among people with progressive MS. Three nonconsecutive 24-h dietary recalls were collected from (n = 19) progressive MS participants in the final months of a diet intervention study and analyzed using Nutrition Data System for Research (NDSR) software. Food group intake was calculated, and intake of micronutrients was evaluated and compared to individual recommendations using Nutrient Adequacy Ratios (NARs). Blood was drawn at baseline and the end of the study to evaluate biomarker changes. Mean intake of fruits and vegetables exceeded nine servings/day and most participants excluded food groups. The intake of all micronutrients from food were above 100% NAR except for vitamin D (29.6 ± 34.6%), choline (73.2 ± 27.2%), and calcium (60.3 ± 22.8%), and one participant (1/19) exceeded the Tolerable Upper Limit (UL) for zinc, one (1/19) for vitamin A, and 37% (7/19) exceeded the chronic disease risk reduction (CDRR) for sodium. When intake from supplements was included in the analysis, several individuals exceeded ULs for magnesium (5/19), zinc (2/19), sodium (7/19), and vitamins A (2/19), D (9/19), C (1/19), B6 (3/19), and niacin (10/19). Serum values of vitamins D, B12, K1, K2, and folate significantly increased compared to respective baseline values, while homocysteine and magnesium values were significantly lower at 12 months. Calcium and vitamin A serum levels did not change. This modified Paleolithic diet is associated with minimal nutritional risks. However, excessive intake from supplements may be of concern

Source: Eating Pattern and Nutritional Risks Among People With Multiple Sclerosis Following a Modified Paleolithic Diet – PubMed

Click pic to purchase book at Amazon.com. E-book versions available at Smashwords.com.

How Did Paleolithic Man Trim His Nails?

He had no modern shoe, gloves, or paring knives.

From Science ABC:

Before humans developed blades or social expectations of hygiene, how did we handle the inexorably growing nails at the ends of our fingers?

The answer to this question is quite simple… the fingernails probably took care of themselves. Fingernails are largely made up of keratin, a hardened protein that is also found in the skin and hair. While keratin is hardy and durable, it is far from unbreakable, as any woman with a chipped nail will attest. Similarly, when you clip your nails with any of the clippers explained above, there is some resistance, but they are relatively easy to snip off.

Now, think back 100,000 years, when early humans behaved as hunter-gatherers, engaging in physically demanding activities to survive. Over the course of their normal days, they may have been digging tubers out of the ground, sharpening a rudimentary spear, carrying temporary shelters or trying to start a fire. With all of this manual labor, it is believed that the fingernails would have naturally been worn down and chipped away. The daily demands of survival would have kept the fingernails from growing to unruly or unmanageable lengths. As mentioned above, we see this passive maintenance in other species as well, such as dogs that are often walked on pavement, which gradually wears down their nails, thus requiring fewer nail trimmings at the vet.If the fingernails of these early humans did break or chip, they likely solved the problem as we do today—giving them a nibble and maybe tugging off the occasional irritating hangnail. Again, we see this same behavior in other species who lick at, soften, and bite their nails when they grow too long.

The tribal elites probably didn’t to as much physical labor as the proletarians. so I imaging they and others could have used flat rocks as nail files.

The linked article covers nail trimming over the last 10,000 years, too.

Source: How did ancient people cut their nails before the nail clipper was invented?

Steve Parker, M.D.

Click pic to purchase book at Amazon.com. E-book versions available at Smashwords.com.