Friday, January 5, 2018

The Four Types of Allergy



The term allergy was coined by a Viennese pediatrician, Claude von Pirquet, in 1906, to explain the sneezing of children exposed to pollen. The clearest translation of its meaning is “altered reactivity,” and it soon became clear that allergy involves excessive and abnormal activation of the immune system.

During the 20th century, scientists identified four types of immune activation that lead to allergic reactions.7 All four types can occur in people with allergic disorders.8 Each type requires that your immune system recognize a specific allergen because of a previous exposure to it. In the first three types, that previous exposure caused your immune system to make antibodies directed against the allergen. The usual function of antibodies, which are proteins made by cells of your immune system, is to create immunity, helping you resist infection; allergy turns this protective effect into a harmful one. The fourth type of allergy, the type that occurs with nickel dermatitis, does not require antibodies to produce its effects.

Type 1 Allergy

Type 1 allergic reactions, the most common form, result from formation of an antibody called IgE (immunoglobulin E). When IgE attaches to an allergen, it stimulates mast cells to release mediators like histamine into your tissues with explosive force. Standard blood tests for allergies look for the presence of IgE antibodies directed against specific allergens. Skin tests for allergies attempt to measure the swelling produced in your skin when IgE antibodies attach to an allergen that’s being injected.

Type 1 is the kind of allergic reaction that occurs with anaphylaxis, eczema, hives, hay fever, and allergic asthma. It has two phases, early and late. Symptoms of the early phase allergic response are caused by the release of mast cell mediators. They can occur within seconds of exposure to an allergen and may last for a few hours. Some mast cell mediators also attract Eos to the inflamed tissues.

Activation of Eos creates the late phase allergic response. The most potent mediators released by Eos are enzymes that damage cells. They’re capable of killing parasites and can inflict the same kind of damage on your own tissues. The late phase reaction may last for days and can cause long-lasting changes to your tissues and your immune system: damaged tissues may heal with scarring, and your immune system may shift so that your lymphocytes further increase their production of IgE antibodies. It’s a dangerous cascade that can allow allergies to spin out of control.

If you suffer from allergic eczema, you can see the difference between the early and late phase responses in your own skin. When you eat a food to which you’re allergic, your skin becomes red, somewhat swollen, and very itchy. Once this early phase reaction subsides, your skin becomes  thickened and scaly, still red and itchy but with less intensity. That’s the late phase response, and if it lasts long enough, your skin does not readily return to normal.

Type 2 and Type 3 Allergy

Type 2 and Type 3 allergy depend upon another class of antibody, called IgG (immunoglobulin G), to amplify the allergic signal. IgG is the main class of antibodies circulating in your blood. IgG is essential for a normal immune response, and its deficiency predisposes people to recurrent or chronic bacterial infections. Type 2 and 3 reactions are the main mechanisms involved in drug allergies and may occur in some people with food allergy, especially when migraine headache, abdominal pain, or arthritis are the allergic symptoms.

Two factors make Type 2 or Type 3 allergy to foods or drugs hard to detect. First, IgE antibodies are not involved, so standard allergy tests, which measure IgE, will not detect this kind of allergy. Second, the onset of the allergic reaction is often delayed, sometimes occurring 24 hours or more after exposure to the triggering allergen. You have to be a really good detective to track down these reactions. The same is true for Type 4 allergy, which is known as delayed hypersensitivity.

Type 4 Allergy

Type 4 allergic reactions do not require antibodies. The triggering allergens directly activate immune cells called helper lymphocytes, which amplify the response themselves, attracting so-called killer lymphocytes to the area where the antigen is found. Killer cells are just as effective as Eos at causing tissue damage.

Type 4 reactions occur in a number of infectious diseases, such as tuberculosis, where they help to control the spread of the infection. They also contribute to the damage that occurs in several autoimmune disorders, including rheumatoid arthritis, Crohn’s disease, type 1 diabetes, multiple sclerosis, and Hashimoto’s thyroiditis.

The most common allergic disorder employing the Type 4 mechanism is poison ivy, an allergic skin rash caused by exposure to oils from plants in the genus Toxicodendron. Allergic contact dermatitis (like Flip’s allergy to nickel, for example) usually involves a Type 4 reaction. For some people, Type 4 reactions may cause asthma. As I describe in Chapter 12, up to 15 percent of asthmatic reactions may occur because of Type 4 allergy. Food allergy may also be caused by Type 4 reactions, especially when the allergic reaction affects the gastrointestinal tract or the skin.

Anaphylaxis: Allergy That Can Kill

The term anaphylaxis was coined in 1901 by the French scientist Charles Richet, who received the Nobel Prize for his research in 1913. Richet coined a new word for what he believed to be a new concept: hypersensitization, or, as he expressed it, “the opposite of a protective response.” With an anaphylactic reaction, your body is flooded with chemicals that cause instantaneous, massive swelling of the affected tissues, dilation of blood vessels, contraction of the smooth muscles that line your airways or intestines, and irritation of nerve endings. If the reaction involves your tongue, throat, or respiratory tract, you may be unable to breathe. If it involves your circulatory system, your blood pressure can drop profoundly, producing anaphylactic shock. Swelling of your face, your lips, your eyes, or any part of your skin, as well as wheezing, abdominal cramps, and diarrhea, are other symptoms that may occur with anaphylaxis.

The usual triggers for anaphylaxis are insect stings, specific foods such as peanuts, or medication such as penicillin. Emergency treatment is essential and starts with an injection of adrenaline, which raises blood pressure, constricts blood vessels and dilates bronchial tubes.

Anyone with a history of anaphylactic reactions should carry a device for rapid self-injection of adrenaline at all times and have an emergency action plan worked out with his or her personal physician.

The incidence of anaphylactic reactions has doubled over the past decade, with an estimated 1,500 fatalities a year in the United States, yet most patients receiving emergency treatment for anaphylaxis at U.S. hospitals do not receive an adrenaline auto-injector or an allergist referral when discharged, a missed opportunity for preventing further reactions.

Studies in many different countries have all reached the same conclusion: people prone to anaphylaxis are not adequately armed with adrenaline. As severe as it is, life-threatening anaphylaxis is still underdiagnosed, underreported, and undertreated.

Peanut Allergy

Peanut allergy is one prevalent cause of anaphylaxis. Peanuts contain at least 12 allergenic proteins, two of which can cause anaphylaxis in sensitive individuals. A telephone survey of more than 4,000 U.S. households in 1997 concluded that peanut or tree nut allergies affected 1.1 percent of those surveyed (which translates to about three million people in the U.S. population). A follow-up study five years later found a doubling of peanut allergy among children.14 By 2007 the prevalence of peanut allergy among schoolchildren in the United States had tripled, and researchers were using the term epidemic to describe the increase. A British study documented a tripling in the rate of allergic skin-test reactivity to peanut extract among schoolchildren during the 1990s and a doubling of clinical allergic reactions to peanuts.

The reasons for the increase in peanut allergy are not clear. Most children with peanut allergy get sick immediately on their first known exposure to peanuts. For this to happen, the child must already have been exposed to peanuts, so that his immune system became sensitized to peanut allergens. Researchers at Imperial College London attempted to identify factors that separated children with  proven peanut allergy from children with other allergies or no allergies. The most significant  difference was that children who developed peanut allergy had been rubbed with skin care products containing peanut oil (arachis oil) twice as often as children who did not develop peanut allergy.

Peanut oil is a common component of skin care and infant care products in the United States as well as the United Kingdom. The list of commonly used topical preparations that contain arachis oil includes Cerumol (for removing earwax), Siopel barrier cream, zinc and castor oil ointment, calamine oily lotion, Dermovate (a potent topical steroid cream used for difficult eczema), and Naseptin cream.

The British researchers also found that peanut allergy was more likely to occur if other family members ate peanuts. Their theory is that exposure to peanut allergens through the skin is the main risk factor for peanut allergy. This theory might explain why children with eczema are at increased risk for developing anaphylaxis from peanuts. The inflamed and broken skin of eczema allows increased absorption of peanut antigens through the skin.

There is at present no specific treatment that can reverse peanut allergy.

Source: The Allergy Solution: Unlock the Surprising, Hidden Truth about Why You Are Sick and How to Get Well Paperback by Leo Galland M.D. and‎ Jonathan Galland J.D. (Aug 2017)

Allergies on the Brain



There’s no topic in the field of allergy as controversial among doctors as the notion that allergic reactions can have a direct impact on your brain. I’m amazed at the controversy, because I’ve seen the effects of brain allergy in so many of my patients, children and adults alike. The reactions have ranged from spaciness and lack of concentration to depression, anxiety, and mental confusion. Patients of mine with brain allergy have often been previously diagnosed with attention deficit disorder, hyperactivity, autism, and bipolar disorder. For these patients, eliminating the allergic trigger can often help relieve the mental disorder.

Research on Brain Allergy

The earliest published report on brain allergies appeared in the Southern Medical Journal in 1943. Dr. Hal Davison, an Atlanta physician, made the following observations:

For a long time it has been noted that symptoms of bizarre and unusual cerebral disturbances occur   in allergic patients. . . . Later it was observed that when the allergic symptoms improved, the cerebral symptoms improved also. . . . Further observations and experiments showed that at times the cerebral symptoms could be produced at will, by feeding patients certain foods. It was also observed in rarer instances, that ingestion of a drug, inhalation of powdered substances or even odors would produce these symptoms.40

Davison then described 87 patients seen in his allergy practice over an eight-year period with symptoms that included blackouts, insomnia, confusion, and changes in personality, all clearly provoked by specific foods or inhalants. As is always the case with allergy, different triggers affected different people. One of the patients, a lawyer, had a progression of symptoms that would start with a headache, followed by itching and hives, then blurred vision, drowsiness, and impaired speech, ending with loss of consciousness. The food triggers were eggs, crab, oysters, and strawberries. Avoiding these foods completely resolved his symptoms.

Medical journals today rarely publish the kind of detailed clinical observations made by Dr. Davison, although they are real and reproducible. In 1985 I spent a day with Professor Roy John, founder of New York University’s Brain Research Laboratory and a pioneer in the creation of electronic maps of brain activity. He told me that when patients were connected to his brain mapping device and then injected with extracts of foods, molds, or chemicals to which they were allergic, the injections produced dramatic changes in brain electrical activity, accompanied by the symptoms for which the patients had initially sought care.

Later in this book, in the chapter on nasal and sinus allergies, I’ll describe experiments done in Europe in which pollen exposure provoked impairment of brain function comparable to the effects of sedative drugs or alcohol.

Allergy and ADHD

Important scientific research on food allergy and the brain comes from England. Dr. Josef Egger, a neurologist, found that food allergy could lead to ADHD.

Dr. Egger and his colleagues identified 40 children with severe ADHD whose behavior improved when they avoided specific foods.41 Half the children then underwent an allergy desensitization procedure designed by a colleague of mine, Dr. Len McEwen. They received injections of low doses of food allergens mixed with an enzyme that stimulates an immune response. The other half received injections of the carrier solution without the allergens; this was the placebo control. At the end of six months, 80 percent of the children receiving the allergen injections were no longer reactive to the foods that had caused behavioral changes. Only 20 percent of the children receiving placebo had become nonreactive to the foods they’d been avoiding. This clearly indicates that allergy—a reaction in which your immune system amplifies the response to a trigger—is an important mechanism of food-induced ADHD. Egger’s study was published in The Lancet, which is the oldest medical journal in the world and the leading medical journal in the United Kingdom. If you experience neurologic or psychiatric symptoms that you believe may be provoked by a dietary or environmental exposure, know that science is on your side. Find a doctor who respects your observations—and who understands that allergy comes in more guises than ever in our rapidly changing world.

Conclusion

In this chapter, I revealed the many and surprising ways allergy can impact health. Julia’s case showed us how a hidden allergy, in her case an allergy to sulfites found in food, can lead to unexplained joint pain, stomach pain, fatigue, and difficulty with mental focus.

For Cora, the attorney, an allergy to nightshade plants (tomatoes, peppers, and potatoes) turned out to be the surprising cause of her mouth sores, which healed when she avoided eating these foods. A mysterious case of hives was a real curveball for Bruce, the professional baseball player, until we discovered that the yeast in beer and wine was the cause.

These cases illustrate the Four Game-Changing Truths about Allergies that I believe can transform how we approach health. That is why it is so important that you bring this book with you to see your doctor, to share this information with him or her. Ultimately, it is for your doctor to evaluate and decide how the ideas in this book may inform your journey of healing.

Source: The Allergy Solution: Unlock the Surprising, Hidden Truth about Why You Are Sick and How to Get Well Paperback by Leo Galland M.D. and‎ Jonathan Galland J.D. (Aug 2017)

Thursday, December 28, 2017

Honey As An Ethnoremedy

Honey In Traditional And Modern Medicine


The use of honey as a medicine is referred to in the most ancient written records, it being prescribed by the physicians of many ancient races of people for a wide variety of ailments (Ransome 1937). It has continued to be used in folk medicine ever since. There are abundant references to honey as medicine in ancient scrolls, tablets, and books. It was prescribed for a variety of illnesses. Excavated medical tablets from Mesopotamia indicate that honey was a common ingredient in many prescriptions (Hajar 2002).
In ancient Egyptian medicine, honey was the most frequent ingredient in all the drug recipes for both internal and external use listed in the Ebers and Edwin Smith Papyri. According to the Ebers papyrus (1550 BC), it is included in 147 prescriptions in external applications. Also, according to the Smith papyrus (1700 BC), it was used in wound healing: “Thou shouldst bind [the wound] with fresh meat the first day [and] treat afterwards with grease, honey [and] lint every day until he recovers.” Honey was used for treatment of stomach pain and urinary retention and as ointment for dry skin. It was used as ointment for wounds and burns, skin irritation, and eye diseases. The Ebers Papyrus contains a description on how to make ointment from honey and how to apply it, with a note: “Notice that this is a very good therapy.” The author of the Smith Papyrus directed that honey be applied topically, with few if any other possibly active ingredients, to wounds.
In old Egypt, honey was the only active ingredient in an ointment described in the Ebers Papyrus for application to the surgical wound of circumcision. Ebers also specifies that an ointment for the ear be made of one-third honey and two-thirds oil. The concentration of honey in seven oral remedies in the Chester Beatty VI Papyrus ranges from 10% to 50%, whereas its proportion in other remedies ranges from 20% to 84%. Honey could very well have provided some kind of protection from the kinds of bacteria most likely to infect wounds, at least enough protection to permit wounds to begin healing on their own.
The ancient Egyptians were not the only people who used honey as medicine. The Chinese, Indians, ancient Greeks, Romans, and Arabs used honey in combination with other herbs and on its own to treat wounds and various other diseases.
In old Greece, the honeybee, a sacred symbol of Artemis, was an important design on Ephesian coins for almost six centuries. Aristotle (384–322 BC) described for the first time the production of honey. Aristotle believed that eating honey prolonged life. Hippocrates (460–377 BC) speaks about the healing virtues of honey: “cleans sores and ulcers, softens hard ulcers of the lips, heals cabuncles and running sores.” Hippocrates is quoted as saying, “I eat honey and use it in the treatment of many diseases because honey offers good food and good health.” Dioscorides (AD 40–90), a Greek physician who traveled as a surgeon with the armies of the Roman emperor Nero, compiled De Materia Medica around AD 77, which was the foremost classic source of modern botanical terminology and the leading pharmacologic text until the 15th century. In addition to excellent descriptions of nearly 600 plants and 1000 simple drugs, Dioscorides described the medicinal and dietetic value of animal derivatives such as milk and honey. Dioscorides stated that honey could be used as a treatment for stomach disease, for a wound that has pus, for hemorrhoids, and to stop coughing. “Honey opens the blood vessels and attracts moisture. If cooked and
applied to fresh wounds, it seals them. It is good for deep dirty wounds. Honey mixed with salt could be dropped inside a painful ear. It will reduce the pain and swelling of the ear. It will kill lice if infested children skin is painted with it. It may also improve vision. Gargle with honey to reduce tonsil swelling. For coughing, drink warm honey and mix with rose oil.” Galen recommended warming up the honey or cooking it, then using it to treat hemorrhoids and deep wounds.
In ancient Rome, honey was mentioned many times by the writers Vergil, Varro, and Plinius. Especially Virgil’s Georgics is a classic where he describes in detail how honey is made. During the time of Julius Caesar, honey was used as a substitute for gold to pay taxes. In the first century AD, Apicus, a wealthy Roman gourmet, wrote a series of books in which more than half the recipes included honey (Bogdanov 2009). A Roman Catholic saint (St. Ambrose) stated, “The fruit of the bees is desired of all and is equally sweet to kings and beggars and is not only pleasing but profitable and healthful, it sweetens their mouths, cures their wounds, and conveys remedies to inward ulcers.” The Roman, Pliny the Elder, said that mixing fish oil with honey was an excellent treatment for ulcers.
In medieval high cultures of the Arabs, the Byzantines, and medieval Europe, honey was important too, and in these cultures, most sweet meals contained honey.
The Compendium of Medicine by Gilbertus Anglicus is one of the largest sources of pharmaceutical and medical information from medieval Europe. Translated in the early 15th century from Latin to Middle English, the text consists of medicinal recipes with guides to diagnosis, medicinal preparation, and prognosis. The text names more than 400 ingredients. Treatments are presented roughly from “head to tail,” so to speak, beginning with headache and ending with hemorrhoids. Honey was a frequent ingredient to many of the remedies and it was combined with other medicinal herbs commonly used at that time. Excerpts appear below:
Headache … let him use oxymel … made of honey and vinegar; two parts of vinegar and the third part of honey, mixed together and simmered. Pimples … anoint it with clean honey, or with the powder of burnt beans and honey, or with the powder of purslane and honey mixed together. Pennyroyal … taken with honey, cleanse the lungs and clear the chest of all gross and thick humors. (Fay Marie Getz 1991)
Germans used honey and cod liver oil for ulcerations, burns, fistulas, and boils in addition to a honey salve, which was mixed with egg yolk and flour for boils and sores (Newman 1983).
AlBasri (Ali Bin Hamzah AlBasri), a 10th century Arab philosopher, mentioned uncooked honey for swollen intestine, whereas cooked honey was good for inducing vomiting when a poisonous drug was ingested. For that purpose, he recommended mixing one pound of sesame oil with one-third pound of cooked honey. Al Razi (Rhazes, AD 864–932), a renowned Muslim physician famous for writing a treatise distinguishing measles from smallpox, claimed that honey ointment made of flour and honey vinegar was good for skin disease and sports nerve injuries and recommended the use of honey water for bladder wounds. His book, Al Hawi (Encyclopedia of Medicine), a comprehensive medical textbook of medicine, which was translated from Arabic to Latin in the 13th century and became a standard textbook of medicine up to the 1700s stated: “Honey is the best treatment for the gums. To keep the teeth healthy mix honey with vinegar and use as mouth wash daily. If you rub the teeth with such a preparation it will whiten the teeth. Honey does not spoil and could also be used to preserve cadavers.” Likewise, Ibn Sina (Avicenna), another famous Muslim physician whose great medical treatise, the Canon, was the standard textbook on medicine in the Arab world and Europe until the 17th century, wrote: “Honey is good for prolonging life, preserve activity in old age. If you want to keep your youth, take honey. If you are above the age of 45, eat honey regularly, especially mixed with chestnut powder. Honey and flour could be used as dressing for wounds. For lung disease, early stage of tuberculosis, use a combination of honey and shredded rose petals. Honey can be used for insomnia on occasions.”
The Hindu Scripture, Veda, which was composed about 1500 BC and written down about 600 BC, speak of “this herb, born of honey, dripping honey, sweet honey, honied, is the remedy for injuries. Lotus honey is used for eye diseases. It is used as topical eye ointment in measles to prevent corneal scarring” (Imperato and Traore 1969), “moreover it crushes insects.” In the section on Hymn to All Magic and Medicinal Plants, honey is used as a universal remedy: “The plants … which removes disease, are full of blossoms, and rich in honey … do I call to exempt him from injury” (Bogdanov 2009).
In ancient China, honey has been mentioned in the book of songs Shi Jing, written in the 6th century BC. According to Chinese medicine, honey acts according to the principles of the Earth element, acting mainly on the stomach and on the spleen. It has Yang character, acting on the Triple Heater Meridian (Shaoyang) (Bogdanov 2009).
In Central and South America, honey from stingless bees was used for ages, long before Columbus. Honey of the native stingless bees was used and regarded as a gift of the gods; it was also a sign of fertility and was given as an offering to the gods (Bogdanov 2009).
Africa has also a long tradition of a bee use for honey, both in the high cultures of Mediterranean Africa and in the more primitive cultures in regions to the south. Honey is used to treat infected leg ulcers in Ghana (Ankra-Badu 1992) and earaches in Nigeria (Obi et al. 1994). Other uses include treatment of gastric ulcers and constipation (Molan 1999).

Source: "Honey in Traditional and Modern Medicine (Traditional Herbal Medicines for Modern Times)" 1st Edition by Laïd Boukraâ (Editor)

25 Days: A Proven Program to Rewire Your Brain

25 Days: A Proven Program to Rewire Your Brain
Why twenty-five days? you ask. Let’s just say I’m partial to numbers that finally
work in my favor.
If you’ve ever heard of the notion that death always comes in threes, I can
personally vouch for that. In my case, death came three times for me in the same
night. But instead of losing my life, the experience changed it, affecting the way I
would view health and fitness from that day forward.
It was October 4, 2004, midway through my twenty-one-year career in fitness
and nutrition, when, while I was seated at the computer, my heart—
simply—
stopped—
beating.
Thirty seconds later, I recovered on my own, only to have my heart fail again
minutes later. I had no pulse. I wasn’t breathing. I was officially dead for the second
time for about six minutes before being revived by a paramedic, who plunged a big
needle full of epinephrine into my heart and defibrillated me three times.
My heart was beating, but I had been without oxygen to my brain to the point
where my lungs had already shut down. I had a pulse but no lung activity, so they
hooked me up to a ventilator and rushed me to the hospital. That’s where my heart
quit on me a third and final time. It took a minimum of ten defibrillations to bring
me back to life before I fell into a coma for three days. But that night, I made the
history books in a way I wouldn’t wish on anyone.
I died three times in three hours and became the world’s only known medical
case to survive three consecutive sudden cardiac arrests (SCA) without any kind of
implanted defibrillator.
When I woke up, I began to pull out all of my intravenous tubes because I didn’t
understand where I was—all I knew was that I wanted to get out of there. They
sedated me and removed me from life support, but I had no short-term memory. I
didn’t know who my parents were, or my girlfriend. You could tell me something,
and ninety seconds later, I wouldn’t know what you were talking about. But it
wasn’t amnesia. It was simply the inability to retain anything. In fact, to this day, I
have a blank space in my brain and can’t recall anything from October 4 until
Thanksgiving—two months of my life are still missing from my memory.
After enduring a week’s worth of tests and having a cardio defibrillator device
implanted in my chest, I was sent home with no real answers. The medical
community was surprised that I had survived and shocked that it had found
nothing wrong with my heart or any evidence of damage. The only two things
doctors were certain about was that a “random” electrical malfunction—most likely
stress—had caused my SCAs and that my being in shape and living a healthy
lifestyle were behind the fact that I was still alive.
Even though I left the hospital with what seemed to be a normal working brain,
I knew something wasn’t quite right. Due to the lack of oxygen flow to my brain
during my SCAs, I couldn’t stay focused and even found myself suffering from
clinical depression. It wouldn’t be until much later, after being diagnosed by Jeff
Ricks, MD, one of the world’s foremost experts on mass trauma management, that
I would discover I had battled what is known medically as mild brain trauma. But
at that moment, I just knew that the way my brain was working was not working
for me.
Up until my incident, I had been working as a personal trainer for ten years and
had been working extensively with NFL and NBA athletes in their off-seasons.
During that time, I trained both myself and my clients using very strict routines:
carefully planned workouts designed to prevent plateaus by gradually changing the
intensity, specificity, and volume over the course of twelve to twenty weeks. The
diets I relied on were even more complicated, involving three separate twelve- to
twenty-week phases.
I was a measurer, a calorie counter, and focused on every single nutrient level in
every single food. I even wore a watch and set alarms to remind myself to eat at
exact times, just to try to capitalize on my body’s hormonal functions around
whatever stimulus I was getting by eating a particular food. If all that sounds
confusing, trust me, it was. In fact, it was nauseating.
But after my SCAs, I was suddenly someone who had to monitor his stress, so it
was unhealthy for me to follow complex and frustrating programs anymore. I was
also still someone who couldn’t remember what he had just done minutes before.
Sometimes my watch would go off, and I wouldn’t know what meal I was on.
Sometimes I wouldn’t even know what day it was. It was unbearable and undoable,
which was why I decided to stop everything I was trying to do and simplify it. I had
to work around my brain to keep my body from falling apart.
Instead of trying to focus on exercise and diet programs lasting twelve to twenty
weeks, I started focusing on one meal at a time. One snack at a time. One workout
at a time. And for each time I ate healthy or finished a workout, I gave myself a
grade of 100 percent. At the end of the day, I would sit down and go over
everything I had done—even if I didn’t remember doing half of what was on my
list. If I managed to do everything and I scored 100 percent on every meal, snack,
and workout, I considered myself successful.
And the next day, I would do it again. And the day after that.
At the end of the week, I added up my total score to see how successful I had
been for five days straight. After five consecutive blocks of five days, I added up my
score again, just to have a sense of the past month. Eventually, as my short-term
memory slowly returned and my depression lifted, within months, I was a changed
man—both physically and mentally. I was imminently aware that something felt
better about the program compared with methods I had used in the past.
Beyond getting back into incredible shape, the first thing I noticed was how
calm I became. I was no longer as worried about how my meals were balanced, and
I stopped weighing and measuring everything. Instead, I took an eyeball approach
with all my servings. I knew I was still eating healthy, but I took a very general
preventive health approach to my diet, instead of the very strict, hard-line approach
I had been used to following.
I also noticed that I was no longer that person who was hard to go out to eat
with, so my friends no longer had to kill themselves trying to find restaurants that
could accommodate my crazy dietary habits. Suddenly I could eat anywhere. I
accepted that every meal wouldn’t be perfect but so long as I ate certain foods,
everything would be all right.
I returned to work as a top trainer three months after my incident and started
using 25Days with clients immediately. But to be honest, I didn’t start them on it
because of the amazing results I had seen in myself; I did it because it was the only
way I could keep track of their programs! I had them carry journals and grade
themselves at every meal, snack, and day I wasn’t training them, so I always knew
exactly what to do and where they had slipped along the way.
It made my training job easier and made their outcomes more enjoyable for
them by streamlining my approach to diet and exercise into a twenty-five-day block
of time. By having them focus on what really mattered to get results, and asking
them to grade themselves each day, it left my clients feeling equally relaxed and as if
they were kicking life in the ass each and every day. And then an interesting thing
happened.
Before my SCAs, I had always had a great success rate with all my clients in
getting them to get onto the difficult-to-manage nutrition programs I was
suggesting. But even though I had a really high success rate, it wasn’t maintainable
practically in a real-world situation. Suddenly my clients weren’t just hitting their
fitness and weight loss goals faster and more often, they were making positive
changes within other facets of their lives—and feeling like a success every step of the
way.
So . . . Is Your Life Worth Twenty-five Days?
For me, 25Days didn’t start as a choice—it began as something I needed to do to
overcome an obstacle.
I can’t eliminate my obstacle. I see it every day when I step out of the shower
and notice the scar on my chest. I’m reminded whenever I look down at Lucky, my
heart therapy service dog who works with me twenty-four hours a day. I’m aware of
it each time I offer him my palm to lick to make sure I’m doing okay—and any
time he gets me out of harm’s way if he senses my cortisol levels going through the
roof unexpectedly.
No, I can’t eliminate my obstacle, but I have no fear of it anymore. I’ve become
stronger than my obstacle—and so can you. So tell me, what’s your obstacle?
I know you have one, or you wouldn’t be reading this. We all have some kind of
barrier to becoming the best version of ourselves. And for many, that obstacle is
usually doubt or fear of failure. Either way, it makes them feel that they can never
be successful.
So I challenge you with this: Is your life worth twenty-five days?
Is the effort of putting in just twenty-five days too much to risk to eliminate
that obstacle for the rest of your life?
If, after twenty-five days, you begin to uncover a way to be consistently healthy
so you can live a life of full potential, then isn’t it worth it to try doing away with
that obstacle? I want you to have the best life possible, and the way to do that is
through the same commonsense, straightforward, no-nonsense approach that saved
me and has been successful with all of my clients. That’s what the 25Days program
is really all about. That said, take a deep breath. Now blow it out. If you’ve failed
every other time in your life or you’ve never tried for fear of failing, I want you to
relax. This will be the time you succeed. This is the way to be able to stay healthy
for the rest of your life. This is the way to rewire your brain to make it effortless to
make the choices necessary to live the life you deserve.
This is so much easier than you think it is. Just give me twenty-five days to show
you.

Wednesday, December 27, 2017

New biomarker could lead to early detection of Alzheimer's disease


Alzheimer's disease (AD)


Researchers at Sanford Burnham Prebys Medical Discovery Institute (SBP) have identified a peptide that could lead to the early detection of Alzheimer's disease (AD). The discovery, published in Nature Communications, may also provide a means of homing drugs to diseased areas of the brain to treat AD, Parkinson's disease, as well as glioblastoma, brain injuries and stroke.

"Our goal was to find a new biomarker for AD," says Aman Mann, Ph.D., research assistant professor at SBP who shares the lead authorship of the study with Pablo Scodeller, Ph.D., a postdoctoral researcher at SBP. "We have identified a peptide (DAG) that recognizes a protein that is elevated in the brain blood vessels of AD mice and human patients. The DAG target, connective tissue growth factor (CTGF) appears in the AD brain before amyloid plaques, the pathological hallmark of AD."

"CTGF is a protein that is made in the brain in response to inflammation and tissue repair," explains Mann. "Our finding that connects elevated levels of CTGF with AD is consistent with the growing body of evidence suggesting that inflammation plays an important role in the development of AD."

The research team identified the DAG peptide using in vivo phage display screening at different stages of AD development in a mouse model. In young AD mice, DAG detected the earliest stage of the disease. If the early appearance of the DAG target holds true in humans, it would mean that DAG could be used as a tool to identify patients at early, pre-symptomatic stages of the disease when treatments already available may still be effective.

"Importantly, we showed that DAG binds to cells and brain from AD human patients in a CTGF-dependent manner" says Mann. "This is consistent with an earlier report of high CTGF expression in the brains of AD patients."

"Our findings show that endothelial cells, the cells that form the inner lining of blood vessels, bind our DAG peptide in the parts of the mouse brain affected by the disease," says Erkki Ruoslahti, M.D., Ph.D., distinguished professor at SBP and senior author of the paper. "This is very significant because the endothelial cells are readily accessible for probes injected into the blood stream, whereas other types of cells in the brain are behind a protective wall called the blood-brain barrier. The change in AD blood vessels gives us an opportunity to create a diagnostic method that can detect AD at the earliest stage possible.

"But first we need to develop an imaging platform for the technology, using MRI or PET scans to differentiate live AD mice from normal mice. Once that's done successfully, we can focus on humans," adds Ruoslahti.

"As our research progresses we also foresee CTGF as a potential therapeutic target that is unrelated to amyloid beta (Aβ), the toxic protein that creates brain plaques," says Ruoslahti. "Given the number of failed clinical studies that have sought to treat AD patients by targeting Aβ, it's clear that treatments will need to be given earlier--before amyloid plaques appear--or have to target entirely different pathways.

"DAG has the potential to fill both roles -- identifying at risk individuals prior to overt signs of AD and targeted delivery of drugs to diseased areas of the brain. Perhaps CTGF itself can be a drug target in AD and other brain disorders linked to inflammation. We'll just have to learn more about its role in these diseases".

This technology has been licensed to a startup company, AivoCode Inc.

Source: www.sbpdiscovery.org/

Canola oil linked to worsening of Alzheimer’s

Canola oil linked to worsening of Alzheimer’s

Canola oil has been marketed as a healthy cooking oil choice because of the low levels of saturated fats that is seen in it. Now a new study has revealed that consumption of canola oil could be linked to worsening of memory functions and learning ability in patients with Alzheimer’s disease.

The study also shows that canola oil consumption could increase the risk of formation of plaques within the brain that is the signature feature of Alzheimer’s disease and also lead to weight gain. The study titled, “Effect of canola oil consumption on memory, synapse and neuropathology in the triple transgenic mouse model of Alzheimer’s disease” was published in the latest issue of the journal Scientific Reports.

This is the first study that shows that consumption of canola oil is not a beneficial choice for the brain. Senior study investigator Professor Domenico Praticò, Professor in the Departments of Pharmacology and Microbiology and director of the Alzheimer's Center at Lewis Katz School of Medicine at Temple University (LKSOM) in Philadelphia, explained that canola oil has been advertised as being healthy and is easily a more “appealing” choice because it is cheaper than other vegetable oils. There have been little or no studies that link its effects on the brain Praticò said.

For this study the team of researchers used a laboratory mice model that was specifically designed to mimic Alzheimer's in humans. These rodents would show no symptoms of memory loss and learning problems early in life and slowly progress to symptoms of Alzheimer’s as they aged. They were classified into two groups at six months of age before they showed any signs of memory loss. One group was given normal diet and the other group received diet supplemented with about two tablespoons of canola oil daily. They were all assessed after a year.

It was seen that canola oil consumption increased body weight significantly over normal diet. Working memory and learning abilities also were reduced in the canola oil group. To test this scientists usually use a set of mazes that the mice need to navigate through. The brains of all the mice were studied and it was noted that amyloid beta 1-40 levels were lower with canola oil consumption. This peptide is known to be protective against Alzheimer’s disease. This group of animals also developed amyloid plaques in the brain that is a hallmark of Alzheimer’s disease. The injuries in the brain were extensive upon examination.

Dr Praticò said, “Based on the evidence from this study, canola oil should not be thought of as being equivalent to oils with proven health benefits.” He explained that they are planning longer studies to check upon the extent of harm caused by the oil and also its effects on other types of dementia and memory loss.

Elisabetta Lauretti, a graduate student in Dr. Pratico’s laboratory at LKSOM and co-author on the new study had earlier worked with him to study the effect of olive oil on Alzheimer’s disease in the same laboratory. This had shown that olive oil was beneficial in terms of brain health. That study too was published earlier this year.

Source: www.nature.com/articles/s41598-017-17373

Wash Away Stress With The Power Of Nature



IF YOU GO DOWN TO THE WOODS TODAY, you’re in for a big surprise.  Why? Because your time out in nature isn’t just a nice antidote to the digital world, it has real wellbeing benefits.

Yep, spending time in green spaces is a scientifically proven wellness concept that comes with an official name: ‘forest bathing’. The Japanese coined the phrase shinrin-yoku way back in 1982 (roughly translated as ‘taking in the forest atmosphere’ or ‘forest bathing’) and have turned it into a form of therapy that’s now thought to lower blood pressure, improve mood and focus and reduce stress. In fact, shinrinyoku is so popular it’s now part of Japan’s national health policy, with millions being spent on research and more than 55 official forest trails being created, with plans for many more. And it’s not only the Japanese who are heading for leafy areas. In Malaysia, the concept is known as mandi embun or ‘bathing in the forest dew’ and it’s catching on in South Korea, Taiwan, Finland, and (not surprisingly) Australia.

Nature’s medicine

Research shows that immersing yourself in natural, green spaces can improve creativity, mood, memory and focus – and that’s just for starters. Hypnotherapist Edrina Rush says it’s because we’re wired to be engrossed in nature and appreciate natural surroundings - especially when there’s an abundance of greenery. “Green is the colour we see the most in nature and it also signifies balance,  calm and harmony,” she explains.

There’s evidence that your pituitary gland is stimulated, your muscles are more relaxed and your blood histamine levels increase when you’re exposed to the colour green. Rush adds that going outdoors can also help to manage levels of serotonin, the neurotransmitter that regulates our mood, behaviour and appetite.

“Too much serotonin and we can become irritable and tense, but too little serotonin and we can become depressed. Breathing fresh air [with more oxygen in it] can help regulate our serotonin [which is affected by oxygen], promoting wellbeing.”

Happily, the feel-good factor triggered by forest bathing can also have a positive one effect on loved ones. “They’re most likely to reap the rewards of our positive psychological gain from spending time in forests,” says psychologist Dr Saima Latif.

The numerous wellbeing benefits of nature mean therapists are beginning to take their clients outdoors. Psychologist Maz Miller from Walk Different (walkdifferent.com.au) is one therapist tapping the benefits of Australia’s beautiful natural scenery for her walk-and-talk sessions in Sydney’s south, and she says it offers a unique opportunity to help patients unwind. “Practising mindfulness with ocean sounds is very different to trying to imitate that in an office with some music,” she explains. “People open  up much more [in nature], they feel more comfortable when they’re looking around.”

Take it slow

As far as wellbeing trends go, this one’s pretty easy to pull off - you simply visit a forest, park or bushland, and walk while taking in your surroundings. It’s important to note that this practice isn’t a fast-paced one – it’s all about moving mindfully, contemplating your surroundings and allowing the serene setting to ‘wash’ your soul and rejuvenate your mind and body.

“Forest bathing is one of my favourite self-love practices,” says Chloe Kerman, 36, former fashion editor-turned-shamanic healer (chloeisidora.com). “I encourage clients and friends to connect with nature by walking in silence and allowing all of their senses to pick up information.” Kerman likes to lie down at the base of a tree and meditate – a process she finds deeply relaxing. “I often leave a forest- bathing session feeling happier, relaxed, in tune and inspired with creative ideas and increased energy,” she says.

Wondering why large, leafy places evoke these feelings? One study published in the journal Public Health reveals that being in a forest setting benefits acute emotions, and is especially effective at soothing chronic stress.  As well as reducing feelings of anxiety, it helps lower the risk of stress-related diseases. “The forest environment lowers your blood pressure, reduces your levels  of stress hormones and increases levels of serum adiponectin, which helps to prevent obesity, type 2 diabetes and cardiovascular disease,” says Dr Latif. “The positive health effects of viewing natural landscapes on stress levels and on speed of recovery from stress or mental fatigue, faster physical recovery from illness and long-term improvement of health and wellbeing are reported in research.”

Into the woods

To dip into the forest bathing trend yourself, “Take longer walks in local parks and be present to the sounds and surroundings,” says Rush. “Go where it’s less busy and leave your phone at home.” She advises walking slowly, taking time to pause and tuning in to the sounds of birds and nature. “Touch leaves and walk barefoot to feel the sensations,” she suggests, adding that it’s a good opportunity to sit and take a few deep, conscious breaths, too.

If you’re ready to explore beyond your local park, look up your nearest national park ’s trails, or pick up a copy of Walks in Nature: Australia by Viola Design (Explore Australia, $29.95) for  112 tracks in and around the nation’s major cities (including foodie pit stop recommendations!). Make sure you’re wearing comfortable walking gear, including sturdy shoes or hiking boots,and take water and some snacks for the road if you’re planning on being in the bush for a while. Oh, and if you’re forest bathing alone, always make sure you tell someone where you’ll be and how long you expect the adventure to take.

Want some company on the trail?

There are several accredited forest bathing guides in Australia who can help you soak up all the wellbeing benefits from your  experience. Visit natureandforesttherapy. org to search for a guide in your area.

While getting outside is obviously ideal, you don’t have to physically visit a forest to enjoy its restorative powers. A recent study by the BBC and the University of California found that you can access some of the wellbeing benefits of this trend merely by watching nature documentaries. “Just viewing a forest scene has been documented to have a very positive effect on psychological healing and recovery from stress, especially for those from urbanised environments,” Dr Latif says.

Filling your home environment with natural light, plants and flowers can also increase your connection with nature, as interior designer Olivia Heath explains. “Research tells us that when we improve that sense of nature, directly or indirectly, it can create a more calming, restful, restorative and energising space,” she says. Try filling your home with easy care indoor greenery, such as maidenhair ferns, spider plants and rubber fig trees to bring the forest into your everyday world, and get back to nature more often.

Love this? Search for more like it on www.womensfitness.com.au