The Circle of Willes in Cryonics Perfusion

Blood flows into the brain primarily via the carotid arteries and the vertebral arteries. The Circle of Willis is a circular arterial structure in the brain that connects blood flowing in from the carotid arteries with blood flowing in from the basilar artery (which is fed by the vertebral arteries).
Blood flows from the Circle of Willis into brain tissue via the anterior, middle, and posterior cerebral arteries. Many studies have shown that the Circle of Willis is incomplete in most people. A 1998 study of 150 healthy adult volunteers showed a complete Circle of Willis in only 42% of cases — more often complete in younger persons and females [RADIOLOGY; Krabbe-Hartkamp,MJ; 207(1):103-111 (1998)]. A slightly more encouraging 2002 study of 118 healthy volunteers in the 65-68 age group, showed 47% had a complete Circle of Willis [THE JOURNAL OF CARDIOVASCULAR SURGERY; Macchi,C; 43(6):887-890 (2002)]

For cryonics purposes, it has been believed that perfusion into the carotid arteries, but not into the vertebral arteries will result in incomplete perfusion of the brain if the Circle of Willis is not complete. In particular, if both posterior communicating arteries are missing, then perfusing only through the carotid arteries will result in no blood getting to parts of the brain supplied by the posterior cerebral arteries. Both posterior communicating arteries were missing in 11% of those in the 1998 study and in 14% of those in the 2002 study cited above.

Nonetheless, a 2008 study showing Circle of Willis complete in only 40% of 99 patients found no case of insufficient perfusion in functional tests of patients given unilateral cerebral perfusion. The authors concluded that “extracranial collateral circulation” provides an alternative pathway to the Circle of Willis for cerebral crossperfusion [EUROPEAN JOURNAL OF CARDIOTHORACIC SURGERY; Urbanski,PP; 33(3):402-408 (2008)]. Although persons with missing posterior communicating arteries could easily have pathways to opposite sides of the brain, other variants of Circle of Willis incompleteness would be expected to prevent perfusion across hemispheres.

When the cryonics organization Alcor does a cephalic isolation (“neuro”) perfusion, the carotid arteries are initially cannulated and the vertebrals are not cannulated. Only if when the patient is being perfused into the carotid arteries no flow is seen coming from the vertebral arteries are the
vertebral arteries to be cannulated and the patient is to be perfused through both the carotids and the vertebrals. If, on the other hand, flow is seen coming from one of the vertebral arteries after perfusion of the carotids has begun, it is assumed that the Circle of Willis is complete and the vertebral arteries are clamped for the rest of the perfusion. Flow only needs to be seen in one of the vertebrals to confirm that the Circle of Willis is complete, because the vertebrals unite in the basilar artery before connecting to the Circle of Willis.

One Alcor employee has informed me that of 15-20 neuro patients perfused by this cephalic isolation method, not once has there been an absence of flow from the vertebrals and not once has Alcor perfused a cephalic isolation patient through the vertebral arteries. This would be slightly improbable, based on a 10-15% expected rate of incomplete communicating posterior arteries on both sides. But another Alcor employee remembers one or two cases where vertebral artery perfusion was done (which would match expectations).

When both posterior communicating arteries are not missing, there is another potential problem with perfusing only into the carotids and not the vertebrals — namely, loss of perfusion pressure. Perfusate entering the Circle of Willis could exit through the basilar artery (the vertebrals) instead
of through the cerebral arteries. Vascular resistance in the body is reportedly only one quarter what it is in the brain. Clamping the vertebral arteries (as is done during Alcor neuro perfusions) could prevent this problem. Blood flowing into the basilar artery need not be pushing all of the blood in the body, however, because arteries — and especially veins — have a large reserve capacity (a balloon-like ability to expand).

Possibly the reserve capacity of the brain would allow blood to flow into the brain as readily as into the body. Blood has about three times the viscosity of water, and vitrification solution has about twice the viscosity of blood. Viscosity will increase vascular resistance in all blood vessels, but the effect would be greater in the brain. The “no reflow” phenomenon would also create resistance in the blood vessels, which again might be greater in the brain than in the body.

Prior to the use of vitrification solution, the Cryonics Institute only perfused cryonics patients through the carotid arteries — there was no attempt to perfuse into the vertebral arteries. Nonetheless, dehydration was seen in the patients, and adequate effluent flow was seen from the jugular veins. Perfusion pressures were reportedly not excessive.

Currently, CI’s funeral director has been opening the chest (median sternotomy), and attempting to clamp the subclavian arteries, as well as the descending aorta, to perfuse into the ascending aorta. In several cases the ascending aorta has been perforated, forcing higher cannulations or the subclavians have been difficult to cannulate after having opened the chest. Our funeral director refused to open the chest at all for a known case of Methacillin- Resistant Staphylococcus Aureas (MRSA).

It would be preferable if a case could be made for perfusing all CI patients only through the carotids. Carotid-only has been recommended for vitrification perfusions overseas, as well as for glycerol perfusions in post-mortem sign-ups. Whether vitrification solution perfusion into the carotids can achieve adequate perfusion pressure in the brain — and whether adequate perfusion pressure can be verified by the observation of effluent from the jugular veins remains unresolved.

It should not be too difficult to clamp the vertebral arteries by cutting near the clavicle, as CI’s funeral director did when CI began the attempt to perfuse the vertebrals as well as the carotid arteries. Nonetheless, this would result in failure to perfuse portions of the brain supplied by the posterior cerebral arteries in the 10-15% of patients who are missing both posterior communicating arteries.

First published in The Immortalist, February, 2011

Cryonics Without Cerebral Dehydration?

One of the interesting things about technological progress in cryonics is that awareness of technological problems, and the desire to solve them, is often dependent on other problems being solved first. For example, cryoprotectant toxicity became a more serious concern after it was possible to eliminate ice formation. After all, it is more important to eliminate severe (mechanical) damage caused by ice crystals than to prevent (minor) alterations of biomolecules.
One problem that is increasingly rising to the top of technological issues to be solved is the extreme dehydration caused by the perfusion of cryoprotectants.

The fact that perfusion of the brain with cryoprotectants causes substantial dehydration has been known in cryonics for a long time. While no rigorous academic studies are available about this topic, it is usually assumed that the cause of this dehydration is that most (but not all) cryoprotectants have poor blood brain barrier (BBB) permeability.

Another line of evidence is that prolonged warm and cold ischemia eliminate this dehydration, presumably because ischemia compromises the BBB in a time-dependent manner.  One ironic consequence of this is that in cryonics severe dehydration is often an indicator of good patient care (i.e. minimization or mitigation of ischemia). Maybe because of this there has been relatively little interest in eliminating CPA-induced cerebral dehydration.

Another reason is that dehydration actually assists in removing water from the brain to facilitate vitrification, perhaps even requiring lower concentrations of cryoprotectant than are necessary for the vitrification of other organs (preliminary evidence for this exists).

Cerebral dehydration was identified as a potential form of injury in a case report for patient A-1097 (2006) but until recently the “advantages” of dehydration seemed to outweigh its potential disadvantages. More serious concerns started to emerge in the last couple of years. Electron micrographs of brains cryopreserved with M22 and other cryoprotectants show ultrastructural alterations that are primarily presumed to be due to CPA-induced dehydration. The significance of this issue was further reinforced in 2015 when a researcher from 21st Century Medicine showed electron micrographs of aldehyde-stabilized vitrified brains (vitrification after chemical fixation) that look considerably better than traditionally vitrified brains. In addition, while employed for the Cryonics Institute, Yuri Pichugin demonstrated that the extreme dehydration associated with modern vitrification solutions is not compatible with good brain slice viability.

Since most researchers in cryonics would like to see a biopreservation protocol that does an excellent job of preserving both viability and ultrastructure, eliminating this kind of injury is likely to be a rather important research goal in the next couple of years.

It is not desirable to deliberately induce ischemia to improve BBB permeability of cryoprotectants. This leaves a number of strategies to improve delivery vitrification agents to the brain:

1. Osmotic opening of the BBB. Molecules such as mannitol have a transient effect on BBB permeability but are probably not potent enough  to permit brain cryoprotection without dehydration.

2. Yuri Pichugin has discovered that detergents such as sodium dodecyl sulfate (SDS) permit cryopreservation of the brain without dehydration.

3. Not all cryoprotectants are impermeable to the brain. Can these cryoprotectants be used in
low toxicity vitrification solutions?

Fortunately, the tools to screen the efficacy of BBB modifying technologies for brain cryopreservation are already known in the literature. Brains can be inspected for post-perfusion morphology and weight loss/gain. BBB modifiers can be tested for viability in brain slices or even whole animals. We can compare whether the use of BBB modifying strategies raises or lowers the concentration of cryoprotectant necessary to vitrify the brain. How do BBB modifiers affect overall ultrastructure in electron micrographs? What do BBB modifiers do to other cells and the vasculature? Do BBB modifiers produce more edema in the rest of the body? Will the use of BBB modifiers allow “extracellular” cryoprotectants and ice blockers to cross the BBB or even cells?

One challenge is how to validate and authorize the use of BBB modifying strategies in human cryonics cases. We know from burr hole and CT scans of neuro patients at Alcor that severe cerebral dehydration is frequently seen in good cases with little ischemia (shrinking the brain down to almost 50% of its natural size).

CT scan of Alcor patient with cryoprotectant-induced brain dehydration

Switching to a cryoprotectant that has similar or even lower toxicity as M22 would be relatively straightforward but if potent agents are used to open the BBB it will be important to choose a dosage that does not produce serious side-effects such as fulminating edema or poor cell viability.

In a 2007 Alcor article (“Securing Viability of the Brain in Cryonics”) I speculated that we should assume that viability of the brain (or slices made from such a brain) is currently lost about halfway through cryoprotectant perfusion as consequence of cryoprotectant toxicity. As we understand it now, the need to use high concentrations of cryoprotectants also produces brain shrinking. If we want to move Alcor closer to its mandate of developing reversible human cryopreservation both problems will need to be revolved. This will most likely involve a minor re-formulation of M22 or a novel cryoprotectant that is more “friendly” to the brain.

For 2016, my lab Advanced Neural Biosciences has made identifying such a brain-friendly cryoprotection protocol a high priority. The good news is that we already know of strategies that work. Now we need to identify protocols that maximize high viability and excellent ultrastructure
to make the next step in further closing the gap between cryonics and suspended  animation.

Originally published as a column in Cryonics magazine, January-February, 2016

Cryofixation and Chemopreservation

The most common modern protocol for imaging brain structure at high magnification is to chemically fix the brain with aldehydes (formaldehyde, glutaraldehyde) and heavy metals like osmium and then prepare it for electron microscopy imaging. Using this method, a tremendous amount of detailed anatomical information about the structure of the brain in its healthy and pathological state has been obtained, including the effects of (prolonged) ischemia.

Almost from its inception, however, the limitations of this method have been recognized. In particular, when fixatives are introduced to the brain through the process of perfusion a number of distinct artifacts are produced, notably shrinking of the brain and a reduction of the extracellular space. While different solutions and protocols have been developed to reduce these artifacts, the gold standard for ultrastructural analysis is a method that does not use aldehydes at all; cryofixation.

In cryofixation small tissue samples are rapidly cooled (without freezing) and then prepared for electron microscopy. This method produces the most realistic images of the ultrastructure of the brain, as evidenced by papers that compared this method with aldehyde fixation or used advanced tools to understand the properties of the brain without doing electron microscopy.

Although the word “vitrification” is rarely used in the context of cryofixation, the pristine images in this method can only be achieved when ice formation is avoided through ultra-rapid cooling. Vitrification without the use of high concentrations of (toxic) cryoprotectants would be quite attractive if it could be scaled to the size of organs (or even humans!) but unfortunately this method can only be used on very small tissue samples.

The pristine images obtained from cryofixation raise some important issues. Does conventional aldehyde fixation produce only predictable distortions or is identity-specific information irreversibly lost? What are the ultrstructural effects of the heavy metal exposure when cryofixed samples are prepared for electron microscopy? In a more general sense, to what degree can we be confident that a technology can produce a completely realistic image of the ultrastructure of the brain?

Will computer simulations of scanned fixed brains need extensive correction if they are to serve as a simulation of the brain? One clear advantage of using viability assays in addition to electron microscopy is that we can test brain slices or whole brains for resumption of function (or retention of memory) after subjecting them to experimental protocols. This is a clear advantage of the use of cryopreservation technologies over chemical fixation. In a cryonics case we can monitor the patient from the start of our procedures to the point of long term care and collect data and viability information. In the case of chemopreservation no such feedback is possible and taking brain biopsies for electron microscopy is all we can do to assess the effects of our cryopreservation procedures.

It is tempting for a cryonics organization to choose the method of preservation that produces the most crisp electron micrographs. In reality, however, there are challenges and unknown issues. Cryofixation cannot be scaled to work for cryonics. What is the effect of conventional aldehyde perfusion in ischemic brains? How do aldehyde fixed brains look on the molecular level compared to cryopreserved brains? How can we know that identity-critical information is not irreversibly altered? And, last but not least, any preservation technology that renders tissue dead by conventional criteria cannot be considered as a means for achieving true human suspended animation.

Originally published as a column in Cryonics magazine, September, 2015

Medical Myopia and Brain Death

Recently someone sent me a number of papers that discussed the biophilosophical underpinnings of brain death. Medical doctors increasingly find themselves in the midst of heated debates about what constitutes death by neurological criteria. It is not hard to understand how controversies can occur in this area. Whenever a patient who satisfies the criteria for brain death shows signs of improvement or recovery, these criteria are called into question. Or, perhaps more troublesome, some people will simply not concede that a patient is dead because recovery can be envisioned. In such cases, the concept of death becomes more like a subjective “decision” than an objective property of the brain.

To someone sympathetic to cryonics these debates are mildly infuriating because it shows the reckless medical myopia with which matters of life and death are approached. When bioethicists debate what constitutes “permanent and irreversible loss of the capacity for consciousness and self-awareness” there is little recognition of the possibility that what looks hopeless and irreversible by contemporary medical technologies may be rather straightforward to repair or recover by future medical technologies. Would we abandon a patient if a cure would be available tomorrow? What about next month? Next year? 50 years?

The standard rejoinder to this position is that cryopreservation of the patient (cryonics) itself produces irreversible damage to the brain and is thus not suitable to stabilize the patient longterm until more advanced treatments are available. But how can we know what will be considered irreversible damage in the future? Should we simply pull the plug based on our guesswork about the limits of future technologies? Would it not be more prudent to let future doctors make that determination?

This does look a lot like saying that cryonics is just an argument in favor of prudence based on ignorance. A sophisticated way of saying, “well, you never know!” Not quite. If a healthy brain without damage gives rise to consciousness and identity, it follows that if the original state of the brain can be inferred from the damaged state, the capacity to restore consciousness and identity is preserved in principle. Ice formation undeniably alters the structure of the brain but it does not make the ultrastructure “disappear.” In fact, at cryogenic temperatures nothing “disappears,” a point that is not even sufficiently recognized by many cryonics advocates. Today we can do better than freezing, though, and use vitrification agents, which solidify into a glass upon cooling to cryogenic temperatures. While these vitrification agents exhibit some toxicity, at the ultrastructural level this expresses itself at most as alteration of cell membranes, protein denaturation, etc., not wholesale destruction.

Where does this leave us on the issue of brain death? For starters, looking at a monitor and concluding that the patient is dead because of the absence of organized electrical activity will tell us little about the ultrastructure of the brain (case in point, at 15 degrees Celsius even a healthy brain will show a flat EEG). It is true that in some cases of brain death absence of electrical activity corresponds to substantial decomposition of brain tissue but it is important to recognize that in many such cases the brain has been permitted to self-destruct at body temperature as a result of trauma and ischemia. When a hospital is faced with a traumatic event of such magnitude that profound cell death can be expected, the most prudent action is to quickly cool the patient and prevent “information-theoretic death.” If the capacity for consciousness and awareness resides in the neuroanatomy of the brain, the first mandate of medicine is to preserve this.

Originally published as a column in Cryonics magazine, March, 2015

The Case for Brain Cryopreservation

Cryopreservation of just the head is as old as Alcor itself. In fact, some people identify Alcor with its “neuro-preservation” option. It is important, however, to recognize that the objective of preserving the head is really to preserve what is inside the head, i.e. the brain. While I am aware of (contrived) technical arguments that prefer head preservation over brain preservation for information-theoretical reasons, I suspect that no advocate of neuro-preservation is anxious about the prospect of having only his/her brain preserved in a pristine state.

This raises an important question – one that is not immediately evident to the general public. Why not just preserve the naked brain instead? I am aware of at least three major arguments against it and I think that these arguments are based on incomplete information or a lack of imagination.

Myth 1: The isolated brain is not a stable organ and will collapse upon itself in a jellylike state if it is removed from the skull.

Answer: In human cryopreservation the brain would only be extracted at low temperatures which provide a lot more stability to the brain. In addition, in a good case the brain will also be loaded with a cryoprotectant and exist in a dehydrated state, which will provide even more stability.

Myth 2: Removing the brain from the skull will damage the brain and will erase identity-critical information.

Answer: It is correct that morticians typically remove the brain with little regard for its ultrastructural integrity but there is no reason why a cryonics organization should engage in such traumatic brain removal. Safe brain removal protocols are technically possible and cryonics organizations have a strong incentive to develop and refine such techniques.

Myth 3: The skull is necessary to provide protection to the brain.

Answer: It is undeniable that the skull provides robust protection to the brain but from that it does not follow that a cryonics organization cannot design a long-term enclosure and maintenance method that provides strong protection of the naked brain, too.

I do not claim that brain preservation is equal in all respects to neuro-preservation. For example, extraction of the brain from the skull requires additional time after completion of cryoprotectant perfusion and during this time the brain will be exposed to high levels of cryoprotectant (strictly speaking, isolated brain perfusion is possible but this requires a very advanced surgical procedure). Keeping the brain temperature low and uniform during brain removal is also a challenge.

On the other hand, there are potential advantages as well. An isolated brain can be placed in the cryoprotectant to allow diffusion of the vitrification agent prior to cryogenic cooldown to compensate for any ischemia-induced cortical perfusion impairment. In fact, if perfusion is no longer an option, immersion of the (fixed) brain in cryoprotectant is the only means to mitigate ice formation during cryostasis. Another advantage is a decrease in long-term care costs (at least 50%), which allows for lower cryopreservation minimums.

But the most important advantage of brain preservation is that public perception and negative PR would be substantially lower than that with neuro-preservation. Even if the procedure were a little riskier (technically speaking) one could still argue that it is safer in general because images of cryopreserved brains do not risk the kind of visceral response that neuro-preservation triggers.

I cannot do justice to all the technical, logistical, and financial issues associated with brain-only cryopreservation here but the topic requires more study for the reason alone that cryonics organizations occasionally receive fixed brains, or patients with long ischemic times, for whom immersion cryoprotection could be superior to straight freezing. Brain cryopreservation does not exist as an option yet, but it has been the reality for a number of patients.

Originally published as a column in Cryonics magazine, January, 2014

Multiple Sclerosis and Human Enhancement

Multiple sclerosis is a disease that raises a lot of interesting questions for people interested in biogerontology, human enhancement, and even cryonics. It raises questions about immunosenescence and draws attention to possible immune improvements for biological human enhancement. Biotechnologies to induce myelin repair may even be useful for the repair of cryopreserved brains. Before I discuss multiple sclerosis from these perspectives, let us take a closer look at this medical condition.

Multiple sclerosis (MS) is an inflammatory autoimmune disorder of the central nervous system that results in axonal degeneration in the brain and spinal cord. In simple terms, multiple sclerosis is a disease wherein the body’s immune system attacks and damages the myelin sheath, the fatty tissue that surrounds axons in the central nervous system. The myelin sheath is important because it facilitates the conduction of electrical signals along neural pathways. Like electrical wires, neuronal axons require insulation to ensure that they are able to transmit a signal accurately and at high speeds. It is these millions of nerves that carry messages from the brain to other parts of the body and vice versa.

More specifically, MS involves the loss of oligodendrocytes, the cells responsible for creating and maintaining the myelin sheath. This results in a thinning or complete loss of myelin (i.e., demyelination) and, as the disease advances, the breakdown of the axons of neurons. A repair process, called remyelination, takes place in early phases of the disease, but the oligodendrocytes are unable to completely rebuild the cell’s myelin sheath. Repeated attacks lead to successively less effective remyelinations, until a scar-like plaque is built up around the damaged axons.

The name multiple sclerosis refers to the scars (sclerae—better known as plaques or lesions) that form in the nervous system. These scars most commonly affect the white matter in the optic nerve, brain stem, basal ganglia, and spinal cord or white matter tracts close to the lateral ventricles of the brain. The peripheral nervous system is rarely involved. These lesions are the origin of the symptoms during an MS “attack.”

In addition to immune-mediated loss of myelin, which is thought to be carried out by T lymphocytes, B lymphocytes, and macrophages, another characteristic feature of MS is inflammation caused by a class of white blood cells called T cells, a kind of lymphocyte that plays an important role in the body’s defenses. In MS, T cells enter the brain via disruptions in the blood-brain barrier. The T cells recognize myelin as foreign and attack it, which is why these cells are also called “autoreactive lymphocytes.”

The attack of myelin starts inflammatory processes which trigger other immune cells and the release of soluble factors like cytokines and antibodies. Further breakdown of the blood–brain barrier in turn causes a number of other damaging effects such as swelling, activation of macrophages, and more activation of cytokines and other destructive proteins. These inflammatory factors could lead to or enhance the loss of myelin, or they may cause the axon to break down completely.

Because multiple sclerosis is not selective for specific neurons, and can progress through the brain and spinal cord at random, each patient’s symptoms may vary considerably. When a patient experiences an “attack” of increased disease activity, the impairment of neuronal communication can manifest as a broad spectrum of symptoms affecting sensory processing, locomotion, and cognition.

Some of the most common symptoms include: numbness and/or tingling of the limbs, like pins and needles; extreme and constant fatigue; slurring or stuttering; dragging of feet; vision problems, especially blurred vision; loss of coordination; inability to walk without veering and bumping into things; weakness; tremors; pain, especially in the legs; dizziness; and insomnia. There are many other symptoms, as well, such as loss of bowel or bladder control, the inability to process thoughts (which leads to confusion), and passing out. Some MS patients lose their vision and many lose their ability to walk. The symptoms are not necessarily the same for all patients and, in fact, an individual MS patient does not always have the same symptoms from day to day or even from minute to minute.

One of the most prevalent symptoms of MS is extreme and chronic fatigue. Assessment of fatigue in MS is difficult because it may be multifactorial, caused by immunologic abnormalities as well as other conditions that contribute to fatigue such as depression and disordered sleep (Braley and Chervin, 2010). Pharmacologic treatments such as amantadine and modafinil have shown favorable results for subjective measures of fatigue. Both drugs are well tolerated and have a mild side-effect profile (Life Extension Foundation, 2013).

It is estimated that multiple sclerosis affects approximately 85 out of every 100,000 people (Apatoff, 2002). The number of known patients is about 400,000 in the United States and about 2.5 million worldwide (Braley & Chervin, 2010). In recent years, there has been an increase of identified multiple sclerosis patients with about 50 percent more women reporting the disease. Indeed, between two and three times as many women have MS than men. Most patients are diagnosed between the ages of 20 and 50 but MS can strike at any age (National Multiple Sclerosis Society, 2013).

Incidence of multiple sclerosis varies by geographic region and certain demographic groups (Apatoff, 2002; Midgard, 2001). There is evidence that worldwide distribution of MS may be linked to latitude (Midgard, 2001). In the U.S., for instance, there is a lower rate of MS in the South than in other regions (Apatoff, 2002). Data regarding race shows 54 percent of MS patients are white, 25 percent are black and 19 percent are classified as other (Apatoff, 2002).

There are four disease courses identified in MS:

Relapsing-Remitting: Patients have clearly defined acute attacks or flare-ups that are referred to as relapses. During the relapse, the patient experiences worsening of neurologic function—the body or mind will not function properly. The relapse is followed by either partial or total recovery, called remissions, when symptoms are alleviated. About 85 percent of MS patients fall into this category (National Multiple
Sclerosis Society, 2013).

Primary-Progressive: The disease slowly and consistently gets worse with no relapses or remissions. Progression of the disease occurs over time and the patient may experience temporary slight improvements of functioning. About 10 percent of MS patients fall into this category (National Multiple Sclerosis Society, 2013).

Secondary-Progressive: Patient appears to have relapsing-remitting MS, but after time the disease becomes steadily worse. There may or may not be plateaus, flareups, or remissions. About half the people originally diagnosed with relapsing remitting will move into this category within 10 years (National Multiple Sclerosis Society, 2013).

Progressive-Relapsing: Quick disease progression with few, if any, remissions. About 5 percent of MS patients fall into this category at diagnosis (National Multiple Sclerosis Society, 2003).

The cause(s) of multiple sclerosis remain unknown although research suggests that both genetic and environmental factors contribute to the development of the disease (National Multiple Sclerosis Society, 2013; Compston and Coles, 2002). The current prevailing theory is that MS is a complex multifactorial disease based on a genetic susceptibility but requiring an environmental trigger, and which causes tissue damage through inflammatory/ immune mechanisms. Widely varying environmental factors have been found to be associated with the disease, ranging from infectious agents to Vitamin D deficiency and smoking. The debate these days revolves primarily around whether immune pathogenesis is primary, or acts secondarily to some other trigger (Braley & Chervin, 2010).

Risk factors for multiple sclerosis include genetics and family history, though it is believed that up to 75% of MS must be attributable to non-genetic or environmental factors. Infection is one of the more widely suspected non-genetic risk factors. A commonly held theory is that viruses involved in the development of autoimmune diseases could mimic the proteins found on nerves, making those nerves a target for antibodies. The potential roles of several viruses have been investigated including herpes simplex virus (HSV), rubella, measles, mumps, and Epstein Barr virus (EBV). The strongest correlation between a virus and MS exists with EBV—virtually 100% of patients who have MS are seropositive for EBV (the rate in the general public is about 90%)— but potential causality remains strongly debated (Ludwin and Jacobson, 2011).

It is important to keep in mind that infectious agents such as viruses may, in fact, have nothing to do with causing MS. The association of a virus with MS is based on increased antibody response and may be epiphenomenal of a dysregulated global immune response. “Proving” causality will require consistent molecular findings as well as consistent results from well-controlled clinical trials of virus-specific antiviral therapies (as yet to be developed). In the end, any theory concerning causality in MS should also account for the strong association with other environmental factors such as Vitamin D deficiency and smoking. Indeed, a landmark study found that, compared to those with the highest levels of vitamin D, those with the lowest blood levels were 62% more likely to develop MS. Additionally, a literature review evaluating more than 3000 MS cases and 45,000 controls indicates that smoking increases the risk of developing MS by approximately 50% (Life Extension Foundation, 2013).

Recently, researchers have pinpointed a specific toxin they believe may be responsible for the onset of MS. Epsilon toxin—a byproduct of the bacterium Clostridium perfringens—is able to permeate the blood brain barrier and has been demonstrated to kill oligodendrocytes and meningeal cells. Loss of oligodendrocytes and meningeal inflammation are both part of the MS disease process, and may be triggered by exposure to epsilon toxin.

The fact that females are more susceptible to inflammatory autoimmune diseases, including multiple sclerosis, points to the potential role of hormones in the etiology of multiple sclerosis. Interestingly, the course of disease is affected by the fluctuation of steroid hormones during the female menstrual cycle and female MS patients generally experience clinical improvements during pregnancy (Life Extension Foundation, 2013). Additionally, pregnancy appears to be protective against the development of MS. A study in 2012 demonstrated that women who have been pregnant two or more times had a significantly reduced risk of developing MS, while women who have had five or more pregnancies had one-twentieth the risk of developing MS compared to women who were never pregnant. (The increase in MS prevalence over the last few decades could reflect the fact that women are having fewer children.) A growing body of evidence supports the therapeutic potential of hormones (both testosterone and estrogens) in animal models of multiple sclerosis, but more research is needed to understand the pathways and mechanisms underlying the beneficial effects of sex hormones on MS pathology (Gold and Voskuhl, 2009).

No single test gives a definitive diagnosis for MS, and variable symptoms and disease course make early diagnosis a challenge. Most diagnoses are presumptive and are based on the clinical symptoms seen in an acute attack. Supporting evidence of these presumptions is then sought, usually from a combination of magnetic resonance imaging (MRI) of the brain, testing the cerebrospinal fluid (CSF) for antibodies, measuring the
efficiency of nerve impulse conduction, and monitoring symptoms over time.

As there is still much work to be done in understanding the nature of multiple sclerosis, a cure has yet to be discovered. Conventional medical treatment typically focuses on strategies to treat acute attacks, to slow the progression of the disease, and to treat symptoms. Corticosteriods such as methylprednisolone are the first line of defense against acute MS attacks and are administered in high doses to suppress the immune system and decrease the production of proinflammatory factors. Plasma exchange is also used to physically remove antibodies and proinflammatory factors from the blood.

The use of beta interferons is a longstanding MS treatment strategy, originally envisioned as an antiviral compound. Beta interferons reduce inflammation and slow disease progression, but the mechanism of action is poorly understood. Other immunosuppressant drugs such as Mitoxantrone and Fingolimod also slow disease progression, but are not used as first-line treatments due to their severe side effects. More recently, researchers at Oregon Health & Science University have noted that an antioxidant called MitoQ has been shown to significantly reverse symptoms in a mouse model of MS (Mao, Manczak, Shirendeb, and Reddy (2013).

Besides pharmacological treatments, MS patients may benefit from therapies (such as physical and speech therapy) and from an optimized nutritional protocol. Supplementation with Vitamin D, Omega-3 and -6 fatty acids, Vitamin E, lipoic acid, Vitamin b12, and Coenzyme Q10 appear to be of particular potential benefit (Life Extension Foundation, 2013). Until a definitive cause for MS can be defined and a cure developed, such strategies, including hormone therapy, offer possible ways to improve quality of life over the course of disease progression.

Unlike Alzheimer’s disease, there does not appear to be a Mendelian variant of MS that will invariably produce the disease in people who have the gene. A somewhat puzzling variable is that MS predominantly tends to occur between the ages of 20 and 50. This appears to exclude approaching MS as a form of immunosenescence. After all, if MS would be a function of the aging immune system, we would see progressively more cases of MS as people get older (or in AIDS patients), ultimately involving many very old people. More likely, MS is a non age-related form of dysfunction of the immune system that is triggered by environmental factors (such as a viral infection). While many discussions about the role of viruses in debilitating diseases like Alzheimer’s and MS still suffer from an incomplete understanding of cause and effect, it seems reasonable to conclude that enhancement of the human immune system can greatly reduce disease and improve the quality of life, even in healthy humans.

One potential treatment for MS is to induce remyelination (or inhibit processes that interfere with efficient remyelination). Stem cells can be administered to produce oligodendrocyte precursor cells to produce the oligodendrocyte glial cells that are responsible for remyelination of axons. While the myelin sheaths of these remyelated axons are not as thick as the myelin sheaths that are formed during development, remyelination can improve conduction velocity and prevent the destruction of axons. While the dominant repair strategies envisioned for cryonics involve molecular nanotechnologies that can build any biochemical structures that physical law permits, it is encouraging to know that specific stem cell therapies will be available to repair and restore myelin function in cryonics patients as damage to myelin should be expected as a result of (prolonged) ischemia and cryoprotectant toxicity.

An interesting possibility is that remyelination therapies may also be used for human enhancement if these therapies can be tweaked to improve conduction velocity in humans or to induce certain desirable physiological responses by varying the composition and strength of the myelin sheath in various parts of the central nervous system.

References

Apatoff, Brian R. (2002). MS on the rise in the US. Neurology Alert 20(7), 55(2).

Braley, Tiffany J., Chervin, Ronald D. (2010). Fatigue in Multiple Sclerosis: Mechanisms, evaluation, and treatment. Sleep 33(8), 1061-1067.

Compston, Alastair, and Coles, Alasdair (2002). The Lancet 359(9313), 1221.

Gold, Stefan M., and Voskuhl, Rhonda R. (2009). Estrogen and testosterone therapies in multiple sclerosis. Progress in Brain Research 175: 239-251.

Life Extension Foundation (2013). Multiple Sclerosis, in: Disease Prevention and Treatment, 5th edition, 947-956.

Ludwin, SK, and Jacobson, S. (2011). Epstein- Barr Virus and MS: Causality or association? The International MS Journal 17.2: 39-43.

Mao, Peizhong, Manczak, Maria, Shirendeb, Ulziibat P., and Reddy, P. Hemachandra (2013). MitoQ, a mitochondira-targeted antioxidant, delays disease progression and alleviates pathogenesis in an experimental autoimmune encephalomyelitis mouse model of multiple sclerosis. Biochimica et Biophysica Acta Molecular Basis of Disease 1832(12): 2322- 2331.

Midgard, R. (2001). Epidemiology of multiple sclerosis: an overview. Journal of Neurology, Neurosurgery and Psychiatry 71(3), 422.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, February, 2014

Recent developments relevant to cryonics

A lot of interesting pieces related to cryonics have appeared over the last few months that I thought I would share:

Four professors conclude in MIT Technology Review that there is significant and growing body of evidence in support of human cryopreservation: “The Science Surrounding Cryonics” 

New York Times Cover story by Pulitzer Prize winning journalist on “A Dying 23 Year Young Woman’s Hope in Cryonics and a Future”

Skeptic Michael Sherman writes a piece in Scientific American called  “Can Our Minds Live Forever?”

Here are three recent important peer reviewed papers:

Dr. Greg Fahy and Robert McIntyre of 21st Century Medicine describe here a new cryobiological and neurobiological technique, aldehyde-stabilized cryopreservation (ASC), which demonstrates the relevance and utility of advanced cryopreservation science for the neurobiological research community. The ASC technology is now also competing against Dr Mikula at Max Planck in he brain preservation prize.

The Grand Challenges of Organ Banking and It’s Potential is described by large group of the worlds leading cryobiology scientists:  The first Organ Banking Summit was convened from Feb. 27 – March 1, 2015 in Palo Alto, CA, with events at Stanford University, NASA Research Park, and Lawrence Berkeley National Labs. Experts at the summit outlined the potential public health impact of organ banking, discussed the major remaining scientific challenges that need to be overcome in order to bank organs, and identified key opportunities to accelerate progress toward this goal. Many areas of public health could be revolutionized by the banking of organs and other complex tissues, including transplantation, oncofertility, tissue engineering, trauma medicine and emergency preparedness, basic biomedical research and drug discovery – and even space travel.

Persistence of Long-Term Memory in Vitrified and Revived Caenorhabditis elegans. Two scientists ask the question:  “Can memory be retained after cryopreservation?” and then demonstrate that a form of long-term memory in C. elegans is not been modified by the process of vitrification or slow freezing.

Though She Isn’t Really Ill, There’s a Little Yellow Pill…

Humans have been ingesting mindand mood-altering substances for millennia, but it has only rather recently become possible to begin to elucidate drug mechanisms of action and to use this information, along with our burgeoning knowledge of neuroscience, to design drugs intended to have a specific effect. And though most people think of pharmaceuticals as “medicine,” it has become increasingly popular to discuss the possibilities for the use of drugs in enhancement, or improvement of “human form or functioning beyond what is necessary to sustain or restore good health” (E.T. Juengst; in Parens, 1998, p 29).

Some (transhumansits) believe that enhancement may not only be possible, but that it may even be a moral duty. Others (bioconservatives) fear that enhancement may cause us to lose sight of what it means to be human altogether. It is not the intention of this article to advocate enhancement or to denounce it. Instead, let’s review some of the drugs (and/or classes of drugs) that have been identified as the most promisingly cognitive- or mood-enhancing. Many of the drugs we will cover can be read about in further depth in Botox for the brain: enhancement of cognition, mood and pro-social behavior and blunting of unwanted memories (Jongh, R., et al., Neuroscience and Biobehavioral Reviews 32 (2008): 760-776).

Of most importance in considering potentially cognitive enhancer drugs is to keep in mind that, to date, no “magic bullets” appear to exist. That is, there are no drugs exhibiting such specificity as to have only the primary, desired effect. Indeed, a general principle of trade-offs (particularly in the form of side effects) appears to exist when it comes to drug administration for any purpose, whether treatment or enhancement. Such facts may constitute barriers to the practical use of pharmacological enhancers and should be taken into consideration when discussing the ethics of enhancement.

Some currently available cognitive enhancers include donepezil, modafinil, dopamine agonists, guanfacine, and methylphenidate. There are also efforts underway to develop memory-enhancing drugs, and we will discuss a few of the mechanisms by which they are proposed to act. Besides cognitive enhancement, the enhancement of mood and prosocial behavior in normal individuals are other types of enhancement that may be affected pharmacologically, most usually by antidepressants or oxytocin. Let’s briefly cover the evidence for the efficacy of each of these in enhancing cognition and/or mood before embarking on a more general discussion of the general principles of enhancement and ethical concerns.

One of the most widely cited cognitive enhancement drugs is donepezil (Aricept®), an acetylcholinesterase inhibitor. In 2002, Yesavage et al. reported the improved retention of training in healthy pilots tested in a flight simulator. In this study, after training in a flight simulator, half of the 18 subjects took 5 mg of donepezil for 30 days and the other half were given a placebo. The subjects returned to the lab to perform two test flights on day 30. The donepezil group was found to perform similarly to the initial test flight, while placebo group performance declined. These results were interpreted as an improvement in the ability to retain a practiced skill. Instead it seems possible that the better performance of the donepezil group could have been due to improved attention or working memory during the test flights on day 30.

Another experiment by Gron et al. (2005) looked at the effects of donepezil (5 mg/day for 30 days) on performance of healthy male subjects on a variety of neuropsychological tests probing attention, executive function, visual and verbal short-term and working memory, semantic memory, and verbal and visual episodic memory. They reported a selective enhancement of episodic memory performance, and suggested that the improved performance in Yesavage et al.’s study is not due to enhanced visual attention, but to increased episodic memory performance.

Ultimately, there is scarce evidence that donepezil improves retention of training. Better designed experiments need to be conducted before we can come to any firm conclusions regarding its efficacy as a cognitive-enhancing.

The wake-promoting agent modafinil (Provigil®) is another currently available drug that is purported to have cognitive enhancing effects. Provigil® is indicated for the treatment of excessive daytime sleepiness and is often prescribed to those with narcolepsy, obstructive sleep apnea, and shift work sleep disorder. Its mechanisms of action are unclear, but it is supposed that modafinil increases hypothalamic histamine release, thereby promoting wakefulness by indirect activation of the histaminergic system. However, some suggest that modafinil works by inhibiting GABA release in the cerebral cortex.

In normal, healthy subjects, modafinil (100-200 mg) appears to be an effective countermeasure for sleep loss. In several studies, it sustained alertness and performance of sleep-deprived subjects(up to 54.5 hours) and has also been found to improve subjective attention and alertness, spatial planning, stop signal reaction time, digit-span and visual pattern recognition memory. However, at least one study (Randall et al., 2003) reported “increased psychological anxiety and aggressive mood” and failed to find an effect on more complex forms of memory, suggesting that modafinil enhances performance only in very specific, simple tasks.

The dopamine agonists d-amphetamine, bromocriptine, and pergolide have all been shown to improve cognition in healthy volunteers, specifically working memory and executive function. Historically, amphetamines have been used by the military during World War II and the Korean War, and more recently as a treatment for ADHD (Adderall®). But usage statistics suggest that it is commonly used for enhancement by normal, healthy people—particularly college students.

Interestingly, the effect of dopaminergic augmentation appears to have an inverted U-relationship between endogenous dopamine levels and working memory performance. Several studies have provided evidence for this by demonstrating that individuals with a low workingmemory capacity benefit from greater improvements after taking a dopamine receptor agonist, while high-span subjects either do not benefit at all or show a decline in performance.

Guanfacine (Intuniv®) is an α2 adrenoceptor agonist, also indicated for treatment of ADHD symptoms in children, but by increasing norepinephrine levels in the brain. In healthy subjects, guanfacine has been shown to improve visuospatial memory (Jakala et al., 1999a, Jakala et al., 1999b), but the beneficial effects were accompanied by sedative and hypotensive effects (i.e., side effects). Other studies have failed to replicate these cognitive enhancing effects, perhaps due to differences in dosages and/or subject selection.

Methylphenidate (Ritalin®) is a well-known stimulant that works by blocking the reuptake of dopamine and norepinephrine. In healthy subjects, it has been found to enhance spatial workingmemory performance. Interestingly, as with dopamine agonists, an inverted U-relationship was seen, with subjects with lower baseline working memory capacity showing the greatest improvement after methylphenidate administration.

Future targets for enhancing cognition are generally focused on enhancing plasticity by targeting glutamate receptors (responsible for the induction of long-term potentiation) or by increasing CREB (known to strengthen synapses). Drugs targeting AMPA receptors, NMDA receptors, or the expression of CREB have all shown some promise in cognitive enhancement in animal studies, but little to no experiments have been carried out to determine effectiveness in normal, healthy humans.

Beyond cognitive enhancement, there is also the potentialfor enhancement of mood and pro-social behavior. Antidepressants are the first drugs that come to mind when discussing the pharmacological manipulation of mood, including selective serotonin reuptake inhibitors (SSRIs). Used for the treatment of mood disorders such as depression, SSRIs are not indicated for normal people of stable mood. However, some studies have shown that administration of SSRIs to healthy volunteers resulted in a general decrease of negative affect (such as sadness and anxiety) and an increase in social affiliation in a cooperative task. Such decreases in negative affect also appeared to induce a positive bias in information processing, resulting in decreased perception of fear and anger from facial expression cues.

Another potential use for pharmacological agents in otherwise healthy humans would be to blunt unwanted memories by preventing their consolidation.Thismay be accomplished by post-training disruption of noradrenergic transmission (as with β-adrenergic receptor antagonist propranolol). Propranolol has been shown to impair the long-term memory of emotionally arousing stories (but not emotionally neutral stories) by blocking the enhancing effect of arousal on memory (Cahill et al., 1994). In a particularly interesting study making use of patients admitted to the emergency department, post-trauma administration of propranolol reduced physiologic responses during mental imagery of the event 3 months later (Pitman et al., 2002). Further investigations have supported the memory blunting effects of propranolol, possibly by blocking the reconsolidation of traumatic memories.

GENERAL PRINCIPLES

Reviewing these drugs and their effects leads us to some general principles of cognitive and mood enhancement. The first is that many drugs have an inverted U-shaped dose-response curve, where low doses improve and high doses impair performance.This is potentially problematic for the practical use of cognition enhancers in healthy individuals, especially when doses that are most effective in facilitating one behavior simultaneously exert null or detrimental effects on other behaviors.

Second, a drug’s effect can be “baseline dependent,” where low-performing individuals experience greater benefit from the drug while higher-performing individuals do not see such benefits (which might simply reflect a ceiling effect), or may, in fact, see a deterioration in performance (which points to an inverted U-model). In the case of an inverted U-model, low performing individuals are found on the up slope of the inverted U and thus benefit from the drug, while high-performing individuals are located near the peak of the inverted U already and, in effect, experience an “overdose” of neurotransmitter that leads to a decline in performance.

Trade-offs exist in the realm of cognitive enhancing drugs as well. As mentioned, unwanted “side effects” are often experienced with drug administration, ranging from mild physiological symptoms such as sweating to more concerning issues like increased agitation, anxiety, and/or depression.

More specific trade-offs may come in the form of impairment of one cognitive ability at the expense of improving another. Some examples of this include the enhancement of long-term memory but deterioration of working memory with the use of drugs that activate the cAMP/protein kinase A (PKA) signaling pathway. Another tradeoff could occur between the stability versus the flexibility of long-term memory, as in the case of certain cannabinoid receptor antagonists which appear to lead to more robust long-term memories, but which also disrupt the ability of new information to modify those memories. Similarly, a trade-off may exist between stability and flexibility of working memory. Obviously, pharmacological manipulations that increase cognitive stability at the cost of a decreased capacity to flexibly alter behavior are potentially problematic in that one generally does not wish to have difficulty in responding appropriately to change.

Lastly, there is a trade-off involving the relationship between cognition and mood. Many mood-enhancing drugs, such as alcohol and even antidepressants, impair cognitive functioning to varying degrees. Cognition-enhancing drugs may also impair emotional functions. Because cognition and emotion are intricately regulated through interconnected brain pathways, inducing change in one area may have effects in the other. Much more research remains to be performed to elucidate these interactions before we can come to any firm conclusions.

ETHICAL CONCERNS

Again, though it is not the place of this article to advocate or denounce the use of drugs for human enhancement, obviously there are considerable ethical concerns when discussing the administration of drugs to otherwise healthy human beings. First and foremost, safety is of paramount importance. The risks and side-effects, including physical and psychological dependence, as well as long-term effects of drug use should be considered and weighed heavily against any potential benefits.

Societal pressure to take cognitive enhancing drugs is another ethical concern, especially in light of the fact that many may not actually produce benefits to the degree desired or expected. In the same vein, the use of enhancers may give some a competitive advantage, thus leading to concerns regarding fairness and equality (as we already see in the case of physical performance-enhancing drugs such as steroids). Additionally, it may be necessary, but very difficult, to make a distinction between enhancement and therapy in order to define the proper goals of medicine, to determine health-care cost reimbursement, and to “discriminate between morally right and morally problematic or suspicious interventions” (Parens, 1998). Of particular importance will be determining how to deal with drugs that are already used off-label for enhancement. Should they be provided by physicians under certain conditions? Or should they be regulated in the private commercial domain?

There is an interesting argument that using enhancers might change one’s authentic identity—that enhancing mood or behavior will lead to a personality that is not really one’s own (i.e., inauthenticity), or even dehumanization—while others argue that such drugs can help users to “become who the really are,” thereby strengthening their identity and authenticity. Lastly, according to the President’s Council on Bioethics, enhancement may “threaten our sense of human dignity and what is naturally human” (The President’s Council, 2003). According to the Council, “the use of memory blunters is morally problematic because it might cause a loss of empathy if we would habitually ‘erase’ our negative experiences, and because it would violate a duty to remember and to bear witness of crimes and atrocities.” On the other hand, many people believe that we are morally bound to transcend humans’ basic biological limits and to control the human condition. But even they must ask: what is the meaning of trust and relationships if we are able to manipulate them?

These are all questions without easy answers. It may be some time yet before the ethical considerations of human cognitive and mood enhancement really come to a head, given the apparently limited benefits of currently available drugs. But we should not avoid dealing with these issues in the meantime; for there will come a day when significant enhancement, whether via drugs or technological means, will be possible and available. And though various factions may disagree about the morality of enhancement, one thing is for sure: we have a moral obligation to be prepared to handle the consequences of enhancement, both positive and negative.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, December, 2013

Brain Fitness

Book Review: The SharpBrains Guide to Brain Fitness: How to Optimize Brain Health and Performance at Any Age by Alvero Fernandez

Of all the organs in the human body, a cryonicist should be most concerned about the health and integrity of his or her brain. Thousands of books have been written about physical health and fitness, but very few address the topic of how to keep the brain fit and healthy. Happily, interest in brain fitness, once relegated to academics and gerontologists, is now taking root across America and the world.

The importance of lifelong learning and mental stimulation as a component of healthy aging has long been recognized and touted as a way to stay mentally alert and to stave off dementia in old age. As with physical exercise, “use it or lose it” appears to apply to our brains too. And now that scientists are learning more about neuroplasticity and how brains change as a result of aging, they have begun to test the effects of various factors on brain health and cognitive ability across the lifespan.

Unfortunately, like much health-related research, the results reported by the media have often been convoluted, confusing, and even contradictory. Products developed by overzealous entrepreneurs make outlandish claims and frequently don’t deliver the purported results. Consumers and professionals alike are left wondering what works and what doesn’t when it comes to maintaining our brains in optimal working condition.

To aid all those navigating the murky waters of brain fitness, enter SharpBrains—a company dedicated to tracking news, research, technology, and trends in brain health and to disseminating information about the applications of brain science innovation. In so doing, they “maintain an annual state-of-the-market consumer report series, publish consumer guides to inform decision-making, produce an annual global and virtual professional conference,” and maintain SharpBrains.com, a leading educational blog and website with over 100,000 monthly readers.

Most recently, SharpBrains has published a book on brain fitness called The SharpBrains Guide to Brain Fitness: How to Optimize Brain Health and Performance at Any Age. A compilation and condensation of information accumulated over the lifespan of the company, The SharpBrains Guide to Brain Fitness emphasizes credible research and goes to great lengths to provide the most up-to-date research results in specific areas of brain fitness, followed by interviews with scientists doing work in those fields. The goal of the guide is to help the reader begin to “cultivate a new mindset and master a new toolkit that allow us appreciate and take full advantage of our brain’s incredible properties…[by] providing the information and understanding to make sound and personally relevant decisions about how to optimize your own brain health and performance.”

The Guide begins by emphasizing that the brain’s many neuronal networks serve distinct functions including various types of memory, language, emotional regulation, attention, and planning. Plasticity of the brain is defined as its lifelong capacity to change and reorganize itself in response to the stimulation of learning and experience—the foundation upon which “brain training” to improve cognitive performance at any age, and to maintain brain health into old age, is predicated.

The difficulty of making sense of the scientific findings on brain health and neuroplasticity is discussed at length, with the finger of blame pointed squarely at the media for reporting only fragments of the research and for often reporting those results which are not most meaningful. The authors stress that “it is critical to complement popular media sources with independent resources, and above all with one’s own informed judgment.”

The following chapters go on to review what is known today about how physical exercise, nutrition, mental challenge, social engagement, and stress management can positively affect brain health. Along the way they provide dozens of relevant research results (as well as the design of each study) to support their recommendations. Reporting on all of those experiments is beyond the scope of this review, so if you are interested in examining them (and you should be!) please obtain a copy of the Guide for yourself or from your local library.

Physical exercise is discussed first because of the very strong evidence that exercise—especially aerobic, or “cardio,” exercise slows atrophy of the brain associated with aging, actually increasing the brain’s volume of neurons (i.e., “gray matter”) and connections between neurons (i.e., “white matter”). While much of the initial research supporting the effects of exercise on the brain came from animal studies, the authors report that “several brain imaging studies have now shown that physical exercise is accompanied by increased brain volume in humans.”

Staying physically fit improves cognition across all age groups, with particularly large benefits for so-called “executive” functions such as planning, working memory, and inhibition. A 2010 meta-analysis by the NIH also concluded that physical exercise is a key factor in postponing cognitive decline and/or dementia, while other studies have found physical exercise to lower the risk of developing Parkinson’s disease, as well.

But don’t think that just any moving around will do the trick. When it comes to providing brain benefits, a clear distinction is drawn between physical activity and physical exercise. Only exercise will trigger the biochemical changes in the brain that spur neurogenesis and support neuroplasticity. It doesn’t need to be particularly strenuous, but to be most beneficial it should raise your heart rate and increase your breathing rate.

Of course, adequate nutrition is also imperative in obtaining and maintaining optimal brain health. The SharpBrains Guide to Brain Fitness primarily highlights the well-known benefits of the Mediterranean diet, which consists of a high intake of vegetables, fruit, cereals, and unsaturated fats, a low intake of dairy products, meat, and saturated fats, a moderate intake of fish, and regular but moderate alcohol consumption. But I think it is safe to say that the jury is still out on the best diet for the brain, as evidenced by the recent popularity of the Paleo diet among life extentionists. And, of course, ethnicity and genetics are important, too. The authors do stress the importance of omega-3 fatty acids and antioxidants obtained from dietary sources, stating firmly that “to date, no supplement has conclusively been shown to improve cognitive functioning, slow down cognitive decline, or postpone Alzheimer’s disease symptoms beyond placebo effect.” This includes herbal supplements such as Ginko biloba and St. John’s wort.

Beyond what we normally do to keep our bodies healthy, the Guide also discusses the relative effectiveness of different forms of “mental exercise.” Perhaps you’ve heard that doing crossword or Sudoku puzzles will keep you sharp and alert into old age, or that speaking multiple languages is associated with decreased risk of Alzheimer’s disease. The good news is that these things are true—to a degree. The part that is often left out is that it’s the challenge of these activities that is important. As with physical activity vs. physical exercise, mental exercise refers to the subset of mental activities that are effortful and challenging.

Puzzles and games may be challenging at first, but they (and other mental exercises) can quickly become routine and unchallenging. In order to reap the most benefit from mental exercise, the goal is to be exposed to novelty and increasing levels of challenge. Variety is important for stimulating all aspects of cognitive ability and performance, so excessive specialization is not the best strategy for maintaining long-term brain health. If you are an artist, try your hand at strategybased games. If you’re an economist, try an artistic activity. Get out of your comfort zone in order to stimulate skills that you rarely use otherwise.

The SharpBrains Guide states that “lifelong participation in cognitively engaging activities results in delayed cognitive decline in healthy individuals and in spending less time living with dementia in people diagnosed with Alzheimer’s disease.” This is hypothesized to be because doing so builds up one’s “cognitive reserve”—literally an extra reservoir of neurons and neuronal connections—which may be utilized so that a person continues to function normally even in the face of underlying Alzheimer’s or other brain pathology. This observation raises another important point on which neuroscientists and physiologists do not yet fully agree. Will we all eventually get dementia if we live long enough without credible brain rejuvenation biotechnologies? This is a topic I would like to return to in a future installment of Cooler Minds Prevail.

Social engagement also appears to provide brain benefits. The NIH meta-analysis mentioned earlier concluded that higher social engagement in mid- to late life is associated with higher cognitive functioning and reduced risk of cognitive decline. Brain imaging studies indicate an effect of social stimulation on the volume of the amygdala, a structure that plays a major role in our emotional responses and which is closely connected to the hippocampus, which is important for memory.

Yet again, not all activity is equal. When it comes to social stimulation, “you can expect to accrue more benefits within groups that have a purpose (such as a book club or a spiritual group) compared to casual social interactions (such as having a drink with a friend to relax after work).” To keep socially engaged across the lifespan, seek out interactions that naturally involve novelty, variety, and challenge such as volunteering and participating in social groups.

“The lifelong demands on any person have changed more rapidly in the last thousand years than our genes and brains have,” The SharpBrains Guide explains in the intro to the chapter on stress management. The result? It has become much more difficult to regulate stress and emotions. It is great that we have such amazing and complex brains, but humans are among the few animals that can get stressed from their own thoughts. And while there are some (potentially) beneficial effects of short bursts of stress, high and sustained levels of stress can have a number of negative consequences. Those of note include: increased levels of blood cortisol which can lead to sugar imbalances, high blood pressure, loss of muscle tissue and bone density, lower immunity, and cause damage to the brain; a reduction of certain neurotransmitters, such as serotonin and dopamine, which has been linked to depression; and a hampering of our ability to make changes to reduce the stress, resulting in General Adaption Syndrome (aka “burnout”).

Research-based lifestyle solutions to combat stress include exercise, relaxation, socialization, humor and laughter, and positive thinking. In particular, targeted, capacity-building techniques such as biofeedback and meditation are recommended to manage stress and build resilience. Mindfulness Based Stress Reduction (MBSR) programs have provided evidence that meditative techniques can help manage stress and research shows that MBSR can lead to decreases in the density of an area of the amygdala which is correlated with reduction in reported stress.

So it appears that multiple approaches are necessary to develop a highly fit brain capable of adapting to new situations and challenges throughout life. “Consequently,” The SharpBrains Guide to Brain Fitness states, “we expect cross-training the brain to soon become as mainstream as cross-training the body is today, going beyond unstructured mental activity in order to maximize specific brain functions.”

There is growing evidence that brain training can work, but in evaluating what “works” we are mostly looking at two things: how successful the training program is (i.e., does it actually improve the skill(s) being trained?) and the likelihood of transfer from training to daily life. Building on an analysis of documented examples of brain training techniques that “work” or “transfer,” SharpBrains suggests the following five conditions need to be met for brain training to be likely to translate into meaningful real world improvements (condensed excerpt):

  1. Training must engage and exercise a core brain-based capacity or neural circuit identified to be relevant to real-life outcomes.
  2. The training must target a performance bottleneck.
  3. A minimum “dose” of 15 hours total per targeted brain function, performed over 8 weeks or less, is necessary for real improvement.
  4. Training must be adaptive to performance, require effortful attention, and increase in difficulty.
  5. Over the long-term, the key is continued practice for continued benefits.

Meditation, biofeedback, and/or cognitive therapy in concert with cognitive training to optimize targeted brain functions appear to be winning combinations in terms of successful techniques facilitating transfer from training to real life benefits. Top brain training software programs, based on SharpBrains’ analysis and a survey of their users, include Lumosity, Brain games, brainHQ, Cogmed, and emWave.

In the end, brain fitness needs are unique to each individual and brain fitness claims should be evaluated skeptically. SharpBrains recommends asking several questions when evaluating brain fitness claims, particularly whether there is clear and credible evidence of the program’s success documented in peer-reviewed scientific papers published in mainstream scientific journals that analyze the effects of the specific product.

Of course, your own individual experience with the product is ultimately the most important evaluation of all. If you are ready to take the plunge into the emerging brain fitness market, The SharpBrains Guide to Brain Fitness is a good place to start, and I’m sure they’d appreciate your feedback as this field continues to develop.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, August, 2013

HIV, Immunosenescence, and Accelerated Aging

After a few articles considering Alzheimer disease from several angles, I would like to switch gears this month and talk more generally about the interaction between the immune system and aging.

In his 2012 paper[1], Caleb E. Finch documents the evolution of life expectancy in the course of human history. The life expectancy at birth of our shared ape ancestor 6 millions years ago is hypothesized to approximate that of a chimpanzee, 15 years. The first Homo species appeared 1-2 million years ago and had a life expectancy of ~20 years, while H. sapiens came onto the scene ~100,000 years ago and could expect about 30 years of life. But starting around 200 years ago, concurrent with industrialization, human life expectancy jumped rapidly, to somewhere between 70 and 80 years today.

As many readers are likely aware, the huge recent increases in life expectancy are commonly attributed to improvements in hygiene, nutrition, and medicine during the nineteenth and twentieth centuries that reduced mortality from infections at all ages. Finch hypothesizes, generally, that early age mortality over the course of human history is primarily due to (acute) infection, while old age mortality is primarily due to (chronic) inflammation. Further analysis of mortality rates over the last several hundred years leads him to further hypothesize that aging has been slowed in proportion to the reduced exposure to infections in early life. These hypotheses are supported by twentieth century examples which strongly demonstrate influences of the early life environment on adult health, such as the effects of prenatal and postnatal developmental influences (e.g., nutrition, exposure to infection) on adult chronic metabolic and vascular disorders as well as physical traits and mental characteristics. This leads Finch to suggest “broadening the concept of ‘developmental origins’ to include three groups of factors: nutritional deficits, chronic stress from socioeconomic factors, and direct and indirect damage from infections.”

Finch also considers the effects of inflammation and diet on human evolution, proposing several environmental and foraging factors that may have been important in the genetic basis for evolving lower basal mortality through interactions with chronic inflammation, in particular: dietary fat and caloric content; infections from pathogens ingested from carrion and from exposure to excreta; and noninfectious inflammagens such as those in aerosols and in cooked foods. He hypothesizes that exposure to these proinflammatory factors, which one would expect to shorten life expectancy, actually resulted in humans evolving lower mortality and longer lifespans in response to highly inflammatory environments.

A means for this, he argues, was the development of the apoE4 genotype. Noting that the apoE4 allele favors advantageous fat accumulation and is also associated with enhanced inflammatory responses, Finch argues that heightened inflammatory response and more efficient fat storage would have been adaptive in a pro-inflammatory environment and during times of uncertain nutrition. As has been discussed in prior articles in Cooler Minds Prevail, the apoE alleles also influence diverse chronic non-infectious degenerative diseases and lifespan. “Thus,” Finch concludes, “the apoE allele system has multiple influences relevant to evolution of brain development, metabolic storage, host defense, and longevity.”

With the general relationship between inflammation and the evolution of human aging and life expectancy in mind, let us now consider immune system involvement in more detail, and the relationship between HIV and immunosenescence more specifically.

Immunosenescence refers to the ageassociated deterioration of the immune system. As an organism ages it gradually becomes deficient in its ability to respond to infections and experiences a decline in long-term immune memory. This is due to a number of specific biological changes such as diminished self-renewal capacity of hematopoietic stem cells, a decline in total number of phagocytes, impairment of Natural Killer (NK) and dendritic cells, and a reduction in B-cell population. There is also a decline in the production of new naïve lymphocytes and the functional competence of memory cell populations. As a result, advanced age is associated with increased frequency and severity of pathological health problems as well as an increase in morbidity due to impaired ability to respond to infections, diseases, and disorders.

It is not hard to imagine that an increased viral load leading to chronic inflammatory response may accelerate aging and immunosenescence. Evidence for this is accumulating rapidly since the advent of antiretroviral therapies for treatment of HIV infection. An unforeseen consequence of these successful therapies is that HIV patients are living longer but a striking number of them appear to be getting older faster, particularly showing early signs of dementia usually seen in the elderly. In one study, slightly more than 10% of older patients (avg = 56.7 years) with wellcontrolled HIV infection had cerebrospinal fluid (CSF) marker profiles consistent with Alzheimer disease[2] – more than 10 times the risk prevalence of the general population at the same age. HIV patients are also registering higher rates of insulin resistance and cholesterol imbalances, suffer elevated rates of melanoma and kidney cancers, and seven times the rate of other non-HIV-related cancers. And ultimately, long-term treated HIV-infected individuals also die at an earlier age than HIV-uninfected individuals[3].

Recent research is beginning to explore and unravel the interplay between HIV infection and other environmental factors (such as co-infection with other viruses) in the acceleration of the aging process of the immune system, leading to immunosenescence. In the setting of HIV infection, the immune response is associated with abnormally high levels of activation, leading to a cascade of continued viral spread and cell death, and accelerating the physiologic steps associated with immunosenescence. Despite clear improvements associated with effective antiretroviral therapy, some subjects show persistent alterations in T cell homeostasis, especially constraints on T cell recovery, which are further exacerbated in the setting of co-infection and increasing age.

Unsurprisingly, it has been observed that markers of immunosenescence might predict morbidity and mortality in HIV-infected adults as well as the general population. In both HIV infection and aging, immunosenescence is marked by an increased proportion of CD28- to CD57+, and memory CD8+ T cells with reduced capacity to produce interleukin 2 (IL-2), increased production of interleukin 6 (IL-6), resistance to apoptosis, and shortened telomeres. Levels of markers of inflammation are elevated in HIV infected patients, and elevations in markers such as high-sensitivity C-reactive protein, D-dimer, and interleukin 6 (IL-6) have been associated with increased risk for cardiovascular disease, opportunistic conditions, or all-cause mortality[4].

But even as we are beginning to identify markers that appear to be associated with risk of poor outcome in HIV infection, it is still unclear how patients should be treated on the basis of this information. To that end, several trials are underway to evaluate the effects of modulation of immune activation and inflammation in HIV infection. At the same time, clinicians at the forefront of advancing knowledge and clinical care are performing research aimed at optimizing care for aging HIV patients.

The implications for such research may be far-reaching. In fact, many HIV clinicians and researchers think that HIV may be key to understanding aging in general. Dr. Eric Verdin states, “I think in treated, HIV-infected patients the primary driver of disease is immunological. The study of individuals who are HIV-positive is likely to teach us things that are really new and important, not only about HIV infection, but also about normal aging.”

Dr. Steven Deeks stresses the collaborative efforts of experts across fields. “I think there is a high potential for tremendous progress in understanding HIV if we can assemble a team of experts from the world of HIV immunology and the world of gerontology,” he says. “Each field can dramatically inform the other. I believe HIV is a well described, well studied, distinct disease that can be used as
a model by the larger community to look at issues of aging.”

References

[1] Finch, C (2012). Evolution of the Human Lifespan, Past, Present, and Future: Phases in the Evolution of Human Life Expectancy in Relation to the Inflammatory Load. Proceedings of the American Philosophical Society, 156:1, 9-44.

[2] Mascolini, M (2013). Over 10% in Older HIV Group Fit Alzheimer’s Biomarker Risk Profile. Conference Reports for NATAP: 20th Conference on Retroviruses and Opportunistic Infections, March 3-6, 2013.

[3] Aberg, X (2012). Aging, Inflammation, and HIV Infection. Topics in Antiviral Medicine, 20:3, 101-105.

[4] Deeks, S, Verdin, S. and McCune, JM (2012). Immunosenescence and HIV. Current Opinion in Immunology, 24: 1-6.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, June, 2013