Multiple Sclerosis and Human Enhancement

Multiple sclerosis is a disease that raises a lot of interesting questions for people interested in biogerontology, human enhancement, and even cryonics. It raises questions about immunosenescence and draws attention to possible immune improvements for biological human enhancement. Biotechnologies to induce myelin repair may even be useful for the repair of cryopreserved brains. Before I discuss multiple sclerosis from these perspectives, let us take a closer look at this medical condition.

Multiple sclerosis (MS) is an inflammatory autoimmune disorder of the central nervous system that results in axonal degeneration in the brain and spinal cord. In simple terms, multiple sclerosis is a disease wherein the body’s immune system attacks and damages the myelin sheath, the fatty tissue that surrounds axons in the central nervous system. The myelin sheath is important because it facilitates the conduction of electrical signals along neural pathways. Like electrical wires, neuronal axons require insulation to ensure that they are able to transmit a signal accurately and at high speeds. It is these millions of nerves that carry messages from the brain to other parts of the body and vice versa.

More specifically, MS involves the loss of oligodendrocytes, the cells responsible for creating and maintaining the myelin sheath. This results in a thinning or complete loss of myelin (i.e., demyelination) and, as the disease advances, the breakdown of the axons of neurons. A repair process, called remyelination, takes place in early phases of the disease, but the oligodendrocytes are unable to completely rebuild the cell’s myelin sheath. Repeated attacks lead to successively less effective remyelinations, until a scar-like plaque is built up around the damaged axons.

The name multiple sclerosis refers to the scars (sclerae—better known as plaques or lesions) that form in the nervous system. These scars most commonly affect the white matter in the optic nerve, brain stem, basal ganglia, and spinal cord or white matter tracts close to the lateral ventricles of the brain. The peripheral nervous system is rarely involved. These lesions are the origin of the symptoms during an MS “attack.”

In addition to immune-mediated loss of myelin, which is thought to be carried out by T lymphocytes, B lymphocytes, and macrophages, another characteristic feature of MS is inflammation caused by a class of white blood cells called T cells, a kind of lymphocyte that plays an important role in the body’s defenses. In MS, T cells enter the brain via disruptions in the blood-brain barrier. The T cells recognize myelin as foreign and attack it, which is why these cells are also called “autoreactive lymphocytes.”

The attack of myelin starts inflammatory processes which trigger other immune cells and the release of soluble factors like cytokines and antibodies. Further breakdown of the blood–brain barrier in turn causes a number of other damaging effects such as swelling, activation of macrophages, and more activation of cytokines and other destructive proteins. These inflammatory factors could lead to or enhance the loss of myelin, or they may cause the axon to break down completely.

Because multiple sclerosis is not selective for specific neurons, and can progress through the brain and spinal cord at random, each patient’s symptoms may vary considerably. When a patient experiences an “attack” of increased disease activity, the impairment of neuronal communication can manifest as a broad spectrum of symptoms affecting sensory processing, locomotion, and cognition.

Some of the most common symptoms include: numbness and/or tingling of the limbs, like pins and needles; extreme and constant fatigue; slurring or stuttering; dragging of feet; vision problems, especially blurred vision; loss of coordination; inability to walk without veering and bumping into things; weakness; tremors; pain, especially in the legs; dizziness; and insomnia. There are many other symptoms, as well, such as loss of bowel or bladder control, the inability to process thoughts (which leads to confusion), and passing out. Some MS patients lose their vision and many lose their ability to walk. The symptoms are not necessarily the same for all patients and, in fact, an individual MS patient does not always have the same symptoms from day to day or even from minute to minute.

One of the most prevalent symptoms of MS is extreme and chronic fatigue. Assessment of fatigue in MS is difficult because it may be multifactorial, caused by immunologic abnormalities as well as other conditions that contribute to fatigue such as depression and disordered sleep (Braley and Chervin, 2010). Pharmacologic treatments such as amantadine and modafinil have shown favorable results for subjective measures of fatigue. Both drugs are well tolerated and have a mild side-effect profile (Life Extension Foundation, 2013).

It is estimated that multiple sclerosis affects approximately 85 out of every 100,000 people (Apatoff, 2002). The number of known patients is about 400,000 in the United States and about 2.5 million worldwide (Braley & Chervin, 2010). In recent years, there has been an increase of identified multiple sclerosis patients with about 50 percent more women reporting the disease. Indeed, between two and three times as many women have MS than men. Most patients are diagnosed between the ages of 20 and 50 but MS can strike at any age (National Multiple Sclerosis Society, 2013).

Incidence of multiple sclerosis varies by geographic region and certain demographic groups (Apatoff, 2002; Midgard, 2001). There is evidence that worldwide distribution of MS may be linked to latitude (Midgard, 2001). In the U.S., for instance, there is a lower rate of MS in the South than in other regions (Apatoff, 2002). Data regarding race shows 54 percent of MS patients are white, 25 percent are black and 19 percent are classified as other (Apatoff, 2002).

There are four disease courses identified in MS:

Relapsing-Remitting: Patients have clearly defined acute attacks or flare-ups that are referred to as relapses. During the relapse, the patient experiences worsening of neurologic function—the body or mind will not function properly. The relapse is followed by either partial or total recovery, called remissions, when symptoms are alleviated. About 85 percent of MS patients fall into this category (National Multiple
Sclerosis Society, 2013).

Primary-Progressive: The disease slowly and consistently gets worse with no relapses or remissions. Progression of the disease occurs over time and the patient may experience temporary slight improvements of functioning. About 10 percent of MS patients fall into this category (National Multiple Sclerosis Society, 2013).

Secondary-Progressive: Patient appears to have relapsing-remitting MS, but after time the disease becomes steadily worse. There may or may not be plateaus, flareups, or remissions. About half the people originally diagnosed with relapsing remitting will move into this category within 10 years (National Multiple Sclerosis Society, 2013).

Progressive-Relapsing: Quick disease progression with few, if any, remissions. About 5 percent of MS patients fall into this category at diagnosis (National Multiple Sclerosis Society, 2003).

The cause(s) of multiple sclerosis remain unknown although research suggests that both genetic and environmental factors contribute to the development of the disease (National Multiple Sclerosis Society, 2013; Compston and Coles, 2002). The current prevailing theory is that MS is a complex multifactorial disease based on a genetic susceptibility but requiring an environmental trigger, and which causes tissue damage through inflammatory/ immune mechanisms. Widely varying environmental factors have been found to be associated with the disease, ranging from infectious agents to Vitamin D deficiency and smoking. The debate these days revolves primarily around whether immune pathogenesis is primary, or acts secondarily to some other trigger (Braley & Chervin, 2010).

Risk factors for multiple sclerosis include genetics and family history, though it is believed that up to 75% of MS must be attributable to non-genetic or environmental factors. Infection is one of the more widely suspected non-genetic risk factors. A commonly held theory is that viruses involved in the development of autoimmune diseases could mimic the proteins found on nerves, making those nerves a target for antibodies. The potential roles of several viruses have been investigated including herpes simplex virus (HSV), rubella, measles, mumps, and Epstein Barr virus (EBV). The strongest correlation between a virus and MS exists with EBV—virtually 100% of patients who have MS are seropositive for EBV (the rate in the general public is about 90%)— but potential causality remains strongly debated (Ludwin and Jacobson, 2011).

It is important to keep in mind that infectious agents such as viruses may, in fact, have nothing to do with causing MS. The association of a virus with MS is based on increased antibody response and may be epiphenomenal of a dysregulated global immune response. “Proving” causality will require consistent molecular findings as well as consistent results from well-controlled clinical trials of virus-specific antiviral therapies (as yet to be developed). In the end, any theory concerning causality in MS should also account for the strong association with other environmental factors such as Vitamin D deficiency and smoking. Indeed, a landmark study found that, compared to those with the highest levels of vitamin D, those with the lowest blood levels were 62% more likely to develop MS. Additionally, a literature review evaluating more than 3000 MS cases and 45,000 controls indicates that smoking increases the risk of developing MS by approximately 50% (Life Extension Foundation, 2013).

Recently, researchers have pinpointed a specific toxin they believe may be responsible for the onset of MS. Epsilon toxin—a byproduct of the bacterium Clostridium perfringens—is able to permeate the blood brain barrier and has been demonstrated to kill oligodendrocytes and meningeal cells. Loss of oligodendrocytes and meningeal inflammation are both part of the MS disease process, and may be triggered by exposure to epsilon toxin.

The fact that females are more susceptible to inflammatory autoimmune diseases, including multiple sclerosis, points to the potential role of hormones in the etiology of multiple sclerosis. Interestingly, the course of disease is affected by the fluctuation of steroid hormones during the female menstrual cycle and female MS patients generally experience clinical improvements during pregnancy (Life Extension Foundation, 2013). Additionally, pregnancy appears to be protective against the development of MS. A study in 2012 demonstrated that women who have been pregnant two or more times had a significantly reduced risk of developing MS, while women who have had five or more pregnancies had one-twentieth the risk of developing MS compared to women who were never pregnant. (The increase in MS prevalence over the last few decades could reflect the fact that women are having fewer children.) A growing body of evidence supports the therapeutic potential of hormones (both testosterone and estrogens) in animal models of multiple sclerosis, but more research is needed to understand the pathways and mechanisms underlying the beneficial effects of sex hormones on MS pathology (Gold and Voskuhl, 2009).

No single test gives a definitive diagnosis for MS, and variable symptoms and disease course make early diagnosis a challenge. Most diagnoses are presumptive and are based on the clinical symptoms seen in an acute attack. Supporting evidence of these presumptions is then sought, usually from a combination of magnetic resonance imaging (MRI) of the brain, testing the cerebrospinal fluid (CSF) for antibodies, measuring the
efficiency of nerve impulse conduction, and monitoring symptoms over time.

As there is still much work to be done in understanding the nature of multiple sclerosis, a cure has yet to be discovered. Conventional medical treatment typically focuses on strategies to treat acute attacks, to slow the progression of the disease, and to treat symptoms. Corticosteriods such as methylprednisolone are the first line of defense against acute MS attacks and are administered in high doses to suppress the immune system and decrease the production of proinflammatory factors. Plasma exchange is also used to physically remove antibodies and proinflammatory factors from the blood.

The use of beta interferons is a longstanding MS treatment strategy, originally envisioned as an antiviral compound. Beta interferons reduce inflammation and slow disease progression, but the mechanism of action is poorly understood. Other immunosuppressant drugs such as Mitoxantrone and Fingolimod also slow disease progression, but are not used as first-line treatments due to their severe side effects. More recently, researchers at Oregon Health & Science University have noted that an antioxidant called MitoQ has been shown to significantly reverse symptoms in a mouse model of MS (Mao, Manczak, Shirendeb, and Reddy (2013).

Besides pharmacological treatments, MS patients may benefit from therapies (such as physical and speech therapy) and from an optimized nutritional protocol. Supplementation with Vitamin D, Omega-3 and -6 fatty acids, Vitamin E, lipoic acid, Vitamin b12, and Coenzyme Q10 appear to be of particular potential benefit (Life Extension Foundation, 2013). Until a definitive cause for MS can be defined and a cure developed, such strategies, including hormone therapy, offer possible ways to improve quality of life over the course of disease progression.

Unlike Alzheimer’s disease, there does not appear to be a Mendelian variant of MS that will invariably produce the disease in people who have the gene. A somewhat puzzling variable is that MS predominantly tends to occur between the ages of 20 and 50. This appears to exclude approaching MS as a form of immunosenescence. After all, if MS would be a function of the aging immune system, we would see progressively more cases of MS as people get older (or in AIDS patients), ultimately involving many very old people. More likely, MS is a non age-related form of dysfunction of the immune system that is triggered by environmental factors (such as a viral infection). While many discussions about the role of viruses in debilitating diseases like Alzheimer’s and MS still suffer from an incomplete understanding of cause and effect, it seems reasonable to conclude that enhancement of the human immune system can greatly reduce disease and improve the quality of life, even in healthy humans.

One potential treatment for MS is to induce remyelination (or inhibit processes that interfere with efficient remyelination). Stem cells can be administered to produce oligodendrocyte precursor cells to produce the oligodendrocyte glial cells that are responsible for remyelination of axons. While the myelin sheaths of these remyelated axons are not as thick as the myelin sheaths that are formed during development, remyelination can improve conduction velocity and prevent the destruction of axons. While the dominant repair strategies envisioned for cryonics involve molecular nanotechnologies that can build any biochemical structures that physical law permits, it is encouraging to know that specific stem cell therapies will be available to repair and restore myelin function in cryonics patients as damage to myelin should be expected as a result of (prolonged) ischemia and cryoprotectant toxicity.

An interesting possibility is that remyelination therapies may also be used for human enhancement if these therapies can be tweaked to improve conduction velocity in humans or to induce certain desirable physiological responses by varying the composition and strength of the myelin sheath in various parts of the central nervous system.

References

Apatoff, Brian R. (2002). MS on the rise in the US. Neurology Alert 20(7), 55(2).

Braley, Tiffany J., Chervin, Ronald D. (2010). Fatigue in Multiple Sclerosis: Mechanisms, evaluation, and treatment. Sleep 33(8), 1061-1067.

Compston, Alastair, and Coles, Alasdair (2002). The Lancet 359(9313), 1221.

Gold, Stefan M., and Voskuhl, Rhonda R. (2009). Estrogen and testosterone therapies in multiple sclerosis. Progress in Brain Research 175: 239-251.

Life Extension Foundation (2013). Multiple Sclerosis, in: Disease Prevention and Treatment, 5th edition, 947-956.

Ludwin, SK, and Jacobson, S. (2011). Epstein- Barr Virus and MS: Causality or association? The International MS Journal 17.2: 39-43.

Mao, Peizhong, Manczak, Maria, Shirendeb, Ulziibat P., and Reddy, P. Hemachandra (2013). MitoQ, a mitochondira-targeted antioxidant, delays disease progression and alleviates pathogenesis in an experimental autoimmune encephalomyelitis mouse model of multiple sclerosis. Biochimica et Biophysica Acta Molecular Basis of Disease 1832(12): 2322- 2331.

Midgard, R. (2001). Epidemiology of multiple sclerosis: an overview. Journal of Neurology, Neurosurgery and Psychiatry 71(3), 422.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, February, 2014

Though She Isn’t Really Ill, There’s a Little Yellow Pill…

Humans have been ingesting mindand mood-altering substances for millennia, but it has only rather recently become possible to begin to elucidate drug mechanisms of action and to use this information, along with our burgeoning knowledge of neuroscience, to design drugs intended to have a specific effect. And though most people think of pharmaceuticals as “medicine,” it has become increasingly popular to discuss the possibilities for the use of drugs in enhancement, or improvement of “human form or functioning beyond what is necessary to sustain or restore good health” (E.T. Juengst; in Parens, 1998, p 29).

Some (transhumansits) believe that enhancement may not only be possible, but that it may even be a moral duty. Others (bioconservatives) fear that enhancement may cause us to lose sight of what it means to be human altogether. It is not the intention of this article to advocate enhancement or to denounce it. Instead, let’s review some of the drugs (and/or classes of drugs) that have been identified as the most promisingly cognitive- or mood-enhancing. Many of the drugs we will cover can be read about in further depth in Botox for the brain: enhancement of cognition, mood and pro-social behavior and blunting of unwanted memories (Jongh, R., et al., Neuroscience and Biobehavioral Reviews 32 (2008): 760-776).

Of most importance in considering potentially cognitive enhancer drugs is to keep in mind that, to date, no “magic bullets” appear to exist. That is, there are no drugs exhibiting such specificity as to have only the primary, desired effect. Indeed, a general principle of trade-offs (particularly in the form of side effects) appears to exist when it comes to drug administration for any purpose, whether treatment or enhancement. Such facts may constitute barriers to the practical use of pharmacological enhancers and should be taken into consideration when discussing the ethics of enhancement.

Some currently available cognitive enhancers include donepezil, modafinil, dopamine agonists, guanfacine, and methylphenidate. There are also efforts underway to develop memory-enhancing drugs, and we will discuss a few of the mechanisms by which they are proposed to act. Besides cognitive enhancement, the enhancement of mood and prosocial behavior in normal individuals are other types of enhancement that may be affected pharmacologically, most usually by antidepressants or oxytocin. Let’s briefly cover the evidence for the efficacy of each of these in enhancing cognition and/or mood before embarking on a more general discussion of the general principles of enhancement and ethical concerns.

One of the most widely cited cognitive enhancement drugs is donepezil (Aricept®), an acetylcholinesterase inhibitor. In 2002, Yesavage et al. reported the improved retention of training in healthy pilots tested in a flight simulator. In this study, after training in a flight simulator, half of the 18 subjects took 5 mg of donepezil for 30 days and the other half were given a placebo. The subjects returned to the lab to perform two test flights on day 30. The donepezil group was found to perform similarly to the initial test flight, while placebo group performance declined. These results were interpreted as an improvement in the ability to retain a practiced skill. Instead it seems possible that the better performance of the donepezil group could have been due to improved attention or working memory during the test flights on day 30.

Another experiment by Gron et al. (2005) looked at the effects of donepezil (5 mg/day for 30 days) on performance of healthy male subjects on a variety of neuropsychological tests probing attention, executive function, visual and verbal short-term and working memory, semantic memory, and verbal and visual episodic memory. They reported a selective enhancement of episodic memory performance, and suggested that the improved performance in Yesavage et al.’s study is not due to enhanced visual attention, but to increased episodic memory performance.

Ultimately, there is scarce evidence that donepezil improves retention of training. Better designed experiments need to be conducted before we can come to any firm conclusions regarding its efficacy as a cognitive-enhancing.

The wake-promoting agent modafinil (Provigil®) is another currently available drug that is purported to have cognitive enhancing effects. Provigil® is indicated for the treatment of excessive daytime sleepiness and is often prescribed to those with narcolepsy, obstructive sleep apnea, and shift work sleep disorder. Its mechanisms of action are unclear, but it is supposed that modafinil increases hypothalamic histamine release, thereby promoting wakefulness by indirect activation of the histaminergic system. However, some suggest that modafinil works by inhibiting GABA release in the cerebral cortex.

In normal, healthy subjects, modafinil (100-200 mg) appears to be an effective countermeasure for sleep loss. In several studies, it sustained alertness and performance of sleep-deprived subjects(up to 54.5 hours) and has also been found to improve subjective attention and alertness, spatial planning, stop signal reaction time, digit-span and visual pattern recognition memory. However, at least one study (Randall et al., 2003) reported “increased psychological anxiety and aggressive mood” and failed to find an effect on more complex forms of memory, suggesting that modafinil enhances performance only in very specific, simple tasks.

The dopamine agonists d-amphetamine, bromocriptine, and pergolide have all been shown to improve cognition in healthy volunteers, specifically working memory and executive function. Historically, amphetamines have been used by the military during World War II and the Korean War, and more recently as a treatment for ADHD (Adderall®). But usage statistics suggest that it is commonly used for enhancement by normal, healthy people—particularly college students.

Interestingly, the effect of dopaminergic augmentation appears to have an inverted U-relationship between endogenous dopamine levels and working memory performance. Several studies have provided evidence for this by demonstrating that individuals with a low workingmemory capacity benefit from greater improvements after taking a dopamine receptor agonist, while high-span subjects either do not benefit at all or show a decline in performance.

Guanfacine (Intuniv®) is an α2 adrenoceptor agonist, also indicated for treatment of ADHD symptoms in children, but by increasing norepinephrine levels in the brain. In healthy subjects, guanfacine has been shown to improve visuospatial memory (Jakala et al., 1999a, Jakala et al., 1999b), but the beneficial effects were accompanied by sedative and hypotensive effects (i.e., side effects). Other studies have failed to replicate these cognitive enhancing effects, perhaps due to differences in dosages and/or subject selection.

Methylphenidate (Ritalin®) is a well-known stimulant that works by blocking the reuptake of dopamine and norepinephrine. In healthy subjects, it has been found to enhance spatial workingmemory performance. Interestingly, as with dopamine agonists, an inverted U-relationship was seen, with subjects with lower baseline working memory capacity showing the greatest improvement after methylphenidate administration.

Future targets for enhancing cognition are generally focused on enhancing plasticity by targeting glutamate receptors (responsible for the induction of long-term potentiation) or by increasing CREB (known to strengthen synapses). Drugs targeting AMPA receptors, NMDA receptors, or the expression of CREB have all shown some promise in cognitive enhancement in animal studies, but little to no experiments have been carried out to determine effectiveness in normal, healthy humans.

Beyond cognitive enhancement, there is also the potentialfor enhancement of mood and pro-social behavior. Antidepressants are the first drugs that come to mind when discussing the pharmacological manipulation of mood, including selective serotonin reuptake inhibitors (SSRIs). Used for the treatment of mood disorders such as depression, SSRIs are not indicated for normal people of stable mood. However, some studies have shown that administration of SSRIs to healthy volunteers resulted in a general decrease of negative affect (such as sadness and anxiety) and an increase in social affiliation in a cooperative task. Such decreases in negative affect also appeared to induce a positive bias in information processing, resulting in decreased perception of fear and anger from facial expression cues.

Another potential use for pharmacological agents in otherwise healthy humans would be to blunt unwanted memories by preventing their consolidation.Thismay be accomplished by post-training disruption of noradrenergic transmission (as with β-adrenergic receptor antagonist propranolol). Propranolol has been shown to impair the long-term memory of emotionally arousing stories (but not emotionally neutral stories) by blocking the enhancing effect of arousal on memory (Cahill et al., 1994). In a particularly interesting study making use of patients admitted to the emergency department, post-trauma administration of propranolol reduced physiologic responses during mental imagery of the event 3 months later (Pitman et al., 2002). Further investigations have supported the memory blunting effects of propranolol, possibly by blocking the reconsolidation of traumatic memories.

GENERAL PRINCIPLES

Reviewing these drugs and their effects leads us to some general principles of cognitive and mood enhancement. The first is that many drugs have an inverted U-shaped dose-response curve, where low doses improve and high doses impair performance.This is potentially problematic for the practical use of cognition enhancers in healthy individuals, especially when doses that are most effective in facilitating one behavior simultaneously exert null or detrimental effects on other behaviors.

Second, a drug’s effect can be “baseline dependent,” where low-performing individuals experience greater benefit from the drug while higher-performing individuals do not see such benefits (which might simply reflect a ceiling effect), or may, in fact, see a deterioration in performance (which points to an inverted U-model). In the case of an inverted U-model, low performing individuals are found on the up slope of the inverted U and thus benefit from the drug, while high-performing individuals are located near the peak of the inverted U already and, in effect, experience an “overdose” of neurotransmitter that leads to a decline in performance.

Trade-offs exist in the realm of cognitive enhancing drugs as well. As mentioned, unwanted “side effects” are often experienced with drug administration, ranging from mild physiological symptoms such as sweating to more concerning issues like increased agitation, anxiety, and/or depression.

More specific trade-offs may come in the form of impairment of one cognitive ability at the expense of improving another. Some examples of this include the enhancement of long-term memory but deterioration of working memory with the use of drugs that activate the cAMP/protein kinase A (PKA) signaling pathway. Another tradeoff could occur between the stability versus the flexibility of long-term memory, as in the case of certain cannabinoid receptor antagonists which appear to lead to more robust long-term memories, but which also disrupt the ability of new information to modify those memories. Similarly, a trade-off may exist between stability and flexibility of working memory. Obviously, pharmacological manipulations that increase cognitive stability at the cost of a decreased capacity to flexibly alter behavior are potentially problematic in that one generally does not wish to have difficulty in responding appropriately to change.

Lastly, there is a trade-off involving the relationship between cognition and mood. Many mood-enhancing drugs, such as alcohol and even antidepressants, impair cognitive functioning to varying degrees. Cognition-enhancing drugs may also impair emotional functions. Because cognition and emotion are intricately regulated through interconnected brain pathways, inducing change in one area may have effects in the other. Much more research remains to be performed to elucidate these interactions before we can come to any firm conclusions.

ETHICAL CONCERNS

Again, though it is not the place of this article to advocate or denounce the use of drugs for human enhancement, obviously there are considerable ethical concerns when discussing the administration of drugs to otherwise healthy human beings. First and foremost, safety is of paramount importance. The risks and side-effects, including physical and psychological dependence, as well as long-term effects of drug use should be considered and weighed heavily against any potential benefits.

Societal pressure to take cognitive enhancing drugs is another ethical concern, especially in light of the fact that many may not actually produce benefits to the degree desired or expected. In the same vein, the use of enhancers may give some a competitive advantage, thus leading to concerns regarding fairness and equality (as we already see in the case of physical performance-enhancing drugs such as steroids). Additionally, it may be necessary, but very difficult, to make a distinction between enhancement and therapy in order to define the proper goals of medicine, to determine health-care cost reimbursement, and to “discriminate between morally right and morally problematic or suspicious interventions” (Parens, 1998). Of particular importance will be determining how to deal with drugs that are already used off-label for enhancement. Should they be provided by physicians under certain conditions? Or should they be regulated in the private commercial domain?

There is an interesting argument that using enhancers might change one’s authentic identity—that enhancing mood or behavior will lead to a personality that is not really one’s own (i.e., inauthenticity), or even dehumanization—while others argue that such drugs can help users to “become who the really are,” thereby strengthening their identity and authenticity. Lastly, according to the President’s Council on Bioethics, enhancement may “threaten our sense of human dignity and what is naturally human” (The President’s Council, 2003). According to the Council, “the use of memory blunters is morally problematic because it might cause a loss of empathy if we would habitually ‘erase’ our negative experiences, and because it would violate a duty to remember and to bear witness of crimes and atrocities.” On the other hand, many people believe that we are morally bound to transcend humans’ basic biological limits and to control the human condition. But even they must ask: what is the meaning of trust and relationships if we are able to manipulate them?

These are all questions without easy answers. It may be some time yet before the ethical considerations of human cognitive and mood enhancement really come to a head, given the apparently limited benefits of currently available drugs. But we should not avoid dealing with these issues in the meantime; for there will come a day when significant enhancement, whether via drugs or technological means, will be possible and available. And though various factions may disagree about the morality of enhancement, one thing is for sure: we have a moral obligation to be prepared to handle the consequences of enhancement, both positive and negative.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, December, 2013

An End to the Virus

Breakthroughs in medicine have increased substantially over the last hundred years, and most would agree that the introduction of antibiotics in 1942 has been one of the largest milestones in the history of medicine thus far. The success in treating bacterial infection has only accentuated the glaring lack of progress in developing effective therapeutics for those other enemies of the immune system, viruses. But Dr. Todd Rider and his team at MIT have dropped a bombshell with their announcement of a new broad spectrum antiviral therapeutic, DRACO, which appears not only to cure the common cold, but to halt or prevent infections by all known viruses.

Before talking specifically about this exciting news, let us first review viral biology and why viral infections have been so difficult to treat.

As you may recall from your early education, a virus particle, or virion, consists of DNA or RNA surrounded only by a protein coat (i.e., naked virus) or, occasionally, a protein coat and a lipid membrane (i.e., enveloped virus). Viruses have no organelles or metabolism and do not reproduce on their own, so they cannot function without using the cellular machinery of a host (bacteria, plant, or animal).

Viruses can be found all throughout our environment and are easily picked up and transferred to areas where they may enter our bodies, usually through the nose, mouth, or breaks in the skin. Once inside the host, the virus particle finds a host cell to infect so it can reproduce.

There are two ways that viruses reproduce. The first way is by attaching to the host cell and entering it or injecting viral DNA/RNA into the cell. This causes the host cell to make copies of the viral DNA and translate that DNA to make viral proteins. The host cell assembles new viruses and releases them when the cells break apart and die, or it buds the new viruses off, which preserves the host cell. This approach is called the lytic cycle.

The second way that viruses reproduce is to use the host cell’s own materials. A viral enzyme called reverse transcriptase makes a segment of DNA from its RNA using host materials. The DNA segment gets incorporated into the host cell’s DNA. There, the viral DNA lies dormant and gets reproduced with the host cell. When some environmental cue happens, the viral DNA takes over, makes viral RNA and proteins, and uses the host cell machinery to assemble new viruses. The new viruses bud off. This approach is called the lysogenic cycle; these viruses are called retroviruses and include herpes viruses and HIV.

Once free from the host cell the new viruses can attack other cells and produce thousands more virus particles, spreading quickly throughout the body. The immune system responds quickly by producing proteins to interfere with viral replication, pyrogenic chemicals to raise body temperature, and the induction of cell death (apoptosis). In some cases simply continuing the natural immune response is enough to eventually halt viral infection. But the virus kills many host cells in the meantime, leading to symptoms ranging from the characteristic runny nose and sore throat of a cold (rhinovirus) to the muscle aches and coughing associated with the flu (influenza virus).

Any virus can be deadly, especially to hosts with a weakened immune system, such as the elderly, small children, and persons with AIDS (though death is actually often due to a secondary bacterial infection). And any viral infection will cause pain and suffering, making treatment a very worthwhile goal. So far, the most successful approach to stopping viral infections has been prevention through the ubiquitous use of vaccines. The vaccine— either a weakened form of a particular virus or a mimic of one—stimulates the immune system to produce antibodies specific to that virus, thereby preventing infection when the virus is encountered in the environment. In another approach, antiviral medications are administered post-infection and work by targeting some of the specific ways that viruses reproduce.

However, viruses are very difficult to defeat. They vary enormously in genetic composition and physical conformation, making it difficult to develop a treatment that works for more than one specific virus. The immense number of viral types in nature makes even their classification a monumental job as there is more enormous structural diversity among viruses. Viruses have been evolving much longer than any cells have even existed and they have evolved methods to avoid detection and to overcome attempts to block replication. So, while we have made some progress in individual battles, those pesky viruses have definitely been winning the war.

Which is why the announcement of a broad spectrum antiviral therapeutic agent is such huge news. In their paper, Rider et al. describe a drug that is able to identify cells infected by any type of virus and which is then able to specifically kill only the infected cells to terminate the infection. The drug, named DRACO (which stands for Double-stranded RNA (dsRNA) Activated Caspase Oligomerizer), was tested against 15 viruses including rhinoviruses, H1N1 influenza, polio virus, and several types of hemorrhagic fever. And it was effective against every virus it was pitted against.

Dr. Rider looked closely at living cells’ own defense mechanisms in order to design DRACO. First, he observed that all known viruses make long strings of doublestranded RNA (dsRNA) during replication inside of a host cell, and that dsRNA is not found in human or other cells. As part of the natural immune response, human cells have proteins that latch onto dsRNA and start a biochemical cascade that prevents viral replication. But many viruses have evolved to overcome this response quite easily. So Rider combined dsRNA detection with a more potent weapon: apoptosis, or cell suicide.

Basically, the DRACO consists of two ends. One end identifies dsRNA and the other end induces cells to undergo apoptosis. When the DRACO binds to dsRNA it signals the other end of the DRACO to initiate cell suicide, thus killing the infected cell and terminating the infection. Beautifully, the DRACO also carries a protein that allows it to cross cell membranes and enter any human or animal cell. But if no dsRNA is present, it simply does nothing, leaving the cell unharmed.

An interesting question is whether any viruses are actually beneficial and whether wiping all viruses out of an organismal system may have negative consequences (as happens when antibiotic treatment eradicates both invading pathogenic bacteria and non-pathogenic flora, often leading to symptoms such as digestive upset). After his recent presentation at the 6th Strategies for Engineered Negligible Senescence (SENS) conference in September 2013, Dr. Rider fielded this question and stated quite adamantly that there are no known beneficial, symbiotic, or non-harmful viruses. This point is further emphasized in a recently published interview in which he is asked whether DRACO-triggered cell death could lead to a lesion in a tissue or organ. Rider responds that “Virtually all viruses will kill the host cell on the way out. Of the hand-full that don’t, your own immune system will try to kill those infected cells. So we’re not really killing any more cells with our approach than we already have been. It’s just that we’re killing them at an early enough stage before they infect and ultimately kill more cells. So, if anything, this limits the amount of cell death.”

So far, DRACO has been tested in cellular culture and in mouse models against a variety of very different virus types. Rider hopes to license DRACO to a pharmaceutical company so that it can be assessed in larger animal trials and, ultimately, human trials. Unfortunately, it may take a decade or more to complete this process and make the drug available for human therapeutic purposes, and that’s only if there is enough interest to do so. Amazingly, the DRACO project was started over 11 years ago and has barely survived during that period due to lack of interest and funding. Even now, after the DRACOs have been successfully engineered, produced, and tested, no one has yet reached out to Rider about taking them beyond the basic research stage. Let us hope that those of us who do find this work unbelievably exciting can make enough noise that Rider’s work continues to the benefit of all mankind.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, November, 2013

No More Couch Potato

In my review of The SharpBrains Guide to Brain Fitness a couple of months ago, the importance of certain lifestyle choices—particularly physical exercise— to maintain and enhance brain health was emphasized at length. Intuitively, we all know that physical activity is good for us. The metaphorical “couch potato” is assumed to be a person in poor health, precisely because of his or her lack of movement (and, of course, lazily consumed snacks and mind-numbing television). But even those of us who admonish the couch potato are moving our bodies a lot less these days due to an increase in the number of jobs requiring long periods of sitting. And current research is clear: all that sitting is taking a toll on our health.

So we know we need to get up and get moving. But what kind of exercise is best? So far, cardiovascular, or aerobic, exercise has received most of the attention in the literature. Because it is light-to-moderate in intensity and long in duration, aerobic exercise increases heart rate and circulation for extended periods, which is presumed to trigger biochemical changes in the brain that spur neuroplasticity—the production of new connections between neurons and even of new neurons themselves. It appears that the best regimen of aerobic exercise incorporates, at a minimum, three 30 to 60 minute sessions per week. In short, plenty of research has found that myriad positive physical and cognitive health benefits are correlated with aerobic exercise.

But what about non-aerobic exercise, such as strength training? The truth is that very little is known about the effects of non-aerobic exercise on cognitive health. What few studies exist show a positive effect of strength training on cognitive health, but the findings are definitely less conclusive than the plethora of evidence supporting aerobic exercise.

However, a lack of research should not be interpreted as negative results. I think non-aerobic exercise has received less research attention because, well, it is harder and appears less accessible than aerobic exercise. It is probably easier to get research participants to commit to a straightforward exercise regimen that doesn’t involve a lot of explanation or study to figure out. Let’s face it: pushing pedals on a stationary bike requires less mental effort than figuring out how to perform weight-bearing exercises with good form.

At worst, we may ultimately discover that non-aerobic exercise has no cognitive benefits. But let’s not throw the baby out with the bathwater. Because strength training does, in fact, promote a number of physical effects that are of great overall benefit to health, especially to the aging individual. Indeed, one would be remiss to omit strength training from any exercise regimen designed to promote healthy aging and a long, physically fit life.

The primary, and most obvious, effect of strength training is that of muscle development, or hypertrophy. Muscles function to produce force and motion and skeletal muscles are responsible for maintaining and changing posture, locomotion, and balance. Anyone who wishes to look and feel strong, physically capable, and well-balanced would do well to develop the appropriate muscles to reach these goals. Muscle mass declines with age, so it is smart to build a reserve of muscle in a relatively youthful state and to maintain it with regular workouts for as long as possible. Doing so will stave off the functional decline known as frailty, a recognized geriatric syndrome associated with weakness, slowing, decreased energy, lower activity, and unintended weight loss.

Those who know me know that I am very, very thin. At 5 foot 9 inches, it has always been a struggle to maintain my weight above 90 lbs.—a full 40 lbs. underweight for a woman of my height. This is almost certainly due, in large part, to genetics (my parents are both rail-thin), and no amount of eating has ever worked to put on additional pounds. Over the years, I grew more concerned about what my underweight meant in terms of disease risks as I age. In particular, dual energy x-ray absorbiometry (DEXA) scans for bone mineral density at age 27 and 33 showed accelerated bone loss beyond what is normal for my age. I was on a trajectory for a diagnosis of osteoporosis by my mid-40s.

Besides ensuring adequate calcium intake, I knew that the best prescription for slowing down bone loss is to perform weight-bearing exercises. Strength training causes the muscles to pull on the bone, resulting in increased bone strength. Strength training also increases muscle strength and flexibility, which reduces the likelihood of falling—the number-one risk factor for hip fracture.

So I dusted off my long-unused gym pass and started strength training 3 to 4 times a week. I was too weak to even lift weights in the beginning, so I started with body weight exercises and gradually progressed to weight machines. Weight machines allow you to build strength and to gain an understanding of how an exercise works a particular muscle or group of muscles. Many machines also have a limited range of motion within which to perform the exercise, providing some guidance on how to perform the movement. As I made improvements in strength, I began reading about strength training exercises online and downloaded some apps to help me in the gym.

For a basic “how-to,” nothing beats a video. There are plenty of exercise demonstration videos on YouTube.com and several other sites, but I prefer the definitive (and straight-to-the-point) visual aids provided by Bodybuilding.com. They offer short instructional videos for just about every strength training exercise in existence. The videos also download quickly and play easily on a mobile device, in case you need a refresher in the gym.

There are a lot of great apps out there, too. My favorites so far include PerfectBody (and associated apps by the same developer), GymPact, and Fitocracy. PerfectBody provides weekly workout routines, complete with illustrated descriptions of exercises and the ability to track your progress by documenting weight lifted and number of repetitions (reps) for each exercise. It is an all-in-one fitness program for learning foundational exercises and building strength and confidence in the gym.

If you have a hard time committing to a workout schedule, Gympact may help. One of the latest in a series of apps that make you put your money where your mouth is, you make a Gympact agreement to go to the gym a minimum number of times per week in order to earn monetary rewards for doing so. The catch is that you are charged money if you fail to meet your pact (which helps to pay all those committed gym-goers who didn’t renege on their promises). For many, the thought of losing money can provide quite the incentive to get your tail to the gym.

Now that you’ve got exercise examples, progress tracking, and motivation to actually get to the gym, how about some fun? Fitocracy is an app that turns exercise into a game, letting you track your exercise in return for points and “level ups” like a video game. There are challenges to meet and quests to conquer, adding to the competitive game-play element. But there’s also a nice social aspect, with friends and groups enabling people to “prop” one another and to provide support and advice.

Once you start pumping iron, you may quickly realize a need for nutrition adequate to meet your new muscle-building goals. As we all know, protein is the most important nutrient for building muscle. And while I will not attempt to provide advice regarding the appropriate nutrient ratio for the calories you consume each day, I can tell you that it is generally recommended to get at least 1 gram of protein per pound of body weight per day if you want to support muscle growth.

Adequate protein consumption is necessary even if you are not strength training and becomes even more important as you age. Reduced appetite and food intake, impaired nutrient absorption, and age-related medical and social changes often result in malnourishment. An insufficient intake of protein, in particular, can lead to loss of muscle mass, reduced strength, and many other negative factors leading to frailty.

It seems that whey protein provides the ultimate benefits in this arena. Whey, which is derived from milk, is a high-quality protein supplement with a rich source of branched-chain amino acids (BCAAs) to stimulate protein synthesis and inhibit protein breakdown, helping to prevent agerelated muscle-wasting (i.e., sarcopenia). Besides muscle support, a growing number of studies indicate other positive, antiaging effects of whey such as antioxidant enhancement, anti-hypertensive effect, hypoglycemic effect, and the promotion of bone formation and suppression of bone resorption. Life Extension Foundation recently reported that these effects mimic the benefits of calorie restriction without a reduction of food intake, playing roles in hormone secretion and action, intracellular signaling, and regulation of gene transcription and translation.

There are many whey protein powder supplements on the market in a variety of formulations and flavors. Whey protein isolate is quickly absorbed and incorporated into muscles, making it a good post-workout option, whereas whey protein concentrate is absorbed and incorporated more slowly, making it ideal for consumption just before bedtime. A whey protein powder may consist of isolate only, concentrate only, or both. Choose what best meets your needs and purposes.

Flavor is an important factor to consider, as well. Most major brands offer a variety of flavors such as vanilla, chocolate, strawberry, and some exotic options. Unflavored powders are sometimes available and are a great neutral protein base for mixing into (green) smoothies or other recipes. Some whey protein powders may actually include sugars to “improve” taste, so make sure to read the ingredients. Even many zero carb powders are still quite sweet. Many brands offer sample size packets which can be very helpful in determining whether or not you like a particular flavor or overall taste prior to buying an entire container.

Lastly, consider the sources of whey protein powder ingredients carefully. Not all whey is created equal, and many commercial brands on the market derive their ingredients from dubious sources or from animals treated with hormones and living in less-than-stellar conditions. But there are many great products out there, including Life Extension’s New Zealand Whey Protein Concentrate, which is derived from grass-fed, free range cows living healthy lives in New Zealand and not treated with Growth Hormone (rBST). If you have reservations about whey protein, there are also alternative protein powders that are derived from plants or egg white.

In summary, while the jury is still out regarding the cognitive benefits of nonaerobic exercise, such exercise is still a very important part of an overall plan to support health and longevity. Adequate nutritional support in the form of whey protein supplementation is generally indicated for its many health benefits, and is absolutely integral to muscle-building efforts. At the very least, strength training should complement brain-boosting aerobic exercise and will help to stave off bone loss and frailty as you age. So erase any preconceived notions you may have had about bodybuilding and start lifting today!

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, October, 2013

Brain Fitness

Book Review: The SharpBrains Guide to Brain Fitness: How to Optimize Brain Health and Performance at Any Age by Alvero Fernandez

Of all the organs in the human body, a cryonicist should be most concerned about the health and integrity of his or her brain. Thousands of books have been written about physical health and fitness, but very few address the topic of how to keep the brain fit and healthy. Happily, interest in brain fitness, once relegated to academics and gerontologists, is now taking root across America and the world.

The importance of lifelong learning and mental stimulation as a component of healthy aging has long been recognized and touted as a way to stay mentally alert and to stave off dementia in old age. As with physical exercise, “use it or lose it” appears to apply to our brains too. And now that scientists are learning more about neuroplasticity and how brains change as a result of aging, they have begun to test the effects of various factors on brain health and cognitive ability across the lifespan.

Unfortunately, like much health-related research, the results reported by the media have often been convoluted, confusing, and even contradictory. Products developed by overzealous entrepreneurs make outlandish claims and frequently don’t deliver the purported results. Consumers and professionals alike are left wondering what works and what doesn’t when it comes to maintaining our brains in optimal working condition.

To aid all those navigating the murky waters of brain fitness, enter SharpBrains—a company dedicated to tracking news, research, technology, and trends in brain health and to disseminating information about the applications of brain science innovation. In so doing, they “maintain an annual state-of-the-market consumer report series, publish consumer guides to inform decision-making, produce an annual global and virtual professional conference,” and maintain SharpBrains.com, a leading educational blog and website with over 100,000 monthly readers.

Most recently, SharpBrains has published a book on brain fitness called The SharpBrains Guide to Brain Fitness: How to Optimize Brain Health and Performance at Any Age. A compilation and condensation of information accumulated over the lifespan of the company, The SharpBrains Guide to Brain Fitness emphasizes credible research and goes to great lengths to provide the most up-to-date research results in specific areas of brain fitness, followed by interviews with scientists doing work in those fields. The goal of the guide is to help the reader begin to “cultivate a new mindset and master a new toolkit that allow us appreciate and take full advantage of our brain’s incredible properties…[by] providing the information and understanding to make sound and personally relevant decisions about how to optimize your own brain health and performance.”

The Guide begins by emphasizing that the brain’s many neuronal networks serve distinct functions including various types of memory, language, emotional regulation, attention, and planning. Plasticity of the brain is defined as its lifelong capacity to change and reorganize itself in response to the stimulation of learning and experience—the foundation upon which “brain training” to improve cognitive performance at any age, and to maintain brain health into old age, is predicated.

The difficulty of making sense of the scientific findings on brain health and neuroplasticity is discussed at length, with the finger of blame pointed squarely at the media for reporting only fragments of the research and for often reporting those results which are not most meaningful. The authors stress that “it is critical to complement popular media sources with independent resources, and above all with one’s own informed judgment.”

The following chapters go on to review what is known today about how physical exercise, nutrition, mental challenge, social engagement, and stress management can positively affect brain health. Along the way they provide dozens of relevant research results (as well as the design of each study) to support their recommendations. Reporting on all of those experiments is beyond the scope of this review, so if you are interested in examining them (and you should be!) please obtain a copy of the Guide for yourself or from your local library.

Physical exercise is discussed first because of the very strong evidence that exercise—especially aerobic, or “cardio,” exercise slows atrophy of the brain associated with aging, actually increasing the brain’s volume of neurons (i.e., “gray matter”) and connections between neurons (i.e., “white matter”). While much of the initial research supporting the effects of exercise on the brain came from animal studies, the authors report that “several brain imaging studies have now shown that physical exercise is accompanied by increased brain volume in humans.”

Staying physically fit improves cognition across all age groups, with particularly large benefits for so-called “executive” functions such as planning, working memory, and inhibition. A 2010 meta-analysis by the NIH also concluded that physical exercise is a key factor in postponing cognitive decline and/or dementia, while other studies have found physical exercise to lower the risk of developing Parkinson’s disease, as well.

But don’t think that just any moving around will do the trick. When it comes to providing brain benefits, a clear distinction is drawn between physical activity and physical exercise. Only exercise will trigger the biochemical changes in the brain that spur neurogenesis and support neuroplasticity. It doesn’t need to be particularly strenuous, but to be most beneficial it should raise your heart rate and increase your breathing rate.

Of course, adequate nutrition is also imperative in obtaining and maintaining optimal brain health. The SharpBrains Guide to Brain Fitness primarily highlights the well-known benefits of the Mediterranean diet, which consists of a high intake of vegetables, fruit, cereals, and unsaturated fats, a low intake of dairy products, meat, and saturated fats, a moderate intake of fish, and regular but moderate alcohol consumption. But I think it is safe to say that the jury is still out on the best diet for the brain, as evidenced by the recent popularity of the Paleo diet among life extentionists. And, of course, ethnicity and genetics are important, too. The authors do stress the importance of omega-3 fatty acids and antioxidants obtained from dietary sources, stating firmly that “to date, no supplement has conclusively been shown to improve cognitive functioning, slow down cognitive decline, or postpone Alzheimer’s disease symptoms beyond placebo effect.” This includes herbal supplements such as Ginko biloba and St. John’s wort.

Beyond what we normally do to keep our bodies healthy, the Guide also discusses the relative effectiveness of different forms of “mental exercise.” Perhaps you’ve heard that doing crossword or Sudoku puzzles will keep you sharp and alert into old age, or that speaking multiple languages is associated with decreased risk of Alzheimer’s disease. The good news is that these things are true—to a degree. The part that is often left out is that it’s the challenge of these activities that is important. As with physical activity vs. physical exercise, mental exercise refers to the subset of mental activities that are effortful and challenging.

Puzzles and games may be challenging at first, but they (and other mental exercises) can quickly become routine and unchallenging. In order to reap the most benefit from mental exercise, the goal is to be exposed to novelty and increasing levels of challenge. Variety is important for stimulating all aspects of cognitive ability and performance, so excessive specialization is not the best strategy for maintaining long-term brain health. If you are an artist, try your hand at strategybased games. If you’re an economist, try an artistic activity. Get out of your comfort zone in order to stimulate skills that you rarely use otherwise.

The SharpBrains Guide states that “lifelong participation in cognitively engaging activities results in delayed cognitive decline in healthy individuals and in spending less time living with dementia in people diagnosed with Alzheimer’s disease.” This is hypothesized to be because doing so builds up one’s “cognitive reserve”—literally an extra reservoir of neurons and neuronal connections—which may be utilized so that a person continues to function normally even in the face of underlying Alzheimer’s or other brain pathology. This observation raises another important point on which neuroscientists and physiologists do not yet fully agree. Will we all eventually get dementia if we live long enough without credible brain rejuvenation biotechnologies? This is a topic I would like to return to in a future installment of Cooler Minds Prevail.

Social engagement also appears to provide brain benefits. The NIH meta-analysis mentioned earlier concluded that higher social engagement in mid- to late life is associated with higher cognitive functioning and reduced risk of cognitive decline. Brain imaging studies indicate an effect of social stimulation on the volume of the amygdala, a structure that plays a major role in our emotional responses and which is closely connected to the hippocampus, which is important for memory.

Yet again, not all activity is equal. When it comes to social stimulation, “you can expect to accrue more benefits within groups that have a purpose (such as a book club or a spiritual group) compared to casual social interactions (such as having a drink with a friend to relax after work).” To keep socially engaged across the lifespan, seek out interactions that naturally involve novelty, variety, and challenge such as volunteering and participating in social groups.

“The lifelong demands on any person have changed more rapidly in the last thousand years than our genes and brains have,” The SharpBrains Guide explains in the intro to the chapter on stress management. The result? It has become much more difficult to regulate stress and emotions. It is great that we have such amazing and complex brains, but humans are among the few animals that can get stressed from their own thoughts. And while there are some (potentially) beneficial effects of short bursts of stress, high and sustained levels of stress can have a number of negative consequences. Those of note include: increased levels of blood cortisol which can lead to sugar imbalances, high blood pressure, loss of muscle tissue and bone density, lower immunity, and cause damage to the brain; a reduction of certain neurotransmitters, such as serotonin and dopamine, which has been linked to depression; and a hampering of our ability to make changes to reduce the stress, resulting in General Adaption Syndrome (aka “burnout”).

Research-based lifestyle solutions to combat stress include exercise, relaxation, socialization, humor and laughter, and positive thinking. In particular, targeted, capacity-building techniques such as biofeedback and meditation are recommended to manage stress and build resilience. Mindfulness Based Stress Reduction (MBSR) programs have provided evidence that meditative techniques can help manage stress and research shows that MBSR can lead to decreases in the density of an area of the amygdala which is correlated with reduction in reported stress.

So it appears that multiple approaches are necessary to develop a highly fit brain capable of adapting to new situations and challenges throughout life. “Consequently,” The SharpBrains Guide to Brain Fitness states, “we expect cross-training the brain to soon become as mainstream as cross-training the body is today, going beyond unstructured mental activity in order to maximize specific brain functions.”

There is growing evidence that brain training can work, but in evaluating what “works” we are mostly looking at two things: how successful the training program is (i.e., does it actually improve the skill(s) being trained?) and the likelihood of transfer from training to daily life. Building on an analysis of documented examples of brain training techniques that “work” or “transfer,” SharpBrains suggests the following five conditions need to be met for brain training to be likely to translate into meaningful real world improvements (condensed excerpt):

  1. Training must engage and exercise a core brain-based capacity or neural circuit identified to be relevant to real-life outcomes.
  2. The training must target a performance bottleneck.
  3. A minimum “dose” of 15 hours total per targeted brain function, performed over 8 weeks or less, is necessary for real improvement.
  4. Training must be adaptive to performance, require effortful attention, and increase in difficulty.
  5. Over the long-term, the key is continued practice for continued benefits.

Meditation, biofeedback, and/or cognitive therapy in concert with cognitive training to optimize targeted brain functions appear to be winning combinations in terms of successful techniques facilitating transfer from training to real life benefits. Top brain training software programs, based on SharpBrains’ analysis and a survey of their users, include Lumosity, Brain games, brainHQ, Cogmed, and emWave.

In the end, brain fitness needs are unique to each individual and brain fitness claims should be evaluated skeptically. SharpBrains recommends asking several questions when evaluating brain fitness claims, particularly whether there is clear and credible evidence of the program’s success documented in peer-reviewed scientific papers published in mainstream scientific journals that analyze the effects of the specific product.

Of course, your own individual experience with the product is ultimately the most important evaluation of all. If you are ready to take the plunge into the emerging brain fitness market, The SharpBrains Guide to Brain Fitness is a good place to start, and I’m sure they’d appreciate your feedback as this field continues to develop.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, August, 2013

HIV, Immunosenescence, and Accelerated Aging

After a few articles considering Alzheimer disease from several angles, I would like to switch gears this month and talk more generally about the interaction between the immune system and aging.

In his 2012 paper[1], Caleb E. Finch documents the evolution of life expectancy in the course of human history. The life expectancy at birth of our shared ape ancestor 6 millions years ago is hypothesized to approximate that of a chimpanzee, 15 years. The first Homo species appeared 1-2 million years ago and had a life expectancy of ~20 years, while H. sapiens came onto the scene ~100,000 years ago and could expect about 30 years of life. But starting around 200 years ago, concurrent with industrialization, human life expectancy jumped rapidly, to somewhere between 70 and 80 years today.

As many readers are likely aware, the huge recent increases in life expectancy are commonly attributed to improvements in hygiene, nutrition, and medicine during the nineteenth and twentieth centuries that reduced mortality from infections at all ages. Finch hypothesizes, generally, that early age mortality over the course of human history is primarily due to (acute) infection, while old age mortality is primarily due to (chronic) inflammation. Further analysis of mortality rates over the last several hundred years leads him to further hypothesize that aging has been slowed in proportion to the reduced exposure to infections in early life. These hypotheses are supported by twentieth century examples which strongly demonstrate influences of the early life environment on adult health, such as the effects of prenatal and postnatal developmental influences (e.g., nutrition, exposure to infection) on adult chronic metabolic and vascular disorders as well as physical traits and mental characteristics. This leads Finch to suggest “broadening the concept of ‘developmental origins’ to include three groups of factors: nutritional deficits, chronic stress from socioeconomic factors, and direct and indirect damage from infections.”

Finch also considers the effects of inflammation and diet on human evolution, proposing several environmental and foraging factors that may have been important in the genetic basis for evolving lower basal mortality through interactions with chronic inflammation, in particular: dietary fat and caloric content; infections from pathogens ingested from carrion and from exposure to excreta; and noninfectious inflammagens such as those in aerosols and in cooked foods. He hypothesizes that exposure to these proinflammatory factors, which one would expect to shorten life expectancy, actually resulted in humans evolving lower mortality and longer lifespans in response to highly inflammatory environments.

A means for this, he argues, was the development of the apoE4 genotype. Noting that the apoE4 allele favors advantageous fat accumulation and is also associated with enhanced inflammatory responses, Finch argues that heightened inflammatory response and more efficient fat storage would have been adaptive in a pro-inflammatory environment and during times of uncertain nutrition. As has been discussed in prior articles in Cooler Minds Prevail, the apoE alleles also influence diverse chronic non-infectious degenerative diseases and lifespan. “Thus,” Finch concludes, “the apoE allele system has multiple influences relevant to evolution of brain development, metabolic storage, host defense, and longevity.”

With the general relationship between inflammation and the evolution of human aging and life expectancy in mind, let us now consider immune system involvement in more detail, and the relationship between HIV and immunosenescence more specifically.

Immunosenescence refers to the ageassociated deterioration of the immune system. As an organism ages it gradually becomes deficient in its ability to respond to infections and experiences a decline in long-term immune memory. This is due to a number of specific biological changes such as diminished self-renewal capacity of hematopoietic stem cells, a decline in total number of phagocytes, impairment of Natural Killer (NK) and dendritic cells, and a reduction in B-cell population. There is also a decline in the production of new naïve lymphocytes and the functional competence of memory cell populations. As a result, advanced age is associated with increased frequency and severity of pathological health problems as well as an increase in morbidity due to impaired ability to respond to infections, diseases, and disorders.

It is not hard to imagine that an increased viral load leading to chronic inflammatory response may accelerate aging and immunosenescence. Evidence for this is accumulating rapidly since the advent of antiretroviral therapies for treatment of HIV infection. An unforeseen consequence of these successful therapies is that HIV patients are living longer but a striking number of them appear to be getting older faster, particularly showing early signs of dementia usually seen in the elderly. In one study, slightly more than 10% of older patients (avg = 56.7 years) with wellcontrolled HIV infection had cerebrospinal fluid (CSF) marker profiles consistent with Alzheimer disease[2] – more than 10 times the risk prevalence of the general population at the same age. HIV patients are also registering higher rates of insulin resistance and cholesterol imbalances, suffer elevated rates of melanoma and kidney cancers, and seven times the rate of other non-HIV-related cancers. And ultimately, long-term treated HIV-infected individuals also die at an earlier age than HIV-uninfected individuals[3].

Recent research is beginning to explore and unravel the interplay between HIV infection and other environmental factors (such as co-infection with other viruses) in the acceleration of the aging process of the immune system, leading to immunosenescence. In the setting of HIV infection, the immune response is associated with abnormally high levels of activation, leading to a cascade of continued viral spread and cell death, and accelerating the physiologic steps associated with immunosenescence. Despite clear improvements associated with effective antiretroviral therapy, some subjects show persistent alterations in T cell homeostasis, especially constraints on T cell recovery, which are further exacerbated in the setting of co-infection and increasing age.

Unsurprisingly, it has been observed that markers of immunosenescence might predict morbidity and mortality in HIV-infected adults as well as the general population. In both HIV infection and aging, immunosenescence is marked by an increased proportion of CD28- to CD57+, and memory CD8+ T cells with reduced capacity to produce interleukin 2 (IL-2), increased production of interleukin 6 (IL-6), resistance to apoptosis, and shortened telomeres. Levels of markers of inflammation are elevated in HIV infected patients, and elevations in markers such as high-sensitivity C-reactive protein, D-dimer, and interleukin 6 (IL-6) have been associated with increased risk for cardiovascular disease, opportunistic conditions, or all-cause mortality[4].

But even as we are beginning to identify markers that appear to be associated with risk of poor outcome in HIV infection, it is still unclear how patients should be treated on the basis of this information. To that end, several trials are underway to evaluate the effects of modulation of immune activation and inflammation in HIV infection. At the same time, clinicians at the forefront of advancing knowledge and clinical care are performing research aimed at optimizing care for aging HIV patients.

The implications for such research may be far-reaching. In fact, many HIV clinicians and researchers think that HIV may be key to understanding aging in general. Dr. Eric Verdin states, “I think in treated, HIV-infected patients the primary driver of disease is immunological. The study of individuals who are HIV-positive is likely to teach us things that are really new and important, not only about HIV infection, but also about normal aging.”

Dr. Steven Deeks stresses the collaborative efforts of experts across fields. “I think there is a high potential for tremendous progress in understanding HIV if we can assemble a team of experts from the world of HIV immunology and the world of gerontology,” he says. “Each field can dramatically inform the other. I believe HIV is a well described, well studied, distinct disease that can be used as
a model by the larger community to look at issues of aging.”

References

[1] Finch, C (2012). Evolution of the Human Lifespan, Past, Present, and Future: Phases in the Evolution of Human Life Expectancy in Relation to the Inflammatory Load. Proceedings of the American Philosophical Society, 156:1, 9-44.

[2] Mascolini, M (2013). Over 10% in Older HIV Group Fit Alzheimer’s Biomarker Risk Profile. Conference Reports for NATAP: 20th Conference on Retroviruses and Opportunistic Infections, March 3-6, 2013.

[3] Aberg, X (2012). Aging, Inflammation, and HIV Infection. Topics in Antiviral Medicine, 20:3, 101-105.

[4] Deeks, S, Verdin, S. and McCune, JM (2012). Immunosenescence and HIV. Current Opinion in Immunology, 24: 1-6.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, June, 2013

Apolipoprotein E Genotype and Viral Infections

Last month this column considered current and future progress in Alzheimer Disease (AD) diagnosis, management, and treatment. Because AD is a terrible brain disease with an increasing rate of prevalence with age, and because it represents one of – if not the – worst conditions that can afflict a person with cryopreservation arrangements, I would like to continue our consideration of this well-known and widely-feared neurodegenerative disease. Specifically, our focus will be on apolipoprotein E (apoE) and research regarding its role in the modulation of physiological responses to certain viral infections.

ApoE protein is primarily synthesized peripherally in the liver and mediates cholesterol metabolism systemically, but it is also made in the central nervous system by astroglia and microglia (non-neuronal cell types) where it transports cholesterol to neurons. In the CNS, neurons express receptors for apoE that are part of the low density lipoprotein receptor gene family. Historically, apoE has been recognized for its role in lipoprotein metabolism and its importance in cardiovascular disease. Of course, apoE carrier status is also widely known as the major factor determining one’s risk of developing late-onset Alzheimer disease (AD). But more recent research has indicated that the various isoforms of apoE may also have significant immunological impact by conferring different susceptibilities to other diseases, as well.

The human apoE gene is located on chromosome 19 and is composed of 79 individual single nucleotide polymorphisms (SNPs). The three major alleles of apoE, named Epsilon-2 (Ɛ2), Epsilon-3 (Ɛ3), and Epsilon-4 (Ɛ4), are determined by differences in SNPs s429358 and rs7412. The products of these alleles are the protein isoforms apoE2, apoE3, and apoE4, which differ only by a single amino acid at two residues (amino acid 112 and amino acid 158). These amino acid substitutions affect noncovalent “salt bridge” formation within the proteins, which ultimately impacts on lipoprotein preference, stability of the protein, and receptor binding activities of the isoforms (see Table 1).

Isoform Amino acid 112 Amino acid 158 Relative charge Lipoprotein preference LDL receptor binding ability

apoE2

cysteine

cysteine

0

HDL

low

apoE3

cysteine

arginine

+1

HDL

high

apoE4

arginine

arginine

+2

VLDL, chylomicrons

high

Table 1. ApoE isoform amino acid differences and resulting chemical and physiological changes.

There are also two minor alleles, Epsilon-1 (Ɛ1) and Epsilon-5 (Ɛ5), which are present in less than 0.1% of the population. The three major alleles are responsible for three homozygous (Ɛ2/Ɛ2, Ɛ3/Ɛ3, Ɛ4/Ɛ4) and three heterozygous (Ɛ2/Ɛ3, Ɛ2/Ɛ4, Ɛ3/Ɛ4) genotypes. [I will pause to mention here that it is now quite easy to determine one’s genotype through services such as 23andme.com.]

An interesting document in the field is the literature review by Inga Kuhlman, et al. (Lipids in Health and Disease 2010, 9:8) which assesses hepatitis C, HIV and herpes simplex disease risk by ApoE genotype. An important finding is that the Ɛ4 allele is found less frequently in populations as they age (e.g., 14% of the general German population vs. 5% in centenarians), indicating that Ɛ4 is a major mortality factor in the elderly. This is assumed to be a result of the Ɛ4 allele’s well-known predisposition to Alzheimer and cardiovascular diseases.

The authors explain that “apoE4 carriers have a tendency for 5-10% higher fasting total cholesterol, LDL-cholesterol and triglyceride levels relative to homozygote Ɛ3/Ɛ3” and that this tendency towards higher lipid levels is probably responsible for the 40-50% greater cardiovascular disease risk in Ɛ4 carriers. They also point out that “although the molecular basis of the pathology is poorly understood, and likely to be in part due to apoE genotype associated differences in brain lipid metabolism, an apoE4 genotype has been highly consistently associated with the risk of an age-related loss of cognitive function, in an allele dose fashion.” This means, of course, that Ɛ4/Ɛ4 carriers are at greatest risk for cognitive dysfunction with increasing age.

In the field of immune regulation, a growing number of studies point to apoE’s interaction with many immunological processes. In their article, Kuhlman, et al., summarize the impact of the Ɛ4 allele on the susceptibility to specific infectious viral disease. The authors review a number of studies of the effects of apoE4 genotype on hepatitis C (HCV), human immunodeficiency virus (HIV), and herpes simplex (HSV) infection and outcome in humans.

In general, apoE4 was found to be protective against hepatitis C infection vs. (Ɛ3/Ɛ3) controls. Though the exact mechanisms of apoE genotype-specific effects on HCV life cycle remain uncertain, apoE seems to be involved because “available data indicate that the outcome of chronic HCV infection is better among Ɛ4 carriers due to slower fibrosis progression.”

Concerning the possible influence of apoE genotype on HIV infection and HIV-associated dementia, the authors call attention to the fact that “cholesterol is a crucial component of the HIV envelope and essential for viral entry and assembly.” Given that apoE is essential for cholesterol transport, they hypothesize that apoE genotype influences HIV-induced effects on neurological function. Subsequent review of available research suggests that the 4 allele is associated with higher steady-state viral load and faster disease progression due to accelerated virus entry in 4 carriers, but a correlation between apoE4 and HIV-associated dementia “remains controversial and needs to be clarified by further studies.”

Lastly, a review of the literature regarding the effects of apoE4 genotype on herpes simplex virus (HSV)-1 infection and outcome in humans indicates that apoE4 enhances the susceptibility for HSV-1 “as well as the neuroinvasiveness of HSV-1 compared to other apoE variants” (i.e., HSV-1 is found in more frequently in the CNS of 4 carriers). Importantly, the authors also note that “the combination of apoE4 and HSV-1 may lead to a higher risk of Alzheimer disease (AD) than either factor in isolation.”

Due to its generally being associated with higher risk of cardiovascular disease, dementia, and increased susceptibility to and/or accelerated progression of various viral infections, one may wonder why the 4 allele has not been eliminated by evolutionary selection. This may be explained, in part, by the protective and beneficial effects it exhibits in certain harmful infectious diseases, as demonstrated for hepatitis C.

The exact mechanisms of apoE influence on susceptibility to and course of viral infection remain shrouded. Because the mechanisms of HCV, HIV, and HSV infection are quite similar (i.e., all three viruses compete with apoE for cell attachment and receptor binding), it is interesting to find differences in receptor binding among them.

Involvement or interaction between the immune system, cognition, and brain diseases such as AD is an as-yet widely untouched field of inquiry. Further elucidation of the mechanisms by which apoE may influence the pathogenesis of infectious viral diseases can lead to new developments in the treatment of disease based on an individual’s apoE genotype.

Aside from the role that ApoE plays in susceptibility and progression of infectious disease, there is growing interest in the role that infection or a compromised immune system plays in the development of dementia. For example, despite the successful management of HIV with antiretroviral drugs, some patients are showing signs of memory impairment and dementia at a relatively young age. Interestingly, these people seem to show accelerated aging, too, which raises important questions about the relationship between the immune system, immunosenescence, and aging.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, May, 2013

Alzheimer Disease in 2020

Any terminal illness is a terrible thing; but to a cryonics member, a brain-destroying neurodegenerative disease is the worst contemporary medical “death sentence” one can receive. There are several flavors of neurodegenerative disorders, many of which primarily affect the patient’s movement, strength, coordination, or the peripheral nervous system. And there are numerous contributory mechanisms in the causation of neurodegeneration, including prion infection and toxin related disease. But the most common – and the most feared – neurodegenerative disease is one that affects not movement, but cognition.

Of course, I am speaking of Alzheimer disease (AD). Originally described in a 51- year old woman by the Bavarian psychiatrist Alois Alzheimer in 1906, neuropathologists have increasingly recognized that AD is also the most common basis for latelife cognitive failure. Culminating in neuronal dystrophy and death leading to the progressive loss of memory and other cognitive functions (i.e., dementia), and affecting individuals of both sexes and of all races and ethnic groups at a rate of occurrence in the U.S. ranging from approximately 1.3% (age 65-74) to 45% (age 85-93), it is easy to see why AD has generated so much intense scientific interest in recent years.

In the recently published work “The Biology of Alzheimer Disease” (2012), most of what is known about AD today is described in detail in the various chapters covering topics such as the neuropsychological profile and neuropathological alterations in AD, biomarkers of AD, the biochemistry and cell biology of the various proteins involved in AD, animal models of AD, the role of inflammation in AD, the genetics of AD, and treatment strategies. The editors’ selection of contributions has resulted in the most up-to-date compendium on Alzheimer disease to date.

The book culminates in a chapter called Alzheimer Disease in 2020, where the editors extol “the remarkable advances in unraveling the biological underpinnings of Alzheimer disease…during the last 25 years,” and yet also recognize that “we have made only the smallest of dents in the development of truly disease-modifying treatments.” So what can we reasonably expect over the course of the next 7 years or so? Will we bang our heads against the wall of discovery, or will there be enormous breakthroughs in identification and treatment of AD?

Though a definitive diagnosis of AD is only possible upon postmortem histopathological examination of the brain, a thorough review of the book leads me to believe that the greatest progress currently being made is in developing assays to diagnose AD at earlier stages. It is now known that neuropathological changes associated with AD may begin decades before symptoms manifest. This, coupled with the uncertainty inherent in a clinical diagnosis of AD, has driven a search for diagnostic markers. Two particular approaches have shown the most promise: brain imaging and the identification of fluid biomarkers of AD.

Historically, imaging was used only to exclude potentially surgically treatable causes of cognitive decline. Over the last few decades, imaging has moved from this minor role to a central position of diagnostic value with ever-increasing specificity. The ability to differentiate AD from alternative or contributory pathologies is of significant value now, but the need for an earlier and more certain diagnosis will only increase as disease-modifying therapies are identified. This will be particularly true if these therapies work best (or only) when initiated at the preclinical stage. Improvements in imaging have also greatly increased our understanding of the biology and progression of AD temporally and spatially. Importantly, the clinical correlations of these changes and their relationships to other biomarkers and to prognosis can be studied.

The primary modalities that have contributed to progress in AD imaging are structural magnetic resonance imaging (MRI), functional MRI, fluorodeoxyglucose (FDG) positron emission tomography (PET), and amyloid PET. Structural MRI, which is used to image the structure of the brain, has obvious utility in visualizing the progressive cerebral atrophy characteristic of AD. Such images can be used as a marker of disease progression and as a means of measuring effective treatments (which would slow the rate of atrophy). Functional MRI, on the other hand, measures changes in blood oxygen leveldependent (BOLD) MR signal. This signal, which can be acquired during cognitive tasks, may provide the clinician with a tool to compare brain activity across conditions in order to assess and detect early brain dysfunction related to AD and to monitor therapeutic response over relatively short time periods.

FDG PET primarily indicates brain metabolism and synaptic activity by measuring glucose analog fluorodeoxyglucose (which can be detected by PET after labeling it with Fluorine-18). A large body of FDG-PET work has identified an endophenotype of AD – that is, a signature set of regions that are typically hypometabolic in AD patients. FDG hypometabolism parallels cognitive function along the trajectory of normal, preclinical, prodromal, and established AD. Over the course of three decades of investigation, FDG PET has emerged as a robust marker of brain dysfunction in AD. Imaging of β-amyloid (Aβ) – the peptide that makes up the plaques found in the brains of AD patients – is accomplished via amyloid PET to determine brain Aβ content. Historically, this has only been possible upon postmortem examination, so the utility of amyloid imaging is in moving this assessment from the pathology laboratory to the clinic. Because amyloid deposition begins early on, however, amyloid PET is not useful as a marker of disease progression.

The well-known hallmarks of AD, the plaques and neurofibrillary tangles first described by Alouis Alzheimer in 1906, were discovered in 1985 to be composed primarily of β-amyloid and hyperphosphorylated tau protein, respectively. Advances in our knowledge of Aβ generation and tau protein homeostasis have led to substantial research into disease-modifying drugs aimed at decreasing overall plaque and tangle load in an effort to halt neurodegeneration. Such treatments will likely be most effective if started early in the disease process, making sensitive and accurate fluid biomarkers of Aβ and tau especially important.

Outside of imaging, progress in AD diagnostics stems primarily from the assessment of fluid biomarkers of AD. These biomarkers are generally procured from the cerebrospinal fluid (CSF) and blood plasma and include total tau (T-tau), phosphorylated tau (P-tau) and the 42 amino acid form of of β-amyloid (Aβ42). These core biomarkers reflect AD pathology and have high diagnostic accuracy, which is especially useful in diagnosing AD in prodromal and mild cognitive impairment cases.

Because the CSF is in direct contact with the extracellular space of the brain, biochemical changes in the brain can be detected in the CSF. Assays to detect Aβ42 led to the discovery that Aβ42 in AD is decreased to approximately 50% of control levels, making the measurement of Aβ42 a useful clinical tool. Measurements of T-tau (around 300% of control in AD patients) and P-tau biomarkers (a marked increase in AD patients) in combination with Aβ42, however, provide an even more powerful diagnostic assay.

Fluid biomarkers for AD other than Aβ and tau have been posited, but positive results have been difficult to replicate. Novel biomarkers with the most promise inlcude the amyloid precursor proteins sAPPβ and sAPPα, β-site APP cleaving enzyme-1 (BACE1), Aβ oligomers, and other Aβ isoforms. Additionally, neuronal and synaptic proteins as well as various inflammatory molecules and markers of oxidative stress may prove valuable as CSF biomarkers. Studies of plasma biomarkers such as those investigating plasma Aβ have yielded contradictory results, but promising novel blood biomarkers for AD may be found in certain signaling and inflammatory proteins.

Taken together, progress in brain imaging and identification of fluid biomarkers hold great promise in improved diagnosis of AD cases. When combined with expected drug therapies we may be able to delay the onset of neurodegeneration and associated cognitive impairment significantly. In the meantime, early diagnosis is helpful in stratifying AD cases, monitoring potential treatments for safety, and monitoring the biochemical effect of drugs. For cryonicists, early diagnosis can help guide treatment and end-of-life care decisions in order to optimize cryopreservation of the brain.

So – back to the original question. What can we predict about the AD landscape in 2020?

Besides continued progress in early diagnosis through brain imaging and fluid biomarkers, the authors anticipate that advances in whole-genome and exome sequencing will lead to a better understanding of all of the genes that contribute to overall genetic risk of AD. Additionally, improved ability to sense and detect the proteins that aggregate in AD and to distinguish these different assembly forms and to correlate the various conformations with cellular, synaptic, and brain network dysfunction should be forthcoming in the next few years. Lastly, we will continue to improve our understanding of the cell biology of neurodegeneration as well as cell-cell interactions and inflammation, providing new insights into what is important and what is not in AD pathogenesis and how it differs across individuals, which will lead, in turn, to improved clinical trials and treatment strategies.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, April, 2013

Ancient Brains

Cryonics seeks to preserve terminally ill humans in anticipation of future medical advances that may restore these patients to youthful vigor, cure their devastating diseases, and resuscitate them from cryopreservation itself. At the core of this mission lies the goal of preserving that which we know to be most important to continuity of the person him/herself: the brain.

Absent reversible cryopreservation of the brain (i.e., maintenance of viability), a cryonicist’s best hope for eventual resuscitation lies in preserving brain ultrastructure with as much fidelity as possible. Improvements in cryopreservation solutions, methodologies, and protocols from the field to the operating room have greatly enhanced our ability to meet this objective, as evidenced by microscopic evaluations of tissues vitrified in the lab. More recently, CT scans of patients after neuropreservation have provided valuable feedback as to the efficacy of cryoprotective perfusion in actual Alcor cases. Such progress bodes well for good patient outcomes.

But even our greatest attempts at optimal preservation are thwarted by issues such as long ischemic periods resulting in significant perfusion impairment or even the inability to perfuse at all. So how do we evaluate these patients in light of our objective?

Perhaps the best place to start is the extreme. Let us consider, for example, a prehistoric human brain discovered in 2008 at a construction site in York, UK. A paper published in 2011 in the Journal of Archaeological Science (“Exceptional preservation of a prehistoric human brain from Heslington, Yorkshire, UK”) provides gross and histological observations as well as preliminary results of chemical assays in order to determine the extent and cause of preservation of the brain. Low-powered reflected light microscopy and electron microscopy were performed to explore the surviving morphology and histology of the brain, while highly sensitive neuroimmunological techniques and proteomic analyses were employed to explore brain chemistry.

Examination of the skull indicated death by an abrupt trauma to the neck followed by deliberate dismemberment of the head between veretebrae C2 and C3. Significantly, the authors report “no trace of microbial activity, bacterial or fungal, with none of the porosity or ‘tunneling’ that is characteristic of putrefactive microorganisms.” Examination of the brain masses revealed recognizable sulci and gyri, but neither macroscopic nor CT evaluation could differentiate between grey and white matter.

Histological examination of the brain masses showed “a homogenous, amorphous substance that had not retained any cellular or matrix structure.” Transmission electronic microscopy (TEM) also did not detect any surviving cellular structure, although it did reveal what appeared to be “numerous morphologically degraded structures characteristic of the myelin sheath of nerve fibres.”

Preliminary biomolecular analysis found only 5% of the brain was detectable as hydrolysable amino acids, in contrast to fresh brain tissue of which proteins represent more than 1/3 of dry weight. When compared with a fresh brain, the Heslington brain was also depleted in polar amino acids and enriched in hydrophobic amino acids. Very little undegraded solventsoluble brain lipid was preserved (0.8%- 1.1% wet weight compared with 17.1% for rat brain). In addition, there was an almost complete absence of phospholipids and only a trace of cholesterol, while degradation products of a wide range of lipids were found in abundance.

Ultimately, the authors determined that the preservation of this brain was due to decapitation (thus eliminating the movement of putrefying bacteria from the gut to the brain) followed by inhibition of postmortem putrefaction achieved through rapid burial into fine-grained wet sediment. They go on to argue that this type of preservation is not as unusual as one might think, citing several similar examples of preserved prehistoric human brains, almost always found in wet burial environments.

While interesting in its own right, few would argue that the Heslington brain represents a state of preservation amenable to resuscitation. The ability to infer anything beyond gross macro structure has been obliterated and the normal chemical constituents of the brain have dissolved almost completely into the surrounding environment. Clearly, much of the look of a brain can be retained while none of the person’s identity remains (or is recoverable).

Let us then look at a situation that hits a little closer to home. Published in Forensic Science International in 2007, an article entitled “Autopsy at 2 months after death: Brain is satisfactorily preserved for neuropathology” provides us with considerable food for thought. In this example, a 77-year-old woman’s whole body was stored postmortem in a 3°C cooling chamber for 2 months prior to chemical fixation of her brain at autopsy.

The authors describe moderate autolysis of internal organs of the body, indicating the start of decomposition and putrefaction, as well as reduced tissue consistency and superficial areas of disintegration of the brain. Overall gross morphology was sufficiently preserved to allow macroscopic examination and application of neuropathological methods for diagnosis of neurological disorders. Importantly, they also report that “histologically, normal brain structures including all major parenchymal cell types (neurons, astrocytes, oligodendrocytes, microglia), neuropil, axons, and myelin sheaths were preserved.”

In this case, the use of cold temperatures (3°C) drastically slowed, but did not stop, deterioration of the brain. However, enough of the brain’s chemical constituents and physical structure remained to provide the basis for possible future resuscitation. And while this woman’s brain was preserved by chemical diffusion over the course of 9 weeks (allowing for continued degradation of subcortical tissues during the course of fixation), the use of cryogenic temperatures to quickly preserve her brain would also have been possible, as has been the situation for many “straight frozen” Alcor patients who were received in similar condition.

Exactly where the line between recoverability and non-recoverability — resulting in information-theoretic death — exists is yet to be determined. And while we push, rightfully, for ever greater preservation methods, we do well to remember that those preserved under lessthan- optimal conditions are by no means lost causes. Preserved information, even in fractured and distorted form, may well be adequate to infer the original state.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, March, 2013

Consciousness, Natural Selection, and Knowledge

Cryonics Magazine, February 2013

This is the first entry in a new series of short articles about neuroscience and its implications for the field of human cryopreservation and life extension. In this article I discuss the relationship of the brain to consciousness and knowledge acquisition before venturing into more specific and practical topics

What is consciousness? Most of us understand the word in context, but when asked to define it we are suddenly at a loss for words or at best we offer a description that seems wholly inadequate. Scientists, philosophers, and religious scholars have debated the source, meaning, and nature of consciousness for all of recorded history. But with the rise of neuroscience over the past few decades, it now seems as though explaining the nature and mechanisms of conscious experience in neurobiological terms may be an attainable goal.

The recent work on consciousness by neuroscientists has left certain philosophers more frustrated than ever before, including the likes of Thomas Nagel and David Chalmers. They suspect that consciousness may be quite different and separate from the brain circuitry proposed to underlie it.

Consciousness has appeared to be a strange and undefinable phenomenon for a very long time. Daniel Dennett captured the feeling very nicely in the 1970s:

“Consciousness appears to be the last bastion of occult properties, epiphenomena, immeasurable subjective states — in short, the one area of mind best left to the philosophers. Let them make fools of themselves trying to corral the quicksilver of “phenomenology” into a respectable theory.”(1)

Consciousness no longer appears this strange to many researchers, but the philosophers just mentioned continue to hold that it may not be reduced to brain processes active in cognition. A common philosophical complaint is that any neurobiological theory of consciousness will always leave something out. What it will always leave out is the feeling itself — the feeling of what it is like to be aware, to see green, to smell flowers, and so on (Nagel 1974; Chalmers, 1996). These are so-called qualia — the experiences themselves — and these are what are important about consciousness. The philosopher making this argument may go on to conclude that no science can ever really explain qualia because it cannot demonstrate what it is like to see green if you have never seen green. Ultimately, they argue, consciousness is beyond the reach of scientific understanding.

By contrast, neuroscientists take for granted that consciousness will be domesticated along with the rest of cognition. Indeed, this work tends to assume that neuroscience will not only identify correlates of consciousness, but will eventually tell us what consciousness is. By and large, these neuroscientific efforts have been directed toward cortical regions of the brain, cortical pathways, and cortical activity. This is due, in part, to the prevalence of clinical studies of human patients with region-specific cortical lesions that are correlated with deficits in specific kinds of experiences. This tendency to focus on the cortex may also reflect the common knowledge that humans possess the highest level of consciousness of all animals and have proportionally more cortex than our closest relatives (and — so the supposition goes — therein lies the difference in levels of consciousness).

Another theory of consciousness, offered by Dr. Gerald M. Edelman, aims to resolve this “divorce” between science and the humanities over theories of consciousness. The premise of Edelman’s theory is that the field of neuroscience has already provided enough information about how the brain works to support a scientifically plausible understanding of consciousness. His theory attempts to reconcile the two positions described earlier by examining how consciousness arose in the course of evolution.

In his book on the topic, Second Nature: Brain Science and Human Knowledge, Edelman says:

“An examination of the biological bases of consciousness reveals it to be based in a selectional system. This provides the grounds for understanding the complexity, the irreversibility, and the historical contingency of our phenomenal experience. These properties, which affect how we know, rule out an all-inclusive reduction to scientific description of certain products of our mental life such as art and ethics. But this does not mean that we have to invoke strange physical states, dualism, or panpsychism to explain the origin of conscious qualia. All of our mental life, reducible and irreducible, is based on the structure and dynamics of our brain.

In essence, Edelman has attempted to construct a comprehensive theory of consciousness that is consistent with the latest available neuroanatomical, neurophysiological, and behavioral data. Calling his idea Neural Darwinism, Edelman explains that the brain is a selection system that operates within an individual’s lifetime. Neural Darwinism proposes that, during neurogenesis, an enormous “primary repertoire” of physically connected populations of neurons arises. Subsequently, a “secondary repertoire” of functionally defined neuronal groups emerges as the animal experiences the world. A neural “value system,” developed over the course of evolution and believed to be made up of small populations of neurons within deep subcortical structures, is proposed to assign salience to particular stimuli encountered by the animal in order to select patterns of activity.

For example, when the response to a given stimulus leads to a positive outcome the value system will reinforce the synaptic connections between neurons that happened to be firing at that particular moment. When a stimulus is noxious, the value system will similarly strengthen the connections between neurons that happened to be firing at the time the stimulus was encountered, thus increasing the salience of that stimulus. When a stimulus has no salience, synaptic connections between neurons that fired upon first exposure to that stimulus will become weaker with successive exposures.

Importantly, the mapping of the world to the neural substrate is degenerate; that is, no two neuronal groups or maps are the same, either structurally or functionally. These maps are dynamic, and their borders shift with experience. And finally, since each individual has a unique history, no two individuals will express the same neural mappings of the world.

This brings us to the three tenets of Edelman’s theory:

1. Development of neural circuits leads to enormous microscopic anatomical variation that is the result of a process of continual selection;

2. An additional and overlapping set of selective events occurs when the repertoire of anatomical circuits that are formed receives signals because of an animal’s behavior or experience;

3. “Reentry” is the continual signaling from one brain region (or map) to another and back again across massively parallel fibers (axons) that are known to be omnipresent in higher brains.

Edelman thus believes that consciousness is entailed by reentrant activity among cortical areas and the thalamus and by the cortex interacting with itself and with subcortical structures. He suggests that primary consciousness appeared at a time when the thalmocortical system was greatly enlarged, accompanied by an increase in the number of specific thalamic nuclei and by enlargement of the cerebral cortex — probably after the transitions from reptiles to birds and separately to mammals about a quarter of a billion years ago. Higherorder consciousness (i.e., consciousness of consciousness), on the other hand, is due to reentrant connections between conceptual maps of the brain and those areas of the brain capable of symbolic or semantic reference — and it only fully flowered with hominids when true language appeared. Regarding language and its relationship to higher-order consciousness, Edelman explains:

“We do not inherit a language of thought. Instead, concepts are developed from the brain’s mapping of its own perceptual maps. Ultimately, therefore, concepts are initially about the world. Thought itself is based on brain events resulting from the activity of motor regions, activity that does not get conveyed to produce action. It is a premise of brain-based epistemology that subcortical structures such as the basal ganglia are critical in assuring the sequence of such brain events, yielding a kind of presyntax. So thought can occur in the absence of language….

The view of brain-based epistemology is that, after the evolution of a bipedal posture, of a supralaryngeal space, of presyntax for movement in the basal ganglia, and of an enlarged cerebral cortex, language arose as an invention. The theory rejects the notion of a brainbased, genetically inherited, language acquisition device. Instead, it contends that language acquisition is epigenetic. Its acquisition and its spread across speech communities would obviously favor its possessors over nonlinguistic hominids even though no direct inheritance of a universal grammar is at issue. Of course, hominids using language could then be further favored by natural selection acting on those systems of learning that favor language skills.”

Such a theory is attractive because it does not simply concentrate on conscious perception, but it also includes the role of behavior. We do well to keep in mind that moving, planning, deciding, executing plans, and more generally, keeping the body alive, is the fundamental business of the brain. Cognition and consciousness are what they are, and have the nature they have, because of their role in servicing behavior.

An important element of Edelman’s theory that consciousness is entailed by brain activity is that consciousness is not a “thing” or causal agent that does anything in the brain. He writes that “inasmuch as consciousness is a process entailed by neural activity in the reentrant dynamic core it cannot be itself causal.” This process causes a number of “useful” illusions such as “free will.”

Edelman’s theory of consciousness has further implications for the development of brain-based devices (BBDs), which Edelman believes will be conscious in the future as well. His central idea is that the overall structure and dynamics of a BBD, whether conscious or not, must resemble those of real brains in order to function. Unlike robots executing a defined program, the brains of such devices are built to have neuroanatomical structures and neuronal dynamics modeled on those known to have arisen during animal evolution and development.

Such devices currently exist — such as the “Darwin” device under development by The Neurosciences Institute. Darwin devices are situated in environments that allow them to make movements to sample various signal sequences and consequently develop perceptual categories and build appropriate memory systems in response to their experiences in the real world.

And though Edelman recognizes that it is currently not possible to reflect the degree of complexity of the thalmocortical system interacting with a basal ganglia system, much less to have it develop a true language with syntax as well as semantics, he nevertheless suggests that someday a conscious device could probably be built.

More ambitiously, Edelman also thinks that contemporary neuroscience can contribute to a naturalized epistemology. The term “naturalized epistemology” goes back to the analytical philosopher Willard Quine and refers to a movement away from the “justification” (or foundations) of knowledge and emphasizes the empirical processes of knowledge acquisition. Edelman is largely sympathetic towards Quine’s project, but provides a broader evolutionary framework to epistemology that also permits internal states of mind (consciousness).

1 Daniel C. Dennett, “Toward a Cognitive Theory of Consciousness,” in Brainstorms: Philosophical Essays on Mind and Psychology (Montgomery, VT: Bradford Books, 1978).