Tuesday 31 October 2017

What are physiological responses to stress?


Introduction

Although the term
stress
is commonly used by the general population to refer to various responses to events that individuals find taxing, the concept involves much more. For centuries, scientific thinkers and philosophers have been interested in learning more about the interactions between the environment (stressful events), emotions, and the body. Much is now known about this interaction, although there is still much left to discover. In the late twentieth century and beyond, much was learned about how stressful events affect the activity of the body (or physiology); for example, it has been established that these physiological responses to stressors sometimes increase the risk of development or exacerbate a number of diseases. To best understand the body’s response to stressful events (or stressors), the general sequence of events and the specific responses of various organ systems must be considered.





Almost all bodily responses are mediated at least partially by the central nervous system: the brain and spinal cord. The brain takes in and analyzes information from the external environment as well as from the internal environment (the rest of the body), and it acts to regulate the activities of the body to optimize adaptation or survival. When the brain detects a threat, a sequence of events occurs to prepare the body to fight or to flee the threat. Walter Bradford Cannon, in the early twentieth century, was the first to describe this fight-or-flight response
of the body. It is characterized by generalized physiological activation. Heart rate, blood pressure, and respiration increase to enhance the amount of oxygen available to the tissues. The distribution of blood flow changes to optimize efficiency of the tissues most needed to fight or flee: blood flow to the muscles, brain, and skin increases, while it decreases in the stomach and other organs less important for immediate survival. Increased sweating and muscle tension help regulate the body’s temperature and enhance movement if action is needed. Levels of blood glucose and insulin also increase to provide added energy sources, and immune function is depressed. Brain activity increases, resulting in enhanced sensitivity to incoming information and faster reactions to this information.


Taken together, these physiological changes serve to protect the organism and to prepare it to take action to survive threat. They occur quite rapidly and are controlled by the brain through a series of neurological and hormonal events. When the brain detects a threat (or stressor), it sends its activating message to the rest of the body through two primary channels, the sympathetic nervous system(SNS) and the pituitary-adrenal axis. The sympathetic nervous system is a branch of the nervous system that has multiple, diffuse neural connections to the rest of the body. It relays activating messages to the heart, liver, muscles, and other organs that produce the physiological changes already described. The sympathetic nervous system also stimulates the adrenal gland to secrete two hormones, epinephrine and norepinephrine (also known as adrenaline and noradrenaline), into the bloodstream. Epinephrine and norepinephrine further activate the heart, blood vessels, lungs, sweat glands, and other tissues.


Also, the brain sends an activating message through its hypothalamus to the pituitary gland, at the base of the brain. This message causes the pituitary to release hormones into the bloodstream that circulate to the peripheral tissues and activate them. The primary stress hormone that the pituitary gland releases is adrenocorticotropic hormone (ACTH), which in turn acts on the adrenal gland to cause the release of the hormone cortisol. The actions of cortisol on other organs cause increases in blood glucose and insulin, among many other reactions.


In addition to isolating these primary stress mechanisms, research has demonstrated that the body secretes naturally occurring opiates—endorphins and enkephalins—in response to stress. Receptors for these opiates are found throughout the body and brain. Although their function is not entirely clear, some research suggests that they serve to buffer the effects of stressful events by counteracting the effects of the sympathetic nervous system and stress hormones.




General Adaptation Syndrome

The human body contains a very sophisticated series of mechanisms that have evolved to enhance survival. When stressors and the subsequent physiological changes that are adaptive in the short run are chronic, however, they may produce long-term health risks. This idea was first discussed in detail in the mid-twentieth century by physiologist Hans Selye, who coined the term
general adaptation syndrome

to describe the body’s physiological responses to stressors and the mechanisms by which these responses might result in disease.


Selye’s general adaptation syndrome involves three stages of physiological response: alarm, resistance, and exhaustion. During the alarm stage, the organism detects a stressor and responds with sympathetic nervous system and hormonal activation. The second stage, resistance, is characterized by the body’s efforts to neutralize the effects of the stressor. Such attempts are meant to return the body to a state of homeostasis, or balance. (The concept of homeostasis, or the tendency of the body to seek to achieve an optimal, adaptive level of activity, was developed earlier by Cannon.) Finally, if the resistance stage is prolonged, exhaustion occurs, which can result in illness. Selye referred to such illnesses as diseases of adaptation. In this category of diseases, he included hypertension, cardiovascular disease, kidney disease, peptic ulcer, hyperthyroidism, and asthma.


Selye’s general adaptation syndrome has received considerable attention as a useful framework within which to study the effects of stressors on health, but there are several problems with his theory. First, it assumes that all stressors produce characteristic, widespread physiological changes that differ only in intensity and duration. There is compelling evidence, however, that different types of stressors can produce very different patterns of neural and hormonal responses. For example, some stressors produce increases in heart rate, while others can actually cause heart rate deceleration. Thus, Selye’s assumption of a nonspecific stress response must be questioned.


Also, Selye’s theory does not take into account individual differences in the pattern of response to threat. Research during the later twentieth century demonstrated that individuals vary widely in their physiological responses to identical stressors. Such differences may result from genetic or environmental influences. For example, some studies have demonstrated that normotensive offspring of hypertensive parents are more cardiovascularly responsive to brief stressors than individuals with normotensive parents. Although the genes responsible for hypertension might have been passed on from the hypertensive parents, these children might also have different socialization or learning histories that contribute to their exaggerated cardiovascular reactivity to stressors. Whatever the mechanism, this research highlights the fact that individuals and organ systems vary in the degree to which they respond to stress.




Stress and Illness

Coinciding with the scientific community’s growing acknowledgment that stressful events have direct physiological effects, much interest has developed in understanding the relations between these events and the development or maintenance of specific diseases. Probably the greatest amount of research has focused on the link between stress and heart disease, the primary cause of death in the United States. Much empirical work also has focused on gastrointestinal disorders, diabetes, and pain (for example, headache and arthritis). Researchers are beginning to develop an understanding of the links between stress and immune function. Such work has implications for the study of infectious disease (such as flu and mononucleosis), cancer, and acquired immunodeficiency syndrome (AIDS).


A number of types of research paradigms have been employed to study the effects of stressors on health and illness. Longitudinal studies have identified a number of
environmental stressors that contribute to the development or exacerbation of disease. For example, one study of more than four thousand residents of Alameda County, California, spanning two decades, showed that a number of environmental stressors such as social isolation were significant predictors of mortality from all causes. Other longitudinal investigations have linked stressful contexts such as loud noise, crowding, and low socioeconomic status with the onset or exacerbation of disease.


A major drawback of such longitudinal research is that no clear conclusions can be made about the exact mechanism or mechanisms by which the stressor affected health. Although it is possible, in the Alameda County study, that the relationship between social isolation and disease was mediated by the sympathetic nervous system/hormonal mechanisms already discussed, individuals who are isolated also may be less likely to engage in health care behaviors such as eating healthy diets, exercising, and maintaining preventive health care. Thus, other research paradigms have been used to try to clarify the causal mechanisms by which stressors may influence particular diseases. For example, many scientists use laboratory stress procedures to investigate the influence of brief, standardized stressors on physiology. This type of research has the advantage of being more easily controlled. That is, the researcher can manipulate one or a small number of variables (for example, noise) in the laboratory and measure the physiological effects. These effects are then thought to mimic the physiological effects of such a variable in the natural environment.


This research primarily is conducted to ask basic questions about the relations between stressors, physiology, and subsequent health. The findings also have implications, however, for prevention and intervention. If a particular stressor is identified that increases risk of a particular disease, prevention efforts could be developed to target the populations exposed to this stressor. Prevention strategies might involve either modifying the stressor, teaching people ways to manage more effectively their responses to it, or both.


During the last two or three decades, applied researchers have attempted to develop intervention strategies aimed at controlling the body’s physiological responses to stress. This work has suggested that a number of stress management strategies can actually attenuate physiological responsivity. Most strategies teach the individual some form of relaxation (such as deep muscle relaxation, biofeedback, hypnosis, or meditation), and most of this work has focused on populations already diagnosed with a stress-related disease, such as hypertension, diabetes, or ulcer. The techniques are thought to produce their effects by two possible mechanisms: lowering basal physiological activation (or changing the level at which homeostasis is achieved) and providing a strategy for more effectively responding to acute stressors to attenuate their physiological effects. Research has not proceeded far enough to make any statements about the relative importance of these mechanisms. Indeed, it is not clear whether either mechanism is active in many of the successful intervention studies. Although research does indicate that relaxation strategies often improve symptoms of stress-related illnesses, the causal mechanisms of such techniques remain to be clarified.




The Mind-Body Connection

The notion that the mind and body are connected has been considered since the writings of ancient Greece. Hippocrates described four bodily humors (fluids) that he associated with differing behavioral and psychological characteristics. Thus, the road was paved for scientific thinkers to consider the interrelations between environment, psychological state, and physiological state (that is, health and illness). Such considerations developed most rapidly in the twentieth century, when advancements in scientific methodology permitted a more rigorous examination of the relationships among these variables.


In the early twentieth century, Cannon was the first to document and discuss the fight-or-flight response to threatening events. He also reasoned that the response was adaptive, unless prolonged or repeated. In the 1940s, two physicians published observations consistent with Cannon’s of an ulcer patient who had a gastric fistula, enabling the doctors to observe directly the contents of the stomach. They reported that stomach acids and bleeding increased when the patient was anxious or angry, thus documenting the relations between stress, emotion, and physiology. Shortly after this work was published, Selye began reporting his experiments on the effects of cold and fatigue on the physiology of rats. These physical stressors produced enlarged adrenal glands and small thymus and lymph glands (involved in immune system functioning) as well as increasing ulcer formation.


Psychiatrists took this information, along with the writings of Sigmund Freud, to mean that certain disease states might be associated with particular personality types. Efforts to demonstrate the relationship between specific personality types and physical disease endpoints culminated in the development of a field known as psychosomatic medicine. Research, however, does not support the basic tenet of this field, that a given disease is linked with specific personality traits; thus, psychosomatic medicine has not received much support from the scientific community. The work of clinicians and researchers in psychosomatic medicine paved the way for late twentieth-century conceptualizations of the relations between stress and physiology. Most important, biopsychosocial models that view people’s health status in the context of the interaction between their biological vulnerability, psychological characteristics, and socio-occupational environment have been developed for a number of physical diseases.


Future research into individual differences in stress responses will further clarify the mechanisms by which stress exerts its effects on physiology. Once these mechanisms are identified, intervention strategies for use with patients or for prevention programs for at-risk individuals can be identified and implemented. Clarification of the role of the endogenous opiates in the stress response, for example, represents an important dimension in developing new strategies to enhance individual coping with stressors. Further investigation of the influence of stressors on immune function should also open new doors for prevention and intervention.


Much remains to be learned about why individuals differ in their responses to stress. Research in this area will seek to determine the influence of genes, environment, and behavior on the individual, elucidating the important differences between stress-tolerant and stress-intolerant individuals. Such work will provide a better understanding of the basic mechanisms by which stressors have their effects, and should lead to exciting new prevention and intervention strategies that will enhance health and improve the quality of life.




Bibliography


Craig, Kenneth D., and Stephen M. Weiss, eds. Health Enhancement, Disease Prevention, and Early Intervention: Biobehavioral Perspectives. New York: Springer, 1990. Print.



Everly, George S., Jr., and Jeffrey M. Lating. A Clinical Guide to the Treatment of the Human Stress Response. 3rd ed. New York: Springer, 2013. Print.



Fink, George, et al., eds. Encyclopedia of Stress. 2nd ed. 4 vols. Boston: Academic, 2007. Print.



Karren, Keith J., et al. Mind/Body Health: The Effects of Attitudes, Emotions, and Relationships. 4th ed. San Francisco: Pearson/Benjamin Cummings, 2009. Print.



Ogden, Jane. Health Psychology. New York: Open UP, 2012. Print.



Rice, Virginia Hill, ed. Handbook of Stress, Coping, and Health: Implications for Nursing Research, Theory, and Practice. 2nd ed. Thousand Oaks: Sage, 2012. Print.



Selye, Hans. The Stress of Life. Rev. ed. New York: McGraw-Hill, 1978. Print.



Sher, Leo, ed. Psychological Factors and Cardiovascular Disorders: The Role of Stress and Psychosocial Influences. New York: Nova Science, 2009. Print.

What are genital warts? |


Definition

Genital warts are growths or bumps that appear on the vulva; in or around the
vagina or anus; on the cervix, penis, scrotum, groin, or thigh; or, rarely, in the
mouth or throat. The warts may be raised or flat, single or multiple, small or
large. Some may cluster to form a cauliflower-like shape. This condition is one of
the most common sexually transmitted diseases (STDs).
















Causes

Genital warts are caused by the human papillomavirus (HPV). HPV is a family of more than
eighty common viruses. Many types of HPV cause harmless skin warts that are often
found on the fingers or feet. Only a few types are thought to cause genital
warts.


HPV is easily spread during oral, genital, or anal sex with an infected partner. About two-thirds of people who have sex with a partner who has genital warts will also develop them. Warts can take several weeks or months to appear. Most people will be exposed to a form of HPV at some point in their lives, but not everyone will become infected or develop symptoms.




Risk Factors

Risk factors for HPV and genital warts include multiple sexual partners, women
whose first male sexual partner has had two or more previous sexual partners, sex
without condoms, sex at an early age, skin-to-skin contact with an infected
partner, previous history of genital warts, pregnancy, smoking, and taking
oral
contraceptives. Persons age fifteen to thirty years are at
higher risk.




Symptoms

Genital warts often look like fleshy, raised growths. They have a cauliflower
shape and often appear in clusters. In women, warts may be found in the area of
the vulva, inside or around the vagina or anus, and on the cervix. In men, warts
are less common. If present, they are usually found on the tip or shaft of the
penis, on the scrotum, or around the anus. The following symptoms may also occur
for women and for men: bleeding, itching, irritation, burning, and a secondary
bacterial
infection with redness, tenderness, or pus.


Complications of HPV include cancer. Most strains of HPV that
produce genital warts do not cause cancer, but certain strains may cause
cervical
cancer. Less common are cancers of the vulva, anus, or penis.
It is important for women to have yearly Pap
tests, which can detect any HPV-related problems.


Genital warts may get larger during pregnancy and could make urination difficult. Warts in or near the vaginal opening may also block the birth canal during delivery.




Screening and Diagnosis

A doctor can diagnose genital warts by looking at them. If external warts are found on a woman, her cervix is usually also checked. In all patients, the doctor may use a special solution to help find lesions that do not have classic features.


An abnormal Pap test may indicate HPV, but the doctor will order more accurate
tests, such as a colposcopy, to diagnose HPV. A colposcopy is a special
device that allows the doctor to see if warts are in the cervix and vagina. The
doctor may take a tissue sample (biopsy) and test it. During an HPV
test, a swab of cells from the affected area can be checked for certain types of
HPV.




Treatment and Therapy

Treatment, which depends on the size and location of the warts, helps the symptoms but does not cure the virus. The virus stays in the body, and warts or other problems may recur.


Treatments may include topical treatments. The doctor may recommend topical medications to be applied to the affected areas. They include imiquimod cream, podophyllum resin, podofilox solution, 5-fluorouracil cream, and trichloroacetic acid.


Other treatment options include cryosurgery (freezing the wart),
electrocautery (burning the wart), and laser treatment, all of which destroy the
warts. These methods are used on small warts and on large warts that have not
responded to other treatment. A large wart can also be removed surgically. For
warts that keep coming back, an antiviral drug, called alpha-interferon, can be
injected into the wart.




Prevention and Outcomes

The only way to completely prevent HPV from spreading is to avoid physical contact with an infected partner. Latex condoms may help reduce the spread of HPV infection and genital warts. Condoms are not 100 percent effective, however, because they do not cover the entire genital area. Other ways to prevent infection include abstaining from sex, having a monogamous relationship, and getting regular checkups for STDs. Women should get regular Pap tests, starting at age eighteen years or at the start of sexual activity.


The vaccine Gardasil protects against four types of HPV. Studies have
shown that the vaccine reduced the number of precancerous cervical cell changes
for up to three years after the shot. The vaccine is routinely given to girls ages
eleven to twelve years, and a “catchup” vaccine is given to young women who have
not been vaccinated. The U.S. Food and Drug Administration has also
approved the use of Gardasil in males ages nine to twenty-six years.


Genital warts are rare in children. This diagnosis may indicate sexual abuse, which persons should report to authorities.




Bibliography


Behrman, Richard E., Robert M. Kliegman, and Hal B. Jenson, eds. Nelson Textbook of Pediatrics. 18th ed. Philadelphia: Saunders/Elsevier, 2007.



Centers for Disease Control and Prevention. “Genital Warts: Sexually Transmitted Diseases Treatment Guidelines 2010.” Available at http://www.cdc.gov/std/treatment/2010/genital-warts.htm.



_______. “HPV Vaccines.” Available at http://www.cdc.gov/hpv/vaccine.html.



Dunne, E. F., and L. E. Markowitz. “Genital Human Papillomavirus Infection.” Clinical Infectious Diseases 43 (2006): 624.



EBSCO Publishing. DynaMed: Condyloma acuminatum. Available through http://www.ebscohost.com/dynamed.



Hanna, E., and G. Bachmann. “HPV Vaccination with Gardasil: A Breakthrough in Women’s Health.” Expert Opinion on Biological Therapy 6 (2006): 1223-1227.



Henderson, Gregory, and Batya Swift Yasgur. Women at Risk: The HPV Epidemic and Your Cervical Health. New York: Putnam, 2002.



Lowy, D. R., and J. T. Schiller. “Papillomaviruses and Cervical Cancer: Pathogenesis and Vaccine Development.” Journal of the National Cancer Institute Monographs 23 (1998): 27-30.



McCance, Dennis J., ed. Human Papilloma Viruses. New York: Elsevier Science, 2002.



McLemore, M. R. “Gardasil: Introducing the New Human Papillomavirus Vaccine.” Clinical Journal of Oncology Nursing 10 (2006): 559-560.



Markowitz, Lauri E., et al. “Quadrivalent Human Papillomavirus Vaccine: Recommendations of the Advisory Committee on Immunization Practices (ACIP).” Morbidity and Mortality Weekly Report 56 (March 23, 2007): 1-24.



“New Vaccine Prevents Cervical Cancer.” FDA Consumer 40 (2006): 37.



“Quadrivalent Vaccine Against Human Papillomavirus to Prevent High-Grade Cervical Lesions.” New England Journal of Medicine 356 (2007): 1915-1927.



U.S. Food and Drug Administration. “FDA Approves New Indication for Gardasil to Prevent Genital Warts in Men and Boys.” Available at http://www.fda.gov.



Winer, R. L., et al. “Risk of Female Human Papillomavirus Acquisition Associated with First Male Sex Partner.” Journal of Infectious Diseases 197 (2008): 279-282.

Why has Dee assumed African dress, hairstyle, and name?

Dee has assumed an African-style dress and name as well as natural hair because she has become concerned with her heritage.  Of her new name, she tells Mama, "'I couldn’t bear it any longer, being named after the people who oppress me.'"  Mama reminds her that she was named after her Aunt Dicie, who was named after her mother, who was named after hermother, but she cannot trace the name back further than the...

Dee has assumed an African-style dress and name as well as natural hair because she has become concerned with her heritage.  Of her new name, she tells Mama, "'I couldn’t bear it any longer, being named after the people who oppress me.'"  Mama reminds her that she was named after her Aunt Dicie, who was named after her mother, who was named after her mother, but she cannot trace the name back further than the Civil War.  This seems to confirm, to Dee, that the name had something to do with slavery and thus not her real heritage or something worth preserving (despite the fact that several strong women in her family have had the name).  Thus, she seems to miss the point that her name is a mark of her heritage.


Dee does not think of heritage of something that one makes use of everyday; it is something to be preserved, not used.  It's as if Dee wants to prove that she has a certain kind of authenticity, and so she wants her grandmother's butter dish, the churn top, and the handmaid quilts, not so she can use them, as her family does, but so she can display and do "'something artistic'" with them.  With her name, her dress, and her hair, she seems to feel as though she is (re)creating a more authentic version of herself, but her lack of knowledge about where they've come from or who made them indicates that she really doesn't have the most sincere motivation.

What is grieving in psychopathology?


Introduction

Much of life depends on successful adaptation to change. When that change is experienced as a loss, the emotional and cognitive reactions are properly referred to as "grief." When the specific loss is acknowledged by a person’s culture, the loss is often met with rituals, behaviors that follow a certain pattern, sanctioned and choreographed within the culture. The term “bereavement” is applied to the loss of a significant person (such as a spouse, parent, child, or close relative or friend). In this case the grief and the sanctioned rituals are referred to as "mourning," although some writers use the terms “mourning” and “grieving” as synonyms.









Reaction to a loss often depends on whether the loss is experienced as central as opposed to peripheral to the self, with more central losses exerting greater impact on the people’s ability to function. People who have experienced an important loss may experience obsessive thoughts about who or what was lost, a sense of unreality, a conviction that they were personally responsible for the loss, a sense that there is no help or hope, a belief that they are bad people, and a sense that they are not able to concentrate and remember. Searching for and even perceiving the lost person (often in a dream) are not uncommon.


People’s emotional response may at first be engrossing. Shock, anger, sadness, guilt, anxiety, or even numbness are all possible reactions. Crying, fatigue, agitation, or even withdrawal are not unusual. Some people find it difficult to accept or absorb the reality of the loss in a reaction known as "denial." Depending on cultural, family, and individual traditions, some people suppress, repress, and deny part of their awareness, grief reaction, or both.


The centrality of loss within the self is also related to its circumstances. If people are prepared for the loss, they have made themselves less vulnerable to the loss of that person, place, or object in a process known as "anticipatory grief." This is experienced, for example, by those caring for terminally ill patients, as well as by the patients themselves. Although some argue that grief that is anticipated may be less challenging than that following an unexpected loss, the experience of grieving for someone and caring for that person at the same time can be extremely challenging.




History of Grief Studies

The study of grief as a scholarly concern was started by an essay, “Mourning and Melancholia,” written in 1917 by the Austrian founder of psychoanalysis, Sigmund Freud
. In it, Freud proposed that hysteria (a disorder of emotional instability and dissociation) and melancholia are symptoms of pathological grief. He indicated that painful dejection, loss of the ability to love, inhibition of activity, and decrease in self-esteem that continue beyond the normal time are what distinguish melancholia from mourning (the pathological from the normal). In melancholia, it is the ego (or self) that becomes poor and impoverished. In the pathological case, the damage to self is becoming permanent instead of being a temporary and reversible deprivation.


The study of grief as a normal process of loss evolved over two-and-a-half decades. It was not until 1944 that psychiatrist Erich Lindemann published a study based mostly on interviews with relatives of victims of the Cocoanut Grove nightclub fire in Boston in 1942. He characterized five different aspects of the grief reaction. Each of the five was believed by Lindemann to be normal. Each would give way as the individual readjusted to the environment without the deceased, formed new relationships, and released the ties of connection with the deceased. Morbid or pathological grief reactions were seen as distortions of the normal patterns. A common distortion had to do with delay in reacting to the death. In these cases, the person would either deny the death or continue to maintain composure and show little or no reaction to the death’s occurrence. Other forms of distorted reactions were overactivity, acquisition of symptoms associated with the deceased, social isolation, repression of emotions, and activities that were detrimental to the person’s social status and economic well-being. Examples of such detrimental activities might be getting drunk, being promiscuous, giving away all one’s money, and quitting one’s job.


In the early 1950s, British psychoanalyst and physician John Bowlby began to study loss in childhood, usually with children who were separated from their mothers. His early generalization summarized the child’s response in a threefold way: protest, despair, and detachment. From his later work, which included adult mourning, he came to the conclusion that mourning follows a similar pattern whether it takes place in childhood, adolescence, or adulthood. In his later work, he specified wide time frames for the first and second phases and expanded his threefold description of the process to identify four phases of mourning. All four phases overlap, and people may go back for a while to a previous phase. These phases were numbing, which may last from a few hours to a week and may be interrupted by episodes of intense distress or anger; yearning and searching for the lost figure, which may last for months and even for years; disorganization and despair; and reorganization to a greater or lesser degree.


What was new and interesting about the fourth phase is Bowlby’s introduction of a positive ending to the grieving process. This is the idea of reorganization, a positive restructuring of the person and the individual’s perceptual field. This is a striking advance beyond Lindemann’s notion that for the healthy person, the negative aspects of grieving would be dissipated in time.


Meanwhile, in the mid-1960s, quite independently of Bowlby, Swiss-born psychiatrist Elisabeth Kübler-Ross
was interviewing terminally ill patients in Chicago. She observed closely, listened sensitively, and reported on their experiences in an important book, On Death and Dying (1969). Her work focused on the experiences of the terminally ill.


Kübler-Ross was there as patients first refused to believe the prognosis, as they got angry at themselves and at others around them, as they attempted to argue their way out (to make a deal with God or whoever might have the power to change the reality), as they faced their own sadness and depression, and finally as they came to a sense of acceptance about their fate. (Her idea of acceptance is similar to Bowlby’s concept of reorganization.) From her interviews, she abstracted a
five-stage process in which terminally ill patients came to deal with the loss of their own lives: denial and isolation, anger, bargaining (prayer is an example), depression, and acceptance.




The Process

The grief process is complex and highly individualized. It is seldom as predictable and orderly as the stages presented by Bowlby and Kübler-Ross might imply. Studies conducted by psychologist Janice Genevro in 2003 and Yale University researchers in 2007, as well as a survey of Canadian hospices in 2008, contradict the stage theory of grief altogether and suggest that grief is actually a complex mix of recurring emotions and symptoms that eventually alleviate. The duration of intense grief is quite variable, which can be a source of frustration to bereaved individuals who just want to know when their intense grief will end. Some people take a while to fully realize their loss. In a process known as "denial" or "disbelief," the grieving itself may be delayed. In a normal grief process, bereaved people eventually reach acceptance and accomplish reassessment and reorganization of their lives.


Grieving is the psychological, biological, and behavioral way of dealing with the stress created when a significant part of the self or prop for the self is taken away. Austrian endocrinologist Hans Selye
made a vigorous career defining stress and considering the positive and negative effects that it may have on a person. He defined stress as “the nonspecific response of the body to any demand made upon it.” Clearly any significant change calls for adjustment and thus involves stress. Selye indicated that what counts is the severity of the demand, and this depends on the perception of the person involved.




Complicated Grief or Depression?

Researchers and practitioners are beginning to understand the antecedents and consequences of complicated grief. To some extent, the likelihood of complicated grief depends on the nature of the loss. Losses that are unexpected and those involving sudden or violent death or suicide are especially problematic, as are those associated with childhood abuse or neglect. Individuals who are socially isolated, who were abused or neglected as children, who had a difficult emotional relationship with the deceased, or who lack resilience are particularly vulnerable to complicated grief. Prior history of mental illness, religion, gender, age, and social support are other factors. Between 10 and 20 percent of individuals experiencing a loss exhibit complicated grief reactions. Apart from its negative emotional attributes, this type of grief reaction is associated with higher rates of illness and suicide. In 2013, the American Cancer Society estimated that major clinical depression develops in up to 20 percent of bereaved persons, diagnosable after two months of extreme symptoms such as delusions, hallucinations, feelings of worthlessness, or dramatic weight loss.


Clinical trials suggest that cognitive behavioral therapy or complicated grief treatment may be helpful for those experiencing complicated grief. There is also limited evidence for using antidepressant medications for treating complicated grief, though the outcomes were not as good as those for people with clinical major depression unrelated to grief.


The
Diagnostic Statistical Manual of Mental Disorders (DSM) long stated that clinicians should rule out grief due to a recent loss (within the first few weeks after the death) before making a diagnosis of depression or an adjustment disorder. The fifth edition (DSM-5), published in 2013, eliminates this "bereavement exclusion," generating a great deal of controversy. The American Psychiatric Association states that the reasons for the change are that bereavement often lasts one to two years, not less than two months, and that major depression can be triggered by bereavement, particularly in those who have personal or family histories of depression. Proponents argue that bereavement is merely another stressor like unemployment or divorce and therefore should be considered similarly in diagnosing a patient. Many critics warn that normal grief reactions will be pathologized and patients given unnecessary treatment, particularly antidepressants.




Cultural and Social Influences

Because loss is such a regular part of life, a person’s reaction to it is likely to be regulated by family and cultural influences. Religious and cultural practices have developed to govern the expected and acceptable ways of responding to loss. Many of these practices provide both permission for and boundaries to the expression of grief. They provide both an opportunity to express feelings and a limit to their expression. Often a religion or culture will stipulate the rituals that must be observed, how soon they must be concluded, how long they must be extended, what kind of clothing is appropriate, and what kinds of expressions are permissible and fitting. They also provide a cognitive framework in which the loss may be understood and, perhaps, better accepted—for example, framing the loss as God’s will.


The funeral home industry has been subject to criticism for profiting from the ubiquity of death. In part as a result, several organizations have sprung up to deliver affordable alternatives to traditional funeral arrangements, such as cremation, home-based funeral services, and green (environmentally sensitive) burials.


Toward the end of the twentieth century, professional interest in grief and grief counseling began to grow. The Association for Death Education and Counseling was founded in 1976 to provide a forum for educators and clinicians addressing this concern. Major journals such as Omega and Death Studies provide sources for research on grief and loss.




Bibliography


American Cancer Society. Coping with the Loss of a Loved One. Atlanta: American Cancer Society, 4 Feb. 2013. PDF file.



Bowlby, John. Loss: Sadness and Depression. London: Tavistock Inst., 1980. Print.



Doka, Kenneth J. "Grief and the DSM: A Brief Q&A." HuffPost Healthy Living. TheHuffingtonPost.com, 29 May 2013. Web. 21 May 2014.



Freud, Sigmund. “Mourning and Melancholia.” Collected Papers. Vol. 4. London: Hogarth, 1956. Print.



"Grief, Bereavement, and Coping with Loss." National Cancer Institute. US Dept. of Health and Human Services, National Institutes of Health, 6 Mar. 2013. Web. 21 May 2014.



Harvey, John H., ed. Perspectives on Loss: A Sourcebook. Philadelphia: Taylor, 1998. Print.



Konisberg, Ruth Davis. "New Ways to Think about Grief." Time. Time, 29 Jan. 2011. Web. 21 May 2014.



Lindemann, Erich. “Symptomatology and Management of Acute Grief.” American Journal of Psychiatry 101 (1944): 141–48. Print.



Marrone, Robert. Death, Mourning, and Caring. Pacific Grove: Brooks/Cole, 1997. Print.



Mitford, Jessica. The American Way of Death Revisited. Rev. ed. New York: Knopf, 1998. Print.



Parkes, Colin Murray, and Holly G. Prigerson. Bereavement: Studies of Grief in Adult Life. 4th ed. New York: Routledge, 2010. Print.



Worden, M. Grief Counseling and Grief Therapy: A Handbook for Mental Health Professionals. 3rd ed. New York: Springer, 2008. Print.

Monday 30 October 2017

What is conditioning? |


Introduction

Learning refers to any change in behavior or mental processes associated with experience. Traditionally psychologists interested in learning have taken a behavioral approach, which involves studying the relationship between environmental events, and resulting behavioral changes, in detail. Though the behavioral approach typically involves studying the behavior of nonhuman subjects in controlled laboratory environments, the results that have been found in behavioral research have often found wide application and use in human contexts. Since the early twentieth century, behavioral psychologists have extensively studied two primary forms of learning, classical and operant conditioning.











Classical Conditioning

Classical conditioning is also referred to as associative learning or Pavlovian conditioning, after its primary founder, the Russian physiologist Ivan Petrovich Pavlov
. Pavlov’s original studies involved examining digestion in dogs. The first step in digestion is salivation. Pavlov developed an experimental apparatus that allowed him to measure the amount of saliva the dog produced when presented with food. Dogs do not need to learn to salivate when food is given to them—that is an automatic, reflexive response. However, Pavlov noticed that, with experience, the dogs began to salivate before the food was presented, suggesting that new stimuli had acquired the ability to elicit the response. To examine this unexpected finding, Pavlov selected specific stimuli, which he systematically presented to the dog just before food was presented. The classic example is the ringing of a bell, but there was nothing special about the bell per se. Dogs do not salivate in response to a bell ringing under normal circumstances. What made the bell special was its systematic relationship to the delivery of food. Over time, the dogs began to salivate in response to the ringing of the bell even when the food was not presented. In other words, the dog learned to associate the bell with food so that the response (salivation) could be elicited by either stimulus.


In classical conditioning terminology, the food is the unconditioned stimulus (US). It is unconditioned (or unlearned) because the animal naturally responds to it before the experiment is begun. The sound of the bell ringing is referred to as the conditioned stimulus (CS). It is not naturally effective in eliciting salivation—learning is required in order for it to do so. Salivating in response to food presentation is referred to as the unconditioned response (UR) and salivating when the bell is rung is referred to as the conditioned response (CR). Though it would seem that saliva is saliva, it is important to differentiate the conditioned from the unconditioned response, because these responses are not always identical. More important, one is a natural, unlearned response (the UR) while the other requires specific learning experiences to occur (the CR).


Classical conditioning is not limited to dogs and salivation. Modern researchers examine classical conditioning in a variety of ways. What is important is the specific pairing of some novel stimulus (the CS) with a stimulus that already elicits the response (the US). One common experimental procedure examines eye-blink conditioning in rabbits, where a brief puff of air to the eye serves as the US and the measured response (UR) is blinking. A tone, a light, or some other initially ineffective stimulus serves as the CS. After many pairings in which the CS precedes the air puff, the rabbit will begin to blink in response to the CS in the absence of the air puff. Another common behavior that is studied in classical conditioning research is conditioned suppression. Here a CS is paired with an aversive US, such as a mild electric shock. Presentation of the shock disrupts whatever behavior the animal is engaged in at the time, and with appropriate pairing over time the CS comes to do so as well. A final example that many humans can relate to is taste-aversion learning. Here a specific taste (CS) is paired with a drug or procedure that causes the animal to feel ill (US). In the future, the animal will avoid consuming (CR) the taste (CS) associated with illness (US). Taste aversions illustrate the fact that all forms of conditioning are not created equal. To learn a conditioned eye-blink or salivation response requires many CS-US pairings, while taste aversions are often learned with only one pairing of the taste and illness.




Underlying Factors

Psychologists have long studied the factors that are necessary and sufficient for producing classical conditioning. One important principle is contiguity, which refers to events occurring closely together in space or time. Classical conditioning is most effective when the CS and US are more contiguous, though precisely how closely together they must be presented depends on the type of classical conditioning observed. Taste-aversion conditioning, for example, will occur over much longer CS-US intervals than would be effective with other conditioning arrangements. Nevertheless, the sooner illness (US) follows taste (CS), the stronger the aversion (CR) will be.


Though seemingly necessary for classical conditioning, contiguity is not sufficient. A particularly clear demonstration of this fact is seen when the CS and US are presented at the exact same moment (a procedure called simultaneous conditioning). Though maximally contiguous, simultaneous conditioning is an extremely poor method for producing a CR. Furthermore, the order of presentation matters. If the US is presented before the CS rather than afterward, as is usually the case, then inhibitory conditioning will occur. Inhibitory conditioning is seen in experiments in which behavior can change in two directions. For example, with a conditioned suppression procedure, inhibitory conditioning is seen when the animal increases, rather than decreases, its ongoing behavior when the CS is presented.


These findings have led modern researchers to focus on the predictive relationship between the CS and the US in classical conditioning. An especially successful modern theory of classical conditioning, the Rescorla-Wagner model, suggests that the CS acquires associative strength in direct proportion to how much information it provides about the upcoming US. In addition to providing a quantitative description of the way in which a CR is learned, the Rescorla-Wagner model has predicted a number of counterintuitive conditioning phenomena, such as blocking and overshadowing. Taken as a whole, the newer theoretical conceptions of classical conditioning tend to view the learning organism less as a passive recipient of environmental events than as an active analyzer of information.


Does classical conditioning account for any human behaviors? At first glance, these processes might seem a bit simplistic to account for human behaviors. However, some common human reactions are quite obviously the result of conditioning. For instance, nearly everyone who has had a cavity filled will cringe at the sound of a dentist’s drill, because the sound of the drill (CS) has been paired in the past with the unpleasant experience of having one’s teeth drilled (US). Cringing at the sound of the drill would be a conditioned response (CR). Psychologists have found evidence implicating classical conditioning in a variety of important human behaviors, from the emotional effects of advertising to the functioning of the immune system to the development of tolerance in drug addiction.




Operant Conditioning

At about the same time that Pavlov was conducting his experiments in Russia, an American psychologist named Edward L. Thorndike
was examining a different form of learning that has come to be called instrumental or operant conditioning. Thorndike’s original experiments involved placing cats in an apparatus he designed, which he called a puzzle box. A plate of food was placed outside the puzzle box, but the hungry cat was trapped inside. Thorndike designed the box so that the cat needed to make a particular response, such as moving a lever or pulling a cord, for a trap door to be released, allowing escape and access to the food outside. The amount of time it took the cat to make the appropriate response was measured. With repeated experience, Thorndike found that it took less and less time for the cat to make the appropriate response.


Operant conditioning is much different from Pavlov’s classical conditioning. As was stated before, classical conditioning involves learning “what goes with what” in the environment. Learning the relationship changes behavior, though behavior does not change the environmental events themselves. Through experience, Pavlov’s dogs began to salivate when the bell was rung, because the bell predicted food. However, salivating (the CR) did not cause the food to be delivered. Thorndike’s cats, on the other hand, received no food until the appropriate response was made. Through experience, the cats learned about the effects of their own behavior on environmental events. In other words, they learned the consequences of their own actions.


To describe these changes, Thorndike postulated the law of effect. According to the law of effect, in any given situation an animal may do a variety of things. The cat in the puzzle box could walk around, groom itself, meow, or engage in virtually any type of feline behavior. It could also make the operant response, the response necessary to escape the puzzle box and gain access to the food. Initially, the cat may engage in any of these behaviors and may produce the operant response simply by accident or chance. However, when the operant response occurs, escape from the box and access to the food follows. In operant conditioning terminology, food is the reinforcer (Sr, or reinforcing stimulus), and it serves to strengthen the operant response (R) that immediately preceded it. The next time the animal finds itself in the puzzle box, its tendency to produce the operant response will be a bit stronger as a consequence of the reinforcement. Once the response is made again, the animal gains access to the food again—which strengthens the response further. Over time, the operant response is strengthened, while other behaviors that may occur are not strengthened and thus drop away. So, with repeated experience, the amount of time that it takes for the animal to make the operant response declines.




Skinnerian Conditioning

In addition to changing the strength of responses, operant conditioning can be used to mold entirely new behaviors. This process is referred to as shaping, and it was described by American psychologist B. F. Skinner
, who further developed the field of operant conditioning. Suppose that the experiment’s objective was to train an animal, such as a laboratory rat, to press a lever. The rat could be given a piece of food (Sr) each time it pressed the lever (R), but it would probably be some considerable time before it would do so on its own. Lever pressing does not come naturally to rats. To speed up the process, the animal could be “shaped” by reinforcing successive approximations of lever-pressing behavior. The rat could be given a food pellet each time that it was in the vicinity of the lever. The law of effect predicts that the rat would spend more and more of its time near the lever as a consequence of reinforcement. Then the rat may be required to make some physical contact with the lever, but not necessarily press it, to be rewarded. The rat would make more and more contact with the lever as a result. Finally, the rat would be required to make the full response, pressing the lever, to get food. In many ways, shaping resembles the childhood game of selecting some object in the room without saying what it is and guiding guessers by saying “warmer” as they approach the object, and as they move away from it, saying nothing at all. Before long, the guessers will use the feedback to zero in on the selected object. In a similar manner, feedback in the form of reinforcement allows the rat to “zero in” on the operant response.


Skinner also examined situations in which reinforcement was not given for every individual response but was delivered according to various schedules of reinforcement. For example, the rat may be required to press the lever a total of five times (rather than once) to get the food pellet, or the reinforcing stimulus may be delivered only when a response occurs after a specified period of time. These scenarios correspond to ratio and interval schedules. Interval and ratio schedules can be either fixed, meaning that the exact same rule applies for the delivery of each individual reinforcement, or variable, meaning that the rule changes from reinforcer to reinforcer. For example, in a variable ratio-five schedule, a reward may be given after the first five responses, then after seven responses, then after three. On average, each five responses would be reinforced, but any particular reinforcement may require more or fewer responses.


To understand how large an impact varying the schedule of reinforcement can have on behavior, one might consider responding to a soda machine versus responding to a slot machine. In both cases the operant response is inserting money. However, the soda machine rewards (delivers a can of soda) according to a fixed-ratio schedule of reinforcement. Without reward, one will not persist very long in making the operant response to the soda machine. The slot machine, on the other hand, provides rewards (delivers a winning payout) on a variable-ratio schedule. It is not uncommon for people to empty out their pockets in front of a slot machine without receiving a single reinforcement.




Superstitious Pigeons

As with classical conditioning, exactly what associations are learned in operant conditioning has been an important research question. For example, in a classic 1948 experiment, Skinner provided pigeons with food at regular intervals regardless of what they were doing at the time. Six of his eight pigeons developed stereotyped (consistent) patterns of behavior as a result of the experiment despite the fact that the pigeons’ behavior was not really necessary. According to the law of effect, some behavior would be occurring just prior to food delivery, and this behavior would be strengthened simply by chance pairing with reinforcement. This would increase the strength of the response, making it more likely to occur when the next reward was delivered—strengthening the response still further. Ultimately, one behavior would dominate the pigeons’ behavior in that experimental context. Skinner referred to this phenomenon as superstition. One need only observe the behavior of baseball players approaching the plate or basketball players lining up for a free-throw shot to see examples of superstition in human behavior.


Superstition again raises the issue of contiguity—simply presenting reinforcement soon after the response is made appears to strengthen it. However, later studies, especially a 1971 experiment conducted by J. E. R. Staddon and V. Simmelhag, suggested that it might not be quite that simple. Providing food rewards in superstition experiments changes a variety of responses, including natural behaviors related to the anticipation of food. In operant conditioning, animals are learning more than the simple contiguity of food and behavior; they are learning that their behavior (R) causes the delivery of food (Sr). Contiguity is important, but is not the whole story.


In addition, psychologists have explored the question “What makes a reinforcer reinforcing?” That is to say, is there some set of stimuli that will “work” to increase the behaviors they follow in every single circumstance? The answer is that there is not some set of rewards that will always increase behavior in all circumstances. David Premack was important in outlining the fact that reinforcement is a relative, rather than an absolute, thing. Specifically, Premack suggested that behaviors in which an organism is more likely to engage serve to reinforce behaviors in which they are less likely to engage. In a specific example, he examined children given the option of playing pinball or eating candy. Some children preferred pinball and spent more of their time playing the game than eating the candy. The opposite was true of other children. Those who preferred pinball would increase their candy-eating behavior (R) to gain access to the pinball machine (Sr). Those who preferred eating candy would increase their pinball-playing behavior (R) to gain access to candy (Sr). Behaviors that a child initially preferred were effective in reinforcing behaviors that the child was less likely to choose—but not the other way around.




Negative Consequences

Positive or rewarding outcomes are not the only consequences that govern behavior. In many cases, people respond to avoid negative outcomes or stop responding when doing so produces unpleasant events. These situations correspond to the operant procedures of avoidance and punishment. Many psychologists have advocated using reinforcement rather than punishment to alter behavior, not because punishment is necessarily less effective in theory but because it is usually less effective in practice. For punishment to be effective, it should be (among other things) strong, immediate, and consistent. This can be difficult to accomplish in practice. In crime, for example, many offenses may have occurred without detection prior to the punished offense, so punishment is not certain. It is also likely that an individual’s court hearing, not to mention his or her actual sentence, will be delayed by weeks or even months, so punishment is not immediate. First offenses are likely to be punished less harshly than repeated offenses, so punishment gradually increases in intensity. In the laboratory, such a situation would produce an animal that would be quite persistent in responding, despite punishment.


In addition, punishment can produce unwanted side effects, such as the suppression of other behaviors, aggression, and the learning of responses to avoid or minimize punishing consequences. Beyond this, punishment requires constant monitoring by an external authority, whereas reinforcement typically does not. For example, parents who want to punish a child for a dirty room must constantly inspect the room to determine its current state. The child certainly is not going to point out a dirty room that will cause punishment. On the other hand, if rewarded, the child will bring the clean room to the parents’ attention. This is not to suggest that punishment should necessarily be abandoned as one tool for controlling behavior. Rather, the effectiveness of punishment, like reinforcement, can be predicted on the basis of laboratory results.




Interactions and Biological Constraints

Though the distinction between classical and operant conditioning is very clear in principle, it is not always so clear in practice. This makes sense if one considers real-life learning situations. In many circumstances events in the environment are associated (occur together) in a predictable fashion, and behavior will have consequences. This can be true in the laboratory as well, but carefully designed experiments can be conducted to separate out the impact of classical and operant conditioning on behavior.


In addition, the effectiveness of both classical and operant conditioning is influenced by biological factors. This can be seen both in the speed with which classically conditioned taste aversions (as compared with other CRs) are learned and in the stimulation of natural food-related behaviors in operant superstition experiments. Related findings have demonstrated that the effects of rewarding behavior can be influenced by biology in other ways that may disrupt the conditioning process. In an article published in 1961, Keller and Marian Breland described their difficulties in applying the principles of operant conditioning to their work as animal trainers in the entertainment industry. They found that when trained with food reinforcement, natural behaviors would often interfere with the trained operant response—a phenomenon they called instinctive drift. From a practical point of view, their research suggested that to be successful in animal training, one must select operant responses that do not compete with natural food-related behaviors. From a scientific point of view, their research suggested that biological tendencies must be taken into account in any complete description of conditioning processes.




Applications of Conditioning Technology

Beyond being interesting and important in its own right, conditioning research also serves as a valuable tool in the psychological exploration of other issues. In essence, conditioning technology provides a means for asking animals questions—a way to explore interesting cognitive processes such as memory, attention, reasoning, and concept formation under highly controlled laboratory conditions in less complex organisms.


Another area of research is the field of behavioral neuroscience, or psychobiology, a field that combines physiological and behavioral approaches to uncover the neurological mechanisms underlying behavior. For example, the impact of various medications and substances of abuse on behavior can be observed by administering drugs as reinforcing stimuli. It is interesting to note that animals will produce operant responses to receive the same drugs to which humans become addicted. However, in animals, the neurological mechanisms involved in developing addictions can be studied directly, using both behavioral and physiological experimental techniques in a way that would not be possible with human subjects, due to ethical considerations.


In addition, the principles of classical and operant conditioning have been used to solve human problems in a variety of educational and therapeutic settings, a strategy called applied behavior analysis. The principles of operant conditioning have been widely applied in settings in which some degree of control over human behavior is desirable. Token economies
are situations in which specified behaviors, such as appropriate classroom behavior, are rewarded according to some schedule of reinforcement. The reinforcers are referred to as tokens because they need not have any rewarding value in and of themselves, but they can be exchanged for reinforcers at some later time. According to the principles of operant conditioning, people should increase the operant response to gain the reinforcers, and if the token economy is developed properly, that is exactly what occurs. If token economies sound rather familiar it is for good reason. Money is an extremely potent token reinforcer for most people, who perform operant responses (work) to receive token reinforcers (money) that can later be exchanged for primary reinforcers (such as food, clothing, shelter, or entertainment).


Finally, learning principles have been applied in clinical psychology in an effort to change maladaptive behaviors. Some examples include a procedure called systematic desensitization, in which the principles of classical conditioning are applied in an effort to treat phobias (irrational beliefs), and social skills training, in which operant conditioning is used to enhance communication and other interpersonal behaviors. These are only two examples of useful applications of conditioning technology to treat mental illness. Such applications suggest the need for ongoing research into basic conditioning mechanisms. We must fully understand conditioning principles to appropriately apply them in the effort to understand and improve the human condition.




Bibliography


Domjan, Michael, and Barbara Burkhard. The Principles of Learning and Behavior. 5th ed. Belmont, Calif.: Wadsworth, 2006. Print.



Lavond, David G., and Joseph E. Steinmetz. Handbook of Classical Conditioning. New York: Springer, 2003. Print.



Mazur, James E. Learning and Behavior. 7th ed. New York: Psychology, 2013. eBook Collection (EBSCOhost). Web. 30 Nov. 2015.




Psychology of Learning and Motivation. San Diego: Academic, 2015. eBook Collection (EBSCOhost). Web. 30 Nov. 2015.



Schachtman, Todd R., and Steve S. Reilly. Associative Learning and Conditioning Theory: Human and Non-Human Applications. New York: Oxford UP, 2011. eBook Collection (EBSCOhost). Web. 30 Nov. 2015.



Schwartz, Barry, ed. Psychology of Learning: Readings in Behavior Theory. New York: W. W. Norton, 1984. Print.



Skinner, B. F. Beyond Freedom and Dignity. 1971. Reprint. Indianapolis, Ind.: Hackett, 2002. Print.



Sunday 29 October 2017

What are natural treatments for diabetes?


Introduction


Diabetes has two forms. In the type that develops early in
childhood (type 1), the insulin-secreting cells of the pancreas are destroyed
(probably by a viral infection) and blood levels of insulin drop
nearly to zero. However, in type 2 diabetes (usually developing in adults),
insulin remains plentiful, but the body does not respond normally to it. (This is
only an approximate description of the difference between the two types.) In both
forms of diabetes, blood sugar reaches toxic levels, causing injury to many organs
and tissues.


Conventional treatment for type 1 diabetes includes insulin injections and careful dietary monitoring. Type 2 diabetes may respond to lifestyle changes alone, such as increasing exercise, losing weight, and improving diet. Various oral medications are also often effective for type 2 diabetes, although insulin injections may be necessary in some cases.







Principal Proposed Natural Treatments

Several alternative methods may be helpful when used under medical supervision as an addition to standard treatment. They may help stabilize, reduce, or eliminate medication requirements or may correct nutritional deficiencies associated with diabetes. However, because diabetes is a dangerous disease with many potential complications, alternative treatment for diabetes should not be attempted as a substitute for conventional medical care. Other natural treatments may be helpful for preventing and treating complications of diabetes, including peripheral neuropathy, cardiac autonomic neuropathy, retinopathy, and cataracts.



Treatments for improving blood sugar control. The following treatments might be able to improve blood sugar control in type 1 or type 2 diabetes, or both. However, for none of these is the evidence strong. The mere fact of joining a study tends to improve blood sugar control in people with diabetes, even before any treatment is begun. Presumably, the experience of being enrolled in a trial causes participants to watch their diet more closely. This indicates that for diabetes, as for all conditions, the use of a double-blind, placebo-controlled method is essential. Only if the proposed treatment proves more effective than placebo can it be considered to work in its own right.


For those persons in which a natural treatment for diabetes works, it is
essential to reduce their medications to avoid hypoglycemia.
For this reason, medical supervision is necessary.



Chromium. Chromium is an essential trace mineral that plays a
significant role in sugar metabolism. Some evidence suggests that chromium
supplementation may help bring blood sugar levels under
control in type 2 diabetes, but it is far from definitive.


A four-month study reported in 1997 followed 180 Chinese men and women with type 2 diabetes, comparing the effects of 1,000 micrograms (mcg) chromium, 200 mcg chromium, and placebo. The results showed that HbA1c (glycated hemoglobin) values (a measure of long-term blood sugar control) improved significantly after twp months in the group receiving 1,000 mcg, and in both chromium groups after four months. Fasting glucose (a measure of short-term blood sugar control) was also lower in the group taking the higher dose of chromium.


A double-blind, placebo-controlled trial of seventy-eight people with type 2 diabetes compared two forms of chromium (brewer’s yeast and chromium chloride) with placebo. This rather complex crossover study consisted of four eight-week intervals of treatment in random order. The results in the sixty-seven participants who completed the study showed that both forms of chromium significantly improved blood sugar control. Positive results were also seen in other small, double-blind, placebo-controlled studies of people with type 2 diabetes. However, several other studies have failed to find chromium helpful for improving blood sugar control in type 2 diabetes. These contradictory findings suggest that the benefit, if it exists, is small.


A combination of chromium and biotin might be more effective.
Following positive results in a small pilot trial, researchers conducted a
double-blind study of 447 people with poorly controlled type 2 diabetes. One-half
of the participants were given placebo and the rest were given a combination of
600 milligrams (mg) of chromium (as chromium picolinate) and 2 mg of biotin daily.
All participants continued to receive standard oral medications for diabetes.
During the ninety-day study period, participants who were given the
chromium-biotin combination showed significantly better glucose regulation than
participants who were given placebo. The relative benefit was clear in levels of
fasting glucose and in levels of HgA1c (glycated hemoglobin).


One placebo-controlled study of thirty women with gestational
diabetes (diabetes during pregnancy) found that
supplementation with chromium (at a dosage of 4 or 8 mcg chromium picolinate for
each kilogram of body weight) significantly improved blood sugar control. Chromium
has also shown some promise for helping diabetes caused by corticosteroid
treatment.



Ginseng. In double-blind studies performed by a single research group, the use of American ginseng (Panax quinquefolius) appeared to improve blood sugar control. In some studies, the same researchers subsequently reported possible benefit with Korean red ginseng, a specially prepared form of P. ginseng.


A different research group found benefits with ordinary P. ginseng. However, in other studies, ordinary P. ginseng seemed to worsen blood sugar control rather than improve it. (Another research group found potential benefit.) It seems possible that certain ginsenosides (found in high concentrations in some American ginseng products) may lower blood sugar, while others (found in high concentration in some P. ginseng products) may raise it. It has been suggested that because the actions of these various ginseng constituents are not well defined, ginseng should not be used to treat diabetes until more is known.



Aloe. The succulent aloe plant has been valued since
prehistoric times as a topical treatment for burns, wound infections, and other
skin problems. Today, evidence suggests that oral aloe might be useful for type 2
diabetes.


Evidence from two human trials suggests that aloe gel (the gel of the aloe vera plant, and not the leaf skin, which constitutes the drug aloe) can improve blood sugar control. A single-blind, placebo-controlled trial evaluated the potential benefits of aloe in either seventy-two or forty people with diabetes. (The study report appears to contradict itself). The results showed significantly greater improvements in blood sugar levels among those given aloe over the two-week treatment period.


Another single-blind, placebo-controlled trial evaluated the benefits of aloe in people who had failed to respond to the oral diabetes drug glibenclamide. Of the thirty-six people who completed the study, those taking glibenclamide and aloe showed definite improvements in blood sugar levels over forty-two days compared with those taking glibenclamide and placebo. While these are promising results, large studies that are double-blind rather than single-blind will be needed to establish aloe as an effective treatment for improving blood sugar control.



Cinnamon.
Cinnamon has been widely advertised as an effective
treatment for type 2 diabetes and for high cholesterol. The primary basis for this
claim is a single study performed in Pakistan. In this forty-day study, sixty
people with type 2 diabetes were given cinnamon at a dose of 1, 3, or 6 grams (g)
daily. The results reportedly indicated that the use of cinnamon improved blood
sugar levels by 18 to 29 percent, total cholesterol by 12 to 26 percent, LDL (bad)
cholesterol by 7 to 27 percent, and triglycerides by 23 to 30 percent. These
results were said to be statistically significant compared to the beginning of the
study and to the placebo group.


However, this study has some odd features. The most important feature is that the study found no significant difference in benefit among the various doses of cinnamon. This is called lack of a “dose-related effect,” and it generally casts doubt on the results of a study.


In an attempt to replicate these results, a group of Dutch researchers performed a carefully designed six-week, double-blind, placebo-controlled study of twenty-five people with type 2 diabetes. All participants were given 1.5 g of cinnamon daily. The results failed to show any detectable effect on blood sugar, insulin sensitivity, or cholesterol profile. Furthermore, a double-blind study performed in Thailand enrolling sixty people, again using 1.5 g of cinnamon daily, also failed to find benefit. However, a double-blind study of seventy-nine people that used 3 g instead of 1.5 g daily did find that cinnamon improved blood sugar levels. In addition, a small study evaluated cinnamon for improving blood sugar control in women with polycystic ovary disease, and it too found evidence of benefit. Regarding type 1 diabetes, a study of seventy-two adolescents failed to find benefit with cinnamon taken at a dose of 1 g daily.


A meta-analysis (formal statistical review) of all published evidence concluded that cinnamon has no effect on blood sugar levels in people with diabetes. The evidence regarding cinnamon as a treatment for diabetes is highly inconsistent, suggesting that if cinnamon is indeed effective, its benefits are minimal at most.



Other treatments studied for their effect on blood sugar control.
The food spice fenugreek might also help control blood sugar, but the
supporting evidence is weak. In a two-month double-blind study of twenty-five
people with type 2 diabetes, the use of fenugreek (1 g daily of a standardized
extract) significantly improved some measures of blood sugar control and insulin
response compared with placebo. Triglyceride levels decreased and HDL (good)
cholesterol levels increased, presumably because of the enhanced insulin
sensitivity. Similar benefits have been seen in animal studies and open human
trials. However, it is possible that the effects of fenugreek come from its
dietary fiber content.


A few preliminary studies suggest that the Ayurvedic (Indian) herb
gymnema may help improve blood sugar control. It might be
helpful for mild cases of type 2 diabetes when taken alone or with standard
treatment (under a doctor’s supervision in either case).


Studies in rats with and without diabetes suggest that high doses of the mineral vanadium may have an insulin-like effect, reducing blood sugar levels. Based on these findings, preliminary studies involving humans have been conducted, with some promising results. However, of 151 studies reviewed, none was of sufficient quality to judge if vanadium is beneficial in type 2 diabetes. The researchers did find that vanadium was often associated with gastrointestinal side effects. Furthermore, there may be some cause for concern given the high doses of vanadium used in some of these studies.


The following herbs are proposed for helping to control blood sugar, but the supporting evidence regarding their potential benefit is, in all cases, at best preliminary; for some, there are as many negative results as positive: agaricus, blazei, berberine (goldenseal), black tea, caiapo, cod protein, cayenne, Coccinia indica (also known as C. cordifolia), garlic, green tea, guggul, holy basil (Ocimum sanctum), maitake, milk thistle, nopal cactus (Opuntia stredptacantha), onion, oolong tea, oligomeric proanthocyanidins, Salacia oblonga, Salvia hispanica (a grain), and salt bush. Additionally, the supplements arginine, carnitine, coenzyme Q10 (CoQ10), dehydroepiandrosterone (DHEA), glucomannan, lipoic acid, melatonin with zinc, and vitamin E might also help control blood sugar levels to a slight degree.


One placebo-controlled study found hints that the use of medium-chain
triglycerides by people with type 2 diabetes might improve insulin sensitivity and
aid weight loss. The herb bitter melon (Momordica charantia) is
widely advertised as effective for diabetes, but the scientific basis for this
claim is limited to animal studies, uncontrolled human trials, and other
unreliable forms of evidence. The one properly designed (that is, double-blind,
placebo-controlled) study of bitter melon failed to find benefit. Conjugated
linoleic acid (CLA) has shown promise in preliminary trials. However, other
studies have found that CLA might worsen blood sugar control.


One study found that insulin metabolism in 278 young, overweight persons
improved on a calorie-restricted diet rich in fish oil from
seafood or supplements compared with those on a diet low in fish oil. Though
preliminary, the results suggest that fish oil may help delay the onset of
diabetes in susceptible persons. In another study of fifty people with type 2
diabetes, 2 g per day of purified omega-3-fatty acids (fish oil) was able to
significantly lower triglycerides levels. However, it had no effect on blood sugar
control.


Other herbs traditionally used for diabetes that might possibly offer some benefit include Anemarrhena asphodeloides, Azadirachta indica (neem), Catharanthus roseus, Cucurbita ficifolia, Cucumis sativus, Cuminum cyminum (cumin), Euphorbia prostrata, Guaiacum coulteri, Guazuma ulmifolia, Lepechinia caulescens, Medicago sativa (alfalfa), Musa sapientum L. (banana), Phaseolus vulgaris, Psacalium peltatum, Rhizophora mangle, Spinacea oleracea, Tournefortia hirsutissima, and Turnera diffusa.


Combination herbal therapies used in Ayurvedic medicine have also shown some
promise for improving blood sugar control. One study attempted to test the
effectiveness of whole-person Ayurvedic treatment involving exercise, Ayurvedic
diet, meditation, and Ayurvedic herbal treatment. However, minimal benefits were
seen.


A double-blind study of more than two hundred people evaluated the
effectiveness of a combination herbal formula used in traditional Chinese herbal
medicine (Coptis formula). This study evaluated Coptis
formula with and without the drug glibenclamide. The results hint that Coptis
formula may enhance the effectiveness of the drug but that it is not powerful
enough to treat diabetes on its own. Another randomized trial, this one lacking a
control group, found no added benefit for Tai Chi in the treatment of blood
glucose and cholesterol levels among fifty-three people with type 2 diabetes
during a six-month period.


One study claimed to find evidence that creatine
supplements can reduce levels of blood sugar. However,
because dextrose (a form of sugar) was used as the “placebo” in this trial, the
results are somewhat questionable. In another study, the herb Tinospora
crispa
did not work, and it showed the potential to cause liver
injury.


One study found hints that the supplement DHEA might improve insulin sensitivity. However, a subsequent and more rigorous study failed to find benefits. Relatively weak evidence hints that genistein (an isoflavone extracted from soy) might help control blood sugar.


It has been suggested that if a child has just developed diabetes, the
supplement niacinamide (a form of niacin, also called vitamin
B3) might slightly prolong what is called the honeymoon period. This
is the interval during which the pancreas can still make some insulin and the
body’s need for insulin injections is low. However, the benefits (if any) appear
to be minor. A cocktail of niacinamide plus antioxidant vitamins and minerals has
also been tried, but the results were disappointing. Niacinamide has also been
tried for preventing diabetes in high-risk children. According to most studies,
fructo-oligosaccharides (also known as prebiotics) do not improve blood sugar
control in people with type 2 diabetes.



Massage
therapy has shown some promise for enhancing blood sugar
control in children with diabetes. A review of nine clinical trials found
insufficient evidence to support the traditional Chinese practice of qigong as
beneficial for treatment of type 2 diabetes.




Treating Nutritional Deficiencies

Both diabetes and the medications used to treat it can cause people to fall short of various nutrients. Making up for these deficiencies (through either diet or the use of supplements) may or may not help with diabetes specifically, but it should make a person healthier overall. One double-blind study, for example, found that people with type 2 diabetes who took a multivitamin-multimineral supplement were less likely to develop an infectious illness than those who took placebo.


People with diabetes are often deficient in magnesium, and inconsistent
evidence hints that magnesium supplementation may enhance
blood sugar control. People with either type 1 or type 2 diabetes may also be
deficient in the mineral zinc. Vitamin C levels have been found to be
low in many people on insulin, even though these persons were consuming seemingly
adequate amounts of the vitamin in their diets. Deficiencies of taurine and
manganese have also been reported. The drug metformin can cause vitamin
B12 deficiency. Taking extra calcium may prevent this.




Prevention


Niacinamide. Evidence from a large study conducted in New Zealand suggests that the supplement niacinamide might reduce the risk of diabetes in children at high risk. In this study, more than twenty thousand children were screened for diabetes risk by measuring certain antibodies in the blood (ICA antibodies, believed to indicate risk of developing diabetes). It turned out that 185 of these children had detectable levels. About 170 of these children were then given niacinamide for seven years (not all parents agreed to give their children niacinamide or to have them stay in the study for that long). About ten thousand other children were not screened, but they were followed to see if they developed diabetes.


The results were positive. In the group in which children were screened and given niacinamide if they were positive for ICA antibodies, the incidence of diabetes was reduced by almost 60 percent. These findings suggest that niacinamide is an effective treatment for preventing diabetes. (The study also indicates that tests for ICA antibodies can very accurately identify children at risk for diabetes.)


An even larger study that attempted to replicate these results in Europe (the European Nicotinamide Diabetes Intervention Trial) failed to find benefit. This study screened 40,000 children at high risk and selected 552. The results were negative. The rate of diabetes onset was not statistically different in the group given niacinamide compared with those given placebo. Another study also failed to find benefit.



Dietary changes. The related terms “glycemic index” and “glycemic load” indicate the tendency of certain foods to stimulate insulin release. It has been suggested that foods that rank high on these scales, such as white flour and sweets, might tend to exhaust the pancreas and therefore lead to type 2 diabetes. For this reason, low-carbohydrate and low-glycemic-index diets have been promoted for the prevention of type 2 diabetes. However, the results from studies on this question have been contradictory and far from definitive.


There is no question, however, that people who are obese have a far greater tendency to develop type 2 diabetes than those who are relatively slim; therefore, weight loss (especially when accompanied by increase in exercise) is clearly an effective step for prevention. One review suggests that a weight decrease of 7 to 10 percent is enough to provide significant benefit.



Other natural treatments. Studies investigating the preventive
effects of antioxidant supplements have generally been disappointing. In an
extremely large double-blind study, the use of vitamin E at
a dose of 600 international units every other day failed to reduce the risk of
type 2 diabetes in women. Another large study, which enrolled male smokers, failed
to find benefit with beta-carotene, vitamin E, or the two taken together. Another
large study of female health professionals who were more than forty years old with
or at high risk for cardiovascular disease found that long-term supplementation
(an average of just more than nine years) with vitamin C, vitamin E, or
beta-carotene did not significantly reduce the risk of developing diabetes
compared with placebo. In a smaller (but still sizable) trial involving a subgroup
of these same women, supplementation with vitamins B6 and
B12 and folic acid also did not reduce risk of type 2 diabetes.


Several observational studies suggest that vitamin D may also help prevent diabetes. However, studies of this type are far less reliable than double-blind trials. One observational study failed to find that high consumption of lycopene reduced risk of developing type 2 diabetes.




Supplements to Use Only with Caution

In a double-blind, placebo-controlled study of sixty overweight men, the use of conjugated linoleic acid (CLA) unexpectedly worsened blood sugar control. These findings surprised researchers, who were looking for potential diabetes-related benefits with this supplement. Other studies corroborate this as a potential risk for people with type 2 diabetes and for overweight people without diabetes. Another study, however, failed to find this effect. Nonetheless, people with type 2 diabetes or who are at risk for it should not use CLA except under physician supervision.


Unexpected results also occurred in a study of vitamin E. For various theoretical reasons, researchers expected that the use of vitamin E (either alpha tocopherol or mixed tocopherols) by people with diabetes would reduce blood pressure; instead, the reverse occurred. People with diabetes should probably monitor their blood pressure if they take high-dose vitamin E supplements.


There are equivocal indications that the herb ginkgo might alter insulin release or insulin sensitivity in people with diabetes. The effect, if it exists, appears to be rather complex; the herb may cause some increase in insulin output and, yet, might actually lower insulin levels overall through its effects on the liver and perhaps on oral medications used for diabetes. Until this situation is clarified, people with diabetes should use ginkgo only under the supervision of a physician.


Despite hopes to the contrary, it does not appear that selenium
supplements can help prevent type 2 diabetes, but rather
might increase the risk of developing the disease. Contrary to earlier concerns,
vitamin B3 (niacin) and fish oil appear to be safe for people with
diabetes. A few early case reports and animal studies had raised concerns that
glucosamine might be harmful for persons with diabetes, but subsequent studies
have tended to allay these worries.


Finally, if any herb or supplement does in fact successfully decrease blood sugar levels, this could lead to dangerous hypoglycemia. A doctor’s supervision is strongly suggested




Bibliography


Ahuja, K. D., et al. “Effects of Chili Consumption on Postprandial Glucose, Insulin, and Energy Metabolism. American Journal of Clinical Nutrition 84 (2006): 63-69.



Altschuler, J. A., et al. “The Effect of Cinnamon on A1C Among Adolescents with Type 1 Diabetes.” Diabetes Care 30 (2007): 813-816.



Basu, R., et al. “Two Years of Treatment with Dehydroepiandrosterone Does Not Improve Insulin Secretion, Insulin Action, or Postprandial Glucose Turnover in Elderly Men or Women.” Diabetes 56 (2007): 753-766.



Boshtam, M., et al. “Long Term Effects of Oral Vitamin E Supplement in Type II Diabetic Patients.” International Journal for Vitamin and Nutrition Research 75 (2006): 341-346.



Bryans, J. A., P. A. Judd, and P. R. Ellis. “The Effect of Consuming Instant Black Tea on Postprandial Plasma Glucose and Insulin Concentrations in Healthy Humans.” Journal of the American College of Nutrition 26 (2007): 471-477.



Elder, C., et al. “Randomized Trial of a Whole-System Ayurvedic Protocol for Type 2 Diabetes.” Alternative Therapies in Health and Medicine 12 (2006): 24-30.



Lee, M. S., et al. “Qigong for Type 2 Diabetes Care.” Complementary Therapies in Medicine 17 (2009): 236-242.



Li, Y., T. H. Huang, and J. Yamahara. “Salacia Root, a Unique Ayurvedic Medicine, Meets Multiple Targets in Diabetes and Obesity.” Life Sciences 82 (2008): 1045-1049.



Mackenzie, T., L. Leary, and W. B. Brooks. “The Effect of an Extract of Green and Black Tea on Glucose Control in Adults with Type 2 Diabetes Mellitus.” Metabolism 56 (2007): 1340-1344.



Pi-Sunyer, F. X. “How Effective Are Lifestyle Changes in the Prevention of Type 2 Diabetes Mellitus?” Nutrition Reviews 65 (2007): 101-110.



Ramel, A., et al. “Beneficial Effects of Long-Chain N-3 Fatty Acids Included in an Energy-Restricted Diet on Insulin Resistance in Overweight and Obese European Young Adults.” Diabetologia 51 (2008): 1261-1268.



Shidfar, F., et al. “Effects of Omega-3 Fatty Acid Supplements on Serum Lipids, Apolipoproteins, and Malondialdehyde in Type 2 Diabetes Patients.” Eastern Mediterranean Health Journal 14 (2008): 305-313.



Song, Y., et al. “Effects of Vitamins C and E and Beta-Carotene on the Risk of Type 2 Diabetes in Women at High Risk of Cardiovascular Disease.” American Journal of Clinical Nutrition 90 (2009): 429-437.



Ward, N. C., et al. “The Effect of Vitamin E on Blood Pressure in Individuals with Type 2 Diabetes.” Journal of Hypertension 25 (2007): 227-234.

How can a 0.5 molal solution be less concentrated than a 0.5 molar solution?

The answer lies in the units being used. "Molar" refers to molarity, a unit of measurement that describes how many moles of a solu...