Neuroplasticity And Epilepsy: The Effect of Pathological Activity on Neural Circuitry

Epilepsy is a brain disorder characterized by periodic and unpredictable seizures
mediated by the rhythmic firing of large groups of neurons. It seems likely that
abnormal activity generates plastic changes in cortical circuitry that are critical
to the pathogenesis of the disease.

Brain Plasticity: What Is It?

What is brain plasticity?

Does it mean that our brains are made of plastic?

Of course not.

Plasticity, or neuroplasticity, describes how experiences reorganize neural pathways in the brain. Long lasting functional changes in the brain occur when we learn new things or memorize new information. These changes in neural connections are what we call neuroplasticity.

To illustrate the concept of plasticity, imagine the film of a camera. Pretend that the film represents your brain. Now imagine using the camera to take a picture of a tree. When a picture is taken, the film is exposed to new information — that of the image of a tree. In order for the image to be retained, the film must react to the light and “change” to record the image of the tree. Similarly, in order for new knowledge to be retained in memory, changes in the brain representing the new knowledge must occur.

To illustrate plasticity in another way, imagine making an impression of a coin in a lump of clay. In order for the impression of the coin to appear in the clay, changes must occur in the clay — the shape of the clay changes as the coin is pressed into the clay. Similarly, the neural circuitry in the brain must reorganize in response to experience or sensory stimulation.

  • Neuroplasticity includes several different processes that take place throughout a lifetime – Neuroplasticity does not consist of a single type of morphological change, but rather includes several different processes that occur throughout an individual’s lifetime. Many types of brain cells are involved in neuroplasticity, including neurons, glia, and vascular cells.
  • Neuroplasticity has a clear age-dependent determinant – Although plasticity occurs over an individual’s lifetime, different types of plasticity dominate during certain periods of one’s life and are less prevalent during other periods.
  • Neuroplasticity occurs in the brain under two primary conditions:1. During normal brain development when the immature brain first begins to process sensory information through adulthood (developmental plasticity and plasticity of learning and memory).2. As an adaptive mechanism to compensate for lost function and/or to maximize remaining functions in the event of brain injury.
  • The environment plays a key role in influencing plasticity – In addition to genetic factors, the brain is shaped by the characteristics of a person’s environment and by the actions of that same person.

Developmental Plasticity: Synaptic Pruning

Electrical Trigger for Neurotransmission



Gopnick et al. (1999)[Gopnic, A., Meltzoff, A., Kuhl, P. (1999). The Scientist in the Crib: What Early Learning Tells Us About the Mind, New York, NY: HarperCollins Publishers.] describe neurons as growing telephone wires that communicate with one another. Following birth, the brain of a newborn is flooded with information from the baby’s sense organs. This sensory information must somehow make it back to the brain where it can be processed. To do so, nerve cells must make connections with one another, transmitting the impulses to the brain. Continuing with the telephone wire analogy, like the basic telephone trunk lines strung between cities, the newborn’s genes instruct the “pathway” to the correct area of the brain from a particular nerve cell. For example, nerve cells in the retina of the eye send impulses to the primary visual area in the occipital lobe of the brain and not to the area of language production (Wernicke’s area) in the left posterior temporal lobe. The basic trunk lines have been established, but the specific connections from one house to another require additional signals.
Over the first few years of life, the brain grows rapidly. As each neuron matures, it sends out multiple branches (axons, which send information out, and dendrites, which take in information), increasing the number of synaptic contacts and laying the specific connections from house to house, or in the case of the brain, from neuron to neuron. At birth, each neuron in the cerebral cortex has approximately 2,500 synapses. By the time an infant is two or three years old, the number of synapses is approximately 15,000 synapses per neuron (Gopnick, et al., 1999). This amount is about twice that of the average adult brain. As we age, old connections are deleted through a process called synaptic pruning.

Synaptic pruning eliminates weaker synaptic contacts while stronger connections are kept and strengthened. Experience determines which connections will be strengthened and which will be pruned; connections that have been activated most frequently are preserved. Neurons must have a purpose to survive. Without a purpose, neurons die through a process called apoptosis in which neurons that do not receive or transmit information become damaged and die. Ineffective or weak connections are “pruned” in much the same way a gardener would prune a tree or bush, giving the plant the desired shape. It is plasticity that enables the process of developing and pruning connections, allowing the brain to adapt itself to its environment.

Wiring of Brain

Plasticity of Learning and Memory

It was once believed that as we aged, the brain’s networks became fixed. In the past two decades, however, an enormous amount of research has revealed that the brain never stops changing and adjusting. 

Learning, as defined by Tortora and Grabowski (1996)[Tortora, G. and Grabowski, S. (1996). Principles of Anatomy and Physiology. (8th ed.), New York: HarperCollins College Publishers.], is the ability to acquire new knowledge or skills through instruction or experience.

Memory is the process by which that knowledge is retained over time.

The capacity of the brain to change with learning is plasticity.

So how does the brain change with learning?

According to Drubach (2000)[Drubach, D. (2000). The Brain Explained, Upper Saddle River, NJ: Prentice-Hall, Inc.], there appear to be at least two types of modifications that occur in the brain with learning:

  1. A change in the internal structure of the neurons, the most notable being in the area of synapses.
  2. An increase in the number of synapses between neurons.

Initially, newly learned data are “stored” in short-term memory, which is a temporary ability to recall a few pieces of information. Some evidence supports the concept that short-term memory depends upon electrical and chemical events in the brain as opposed to structural changes such as the formation of new synapses. One theory of short-term memory states that memories may be caused by “reverberating” neuronal circuits — that is, an incoming nerve impulse stimulates the first neuron which stimulates the second, and so on, with branches from the second neuron synapsing with the first. After a period of time, information may be moved into a more permanent type of memory, long-term memory, which is the result of anatomical or biochemical changes that occur in the brain (Tortora and Grabowski, 1996)[Tortora, G. and Grabowski, S. (1996). Principles of Anatomy and Physiology. (8th ed.), New York: HarperCollins College Publishers.]

Kindly also read Does Glial Cells Have Any Role in Creativity and Genius ? Nearly 90 % of the Brain is Composed of Glia.

Injury-induced Plasticity: Plasticity and Brain Repair

During brain repair following injury, plastic changes are geared towards maximizing function in spite of the damaged brain. In studies involving rats in which one area of the brain was damaged, brain cells surrounding the damaged area underwent changes in their function and shape that allowed them to take on the functions of the damaged cells. Although this phenomenon has not been widely studied in humans, data indicate that similar (though less effective) changes occur in human brains following injury. Kindly read Why Home Education of a Child is Very Important- Even Kindergarten is Late!


The importance of neuronal plasticity in epilepsy is indicated most clearly by an animal model of seizure production called kindling. To induce kindling, a stimulating electrode is implanted in the brain, often in the amygdala (a component of the limbic system that makes and receives connections with the cortex, thalamus, and other limbic structures, including the hippocampus; amygdala came from the latin-greek word, meaning almond, which describe the almond-like structure found in the brain ); Shown in research to perform a primary role in the processing of memory, decision-making, and emotional reactions, the amygdalae are considered part of the limbic system.

  • It can be found easily in mammals under the rhinal fissure and closely related to the lateral olfactory tract. This almond like structure, ranging from 1-4cm , average about 1.8cm, has extensive connection with the brain.

At the beginning of such an experiment, weak electrical stimulation, in the form of a low-amplitude train of electrical pulses, has no discernible effect on the animal’s behaviour or on the pattern of electrical activity in the brain (laboratory rats or mice have typically been used for such studies). As this weak stimulation is repeated once a day for several weeks, it begins to produce behavioural and electrical indications of seizures. By the end of the experiment, the same weak stimulus that initially had no effect now causes full-blown seizures. This phenomenon is essentially permanent; even after an interval of a year, the same weak stimulus will again trigger a seizure. Thus, repetitive weak activation produces long-lasting changes in the excitability of the brain that time cannot reverse. The word kindling is therefore quite appropriate: A single match can start a devastating fire.

The changes in the electrical patterns of brain activity detected in kindled animals resemble those in human epilepsy. The behavioural manifestations of epileptic seizures in human patients range from mild twitching of an extremity, to loss of consciousness and uncontrollable convulsions. Although many highly accomplished people have suffered from epilepsy (Alexander the Great, Julius Caesar, Napoleon, Dostoyevsky, and Van Gogh, to name a few), seizures of sufficient intensity and frequency can obviously interfere with many aspects of daily life. Moreover, uncontrolled convulsions can lead to excitotoxicity.

Up to 1% of the population is afflicted, making epilepsy one of the most common neurological problems.

Modern thinking about the causes (and possible cures) of epilepsy has focussed on where seizures originate and the mechanisms that make the affected region hyperexcitable.

Most of the evidence suggests that abnormal activity in small areas of the cerebral cortex (called foci) provide the triggers for a seizure that then spreads to other synaptically connected regions. For example, a seizure originating in the thumb area of the right motor cortex will first be evident as uncontrolled movement of the left thumb that subsequently extends to other more proximal limb muscles, whereas a seizure originating in the visual association cortex of the right hemisphere may be heralded by complex hallucinations in the left visual field. The behavioural manifestations of seizures therefore provide important clues for the neurologist seeking to pinpoint the abnormal region of cerebral cortex. Epileptic seizures can be caused by a variety of acquired or congenital factors, including cortical damage from trauma, stroke, tumors, congenital cortical dysgenesis (failure of the cortex to grow properly), and congenital vascular malformations. One rare form of epilepsy, Rasmussen’s encephalitis, is an autoimmune disease that arises when the immune system attacks the brain, using both humoral (i.e. antibodies) and cellular (lymphocytes and macrophages) agents that can destroy neurons. Some forms of epilepsy are heritable, and more than a dozen distinct genes have been demonstrated to underlie unusual types of epilepsy. However, most forms of familial epilepsy (such as juvenile myoclonic epilepsy and petit mal epilepsy) are caused by the simultaneous inheritance of more than one mutant gene.

No effective prevention or cure exists for epilepsy. Pharmacological therapies that successfully inhibit seizures are based on two general strategies.

One approach is to enhance the function of inhibitory synapses that use the neurotransmitter GABA;[Gamma-Aminobutyric acid(γ-Aminobutyric acid); (also called GABA for short) is the chief inhibitory neurotransmitter in the mammalian central nervous system. It plays the principal role in reducing neuronal excitability throughout the nervous system. In humans, GABA is also directly responsible for the regulation of muscle tone.] the other is to limit action potential firing by acting on voltage-gated Na+ channels. Commonly used antiseizure medications include carbamazepine, phenobarbital, phenytoin (Dilantin®), and valproic acid.

These agents, which must be taken daily, successfully inhibit seizures in 60–70% of patients. In a small fraction of patients, the epileptogenic region can be surgically excised. In extreme cases, physicians resort to cutting the corpus callosum to prevent the spread of seizures (most of the “split-brain” subjects were patients suffering from intractable epilepsy).

One of the major reasons for controlling epileptic activity is to prevent the more permanent plastic changes that would ensue as a consequence of abnormal and excessive neural activity.

Brain-corpus collosum

    The corpus callosum (Latin for “tough body”) is by far the largest bundle of nerve fibers in the entire nervous system. Its population has been estimated at 200 million axons—the true number is probably higher, as this estimate was based on light microscopy rather than on electron microscopy—
    a number to be contrasted to 1.5 million for each optic nerve and 32,000 for the auditory nerve. Its cross-sectional area is about 700 square millimeters, compared with a few square millimeters for the optic nerve. It joins the two cerebral hemispheres, along with a relatively tiny fascicle of fibers called the anterior commissure, as shown. The word commissure signifies a set of fibers connecting two homologous neural structures on opposite sides of the brain or spinal cord; thus the corpus callosum is sometimes called the great cerebral commissure.
    Until about 1950 the function of the corpus callosum was a complete mystery. On rare occasions, the corpus callosum in humans is absent at birth, in a condition called agenesis of the corpus callosum. Occasionally it may be completely or partially cut by the neurosurgeon, either to treat epilepsy (thus preventing epileptic discharges that begin in one hemisphere from spreading to the other) or to make it possible to reach a very deep tumor, such as one in the pituitary gland, from above. In none of these cases had neurologists and psychiatrists found any deficiency; someone had even suggested (perhaps not seriously) that the sole function of the corpus callosum was to hold the two cerebral hemispheres together. Until the 1950s we knew little about the detailed connections of the corpus callosum. It clearly connected the two cerebral hemispheres, and on the basis of rather crude neurophysiology it was thought to join precisely corresponding cortical areas on the two sides. Even cells in the striate cortex were assumed to send axons into the corpus callosum to terminate in the exactly corresponding part of the striate cortex on the opposite side.
    In 1955 Ronald Myers, a graduate student studying under psychologist Roger Sperry (Roger Wolcott Sperry) at the University of Chicago, did the first experiment that revealed a function for this immense bundle of fibers. Myers trained cats in a box containing two side-by-side screens onto which he could project images, for example a circle onto one screen and a square onto the other. He taught a cat to press its nose against the screen with the circle, in preference to the one with the square, by rewarding correct responses with food and punishing mistakes mildly by sounding an unpleasantly loud buzzer and pulling the cat back from the screen gently but firmly. By this method the cat could be brought to a fairly consistent performance in a few thousand trials. (Cats learn slowly; a pigeon will learn a similar task in tens to hundreds of trials, and we humans can learn simply by being told. This seems a bit odd, given that a cat’s brain is many times the size of a pigeon’s. So much for the sizes of brains.)
    Not surprisingly, Myers’ cats could master such a task just as fast if one eye was closed by a mask. Again not surprisingly, if a task such as choosing a triangle or a square was learned with the left eye alone and then tested with the right eye alone, performance was just as good. This seems not particularly impressive, since we too can easily do such a task. The reason it is easy must be related to the anatomy. Each hemisphere receives input from both eyes, a large proportion of cells in area 17 receive input from both eyes. Myers now made things more interesting by surgically cutting the optic chiasm in half, by a fore-and-aft cut in the midline, thus severing the crossing fibers but leaving the uncrossed ones intact—a procedure that takes some surgical skill. Thus the left eye was attached only to the left hemisphere and the right eye to the right hemisphere. The idea now was to teach the cat through the left eye and test it with the right eye: if it performed correctly, the information necessarily would have crossed from the left hemisphere to the right through the only route known, the corpus callosum. Myers did the experiment: he cut the chiasm longitudinally, trained the cat through one eye, and tested it through the other—and the cat still succeeded. Finally, he repeated the experiment in an animal whose chiasm and corpus callosum had both been surgically divided. The cat now failed. Thus he established, at long last, that the callosum actually could do something—although we would hardly suppose that its sole purpose was to allow the few people or animals with divided optic chiasms to perform with one eye after learning a task with the other.[Source: David Hubel’s-Eye, Brain And Vision]

The corpus callosum is a thick, bent plate of axons near the center of this brain section, made by cutting apart the human cerebral hemispheres and looking at the cut surface. Image source

Here the brain is seen from above. On the right side an inch or so of the top has been lopped off. We can see the band of the corpus callosum fanning out after crossing, and joining every part of the two hemispheres. (The front of the brain is at the top of the picture.) Image source

The Corpus Callosum Defined

Imagine for a moment two people who think and behave in very similar ways yet perceive the world a bit differently from one another.

What if they could share their thoughts, then modify them into a single world view based on both perceptions?

This may seem weird, but our brain works this way thanks to the corpus callosum.

Located near the center of the brain, this structure is the largest bundle of nerve fibers that connects the left and right cerebral hemispheres, much like a bridge. Traffic flows in both directions, but instead of vehicles travelling over the gap, it is information.

Corpus callosum

The corpus callosum is near the center of the brain and is covered by the cerebral hemispheres. Image source

Split Brain Patients

Until the early 1950s, the function of the corpus callosum had alluded scientists. No one knew what it did, except to connect the two cerebral hemispheres. By the 1960s, scientists at least knew that nerve fibers within the callosum connected corresponding areas in the two hemispheres but did not yet understand the complexity involved. However, this limited knowledge was used in an attempt to help patients who suffered from severe and constant seizures.

Normally, electrical activity in the brain flows down specific pathways. This is not so during seizures. The electrical charges could end up anywhere in the brain and stimulate the uncoordinated muscular activity that many people associate with a seizure. Roger Sperry was the scientist who developed a surgical procedure to cut the corpus callosum and stop the spread of this activity from one hemisphere to the other. This procedure was a last ditch effort to normalize the lives of seizure patients, and it was very effective. However, there were a few unexpected results.

After surgery, some patients exhibited contrary behaviours, such as pulling their pants on with one arm while simultaneously pulling them off with the other. Another patient began to shake his wife aggressively with his left hand as his right hand intervened to stop the attack. These results began a plethora of investigations which eventually lead to the understanding that each hemisphere tends to specialize in certain activities, i.e., speech (left side) or emotional reactivity (right side).

After the patient’s callosum was cut, the attack on his wife was instigated by the right hemisphere (via his left hand) because the left hemisphere (right hand) didn’t realize what was happening soon enough to prevent it. Such a conflict ordinarily would have been resolved in the patient’s brain before the external behaviour was produced.

To a greater or lesser degree, both hemispheres contribute to the initiation of a particular behaviour. It is the corpus callosum that provides the communication pathway to coordinate these activities and helps to incorporate them into daily life. Without this brain structure, we literally have two separate personalities in your head, each with its own agenda. [Source: Jay Mallonee – Jay is a wildlife biologist, college professor and writer. His master’s degree is in neurobiology and he has studied animal behaviour since 1976.]


  1. SCHEFFER, I. E. AND S. F. BERKOVIC (2003) The genetics of human epilepsy. Trends Pharm. Sci. 24: 428–433.
  2. ENGEL, J. JR. AND T. A. PEDLEY (1997) Epilepsy: A Comprehensive Textbook. Philadelphia: Lippincott-Raven Publishers.
  3. McNamara, J. O. (1999) Emerging insights into the genesis of epilepsy. Nature 399:

Neuroscience, 3rd edition

Editors: Dale Purves, George J Augustine, David Fitzpatrick, Lawrence C Katz, Anthony-Samuel LaMantia, James O McNamara, and S Mark Williams.

Sunderland (MA): Sinauer Associates; 2004.
ISBN 0-87893-725-0


Epilepsy: Still an Enigma For Common People

“It starts with the sensation of a light switch being pulled violently behind my eyes. I lose cognitive control quickly. I can’t focus on even a simple task, and I forget what I’m doing while I’m in the middle of doing it. I could pick up a pen, then forget why I’m holding it. As the weight behind my eyes intensifies, my eyes roll into my head and start to flutter so rapidly it feels like they are going to pop out. This can last for a split second, a few hours, a few days, or a week.

I have epilepsy, a neurological disorder characterised by recurring, unprovoked seizures. The episodes I described are seizures — they are simply misfiring neurones. Sometimes, these seizures are affected by the season. Other times, they are more easily triggered by stress.

I had my first Tonic-Clonic Seizure in December 1996, my Grade Six year. I fell unconscious into a snow bank in the parking lot of my elementary school. Many people are more familiar with this seizure’s older name, Grand Mal.

I was diagnosed with Generalized Seizures that same year. This news left my family and teachers confused. I had never shown any signs of what they thought of as epilepsy, the violent shaking on the ground portrayed in movies and on TV. No one realised I had been having seizures for many years. Instead, they misread my childhood behaviour as misbehaving.

I frequently blacked out for split seconds in elementary school. The blackouts were likely Absence Seizures, a type of seizure that looks like daydreaming. Even though the blackouts happened on a regular basis, they were almost impossible to spot with an untrained eye.

In a Grade Three art class, I blacked out and knocked over a cup of water that contained a few paint brushes. At the time, no one realised it was a Partial Seizure. When I came to, my teacher asked me why I had done that. She told me it was a disturbance to the class and I needed to watch my behaviour.

I had never even heard the word “seizure” before the age of 11. Without a reference point, these incidents in school seemed normal to me. As far as I was concerned, I didn’t have seizures; I just needed to control my behaviour so I would stop getting in trouble at school.

There was no information about raising a child with Generalized Seizures available to families living with epilepsy in the late-’90s. My family and my teachers didn’t know anyone with epilepsy who could help them figure it out. There were no community epilepsy agencies in our area at the time. Without available resources, we felt left in the dark.

Without a clear understanding of epilepsy as I grew up, it became difficult for me to talk about it with others in my life. As a young adult, I would often avoid discussing it with boyfriends, new employers, and new friends.” [Source: Undiagnosed Epilepsy Made People Think I Was Acting Out

Globally, one in 100 people are diagnosed with epilepsy. It is one of the most common neurological conditions worldwide, yet public knowledge is extremely limited. Many people with epilepsy never talk publicly about their diagnosis fearing discrimination. Seizures and seizure first aid on television are often inaccurate. Myths and misconceptions about epilepsy persist.

Epilepsy is a chronic disorder characterised by recurrent seizures, which may vary from a brief lapse of attention or muscle jerks to severe and prolonged convulsions. The seizures are caused by sudden, usually brief, excessive electrical discharges in a group of brain cells (neurones). In most cases, epilepsy can be successfully treated with anti-epileptic drugs. [Source:


File:Opisthotonus in a patient suffering from tetanus - Painting by Sir Charles Bell - 1809.jpg

Painting by Sir Charles Bell (1809) showing opisthotonos (in a patient suffering from tetanus.  Opisthotonus (धनुर्वात), Tetanus (धनुस्तम्भ)

Imitators of Epilepsy

  • Fainting (syncope)
  • Mini-strokes (transient ischemic attacks or TIAs)
  • Hypoglycemia (low blood sugar)
  • Migraine with confusion
  • Sleep disorders, such as narcolepsy and others
  • Movement disorders: tics, tremors, dystonia
  • Fluctuating problems with body metabolism
  • Panic attacks
  • Nonepileptic (psychogenic) seizures


What is a Seizure?

A seizure is a brief disruption in normal brain activity that interferes with brain function.

The brain is made up of billions of cells called neurones which communicate by sending electrical messages. Brain activity is a rhythmic process characterised by groups of neurones communicating with other groups of neurones. During a seizure, large groups of brain cells send messages simultaneously (known as “hypersynchrony”) which temporarily disrupts normal brain function in the regions where the seizure activity is occurring.

Seizures can cause temporary changes or impairments in a wide range of functions. Any function that the brain has can potentially be affected by a seizure, such as behaviour, sensory perception (vision, hearing, taste, touch, smell), attention, movement, emotion, language function, posture, memory, alertness, and/or consciousness. Not all seizures are the same. Some seizures may only affect one or two discrete functions, other seizures affect a wide range of brain functions.

Most people associate a seizure with a loss of consciousness and rhythmic jerking movements. Some seizures do cause convulsive body movements and a loss of consciousness, but not all. There are many different kinds of seizures. A temporary uncontrollable twitching of a body part could be due to a seizure. A sudden, brief change in feeling or a strange sensation could be due to a seizure.

Most seizures are brief events that last from several seconds to a couple of minutes and normal brain function will return after the seizure ends. Recovery time following a seizure will vary. Sometimes recovery is immediate as soon as the seizure is over. Other types of seizures are associated with an initial period of confusion afterwards. Following some types of seizures, there may be a more prolonged period of fatigue and/or mood changes.

What is the Difference Between a Seizure and Epilepsy?

A seizure is a brief episode caused by a transient disruption in brain activity that interferes with one or more brain functions.

Epilepsy is a brain disorder associated with an increased susceptibility to seizures.

When a person experiences a seizure it does not necessarily indicate that they have epilepsy, there are many possible reasons that a seizure could happen. When someone has been diagnosed with epilepsy it indicates that they have had a seizure (usually 2 or more) and they are considered to have an increased risk of future seizures due to a brain-related cause.

Causes of Epilepsy

Just as there are many different types of epilepsy there are many different causes too, which include:

  • a brain injury or damage to the brain – Anything that can injure the brain is a potential cause of epilepsy including head trauma; stroke; brain injury during birth; neurodegenerative diseases; brain tumours; and many others. Epilepsy may begin weeks, months or years after an injury to the brain.
  • structural abnormalities that arise during brain development – Sometimes these structural changes in the brain are visible on a brain scan (such as an MRI), other times there could be subtle changes in brain structure that are not easy to detect with current imaging techniques. Epilepsy due to a structural abnormality may begin early in life, during adolescence or in adulthood.
  • genetic factors
    Some genetic causes of epilepsy are inherited and there may be other family members with epilepsy, while other genetic factors that cause epilepsy occur at random.
  • a combination of two or more of the above factors
  • Infections, including brain abscess, meningitis, encephalitis, and HIV/AIDS

For many people with epilepsy, the cause of their seizures is unknown. It is hoped that research and new developments in diagnostic testing will provide more answers for people with epilepsy and their families.

Epilepsy that does not get better after two or three anti-seizure drugs have been tried is called “medically refractory epilepsy.” In this case, the doctor may recommend surgery to:

  • Remove the abnormal brain cells causing the seizures.
  • Place a vagal nerve stimulator (VNS). This device is similar to a heart pacemaker. It can help reduce the number of seizures.

Opportunity is Here And Now: A Lesson That Can be Learned From Pandit Madan Mohan Malaviya

I came across a very interesting article from a Newspaper ‘New Indian Express’ published in India, I thought I should share it on my blog.

 “Opportunity is here and now”

It’s not the lack of opportunities that prevent a person from succeeding in life.

Not using what you have on hand is a roadblock that impedes one’s success in his/her endeavours.

I do not wish to work under someone. I want to run my own business.

For that purpose, I need capital and experience. So I am working as a sales representative in an organisation. But in this profession, I have to go from house to house; organisation to organisation.

In order to meet a person, I have to wait for hours together. Even to get an appointment to see someone, I have to visit the place several times. Many look down upon me as a ‘nuisance’ and treat me so.

Because of this, my self­ esteem has suffered severe blows.

Sometimes, I feel why at all should I get into any business.

Why can’t I get employed as a clerk in a company?

When someone praises us as capable, clever, what do we think?

This person says I am intelligent. He thinks I know a lot. He calls me clever. He compares me with Chanakya. He feels that I am highly skilful. He is astounded at my intelligence…

In this manner, we create a heap of a thousand words of praise from a single word of appreciation.

Similarly, if someone calls us a fool, we add a thousand words of insult and feel depressed.

We would not look into every possible interpretation of the word, and create meanings where none exist!

Pandit Madan Mohan Malaviya was the founder of The Banaras Hindu University (B.H.U), in North India.

B.H.U is one of the largest residential universities in Asia, with over 20,000 students.

Pandit Madan Mohan Malaviya

In the first decade of the 20th Century, the country (India) was in abysmal depths and independence appeared like a distant dream. The total number of colleges in the country had gone up from 27 in 1857 to just 72 in 1882 and the total number of B. A. graduates in twenty-five years was 3284. Literacy was an unbelievable low of about 6% in 1900 and the educational facilities were meagre. All the five Universities which existed at that time in Calcutta, Bombay, Madras, Lahore and Allahabad, were mainly examining Universities. While India had only five Universities at that time, U.K had eighteen, France– fifteen, Italy– twenty-one, Germany-twenty two, and USA– 134 Universities.

Under these conditions, it would have looked foolhardy and vain to think or even dream of the ‘post-independent India’. However, at that time, Malaviyaji started looking beyond the current milieu and the forthcoming independence. He saw from the depths/from the dark abyss into the distant future and visualized a ‘Resurgent Modern India’!!

Malaviyaji realised that ‘Modern India’ can be built by engineers, doctors, scientists & artists, only when they are imbued with high character, probity and honour. He strongly felt that all of them could be nurtured in a beautiful, big garden called the ‘University’, which should be an extension and modified version of the gurukul. Hence, in order to meet the future immense needs of the ‘Resurgent Modern India’, he visualized a ‘Modern University’ that combines the best thought and culture of the East with the best Science & Technology of the West. Source:

“Everyone knows that there is no great beggar than Pandit Malaviyaji on the face of the earth. He has never begged for himself; by the grace of God he has never been in want, but he became a voluntary beggar for causes he has made his own, and God has always filled his bowl in an over-flowing measure. But he had an insatiable appetite and although he got the crore he wanted he is still asking for more.”

Mahatma Gandhi, Silver Jubilee Adress, 21st Jan 1942.

Pandit Madan Mohan Malviya worked with determination to start the University.

There was a crisis for funds, but he did not get disheartened. He went from town to town, met many rich men and traders to collect donations.

Malaviyaji undertook frequent tours to request and persuade the rulers and rich men of various states to donate to the noble cause.

It took the Prince of Beggars, as Malaviyaji was popularly known, nearly two years from 15th July 1911 to 28th April 1913 to collect the minimum required the amount of Rs (I.N.R) 50 lakhs to start serious negotiations with the representatives of the British Government of India. It is not only the rich but also the poor and the lowly responded generously to Pandit Madan  Mohan Malaviya’s call. It is said that a woman offered her bangles for the cause and a courtesan her day’s earnings! [Source:

He went to the Nizam of Hyderabad to request him for funds.

The last Nizam was well known for his huge wealth and jewellery collection; he had been the richest man in the world until the end of his reign

The Nizam was furious,

“How dare you come to me for funds… that too for a Hindu University?” he roared with anger and took off his footwear and flung them at Malviya.

Malviya picked up the footwear and left silently. He came directly to the marketplace and began to auction the footwear. As it was the Nizam’s footwear, many came forward to buy it. The price went up. When the Nizam heard of this, he became uneasy. He thought it would be an insult if his footwear were to be bought by someone for a pittance. So he sent one of his attendants with the instruction,

“Buy that footwear no matter what the bidding price be!”

Thus, Malviya managed to sell the Nizam’s own footwear to him, for a huge amount. He used that money to build the Banaras Hindu University!

The Benares Hindu University Act was passed on 1st October 1915 and came into force from 1st April 1916. The foundation stone was laid on 4th February 1916 by H.E.Lord Hardinge, the then Viceroy & Governor General of India. The first colleges to be started were: The Central Hindu College (Oct 1917), The College of Oriental Learning (July 1918), The Teachers Training College (Aug 1918) and The Engineering College (Aug 1919). []

I only wish to tell this to all those young men who are without an ideal or a goal in life.

Do you know what prevents a person from succeeding?

It is not his lack of skills or qualifications.

It is not what you have, but it is how you use what you have, which makes a difference in your life.

The greatness of the Vision depends mostly on its far-sightedness, it’s clarity, it’s magnitude, and it’s wide canvas.

Normally, the farther one looks into the future, hazier is the picture.  

While the ordinary see nothing but the dark clouds, the visionary sees a bright star shining in the distance.

He/She then paints it for others with all the clarity on a wide canvas.

Do not give up under the impression that ‘Opportunity is No Where!’ Take that sentence in the right spirit that ‘Opportunity is Now Here!’ and move forward with your life!

“No aspect of Malaviya ji is hidden from me. I am well aware of his simplicity, purity, tenderness, and love. From all these virtues of him, you must take as much as you (students & teachers) can. If someone can not take the warmth of Sun, even being in the open, it is not the fault of Sun. Sun itself gives warmth to one and all. If someone does not want to take it and shivers in cold then what can Sun do? Being so close to Malaviyaji, if you cannot learn from his life simplicity, sacrifice, patriotism, large-heartedness, universal love and other virtues, then who can be a greater unlucky person than you?”     

Mahatma Gandhi

“Patriotism and service to the motherland is food for Malaviya Ji. He can never, ever leave it, just as it is impossible to leave daily recitation of BhagwadGita. Patriotism and service to motherland together is the breath of life for him. That is why till he breathes he will ceaselessly continue to serve motherland and humanity.” 

Mahatma Gandhi, from monthly periodical ‘VishwaJyoti’, Jan 1962.

When I returned to my Country, I first went to Lokamanya Tilak. He appeared tall like the Himalayas. I thought it was not possible for me to scale the heights and returned. Then I went to Deshbandhu Gokhale. He appeared deep like the Ocean. I saw that it was not possible for me to gauge the depth and returned. Finally, I went to Mahamana Malaviya and he appeared like the pure flow of Ganga. I saw it was possible to take bath in the sacred flow.      

Mahatma Gandhi

His personality cannot be condensed in a few words. Mahatma Gandhi called him “praatah smaraniyal”, a pious person whose name when remembered in the morning would lift one out of the mire of one’s sordid self. Gandhiji compared Tilak to the lofty Himalayas, Gokhale to the deep seas and Malaviyaji to the crystal clear sacred river in which he decided to have ablution! Malaviyaji’s gentle, sweet, soft and graceful nature was a true reflection of his abundant love for humanity. A British official commented that Malaviyaji ‘wore the white flower of a blameless life’.

Edgar Snow, a journalist, wrote that his personality radiated ‘the sweetness and simplicity of a child, yet his words carried the strength and conviction of a man with a settled philosophy of life’. For all his sweetness he could still be tougher than the toughest when occasion demanded it. Dr S. Radhakrishnan said                             “Pandit Malaviyaji is a Karmayogi. He is not only a representative of Hinduism but the soul of Hinduism. He had strived all through his life for the Hindu ideals and we see the combination of idealism and practical wisdom……. While preserving the imperishable treasures of our past, he is keen on moving forward with the times”.

Malaviyaji visualized the importance of education and the hardships of the students early in life. He set up the ‘MacDonald Hindu Boarding House to accommodate 230 students in 1903 in Allahabad, by collecting a public donation of Rs (I.N.R) 1.3 lakhs. This appears to be the precursor for his grand vision of the Banaras Hindu University, which he built up from a vision in 1900 to a reality in 1916. These examples show his keen analysis of a problem, ability to think of a workable solution, motivate a team to work, collect a large number of funds for a public cause and realize the dream.

Gopala Krishna Gokhale said

“Malaviyaji’s sacrifice is a real one. Born in a poor family, he started earning thousands monthly. He tasted luxury and wealth but giving heed to the call of the nation, renouncing all he again embraced poverty”.

Adapted from

‘Swami Sukhabodhananda’- The New Indian Express.

Published Date: Sep 27, 2012, 10:28 AM

23/01/2016 Opportunity is here and now

Neural Stem Cells – Promise and Perils

One of the most highly publicized issues in biology over the past several years has been the use of stem cells as a possible way of treating a variety of neurodegenerative conditions, including Parkinson’s, Huntington’s, and Alzheimer’s diseases.

Amidst the social, political, and ethical debate set off by the promise of stem cell therapies, an issue that tends to get lost is…

What, exactly, is a stem cell?

Neural stem cells are an example of a broader class of stem cells called somatic stem cells. These cells are found in various tissues, either during development or in the adult.

All somatic stem cells share two fundamental characteristics:

  1. they are self-renewing, and
  2. upon terminal division and differentiation they can give rise to the full range of cell classes within the relevant tissue.

Thus, a neural stem cell can give rise to another neural stem cell or to any of the main cell classes found in the central and peripheral nervous system (inhibitory and excitatory neurons, astrocytes, and oligodendrocytes; Figure A).

A neural stem cell is therefore distinct from a progenitor cell, which is incapable of continuing self-renewal and usually has the capacity to give rise to only one class of differentiated progeny.

  • An oligodendroglial progenitor, for example, continues to give rise to oligodendrocytes until it’s mitotic capacity is exhausted;
  • a neural stem cell, in contrast, can generate more stem cells as well as a full range of differentiated neural cell classes, presumably indefinitely.

Neural stem cells, and indeed all classes of somatic stem cells, are distinct from embryonic stem cells.

Embryonic stem cells (also known as ES cells) are derived from pre-gastrula embryos. ES cells also have the potential for infinite self-renewal and can give rise to all tissue and cell types throughout the organism including germ cells that can generate gametes (recall that somatic stem cells can only generate tissue specific cell types). Stem cells

There is some debate about the capacity of somatic stem cells to assume embryonic stem cell properties.

Some experiments with hematopoetic and neural stem cells indicate that these cells can give rise to appropriately differentiated cells in other tissues; however, some of these experiments have not been replicated.

The ultimate therapeutic promise of stem cells—neural or other types—is their ability to generate newly differentiated cell classes to replace those that may have been lost due to disease or injury.

Such therapies have been imagined for some forms of diabetes (replacement of islet cells that secrete insulin) and some hematopoetic diseases. In the nervous system, stem cell therapies have been suggested for replacement of dopaminergic cells lost to Parkinson’s disease and replacing lost neurons in other degenerative disorders.

While intriguing, this projected use of stem cell technology raises some significant perils.

  • These include insuring the controlled division of stem cells when introduced into mature tissue, and
  • identifying the appropriate molecular instructions to achieve differentiation of the desired cell class.

Clearly, the latter challenge will need to be met with a fuller understanding of the signalling and transcriptional regulatory steps used during development to guide differentiation of relevant neuron classes in the embryo.

At present, there is no clinically validated use of stem cells for human therapeutic applications in the nervous system. Nevertheless, some promising work in mice and other experimental animals indicates that both somatic and ES cells can acquire distinct identities if given appropriate instructions in vitro (i.e., prior to introduction into the host), and if delivered into a supportive host environment.

For example, ES cells grown in the presence of platelet-derived growth factor, which biases progenitors toward glial fates, can generate oligodendroglial cells that can myelinate axons in myelindeficient rats. Similarly, ES cells pretreated with retinoic acid matured into motor neurons when introduced into the developing spinal cord (Figure below).

While such experiments suggest that a combination of proper instruction and correct placement can lead to appropriate differentiation, there are still many issues to be resolved before the promise of stem cells for nervous system repair becomes a reality.

Schematic of the injection of fluorescently labeled embryonic stem (ES) cells into the spinal cord of a host chicken embryo.

ES cells integrate into the host spinal cord and apparently extendaxons.

the progeny of the grafted ES cells are seen in the ventral horn of the spinal cord. They have motor neuron-like morphologies, and their axons extend into the ventral root. (From Wichterle et al., 2002.)

Source: Neuroscience, 3rd edition
Editors: Dale Purves, George J Augustine, David Fitzpatrick, Lawrence C Katz, Anthony-Samuel LaMantia, James O McNamara, and S Mark Williams.
Sunderland (MA): Sinauer Associates; 2004.
ISBN 0-87893-725-0

What is your definition of ethics?

Ethics — described briefly as the norms by which acceptable and unacceptable behaviours are measured–has been the concern, and perhaps the great dilemma, of sentient humans since Socrates subjected it to philosophical inquiry almost 2500 years ago. Socrates believed, without universal acceptance, that the most pertinent issues people must deal with are related to how we live our lives, what actions are and are not righteous, and how people should live together peacefully and harmoniously.

Image result

Image Source:


A vast parade of philosophers, religious leaders, politicians, professors, and self-help gurus have followed Socrates’ lead through the ensuing centuries; it’s a popular and enduring subject, perhaps because it is so complex, intriguing, and pervasive in every facet of our lives.

Image result

Image Source:


Today, ethics dominates our news in the form of anti-ethics. The headlines in newspapers and the lead stories on TV, radio, and Internet news are typically about such abhorrent behaviour as lying, stealing, revenge, convictions for corruption, gratuitous murder, and misuse of public or other people’s funds for personal gain. Readers, viewers, and listeners can hardly be faulted for thinking that we live in a corrupt society, exactly what Socrates and others did not want or envision. Perhaps the anti-ethical stance of the media is the most anti-ethical part of our society.
Nevertheless, the battle for a more ethical society rages unchecked and unabated.

Ethics In Leadership

Image Source:


Image result

Image Source:



Bad circumstances are not excuses for making bad choices.
Values and ethics are not just designed for good times, but also to get you through bad times. They are like the laws of the land— you need them when circumstances are good, but they’re even more valuable to protect you from the bad.
Most choices are not ethical choices. 
For example, what clothes to buy or what TV to get are personal choices based on what is appropriate for your situation. They are not ethical choices. Personal choices are subjective, not objective. Even though these are not ethical issues they certainly involve responsibility. 

Image Source:

Ethical choices reflect the objective choice between right and wrong. That is why your conscience hurts when making an unethical choice and does not hurt when you make a wrong personal choice—– because in ethical matters there is a clear right choice. 

Image Source:

Just as with a mathematics test, who takes it and whatever answer they give varies, but what makes it right is not the choice, but the actual correctness of the answer. 

Being a nice person is not the same thing as being a good and ethical person.

A person can be socially nice yet be a cheat and a liar. That makes him nice but unethical. However, niceness reflects social acceptability. Nice does not mean good.

Unfortunately, many of our choices today seem to be based on:

1.   Our desire for convenience, comfort and pleasure.
2.  Our feelings- the criteria is to feel good rather than do what is responsible.
3.  Social fads and ads- the philosophy that everyone else is doing it, so why shouldn’t I?
It is a common belief that ethics and ethical choices are confusing. 
The big question is to whom?
Only to those with unclear values. 



Image Source:

Those who believe that ethics cannot be generalised but vary with every situation, come up with justification and keep changing their ethics from situation to situation, and person to person. 

This is called SITUATIONAL ETHICS. This is the ethics of convenience rather than conviction.

You'd think she'd have seen that coming.

Image Source:


There’s harmony and inner peace to be found in following a moral compass that points in the same direction, regardless of fashion or trend —–Ted Koppel

Why do we have standards? 
 Standards are a measure. 
 One Metre in Europe is One Metre in Asia. 
One Kilogramme of flour is One Kilogramme of flour wherever you go.

Image Source:

People who do not want to adhere to any moral standards, keep changing the definition of morality by saying nothing is right or wrong, that one’s thinking makes it so. They put the onus in interpretation rather than on their behaviour. 

They feel “my behaviour is okay, your interpretation was faulty.”

Image Source:

For example,
Adolf Hitler could have believed he was right. 
But the big question is, “Was he right?” 
Giving food to the hungry is right but at the same time giving food everytime a person becomes hungry is not, teach him/her how to learn and earn.
The generalisation sets the benchmark; the exception is the situation. 
For example, murder is wrong. That is a general statement and a generalised truth, and ethical standard. Unless it is in self-defence. 

This doesn’t mean it is okay to murder if the weather is good or if you feel like it.

Our standard of ethics is revealed by the advisors we hire, the superiors we choose to work with, the friends we like to hang out with, the suppliers we choose, the buyers we deal with, as much as how we spend our leisure time. 

Image Source:

Opinions vary from culture to culture. But values such as fairness, justice, integrity and commitment are universal and eternal. 
They have nothing to do with culture. Never has there been a time when society has not respected courage over cowardice.
Ethics and justice involve the following:
·       Empathy
·       Fairness
·       Compassion for the injured, the ill, and the aged.
·       The larger interests of the society.

Just because a majority of people agree on something doesn’t make it right.

If the citizens of a country voted to disenfranchise all blue-eyed people, that doesn’t make it a right decision.

Image Source:

 Basic ethics are pretty universal. Just as freedom without discipline leads to destruction, similarly, a society without a set of principles destroys itself.
If values were so subjective, no criminal should be in jail.
A society becomes good or bad, based on the ethical values of individuals. And what gives a society it’s strength is it’s underlying ethical values.
People who believe in the relativity of ethics get stuck in their own paradox. 
They say, “Everything is relative.” 
The statement is itself is an absolute truth. It is self-contradictory. The distinction between right and wrong, dishonesty and honesty, presupposes their existence. Changing terminology does not change the meaning.
Just like changing labels does not change the contents. Low moral values become more accepted by giving them new names, though the result is the same. 
Image result

Image Source: Ethics Training

Sometimes even the media glamorises immorality— liars are called extroverts with imagination.

The price of apathy is to be ruled by evil men Plato
To educate a man, and not in morals, is to educate a menace to society. 
                                                                                                   —-Theodore Roosevelt

Image Source:


When Michael Severn, the President of Columbia University resigned in 1993, a reporter asked him if there was any task left incomplete.
“YES”, replied Severn.
“It sounds complacent, but there is really only one.” 
He referred to the lack of instruction in ‘Ethics’:
“The average undergraduate, however, gets no training in these areas. Most educators are afraid to touch the subject. The subject of ethics is usually left to parents to address. The result is that young people who need moral and ethical training than ever are getting less than ever. Moral and ethics are not religion. They are logical, sensible principles of good conduct that we need for a peaceful society.”
[Source: John Beckley, “Isn’t it time to Wake Up?” In the Best of….Bits and Pieces, Economics Press, Fairfield, NJ, 1994, p.129]


Image Source:

Let no man be sorry he has done good, because others have done evil! If a man has acted right, he has done well, though alone; if wrong, the sanction of all mankind will not justify him.

                                                                                                               —–Henry Fielding

Most will agree that legality and ethics are not the same things. What may be ethical may or may not be legal, and vice-versa.
For example:
1.   An insurance salesperson more concerned with getting a larger commission than selling the best policy for that particular client sells an unsuitable policy. This may be legal but it is unethical.
2.  A young executive is driving over the speed limit, trying to reach the hospital with his bleeding child in the back seat of his car. Hardly anyone would question the breaking of law in this situation. It would be unethical not to get medical help to save the child’s life, even if it meant breaking the law.
 Legality establishes minimum standards, whereas ethics and values go beyond standards. Ethics and values are about fairness and justice. They are not about pleasing and displeasing people. They are about respecting people’s needs and rights.

Image Source:


Image Source:

There are many kinds of desires

the desire for success;
the desire to do one’s duty even at the cost of pleasure;
the desire for purpose—something worth dying for which gives meaning to life.
What good is it if you gain the whole world and lose your conscience?
A purposeless life is a living death.
What is your purpose?
Do you have one?
Purpose brings passion.
Find or create a purpose and then pursue it with passion and perseverance.
Every day we need to ask ourselves:
“Am I getting any closer to my purpose in life? Do you have one? 
Purpose brings passion. Find or create a purpose and then pursue it with passion and perseverance.

Image Source:

 Adapted from ‘You Can Win’
by Shiv Khera.

Questioning The Use of Trendelenburg Position in Management of Acute Hypotension.


Trendelenburg or Head Down Tilt (HDT) Position


In the Trendelenburg position, the body is laid flat on the back with the feet higher than the head by 15-30 degrees, in contrast to the reverse Trendelenburg position, where the body is tilted in the opposite direction.


Acute hypotension is an extremely common condition seen in a wide variety of patients and is defined as a condition in which the blood pressure is decreased to a point that is inadequate for normal tissue perfusion and oxygenation.

Hypotension may be a sign of shock or may progress to a ‘shock’ state. Its causes are multifactorial with etiologies such as fluid volume deficits (hypovolemic), decreased cardiac output (cardiogenic), inadequate intravascular volume (vasodilatory), or iatrogenic effects of certain classes of medications. The severity of the condition is related to the amount of circulating blood. If blood does not reach vital organs, perfusion is compromised, resulting in tissue hypoxia and damage to the vital organs of the brain, heart and kidneys.

The mortality rate for shock is extremely high, reaching 60% to 80% in cardiogenic shock. For this reason, it is imperative that interventions target resolving severe hypotension or shock as quickly and effectively as possible. One intervention commonly used to manage severe hypotension is Trendelenburg positioning, defined as a position in which the head is low and body and legs are on an inclined or raised plane.

Theoretically, it shifts abdominal organs upward and out of the pelvis and increases blood flow to the brain in case of hypotension or shock. However, use of this intervention is controversial, and many experts in the medical and nursing field question its efficacy. In addition, some clinicians and nurses concede that the intervention has harmful effects that may actually worsen patient outcomes. [Trendelenburg Positioning to Treat Acute Hypotension: Helpful or Harmful. Clinical Nurse Specialist: July/August 2007 – Volume 21 – Issue 4 – pp 181-187]

It is important to seek answers to the questions surrounding the efficacy of Trendelenburg positioning because it is frequently used by nurses and health care givers, who are patient advocates and are often the first to recognise a deteriorating patient condition. Trendelenburg positioning is used by nurses and health care givers because it is believed to improve patients’ outcomes.

The support for this intervention is anecdotal and can be traced to the 1800s. The position is named for the surgeon who originally coined it’s use in 1890, Dr. Friedrich Trendelenburg who studied medicine in Scotland and Berlin before becoming a professor of surgery and surgeon-in-chief in Leipzig, Germany. His fascination with urology linked his name to the positioning technique, now commonly known as the Trendelenburg position or head-down-tilt (HDT) position.

Dr. Trendelenburg used this head down, elevated body position to surgically manage strangulated hernias, bladder stones, and various gynaecological problems. Friedrich Adolf Trendelenburg placed patients supine with the head of the bed tilted 45 degrees downward to aid visualisation of abdominal organs for surgical procedures.

However, it was not until World War I that Walter Cannon, an American physiologist, introduced the position as a treatment of shock. He promoted the technique as a way to increase venous return to the heart, increase cardiac output, and improve blood flow to vital organs.

Today, some clinicians use this position, now called the Trendelenburg position, to treat hypotensive episodes. They believe this position shifts intravascular volume from the lower extremities and abdomen to the upper thorax, heart, and brain, improving perfusion to these areas.

But as far back as the 1960s, researchers found undesirable effects of the Trendelenburg position, including decreased blood pressure, engorged head and neck veins, impaired oxygenation and ventilation, increased aspiration risk, and greater risk of retinal detachment and cerebral oedema.

Evidence shows that while this position shifts fluid, it adversely engorges the right ventricle, causing it to become dilated, which further reduces cardiac output and blood pressure. It also impairs lung function by compromising pulmonary gas exchange. Abdominal contents shift upward, increasing pressure on and limiting movement of the diaphragm and reducing lung expansion. Lung compliance, vital capacity, and tidal volumes decrease while the work of breathing increases. The result is impaired gas exchange—hypercarbia and hypoxemia. Evidence also suggests that when obese patients are placed in Trendelenburg position, lung resistance increases significantly and pulmonary gas exchange worsens.

The Trendelenburg position has little, if any, positive effect on cardiac output and blood pressure. It impairs pulmonary gas exchange and increases the aspiration risk. The evidence doesn’t support its use to treat hypotension. However, evidence-based practice does support elevating the lower extremities—without using a head-down tilt position—to mobilize fluid from the lower extremities to the core during hypotensive episodes. Sometimes called a modified Trendelenburg position, this position has been found to support blood pressure without the negative consequences of the traditional Trendelenburg position.

An extensive search on Ovid Medline, CINAHL, and Evidence-Based Medicine Review Multifile was conducted to identify pertinent articles. The inclusionary criteria were, the use of HDT of greater than, or equal 10̊, and patients under general anesthesia. Six articles were identified and critically appraised. The data compiled in this systematic review suggest there is an increase in cardiac preload with no consequent increase in cardiac output or performance. The data suggest there are multiple negative consequences of HDT on pulmonary function including a decrease of functional residual capacity, an increase of atelectasis, and a decrease in oxygenation. This systematic review concluded, there is a lack of clear evidence to support the use of HDT as a treatment for acute hypotension. In the controlled environment of the surgical setting, head-down tilt should be utilized judiciously and for as short a duration as possible. HDT position should be avoided in patients who are obese, have pre-existing obstructive pulmonary disorders, have New York Heart Association class III heart failure, or other significant cardiopulmonary dysfunction.[Carter, Aaron T., “The Cardiopulmonary Consequences of the Trendelenburg Position in Patients Under General Anesthesia” (2010). School of Physician Assistant Studies. Paper 214.]

Ensuring that healthcare practices are based on the best evidence can improve patient safety. To safely and effectively manage acutely ill patients, clinicians must evaluate traditional practices and systems.

Reference: Questioning Common Nursing Practices
What Does the Evidence Show? Am Nurs Today. 2013;8(3).

Scientific and Medical Publishing: Role and Definition of Authors

What do the editors of medical journals talk about when they get together?

So far today, it’s been a fascinating but rather grim mixture of research that can’t be replicated, dodgy authorship, plagiarism and duplicate papers, and the general rottenness of citations as a measure of scientific impact.

We’re getting to listen and join in the editors’ discussion  in Chicago in the year 2013 (7th International Congress on Peer Review and Biomedical Publication). They assemble once every four years to chew over academic research on scientific publishing and debate ideas. This tradition was started by The Journal of American Medical Association, JAMA in Chicago in 1989. The name of the international congress still goes by its original pre-eminent concern, “peer review and biomedical publication.” But the academic basis for peer review is a small part of what’s discussed these days.

The style hasn’t changed in all these years, and that’s a good thing. As JAMA contributing deputy editor Drummond Rennie  said, most medical conferences go on and on, “clanking rustily forward like a Viking funeral.” Multiple concurrent sessions render a shared ongoing discussion impossible.

The congress hurtled off to an energetic start with John Ioannidis, epidemiologist and agent provocateur author of “Why most published research findings are false.” He pointed to the very low rate of successful replication of genome-wide association studies (not much over 1%) as an example of very deep-seated problems in discovery science.

Half or more of replication studies are done by the authors of the original research: “It’s just the same authors trying to promote their own work.” Industry, he says, is becoming more concerned with replicability of research than most scientists are. Ioannidis cited a venture capital firm that now hires contract research organizations to validate scientific research before committing serious funds to a project.

Why is there so much un-reproducible research? Ioannidis points to the many sources of bias in research. Chavalarias and he trawled through more than 17 million articles in PubMed and found discussion of 235 different kinds of bias. There is so much bias, he said, that it makes one of his dreams – an encyclopedia of bias – a supremely daunting task.

Authorship Issues & Postdocs - No it is my wifes turn to be first author on your paper.

What would help?

Ioannidis said we need to go back to considering what science is about: “If it is not just about having an interesting life or publishing papers, if it is about getting closer to the truth, then validation practices have to be at the core of what we do.” He suggested three ways forward:

  1. we have to get used to small genuine effects and not expect (and fall for) excessive claims.
  2. Secondly, we need to have – and use – research reporting standards.
  3. The third major strategy he advocates is registering research: protocols through to datasets.

Isuru Ranasinghe, in a team from Yale, looked at un-cited and poorly cited research in cardiovascular research. The proportion isn’t changing over time, but the overall quantity is rising rather dramatically as the biomedical literature grows: “1 in 4 journals have more than 90% of their content going un-cited or poorly cited five years down the track.” Altogether, about half of all articles don’t have an influence – if you judge it by citation.

Earlier, though, there was a lot of agreement from the group on the general lousiness of citation as a measure and influence on research. Tobias Opthof, presenting his work on journals pushing additional citation of their own papers, called citation impact factors “rotten” and “stupid”. Elizabeth Wager pulled no punches at the start of the day, reporting on analyses of overly prolific authors: surely research has to be about doing research, not just publishing a lot of articles. Someone who publishes too many papers, she argued, could be of even more concern than someone who does research, but publishes little. Incentives and expectations of authorship really no longer serve us well – if they ever did. [Bad research rising: The 7th Olympiad of research on biomedical publication]

One of the highlights of graduate school is publishing your very first papers in peer-reviewed journals. But what this novice scientist should not be fretting over is which colleagues should be included as authors and whether they are breaking any norms. The two things that should be avoided are including as authors, those that did not substantially contribute to the work, and excluding those that deserve authorship. There have been controversial instances where breaking these authorship rules caused uncomfortable situations. None of us would want someone writing a letter to a journal arguing that they deserved authorship. Nor is it comfortable to see someone squirming out of authorship, arguing they had minimal involvement when an accusation of fraud has been levelled against a paper. How to determine who should be an author can be difficult.

The cartoon above highlights the complexity and arbitrariness of authorship –and the perception that there are many instances of less than meritorious inclusion.

Journals do have their own guidelines, and many now require statements about contributions, but even these can be vague, still making it difficult to assess how much individuals actually contributed. We usually reiterate the criteria from Weltzin et al. (2006)[Weltzin, J. F., Belote, R. T., Williams, L. T., Keller, J. K. & Engel, E. C. (2006) Authorship in ecology: attribution, accountability, and responsibility. Frontiers in Ecology and the Environment, 4, 435-441]. There are four criteria to evaluate contribution:

  1. Origination of the idea for the study. This would include the motivation for the study, developing the hypotheses and coming up with a plan to test hypotheses.
  2. Running the experiment or data collection. This is where the blood, sweat and tears come in.
  3. Analysing the data. Basically moving from a database to results, including deciding on the best analyses, programming (or using software) and dealing with inevitable complexities, issues and problems.
  4. Writing the paper. Putting everything together can sometimes be the most difficult and external motivation can be important.

Basic requirements for authorship are that one of these steps was not possible without a key person, or else there was a person who significantly contributed to more than one of these. Such requirements mean that undergraduates assisting with data collection do not meet the threshold for authorship. Obviously these are idealized and different types of studies (e.g., theory or methodological papers) do not necessarily have all these activities. Regardless, authors must have contributed in a meaningful way to the production of this research and should be able to defend it. All authors need to sign off on the final product. [Navigating the complexities of authorship: Part 1 –inclusion]

While this system is idealized, there are still complexities making authorship decisions difficult or uncomfortable.

I recently came across an article “AUTHORSHIP: AN EVOLVING CONCEPT”.

It deals with the role and definition of authorship and the need to differentiate between an “author” and a “contributor”.

As most of us often write an article or a study to be published in a journal or a magazine, I thought it would be necessary to share it with everyone, as simply providing a link would have let it gone unnoticed.

Authorship confers credit and has important academic, social, and financial implications. Authorship also implies responsibility and accountability for published work.

Manuscript Rejection - where are those editors these days.

Authorship: An Evolving Concept

By Steph Fairbairn, Leanne Kelly, Selina Mahar, and Reinier Prosée, editorial coordinators, Health Learning, Research & Practice, Wolters Kluwer

The role and definition of authorship in scientific and medical publishing has become increasingly complicated in recent years.

In most other forms of publishing – social sciences, humanities, legal – we assume that three, perhaps four, authors collaborated in the writing of the work. However, the nature of scientific research and reporting means that “authorship” no longer fits into a neat category.

To elaborate, a researcher who didn’t write the text of a paper can still be considered an author if her or she contributed substantially to the conception of the work, or the analysis of the data. Access to the Internet has made sharing information and collaborating on projects far simpler, and many authors can now work closely with colleagues in different countries.

With such a proliferation of collaboration and co-authorship in academic writing, it becomes harder to differentiate between a “contributor” and an “author.” Moreover, the pressures of funding applications, securing tenure at an academic institution, and the requirement to meet publication quotas all play their part in pushing contributors to demand a co-authorship accreditation.

Plagiarism. With thanks to google images.

Plagiarism is the copying or paraphrasing of other people’s work or ideas into your own work without full acknowledgement.

ICMJE Guidelines

The International Committee of Medical Journal Editors (ICMJE) formulated a set of guidelines to define authorship. [The New ICMJE Recommendations (August 2013). The International Committee of Medical Journal Editors.]

One of the most important changes in the document is the addition of a fourth criterion for authorship to emphasize each author’s responsibility to stand by the integrity of the entire work.

Authorship requires:

  • Substantial contributions to: the conception or design of the work; or the acquisition, analysis, or interpretation of data for the work; AND
  • Drafting the work or revising it critically for important intellectual content; AND
  • Final approval of the version to be published; AND
  • Agreement to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved. [Defining the Role of Authors and Contributors. The International Committee of Medical Journal Editors.]

Authorship involves not only credit for the work but also accountability. The addition of a fourth criterion was motivated by situations in which individual authors have responded to inquiries regarding scientific misconduct involving some aspect of the study or paper by denying responsibility (“I didn’t participate in that part of the study or in writing that part of the paper; ask someone else”). Each author of a paper needs to understand the full scope of the work, know which co-authors are responsible for specific contributions, and have confidence in co-authors’ ability and integrity. When questions arise regarding any aspect of a study or paper, the onus is on all authors to investigate and ensure resolution of the issue.

By accepting authorship of a paper, an author accepts that any problem related to that paper is, by definition, his or her problem. Given the specialized and myriad tasks frequently involved in research, most authors cannot participate directly in every aspect of the work. Still, ICMJE holds that each author remains accountable for the work as a whole by knowing who did what, by refraining from collaborations with co-authors whose integrity or quality of work raises concerns, and by helping to resolve questions or concerns if they arise. For example, a clinician who merits authorship in part through design of a study and care of its participating patients should have full confidence in the work of co-authors with expertise in biostatistics, and must agree as a condition of authorship to ensure resolution of questions regarding the analysis should they arise. This new criterion better balances credit with responsibility, and establishes the expectation that editors may engage all authors in helping to determine the integrity of the work.

The authorship criteria are not intended for use as a means to disqualify colleagues from authorship who otherwise meet authorship criteria by denying them the opportunity to meet criterion #s 2 or 3. Therefore all individuals who meet the first criterion should have the opportunity to participate in the review, drafting, and final approval of the manuscript. As always the decision about who should be an author on a given article is the responsibility of the authors and not the editors of the journal to which the work has been submitted.

Non-Author Contributors

Contributors who meet fewer than all 4 of the above criteria for authorship should not be listed as authors, but they should be acknowledged. Examples of activities that alone (without other contributions) do not qualify a contributor for authorship are

  • acquisition of funding; general supervision of a research group or general administrative support;
  • and writing assistance, technical editing, language editing, and proofreading.

Those whose contributions do not justify authorship may be acknowledged individually or together as a group under a single heading (e.g. “Clinical Investigators” or “Participating Investigators”), and their contributions should be specified (e.g., “served as scientific advisors,” “critically reviewed the study proposal,” “collected data,” “provided and cared for study patients”, “participated in writing or technical editing of the manuscript”).

Coauthor List - You should spend the next week typing down names of all co-authors on your paper.Some researchers have argued that these guidelines are unfairly strict, but they were created to safeguard the idea of authorship to signify scientific integrity. Readers do not want a meaningless list of names – they want to know who is chiefly responsible.[Scott T. Changing authorship system might be counterproductive. BMJ 1997; p. 744]

 In this way, adhering to the ICMJE definition ensures that only those who are “chiefly responsible” are recognized and held accountable. Some authors, however, take issue with the ICMJE guidelines not just because they require authors to be involved in every stage of the manuscript’s production, but because they wish to acknowledge the important contribution of their colleagues. In their editorial “The Men Who Stare at Science,” Goetze and Reinfeld argue that senior scientists should “grab the pen (keyboard) more often” as writing “is essential to ones results and to harbour new ideas.” [Goetze, Jens P.; Rehfeld, Jens F. The men who stare at science. Cardiovascular Endocrinology 2015; p. Published ahead of print.]

Postdoc Workload - Just work till midnight you need to relax too.

Historical overview

Taking a broad look at the history of authorship, even going back to the classical period, you can see how ideas of authorship have only recently become intertwined with ideas of ownership and uniqueness (see The origins of our current understanding of authorship). In Laws, Plato argues that we should “eliminate everything we mean by the word ownership,” which includes intellectual property.Plato rejected the notion of uniqueness and believed that new knowledge is something that we relearn. [Hamilton E, and Cairns H (Translators). Plato. The Collected Dialogues: Including the Letters.   Princeton, New Jersey: Princeton University Press; 1961.]

Not every Classical author shared this belief, however, and some took more credit for their work. Herodotus, for example, starts his famous Histories by mentioning that “Herodotus, from Halicarnassus, here displays his enquiries.” [Holland T (Translator). Herodotus: The Histories. London: Penguin Classics; 2013.]

Herodotus is keen to outline clear rules regarding the correct citation of sources, but in the Classical period plagiarism was common as authors and orators shared the same sources and borrowed from one another.

[Anderson J. Plagiarism, Copyright Violation and Other Thefts of Intellectual Property: An Annotated Bibliography with a Lengthy Introduction. Jefferson, North Carolina and London: McFarland & Company, Inc., Publishers; 1998.]

Current Understanding of Authorship

During the Renaissance, the idea of an author’s ownership of a text came into being, particularly with the Statute of Anne (1710), which conferred ownership to authors rather than publishers; it is no surprise that this development coincided with the rise of the printing press. This early form of copyright did not apply to content, [Velagic Z, Hasenay D. Understanding textual authorship in the digital environment: lessons from historical perspectives. Proceedings of the Eighth International Conference on Conceptions of Library and Information Science, Copenhagen, Denmark; 2013]

but it was an important step toward the idea of intellectual property developed in the Romantic period. The Romantic Movement emphasized the importance of the individual, which led to intellectual and creative copyright laws being consolidated during the 19th century.

[Velagic Z, Hasenay D. Understanding textual authorship in the digital environment: lessons from historical perspectives. Proceedings of the Eighth International Conference on Conceptions of Library and Information Science, Copenhagen, Denmark; 2013]

It wasn’t until postmodernist critiques of literary theory, in the middle of the 20th century, that ideas of individualism were challenged. In particular, Roland Barthes rejected the Romantic idea of individualism and ownership. In Barthes’ now infamous essay “The Death of the Author” (1967), he argued that authorial intention should be separated from the text. Barthes decentred the author, going against the traditional theory that an author’s history and experience could be used to enrich our understanding of his or her work.

Current author trends

The debate over authorship and contributorship was reignited in March 2015, when G3: Genes|Genomics|Genetics published a paper on the genomics of the fruit fly with over 1,000 listed authors.

[Leung, W. et al. Drosophila Muller F Elements Maintain a Distinct Set of Genomic Properties Over 40 Million Years of Evolution. G3: Genes|Genomics|Genetics. 2015.]

According to Barthes’ theory, if the “author” is simply representative of his or her institution, or academic background, why not include all those directly involved in its creation? 

[Woolston, C. Fruit-fly paper has 1,000 authors. Nature. 2015.]

Each undergraduate student contributed to the analysis of data, which is one of the major tenants of authorship according to the ICMJE. If we understand the author as the progenitor of this article, then logic follows that each person listed as a co-author contributed to the authorship of the paper, however small. To take this one step further, the identity of each co-author eventually becomes subsumed into the first author when a paper is cited as W. Leung et al, and the number of contributors is incidental because of how papers are traditionally cited with the use of et al.”

Throughout history, writing has commonly been regarded as an individual act. People like to associate one paper, or idea, with one name. Examples of this include Edward Jenner and the production of the first vaccines, Alexander Fleming and the discovery of penicillin, and Marie Curie and the development of radiotherapy. In recent years, however, as scientific papers are increasingly authored through collaborative efforts rather than individuals, this has opened up the dilemma of first authorship. In 1996 it was suggested that the tradition of citing authors should be restructured to parallel film credits and create a hierarchy of authorship, contributors, and acknowledgements.

[Godlee F; Definition of “authorship” may be changed; BMJ. 1996 Jun 15;312(7045):1501-2.]

This concept would not redefine authorship but instead recognize important contributions in another way. While this idea is attractive, it doesn’t solve the problem of who to list as an author and who to list as a contributor.



Let’s Build a Culture of Integrity Instead!


One potential solution was recently proposed by BioMed Central to implement

“Author Contributorship Badges”

as a method of showing the exact contribution each author made to a paper. [BioMed Central first publisher to implement Author Contributorship Badges, a new system which improves how publishers credit scientists. BioMed Central. 2015]

BioMed Central chose to roll this scheme out in their open-access, open-data journal, GigaScience. All papers published from October 1, 2015, will include the badge system (see First paper published by BioMed Science with Author Contributorship Badges). While authors are still listed in the traditional format, a link to the “Open Badges” appears on the website, and ten potential roles in the creation of an article are represented by ten badges, such as “Data Curation,” “Methodology,” and “Writing Review.”

ScreenHunter_67 Oct. 09 08.58ScreenHunter_66 Oct. 09 08.56ScreenHunter_64 Oct. 09 08.55ScreenHunter_63 Oct. 09 08.54

BioMed Central Implements Author Contributorship Badges


Each badge has a list of authors who contributed to that specific role, and an author can be listed under more than one role. Amye Kenall, associate publisher at BioMed Central, states: “Author Contributorship Badges enable people and organisations to capture the types of skills, knowledge and behaviours that we value, but often find difficult to recognise with traditional credentials.”

The badge system embraces the ICMJE definition of authorship in a refreshing format. Each point in the ICMJE definition has at least one badge. Should it prove successful, the badge system could be a significant turning point in how authors and publishers define authorship.

Badges Biomed Central.png

GWATCH: a web platform for automated gene association discovery analysis 

Svitin, A., Malov, S., Cherkasov, N., Geerts, P., Rotkevich, M., Dobrynin, P., & … O’Brien, S. J. (2014). GWATCH: a web platform for automated gene association discovery analysis.Gigascience, 318. doi:10.1186/2047-217X-3-18

The future of authorship

One of the most significant changes in the publishing industry has been the shift toward digital media and the steady decline of print. Authors are no longer being asked to write a finite article for a journal. For example, when an author contributes an article to a journal, the article will be published in the print and digital versions, shared on social media, and potentially used in promotional material.

This notion of multiple destinations is even more evident when considering blogs. When an author writes a blog, he or she is writing with the knowledge that the work can be shared, critiqued, and linked in numerous ways, making it not just a blog post or a text but part of a huge textual network.

A text is no longer a finished article;

it is an

“ongoing conversation,”

a fluid movement with a number of versions and stages.

[Fitzpatrick, K; The Digital Future of Authorship: Rethinking Originality; Culture Machine; 2011, Vol 12,]

It lives under the assumption that any text, online-only or complimentary to a print component, should constantly be changing. When putting a text on the Internet, particularly in blog form, the text is immediately visible for public consumption and critique. A blog creates a forum for all views, and the result combines numerous views on one topic, while adding commentary to create a new text.

The fluid nature of blogs and other online formats has introduced the idea of “versioning.”

This is traditionally defined as “the creation and management of multiple releases of a product, all of which have the same general function but are improved, upgraded, or customized.”

[Versioning Definition. 2007.

The same, or an alternate, author takes an article and makes changes. He or she adds to it, improving it and creating a timelier and more informative piece; more authors can also be added to the text.

Versioning allows readers to see a scientific process not just through the words of a text, but through the progression of the text itself. With this change of process, the act of writing becomes less about the act itself, or the completion of a piece of work, and more about development and discovery. This, in turn, could mean that authors will no longer be defined by specific works, but by one work as a whole.

However, the prospect of a more fluid style of writing and authorship will inevitably lead to a number of potential problems, namely plagiarism. The traditional notion of plagiarism is that all those involved in the writing of a paper are named as authors, giving due credit for anything they may have borrowed or used in their text. With a more fluid, ever-evolving text, plagiarism (whether intentional or unintentional) is inevitable and perhaps unavoidable. The idea of a constantly reworked text also raises a number of questions about the validity of the work and the contributions of different authors –

are the authors involved sufficiently in the work to be considered as such?

Could they be considered as “curators” instead?

Is the work more about quantity than quality?

Who is chosen as the “first author” after so many changes to a paper?

How will original authors feel about their works being up for adaptation and public consumption?

Most importantly, with articles constantly changing, how will publishers and readers assure their legitimacy?

As we move further into the digital age, these questions require discussion in order to redefine the concept of authorship. In many ways, it seems as though we are trying to embrace the new freedoms that digital media allows while maintaining strong traditions in print and also trying to identify the most modern definition of authorship. Although the “Author Contributorship Badges” offer an appealing solution it is, after-all, online-only. What is certain is the need for the academic and publishing communities to continue their discussion on the definition of authorship, ensuring clarity and flexibility in an increasingly digital age. In the meantime, the ICMJE guidelines provide a definition of authorship that guarantees recognition, both by authors and for authors. In time, they will surely be modified to reflect digital trends, but for now, they clearly delineate what it means to be an author.

Reference: AUTHORSHIP: AN EVOLVING CONCEPT                             Authors
Steph Fairbairn, Leanne Kelly, Selina Mahar, and Reinier Prosée
Editorial coordinators, Health Learning, Research & Practice | Wolters Kluwer