There is nothing more GLOBAL WALL STREET than creating yet another QUASI-GOVERNMENTAL agency to take control of all public policy discussion from WE THE PEOPLE----now for our public K-12 schools. No one is more raging far-right wing global Wall Street than Senator Ferguson who represents a poor, black community in Baltimore being harmed most by these global Wall Street education policies.
As we discuss this week public policy around THE STUDY OF THE HUMAN BRAIN-----this is ground zero to what our human brain development pathway takes because these global 1% are starting these global corporate education pre-K -----2-3 years old controlling the earliest formation of thought and behavior for our children and as a CLINTON/OBAMA global 5% Wall Street player FERGUSON could care less about the 99% of people----
he is a SHOW ME THE MONEY MAN---GET RID OF GLOBAL WALL STREET POLS AND PLAYERS.
Maryland media fills their air time with reports making all these far-right wing pols sound left social progressive-----when they are SOCIAL PROGRESSIVE POSING.
TEDCO is that quasi-governmental agency that now controls all education public policy just as MARYLAND PUBLIC SERVICES COMMISSION has all power over our ONCE PUBLIC UTILITIES. As with the Maryland Public Services Commission, TEDCO is filled with corporate appointments that push all policies written by global Wall Street so passing these two bills pushed by our Baltimore City pols running as DEMOCRATS kills the voice of 99% of citizens in education policy discussion.
TEDCO Investment in Universities:
What TEDCO investment in universities does is officially make our once PUBLIC AND NON-PROFIT PRIVATE UNIVERSITIES into global corporate RESEARCH AND DEVELOPMENT DEPARTMENTS. Global Wall Street now owns our university students and all their academic work thanks to global Wall STreet pols in Maryland. Below we see our pensions being used to fund these global corporate R and D projects----subsidizing global Wall Street profits when our state and city pensions used to subsidize our local communities services, programs, and development----our Maryland Clinton/Bush/Obama pols say---NO, ALL SUBSIDY IN BALTIMORE MUST BOOST GLOBAL WALL STREET PROFIT-----
Pension System Making Investments:
Citizens shouting against the loss of voice in education policies K-university---these are the bills passed that take our 99% of voices out of public policy discussion----get rid of pols that vote for these kinds of bills. TECHNICAL.LY is a media outlet promoting global technology ------not local citizen-based small business jobs and services.
Apr. 12, 2016 12:57 pm
How 4 tech-related bills fared in the 2016 Maryland General Assembly session
Tax credits for angel investors and incubators once again failed. Here's the full rundown.
How 4 tech-related bills fared in the 2016 Maryland General Assembly session
With Sine Die falling at midnight, the 2016 Maryland General Assembly session is in the books. While this year’s session didn’t have the drama that ridesharing regulations created in 2015, bills to help tech and innovation were still on the docket.
The Greater Baltimore Committee (GBC) flagged a handful throughout the session, and here’s how they fared:
1. Angel Investor Tax Credit:
- Did Not Pass
2. One Maryland Tax Credit:
- Did Not Pass
3. TEDCO Investment in Universities:
4. Pension System Making Investments:
This is exactly what SMART PHONES are doing and that hacking is not for the benefit of 99% of citizens---it not only modifies our human behavior it taps into every action we make all day long---recording it for MEGA-DATA. This then creates ARTIFICIAL INTELLIGENCE data bases growing the ability of robotics to replace humans in all areas----even critical thinking and those much needed NERDS AND GENIUSES.
Americans must WAKE UP to the fact that global Wall Street is far-right corporate fascist meaning they have no morals, ethics, values, care about humanity----they are only able to think of maximizing corporate profits and power. A nation built on freedom and liberty for citizens has the OPPOSITE as elected officials.
What is "brain hacking"? Tech insiders on why you should care
Silicon Valley is engineering your phone, apps and social media to get you hooked, says a former Google product manager. Anderson Cooper reports
- 2017 Apr 09
- Correspondent Anderson Cooper
Have you ever wondered if all those people you see staring intently at their smartphones -- nearly everywhere, and at all times -- are addicted to them? According to a former Google product manager you are about to hear from, Silicon Valley is engineering your phone, apps and social media to get you hooked. He is one of the few tech insiders to publicly acknowledge that the companies responsible for programming your phones are working hard to get you and your family to feel the need to check in constantly. Some programmers call it “brain hacking” and the tech world would probably prefer you didn’t hear about it. But Tristan Harris openly questions the long-term consequences of it all and we think it’s worth putting down your phone to listen.
Tristan Harris, a former Google product manager
Tristan Harris: This thing is a slot machine.
Anderson Cooper: How is that a slot machine?
Tristan Harris: Well every time I check my phone, I’m playing the slot machine to see, “What did I get?” This is one way to hijack people’s minds and create a habit, to form a habit. What you do is you make it so when someone pulls a lever, sometimes they get a reward, an exciting reward. And it turns out that this design technique can be embedded inside of all these products.
The rewards Harris is talking about are a big part of what makes smartphones so appealing. The chance of getting likes on Facebook and Instagram. Cute emojis in text messages. And new followers on Twitter.
Tristan Harris: There’s a whole playbook of techniques that get used to get you using the product for as long as possible.
Anderson Cooper: What kind of techniques are used?
“...every time I check my phone, I’m playing the slot machine to see, ‘What did I get?’ This is one way to hijack people’s minds and create a habit, to form a habit.” Tristan HarrisTristan Harris: Tristan Harris: So Snapchat’s the most popular messaging service for teenagers. And they invented this feature called “streaks,” which shows the number of days in a row that you’ve sent a message back and forth with someone. So now you could say, “Well, what’s the big deal here?” Well, the problem is that kids feel like, “Well, now I don’t want to lose my streak.” But it turns out that kids actually when they go on vacation are so stressed about their streak that they actually give their password to, like, five other kids to keep their streaks going on their behalf. And so you could ask when these features are being designed, are they designed to most help people live their life? Or are they being designed because they’re best at hooking people into using the product?
Anderson Cooper: Is Silicon Valley programming apps or are they programming people?
Tristan Harris: Inadvertently, whether they want to or not, they are shaping the thoughts and feelings and actions of people. They are programming people. There’s always this narrative that technology’s neutral. And it’s up to us to choose how we use it. This is just not true.
Anderson Cooper: Technology’s not neutral?
Tristan Harris: It’s not neutral. They want you to use it in particular ways and for long periods of time. Because that’s how they make their money.
It’s rare for a tech insider to be so blunt, but Tristan Harris believes someone needs to be. A few years ago he was living the Silicon Valley dream. He dropped out of a master’s program at Stanford University to start a software company. Four years later Google bought him out and hired him as a product manager. It was while working there he started to feel overwhelmed.
Tristan Harris: Honestly, I was just bombarded in email and calendar invitations and just the overload of what it’s like to work at a place like Google. And I was asking, “When is all of this adding up to, like, an actual benefit to my life?” And I ended up making this presentation. It was kind of a manifesto. And it basically said, you know, “Look, never before in history have a handful of people at a handful of technology companies shaped how a billion people think and feel every day with the choices they make about these screens.”
“Inadvertently, whether they want to or not, they are shaping the thoughts and feelings and actions of people. They are programming people.” Tristan HarrisHis 144-page presentation argued that the constant distractions of apps and emails are “weakening our relationships to each other,” and “destroying our kids ability to focus.” It was widely read inside Google, and caught the eye of one of the founders Larry Page. But Harris told us it didn’t lead to any changes and after three years he quit.
Tristan Harris: And it’s not because anyone is evil or has bad intentions. It’s because the game is getting attention at all costs. And the problem is it becomes this race to the bottom of the brainstem, where if I go lower on the brainstem to get you, you know, using my product, I win. But it doesn’t end up in the world we want to live in. We don’t end up feeling good about how we’re using all this stuff.
60 Minutes Overtime
When smartphones become a teen's drug of choice
Anderson Cooper: You call this a “race to the bottom of the brain stem.” It’s a race to the most primitive emotions we have? Fear, anxiety, loneliness, all these things?
Tristan Harris: Absolutely. And that’s again because in the race for attention I have to do whatever works.
Tristan Harris: It absolutely wants one thing, which is your attention.
Now he travels the country trying to convince programmers and anyone else who will listen that the business model of tech companies needs to change. He wants products designed to make the best use of our time not just grab our attention.
60 Minutes Overtime
Defining your day by "what pops up on the screen"
Anderson Cooper: Do you think parents understand the complexities of what their kids are dealing with, when they’re dealing with their phone, dealing with apps and social media?
Tristan Harris: No. And I think this is really important. Because there’s a narrative that, “Oh, I guess they’re just doing this like we used to gossip on the phone, but what this misses is that your telephone in the 1970s didn’t have a thousand engineers on the other side of the telephone who were redesigning it to work with other telephones and then updating the way your telephone worked every day to be more and more persuasive. That was not true in the 1970s.
Anderson Cooper: How many Silicon Valley insiders are there speaking out like you are?
Tristan Harris: Not that many.
We reached out to the biggest tech firms but none would speak on the record and some didn’t even return our phone call. Most tech companies say their priority is improving user experience, something they call “engagement.” But they remain secretive about what they do to keep people glued to their screens. So we went to Venice, California, where the body builders on the beach are being muscled out by small companies that specialize in what Ramsay Brown calls “brain hacking.”
Anderson Cooper speaks with Ramsay Brown, the cofounder of Dopamine Labs
CBS NewsRamsay Brown: A computer programmer who now understands how the brain works knows how to write code that will get the brain to do certain things.
Ramsay Brown studied neuroscience before co-founding Dopamine Labs, a start-up crammed into a garage. The company is named after the dopamine molecule in our brains that aids in the creation of desire and pleasure. Brown and his colleagues write computer code for apps used by fitness companies and financial firms. The programs are designed to provoke a neurological response.
“A computer programmer who now understands how the brain works knows how to write code that will get the brain to do certain things.” Ramsay BrownAnderson Cooper: You’re trying to figure out how to get people coming back to use the screen?
Ramsay Brown: When should I make you feel a little extra awesome to get you to come back into the app longer?
The computer code he creates finds the best moment to give you one of those rewards, which have no actual value, but Brown says trigger your brain to make you want more. For example, on Instagram, he told us sometimes those likes come in a sudden rush.
Ramsay Brown: They’re holding some of them back for you to let you know later in a big burst. Like, hey, here’s the 30 likes we didn’t mention from a little while ago. Why that moment--
Anderson Cooper: So all of a sudden you get a big burst of likes?
Ramsay Brown: Yeah, but why that moment? There’s some algorithm somewhere that predicted, hey, for this user right now who is experimental subject 79B3 in experiment 231, we think we can see an improvement in his behavior if you give it to him in this burst instead of that burst.
When Brown says “experiments,” he’s talking generally about the millions of computer calculations being used every moment by his company and others use to constantly tweak your online experience and make you come back for more.
Ramsay Brown: You’re part of a controlled set of experiments that are happening in real time across you and millions of other people.
Anderson Cooper: We’re guinea pigs?
Ramsay Brown: You’re guinea pigs. You are guinea pigs in the box pushing the button and sometimes getting the likes. And they’re doing this to keep you in there.
The longer we look at our screens, the more data companies collect about us, and the more ads we see. Ad spending on social media has doubled in just two years to more than $31 billion.
Ramsay Brown: You don’t pay for Facebook. Advertisers pay for Facebook. You get to use it for free because your eyeballs are what’s being sold there.
Anderson Cooper: That’s an interesting way to look at it, that you’re not the customer for Facebook.
“You don’t pay for Facebook. Advertisers pay for Facebook. You get to use it for free because your eyeballs are what’s being sold there.” Ramsay BrownRamsay Brown: You’re not the customer. You don’t sign a check to Facebook. But Coca-Cola does.
Brown says there’s a reason texts and Facebook use a continuous scroll, because it’s a proven way to keep you searching longer.
Ramsay Brown: You spend half your time on Facebook just scrolling to find one good piece worth looking at. It’s happening because they are engineered to become addictive.
Anderson Cooper: You’re almost saying it like there’s an addiction code.
Ramsay Brown: Yeah, that is the case. That since we’ve figured out, to some extent, how these pieces of the brain that handle addiction are working, people have figured out how to juice them further and how to bake that information into apps.
Larry Rosen: Dinner table could be a technology-free zone.
While Brown is tapping into the power of dopamine, psychologist Larry Rosen and his team at California State University Dominguez Hills are researching the effect technology has on our anxiety levels.
Larry Rosen: We’re looking at the impact of technology through the brain.
Rosen told us when you put your phone down – your brain signals your adrenal gland to produce a burst of a hormone called, cortisol, which has an evolutionary purpose. Cortisol triggers a fight-or-flight response to danger.
Anderson Cooper: How does cortisol relate to a mobile device, a phone?
Larry Rosen: What we find is the typical person checks their phone every 15 minutes or less and half of the time they check their phone there is no alert, no notification. It’s coming from inside their head telling them, “Gee, I haven’t check in Facebook in a while. I haven’t checked on this Twitter feed for a while. I wonder if somebody commented on my Instagram post.” That then generates cortisol and it starts to make you anxious. And eventually your goal is to get rid of that anxiety so you check in.
So the same hormone that made primitive man anxious and hyperaware of his surroundings to keep him from being eaten by lions is today compelling Rosen’s students and all of us to continually peek at our phones to relieve our anxiety.
Larry Rosen: When you put the phone down you don’t shut off your brain, you just put the phone down.
Anderson Cooper: Can I be honest with you right now? I haven’t paid attention to what you’re saying because I just realized my phone is right down by my right foot and I haven’t checked it in, like 10 minutes.
Larry Rosen: And it makes you anxious.
Anderson Cooper: I’m a little anxious.
A computer tracks minute changes in Anderson Cooper’s heart rate and perspiration
Larry Rosen: Yes.
We found out just how anxious in this experiment conducted by Rosen’s research colleague Nancy Cheever.
Nancy Cheever: So the first thing I’m going to do is apply these electrodes to your fingers.
While I watched a video, a computer tracked minute changes in my heart rate and perspiration. What I didn’t know was that Cheever was sending text messages to my phone which was just out of reach. Every time my text notification went off, the blue line spiked – indicating anxiety caused in part by the release of cortisol.
Nancy Cheever: Oh, that one is…that’s a huge spike right there. And if you can imagine what that’s doing to your body. Every time you get a text message you probably can’t even feel it right? Because it’s such a um, it’s a small amount of arousal.
Anderson Cooper: That’s fascinating.
Their research suggests our phones are keeping us in a continual state of anxiety in which the only antidote – is the phone.
Anderson Cooper: Is it known what the impact of all this technology use is?
Larry Rosen: Absolutely not.
Anderson Cooper: It’s too soon.
Larry Rosen: We’re all part of this big experiment.
Anderson Cooper: What is this doing to a young mind or a teenager?
Larry Rosen: Well there’s some projects going on where they’re actually scanning teenager’s brains over a 20-year period and looking to see what kind of changes they’re finding.
Gabe Zichermann: Here’s the reality. Corporations and creators of content have, since the beginning of time, wanted to make their content as engaging as possible.
Gabe Zichermann has worked with dozens of companies – including Apple and CBS – to make their online products more irresistible. He’s best known in Silicon Valley for his expertise in something called “gamification,” using techniques from video games to insert fun and competition into almost everything on your smartphone.
Gabe Zichermann: So one of the interesting things about gamification and other engaging technologies, is at the same time as we can argue that the neuroscience is being used to create dependent behavior those same techniques are being used to get people to work out, you know, using their Fitbit. So all of these technologies, all the techniques for engagement can be used for good, or can be used for bad.
“Asking technology companies, asking content creators to be less good at what they do feels like a ridiculous ask.” Gabe ZichermannZichermann is now working on software called ‘Onward’ designed to break user’s bad habits. It will track a person’s activity and can recommend they do something else when they’re spending too much time online.
Gabe Zichermann: I think creators have to be liberated to make their content as good as possible.
Anderson Cooper: The idea that a tech company is not going to try to make their product as persuasive, as engaging as possible, you’re just saying that’s not gonna happen?
Gabe Zichermann: Asking technology companies, asking content creators to be less good at what they do feels like a ridiculous ask. It feels impossible. And also it’s very anti-capitalistic, this isn’t the system that we live in.
Ramsay Brown and his garage start-up Dopamine Labs made a habit-breaking app as well. It’s called “Space” and it creates a 12-second delay -- what Brown calls a “moment of Zen” before any social media app launches. In January, he tried to convince Apple to sell it in their App Store.
Ramsay Brown: And they rejected it from the App Store because they told us any app that would encourage people to use other apps or their iPhone less was unacceptable for distribution in the App Store.
Anderson Cooper: They actually said that to you?
Ramsay Brown: They said that to us. They did not want us to give out this thing that was gonna make people less stuck on their phones.
For those not knowing the national science foundation drive in research funding et al surrounding a DECADE OF THE BRAIN at the end of Bush Sr beginning of Clinton dedicated much funding to mapping out the human brain and its functions. Medical scientists like myself loved the discoveries knowing these sciences would provide much needed medical relief to citizens tied to neurological disease vectors. What we saw were advances in common disorders from Alzheimers to Parkinsons-----from various kinds of depressions to PTSS brain injuries. All Federally funded bringing more benefit to WE THE PEOPLE in our health outcomes.
What we are seeing from Affordable Care Act is the opposite-----all those common disease vectors like Alzheimers and Parkinsons are being declared COSTLY TO TREAT-----while the medical products tied to mental health and alcohol and drug abuse are being funded heavily through ACA. As US citizens lose more and more ability to be fully insured with the best of health insurance they do not access the fruits of all these Federal BRAIN RESEARCH. We are told now that it is not pragmatic for Americans to have a life span so long it brings with it these degenerative neurological disease vectors.
We shared an article that stated citizens in states across the nation are not seeing that Affordable Care Act heavy funding for mental illness as was sold by Congress----where did that money go?
We are seeing much of BRAIN RESEARCH controlled by a DOD-----Homeland Security-----being used more for that MIND CONTROL OF ARTIFICIAL INTELLIGENCE and Trans-humanism.
The article below is long but please glance through to see what our DECADE OF THE BRAIN national science funding targeted-----as we see how these goals are unfolding in our Affordable Care Act vs what we are seeing in goals of ARTIFICIAL INTELLIGENCE.
Friday, February 26, 2010
A Decade after The Decade of the Brain
In this seven-part series, directors of neuroscience-related institutes at the National Institutes of Health examine how brain research has progressed since 2000—the decade after The Decade of the Brain. Here in part one, we hear from Dr. Nora D. Volkow of the National Institute on Drug Abuse.
In 1990, Congress designated the 1990s the “Decade of the Brain.” President George H. W. Bush proclaimed, “A new era of discovery is dawning in brain research.” During the ensuing decade, scientists greatly advanced our understanding of the brain.
The editors of Cerebrum asked the directors of seven brain-related institutes at the National Institutes of Health (NIH) to identify the biggest advances, greatest disappointments, and missed opportunities of brain research in the past decade—the decade after the “Decade of the Brain.” We also asked them what looks most promising for the coming decade, the 2010s. Our experts focused on research that might change how doctors diagnose and treat human brain disorders.
We hear from Nora D. Volkow, director of the National Institute on Drug Abuse; Thomas R. Insel, director of the National Institute of Mental Health; Story Landis, director of the National Institute of Neurological Disorders and Stroke; Kenneth R. Warren, acting director of the National Institute on Alcohol Abuse and Alcoholism; Paul A. Sieving, director of the National Eye Institute; James F. Battey, director of the National Institute on Deafness and Other Communication Disorders; and Richard J. Hodes, director of the National Institute on Aging.
Challenges and Opportunities in Drug Addiction Research
By Nora D. Volkow, M.D., National Institute on Drug Abuse
Neuroscience is at a historic turning point. Today, a full decade after the “Decade of the Brain,” a continuous stream of advances is shattering long-held notions about how the human brain works and what happens when it doesn’t. These advances are also reshaping the landscapes of other fields, from psychology to economics, education and the law.
Until the Decade of the Brain, scientists believed that, once development was over, the adult brain underwent very few changes. This perception contributed to polarizing perspectives on whether genetics or environment determines a person’s temperament and personality, aptitudes, and vulnerability to mental disorders. But during the past two decades, neuroscientists have steadily built the case that the human brain, even when fully mature, is far more plastic—changing and malleable—than we originally thought.1 It turns out that the brain (at all ages) is highly responsive to environmental stimuli and that connections between neurons are dynamic and can rapidly change within minutes of stimulation.
Neuroplasticity is modulated in part by genetic factors and in part by dynamic, epigenetic changes that influence the expression of genes without changing the DNA sequence. Epigenetic processes are of particular clinical interest because their external triggers (such as early parental care, diet, drug abuse and stress) can affect a person’s vulnerability to many diseases, including psychiatric disorders. In addition, in contrast to genetic sequence differences, epigenetic alterations are potentially reversible, and thus amenable to public health policy interventions.
It also has become increasingly clear that the human brain is particularly sensitive to social stimuli, which likely has accelerated the rate of human brain evolution. Humans have evolved a complex neuronal circuitry in large areas in the brain to process complex social information (such as predicting others’ reactions and emotions) and to respond appropriately. New research has revealed that social stimuli (such as parenting style and early-life stress) can epigenetically modify the expression of genes that influence brain morphology and function including the sensitivity of an individual to stressful stimuli.2 In the future, this knowledge will enable us to tailor personalized prevention interventions that are based on information on how genetics and epigenetics affect brain function and behavior. For example, a recent study showed that a prevention intervention based on improving parenting style reduced the risk for substance use disorders only in adolescents with a particular variant of a gene that recycles the chemical serotonin back into the neurons, which is a variant that results in greater sensitivity to social adversity.3
In the coming decade, insights about what underlies neuroplasticity, combined with technological advances that allow us to “see” with greater precision the human brain in action, are bound to revolutionize the way we view learning and the methods we use to educate young people. New research will also show us how to help people overcome or compensate for many of the deficits associated with drug abuse, addiction and other mental disorders.4
For example, scientists are using imaging technologies in neurofeedback programs that train people to voluntarily recalibrate their neural activity in specific areas of the brain, allowing them to gain unprecedented control over, for example, pain perception5 or emotional processing.6 During drug addiction treatment, this approach could greatly reduce the risk of relapse by enabling a patient to control the powerful cravings triggered by a host of cues (e.g., people, things, places) that have become tightly linked, in the brain of the user, to the drug experience.
Other promising advances stem from ongoing research and development of direct communication pathways between a brain and external computer devices, the so called brain-computer interfaces (BCI). In a recent study, one version of BCI appeared to help paralyzed stroke victims regain some movement control.7 In the next decade, forms of BCI might help people with a variety of neuropsychiatric conditions that have proved resistant to traditional treatments. For example, early evidence suggests that BCI training could benefit patients with epilepsy or attention-deficit/hyperactivity disorder (ADHD) that is unresponsive to drugs.8
As we build on these rapid advances in neuroscience research, we must keep a watchful eye on their vast social and political implications.
For example, neurologists have started to uncover the molecular components and neural circuitry that underlie the learning process.9 We also are learning how to use transcranial magnetic stimulation (TMS), a noninvasive method to modulate the activity within a neural circuit, more effectively.10 Should we use this knowledge to better educate young people and teach new skills to seniors, or should we use these tools only to treat people with neuropsychiatric disorders? As we begin to understand how parenting styles affect the development and function of the brain, how far should we go to protect children from the long-term and deleterious effects of bad parenting?
Recent progress in brain research and associated fields has been impressive, and we are sure to witness further acceleration in the pace of neuroscientific discovery in the next couple of decades. Indeed, we are entering a new era in which our technologies are beginning to affect our lives in profound ways. We are bound to recast our relationship with our brains and, in the process, to redraw the boundaries of human evolution.
Understanding Mental Disorders as Circuit Disorders
Thomas R. Insel, M.D., National Institute of Mental Health
When the Decade of the Brain began in 1990, scientists had developed both drug and behavioral treatments for most mental disorders, but their understanding of these disorders was primitive. Two decades later, neuroscientists are finally uncovering the brain processes involved in mental disorders. There is great promise for development of more effective treatments in the upcoming decade.
In 1990, most theories of the causes of mental disorders were based on investigations of treatments, rather than on scientific insight about how diseases arise. By 2000, we had developed more treatments—including best-selling second-generation antipsychotics and antidepressants—but we were no further along in our understanding of the causes. During the so-called Decade of the Brain, there was neither a marked increase in the rate of recovery from mental illness, nor a detectable decrease in suicide or homelessness—each of which is associated with a failure to recover from mental illness. To reduce the occurrence and death toll of mental disorders, we will need a more thorough understanding of why these mysterious illnesses occur.
People frequently cite the 1990s as the era for redefining mental disorders as brain disorders. While this conceptual shift was important, we now realize the greater importance of developing new tools: imaging techniques for quantitative studies of brain structure, function and chemistry, as well as other comprehensive tools for mapping DNA and RNA. What do we mean by comprehensive? Rather than focusing on four or five neurotransmitters, researchers at the turn of the 21st century were able to investigate thousands of genes to yield an unbiased survey of the biology of mental disorders. These advances ushered in a decade of discovery that brings us to 2010.
If scientists introduced mental disorders as brain disorders in the Decade of the Brain, researchers in the past ten years have demonstrated the importance of specific brain circuits. Unlike neurological disorders, which often involve areas of tissue damage or cell loss, mental disorders have begun to appear more like circuit disorders, with abnormal conduction between brain areas rather than loss of cells.
Neuroimaging technology has revealed that specific brain pathways, mostly located in the prefrontal cortex, are involved in major mental disorders. Deep brain stimulation, a procedure in which neurologists manipulate certain pathways via electric current, has shown promise as a treatment for depression and obsessive-compulsive disorder, on the heels of its successful use as a treatment for neurological motor disorders such as Parkinson’s. In the past couple of years, via a new technology called optogenetics, neuroscientists have used light to manipulate circuits in experimental animals with millisecond precision and cellular resolution. Thus, for the first time, researchers can conduct specific tests of theories about brain circuits and behavior.
What causes a circuit disorder? Although this will be a major question for the next decade, we already have some intriguing ideas. Mental disorders such as schizophrenia and mood and anxiety disorders are mostly diseases of early life; their onset tends to occur during adolescence or early adulthood, when the brain is still developing. For example, a person with schizophrenia usually experiences a psychotic break in early adulthood, which is a time when the number of cortical synapses is being pruned. The disorder might result from the excessive loss of synapses in a critical cortical pathway when the normal process overshoots.
Since 2005, scientists studying our genes, the proteins they produce and their functions have started to identify some of the key factors that increase the risk of mental disorders, from autism to schizophrenia. The candidates include a long list of previously unknown proteins that have one thing in common: They are important for healthy brain development. Indeed, if the Decade of the Brain redefined mental disorders as brain disorders, recent research suggests that mental disorders are really developmental brain disorders, caused by disruptions in the circuitry map of the developing brain.
During this next decade, expect to see the full roster of candidates as scientists begin to describe the key variations in sequences of genes that produce altered proteins and dysfunctional circuitry. Neuroscientists already have powerful tools to move from the study of molecules to circuits and, ultimately, to behavior. How will we translate this emerging knowledge into better treatments? The answer for psychiatry will likely be the same as the answer in the rest of medicine: Basic discoveries regarding genes and proteins will point the way to molecular and cellular mechanisms, which in turn will yield new targets for treatment and prevention.
In some ways, psychiatry has been the victim of its early success, as medications found by accident in the 1960s delayed the search for fundamental mechanisms of disease that could yield new targets and new treatments. After two decades of progress, clinical neuroscientists are finally beginning to understand what underlies a few mental disorders. In the upcoming decade, which we can perhaps call the Decade of Translation, we can look forward to seeing this new understanding translate to improved treatments that will finally reduce the occurrence and death rates of these disabling illnesses.
Basic Science and Gene Findings Drive Research
By Story Landis, Ph.D., National Institute of Neurological Disorders and Stroke
Remarkable advances during the Decade of the Brain set the stage for the decade that just ended, and recent findings make us optimistic that progress will accelerate.Basic neuroscience research in the 1990s has been an important part of this momentum. The following is just a sample of the many important findings during the Decade of the Brain: In 1991, researchers discovered the molecular basis of olfaction—our sense of smell—which made the olfactory system as attractive as the visual system for exploring neural development and sensory processing. The identification of molecules in multiple systems that guide axons—fibers involved in communication between neurons—led to a new understanding of how connections form during development. Finally, molecular and biochemical studies showed that synapses—the junctions at which nerves communicate—are complex molecular machines rather than simple structures, as earlier images from electron microscopes suggested.
Although neuroscientists believe that advances in basic science ultimately will improve our understanding of neurological disease and thereby will guide treatments, genetics has had the most immediate impact. The discovery in 1991 that Kennedy’s disease, a motor neuron disorder, is caused by a specific gene mutation was the first of a stream of significant findings, including gene mutations involved in Huntington’s disease, amyotrophic lateral sclerosis (ALS, or Lou Gehrig’s disease) and Rett syndrome. In addition to identifying mutations responsible for these undeniably familial disorders, investigators discovered a mutation in the alpha-synuclein gene in a large European family with Parkinson’s disease. This finding was particularly noteworthy since the consensus in the field had been that Parkinson’s had environmental causes. Now more Parkinson’s researchers have expanded their search to include genetics, which accounts for the disease in some patients and may contribute to it in others.
In the past decade, the list of single gene defects that contribute to neurological disorders grew at an extraordinary pace, leading to almost an embarrassment of riches. For example, at least 15 identified genes cause spinocerebellar ataxias, and almost as many additional genes are suspected culprits. Classification for ataxias by genetic profile has replaced the clinical classification based on time of onset, rate of progression and subtleties of the clinical exam. In addition, researchers have identified at least six additional Parkinson’s-related genes; together, these genes appear to underlie the disease in as much as 35 percent of patients with Parkinson’s disease.
Gene identification has immediately benefited patients. For many of the rare neurological disorders, patients and their families might have spent many years and thousands of dollars in their search for a diagnosis. For some diseases, an inexpensive genetic test can now bring that odyssey to a rapid and conclusive end.
We hope for more than the ability to diagnose, however. Implicit in the discovery of a causative gene is the belief that this knowledge will quickly lead to a better understanding of disease processes, and this in turn will yield better treatments. But to date this translation has proved to be much more difficult than we had imagined. For example, in 1987, researchers discovered that mutations in the dystrophin gene cause Duchenne muscular dystrophy, a disorder that results in the death of affected boys, usually before age 20. Despite two decades of research and the availability of both mouse and dog models, the only treatment currently in use is corticosteroids.
Similarly, scientists identified the genetic defect in Huntington’s disease in 1993; today, no treatments slow the disease’s progression, and those that address symptoms in the middle to late stages are not particularly effective. Scientists still debate whether aggregates formed in the brain from the Huntington’s gene’s mutant protein are toxic or protective. We have, however, become much more sophisticated about defining and testing targets for therapeutics development, and we now have at our disposal exciting new technologies, such as small interfering RNA (siRNA), to turn our knowledge of gene mutations into treatments.
Other basic science developments offer significant promise for the current decade. One such advance is human pluripotent stem cell technology. Using recipes for specific classes of neural cells, scientists can use human embryonic stem cells to generate thousands of dopamine neurons, motor neurons or myelin-producing cells suitable for studying and potentially treating Parkinson’s disease, ALS or multiple sclerosis, respectively. In 2006 scientists learned how to turn back the developmental clock of mouse skin cells to make them into induced pluripotent stem (iPS) cells that closely resemble embryonic stem cells. A year later researchers extended this remarkable technology to human skin cells. In proof-of-principle studies, investigators have turned skin cells from patients with ALS or Parkinson’s disease into induced pluripotent stem cells. Then they have differentiated the cells into motor neurons or dopamine neurons. Scientists are already using such cells in the search for disease-modifying drugs. Many investigators believe that well down the road, human embryonic stem and iPS cells may be useful in replacing nervous system cells lost or damaged by neurological disorders.
Tackling the Mysteries of Alcohol Dependence
By Kenneth R. Warren, Ph.D., National Institute on Alcohol Abuse and Alcoholism
Why does drinking alcohol have such profound effects on people’s behavior? Why does alcohol dependence develop and persist in some people but not in others? Scientists attempt to answer these questions by studying the brain, where alcohol intoxication and dependence begin. During the past decade, advances in technology have helped us better understand how alcohol changes the brain and how those changes influence alcohol-related behaviors. In the coming decade, this knowledge will help researchers develop drug and other interventions that can reduce the high social, personal and economic costs of alcohol-related problems.
Research supported by NIAAA in the early 1990s demonstrated that people who abuse alcohol for a long time experience lasting changes within the brain’s limbic system, which supports emotion and motivation. These changes, which we call neuroadaptation, involve multiple neurotransmitters and other brain chemicals. Neuroadaptation can result in heightened anxiety and distress during abstinence; the drinker can alleviate this discomfort for a short time by drinking more. This may help explain why people with alcohol dependence steadily increase the amount they drink.
As a person’s dependence on alcohol grows, the affected neurotransmitter systems change from those that are involved in the brain’s reward system to those that cause negative effects such as anxiety, sweating and tremors. It appears that people with alcohol dependence continue to drink despite recurring health and social problems because of a vicious cycle: They are drinking in an attempt to avoid the unpleasant effects of drinking. In the future, alcohol scientists hope to use their understanding of how neuroadaptation occurs to develop targeted medications for treating alcohol dependence.
Researchers have identified stress as a probable trigger for relapse into alcohol dependence. Alcohol neuroscientists have identified several brain-cell receptors that influence resilience to stress and may be involved in susceptibility to alcohol dependence. For example, researchers found that mice lacking a receptor that mediates stress responses voluntarily drank much less alcohol and were more sensitive to its sedative effects than normal mice. In one study, people who had recently gone through alcohol detoxification took a drug that targets this same receptor. They reported fewer alcohol cravings and improved overall well-being.1 This finding might lead to a new treatment for some types of alcohol dependence, which is a central part of the NIAAA’s mission.
Scientists also are seeking ways to combat underage drinking, a major public health challenge worldwide. For example, researchers conducting studies on animals have found that adolescents are less sensitive than adults to the negative effects of intoxication, including sleepiness, hangover and impaired coordination. That means it takes more alcohol for teens to begin to experience the negative effects that adults recognize as signs that they have had too much to drink. On the other hand, researchers conducting studies on humans have found that adolescents are more sensitive than adults to alcohol’s impairment of memory and social inhibition.2 These findings suggest that adolescents are particularly prone to alcohol-related consequences, such as teenage drinking and driving accidents and lasting cognitive deficits.3 In addition, the earlier drinking begins in adolescence, the greater the risk of alcohol use disorders in adulthood.4 Our next challenge, therefore, is to learn how drinking may interfere with normal adolescent brain development at the cellular and molecular level, as well as how this interference may lead to cognitive impairment and alcohol use disorders. Then we can investigate interventions that will protect people of all ages.
During the Decade of the Brain, scientists developed imaging and electrical recording techniques that allow today’s researchers to study how alcohol affects different brain systems and structures. We can also see, in real time, how both the motivation to drink and alcohol itself change the human brain. For example, using functional magnetic resonance imaging (fMRI), scientists can track how the desire to use alcohol changes specific brain regions. Scientists using magnetic resonance spectroscopy (MRS) can monitor chemical and metabolic changes that may cause alcohol’s short-term pleasurable effects (intoxication) and long-term detrimental effects (dependence).
Furthermore, about half of a person’s risk of developing alcoholism is based on his or her genetic makeup,5 and real-time recording techniques also are helping scientists to identify genetic risk factors. For instance, using event-related potentials (ERPs), researchers have identified unusual brainwaves that appear in the brains of children of alcoholics before they have taken their first drink.6 Researchers also have found that certain genetic markers linked to alcohol dependence also are associated with psychiatric disorders such as antisocial personality disorder and attention-deficit/hyperactivity disorder. This finding suggests that these illnesses have genetic connections.7 In order to investigate the interface of genetics and neuroimaging, the NIAAA has promoted imaging research that may clarify how genes associated with alcohol dependence affect the brain. This new field of imaging genetics offers a powerful research tool to help us understand the genes that underlie alcohol-related disorders.
Understanding the effect of alcohol on the brain through discoveries in neuroscience is integral to understanding why people get into trouble from alcohol use and figuring out how to prevent and reduce alcohol-related problems. During the next decade, animal and human studies using increasingly sophisticated technology will provide information that may help bring us closer to these important goals.
Using Breakthroughs in Visual Neuroscience to Treat Diseases
By Paul A. Sieving, M.D., Ph.D., National Eye Institute
Advances in visual neuroscience during the past 10 years are generating a lot of excitement. The ability to record simultaneously the activity of different clusters of neurons in the eye has greatly improved our understanding of how our neural circuits process and integrate visual signals. For example, recording the impulses from clusters of retinal ganglion cells, which transmit visual input from the eye to the brain, allows researchers to characterize completely the information presented to the visual parts of the brain.
The next research front will involve investigating how neurons interconnect into circuits that control visually guided behavior, such as when we alter our path to avoid an obstacle we see. In the next decade, three recent technical advances will help us learn more about this neural circuitry. First, scientists can now see complex, interconnected brain structures using the Brainbow technique, which employs genetically coded fluorescent proteins that can mark hundreds of neurons with unique colors. Second, two-photon imaging technology can display the dynamic interactions between neurons in real time. Finally, scientists can implant into the brain a grid containing 100 electrodes that deliver signals to a computer, which measures the activity of individual neurons within a larger group.
Scientists can use information about the activity of individual neurons therapeutically. Because the organization of the primate visual system is very close to that of humans, what we learn through ethical studies of nonhuman primates brings us closer to human medical applications. Studies using an array of electrodes implanted in the brain show that monkeys can use their visual system to control an artificial limb remotely, by mental control alone.1 If this ability holds true in humans, it could dramatically improve sensory substitution treatments used for a range of human injuries—for example, better devices for people who have lost limbs due to war or disease. In addition, learning about faulty nerve circuits in the visual system will provide insight into other types of circuit disorders, such as chronic pain and epilepsy.
We have tremendous opportunities to translate what we have learned about visual circuits in the past decade into treatments for neurodegenerative diseases affecting vision, such as retinitis pigmentosa and macular degeneration. These diseases target photoreceptor cells in the retina, which normally process light that becomes an image when electrochemical signals are transmitted through the retina and optic nerve to the brain. The degeneration or death of photoreceptor cells causes loss of vision and blindness, but the other parts of the transmission process—the second-stage retinal neurons—remain intact. Researchers are testing several methods to activate retinal cells by bypassing the nonfunctional photoreceptor cells.
For example, through funding from the National Eye Institute and the U.S. Department of Energy, scientists have developed an artificial retina chip—an electrode array that receives signals interpreted as electrical impulses from a camera. When the array is transplanted into the eye, it stimulates the remaining retinal circuits and transmits the impulses to the brain to enable it to visualize what the camera sees. An alternative strategy involves a microchip with tiny solar cells that convert light energy into electrochemical impulses. And optogenetics, which draws on advances in nanotechnology, uses pulses of light to specifically activate genetically engineered ion channels in retinal cells to initiate the visual pathway.
Researchers are also investigating how to restore vision via cell-based therapies such as stem cell technology and gene therapy. For example, scientists can now induce stem cells to develop into retinal cells, and these cells function correctly when transplanted into an animal with retinal degeneration. The most promising results come from recent clinical trials using gene therapy to treat people with Leber’s congenital amaurosis. People with this condition lack an enzyme required for vitamin A metabolism, and the resultant degeneration of photoreceptor cells causes vision loss. When researchers delivered the missing enzyme to the remaining intact photoreceptors, the patients’ visual sensitivity increased. This was one of the first examples of safe and effective gene therapy.
These advances are beginning to help people with limited vision see better. They are also shedding light on treatments for other neurological disorders. New technologies and scientific breakthroughs afford significant opportunities to expand our knowledge about the visual system and develop applications with therapeutic potential.
Advances in Genetics and Devices Are Helping People with Communication Disorders
By James F. Battey Jr., M.D., Ph.D., National Institute on Deafness and Other Communication Disorders
During the past decade, scientists have made astonishing advances in the NIDCD’s mission areas of hearing, balance, smell, taste, voice, speech and language. Numerous discoveries have expanded our knowledge base amid one of the most exciting periods in the history of communication research.
Genetics ranks high on the list of areas in which we’ve made significant progress. Before the Decade of the Brain, we knew that deafness could be inherited, but we knew little about the genes involved. Twenty years later, we’ve identified hundreds of genetic mutations linked to inherited hearing loss, with more than 80 genes mapped just in the past 10 years. Further study has shown us the functions of many of the proteins that these genes encode and has revealed molecular pathways essential for normal hearing.
Similar explorations in speech and language have turned up genetic mutations that are responsible for delayed language development in young children and that also play a supporting role in dyslexia and some cases of autism. This kind of discovery, which reveals common neural pathways in speech, reading and language development, could be the key to freeing thousands of children now locked inside their own worlds.
Another exciting gene discovery, the result of a collaboration across the National Institutes of Health and internationally, recently identified the first genetic mutations responsible for stuttering, which places this speech disorder squarely in the medical world. Researchers are currently working with animal models to understand how this gene influences the neural circuits that control expressive language.
Combating hearing loss by regenerating hair cells, small sensory cells in the inner ear, also is showing promise. Our ability to hear relies on these hair cells, and defects in them or damage to them cause hearing loss. Although fish, amphibians, and birds are able to grow new hair cells, humans and other mammals can’t. Scientists are trying to understand the molecules and genes involved in hair cell regeneration in animals, with hopes of learning how to mimic the process in humans. Research in hair cell regeneration could one day offer a powerful treatment option, if not a cure, for hearing loss.
Beyond genetic discoveries, we continue to focus on the development of devices that bring sound into the worlds of people who are profoundly deaf or hard of hearing. The cochlear implant, one of the most groundbreaking biomedical achievements of the past 30 years, uses direct electrical stimulation of the auditory nerve via implanted electrodes to bypass inner ear damage and provide a sense of sound. Although cochlear implants have helped close to 200,000 people worldwide, most still have problems clearly hearing conversations in noisy environments. Scientists are currently looking at how to better localize sound by using advanced signal processing techniques and improved electrode design.
Hearing aid users have similar problems in noisy environments. An ingenious solution has emerged from the study of the ears of a parasitic fly, Ormia ochracea, which is extraordinarily successful at localizing sound. Using the lessons from this research, scientists are developing a miniature directional microphone that can zero in on a single voice and make communication in noisy places a more effective.
As this new decade begins, we’re applying the technology of cochlear implants to the development of other potential neural prostheses for hearing, balance and speech. These include auditory brainstem implants, which reconnect the ear to the brain in people whose auditory nerves have been surgically removed; vestibular implants to normalize balance by electrically stimulating the vestibular nerve; and brain-computer interfaces to help patients with locked-in syndrome translate thought into synthesized speech.
In smell and taste research, we’ll focus less on the nose and the tongue and more on the brain, tackling questions about how the brain interprets sensory data and mapping the functional organization of the neural circuits that mediate these senses. We are just beginning to understand the complicated neural networks that turn objects and words into speech, but newer imaging techniques, such as voxel-based morphometry, will allow us to localize brain function at a much finer spatial resolution than fMRI and will become a powerful tool for researchers to see which areas of the brain are active during speech and word retrieval.
I am certain that we will end this new decade with a far better understanding of how language and speech are processed in the brain. We’ll also have more sensitive, individually tailored and effective technologies for people with hearing loss. Finally, our continued studies in genetics, and the rapid accumulation of knowledge about genes and their functions, mean that the era of precise genotype-based diagnosis may be at hand for many of the communication disorders we study.
It Takes a Village: Large-Scale Studies Prove Vital to Alzheimer’s Disease Research
By Richard J. Hodes, M.D., National Institute on Aging
During the next 25 years, the number of Americans living to age 65 is expected to double to about 72 million. Many people thrive as they age, but others experience cognitive decline wrought by Alzheimer’s disease and other dementias. Today, as many as 5.1 million Americans may have Alzheimer’s disease, the most common form of dementia. Unless we can cure or prevent it, Alzheimer’s prevalence may triple by 2050.
These dire projections lend urgency to research into this devastating disease. In the past decade, researchers and clinicians working across diverse disciplines have made important discoveries about the molecular changes that take place in the brains of people with Alzheimer’s disease, identified genetic risk factors, and pointed to lifestyle and environmental factors—such as diet and exercise—that may contribute to the onset and progression of the disorder.
Along with these advances, however, came a humbling appreciation of the complexity of Alzheimer’s. More than a century since Alois Alzheimer first described abnormal deposits of beta-amyloid and tau proteins in the brain of a woman with dementia, researchers are still asking if these hallmark plaques and tangles are the causes or the results of the disease process. While it is difficult to predict when we will have the answers, we may gain great insight from large-scale, collaborative studies.
Researchers in government, academia and private industry are joining forces to discover the genetic and environmental risk factors involved in Alzheimer’s. One such success story is the Alzheimer’s Disease Neuroimaging Initiative (ADNI), a partnership launched in 2004 and primarily supported by the National Institute on Aging (NIA), joined by other NIH institutes and private partners. ADNI scientists are developing imaging and biomarker profiles of the changes that signal the onset of Alzheimer’s, sometimes long before symptoms appear.
These new biomarker tools—from brain scans to blood and cerebrospinal fluid tests—will enable us to detect and follow the progression of Alzheimer’s during clinical trials. And the scientists’ efforts are beginning to show results. In spring 2009, ADNI reported that certain cerebrospinal fluid biomarkers may help us both predict who is at risk of developing the disease and learn how the disease responds to various therapies. These data, involving hundreds of volunteers, are available to qualified researchers worldwide, thus further driving the collaborative nature of the research and strengthening our chances of finding answers quickly. Ultimately, we hope that these technologies will prove useful in everyday clinical practice so that we can implement therapies or preventative measures as soon as possible.
We anticipate that a similar collaborative approach will tell us more about Alzheimer’s risk factor genes. Researchers have shown that three genes cause the rare, early-onset form of Alzheimer’s that occurs in some families. However, only one of the other 30,000 genes in our DNA is linked to increased risk for the more widespread, late-onset form that commonly occurs after age 65. Scientists are eager to identify additional risk factor genes.
Genome-wide association studies, which use methods that can rapidly test up to a million sites in one person’s genes, will help scientists find those elusive genetic variations. Since 2007, several international research groups conducting association studies have identified variants of the SORL1, CLU, PICALM and CR1 genes that may play a role in the risk of late-onset Alzheimer’s.
To build the large bank of DNA samples needed for future association studies, the NIA supports the Alzheimer’s Disease Genetics Consortium, which collects and analyzes biological samples from tens of thousands of people with and without the disease. The consortium freely shares its data and analyses with others in the research community to help spur advances in our understanding of the genetic mechanisms at work and to help scientists identify new pathways to prevention or treatment.
The past decade is also marked by advances in translational research—applying knowledge gained in the laboratory as quickly as possible to new tests or therapies in a clinical setting. The NIA currently supports 60 grants aimed at identifying and developing effective therapies for the treatment of Alzheimer’s. The work is varied, from finding new compounds that will modify beta-amyloid production or clear it from the brain to reformulating existing drugs and naturally occurring compounds used to treat other diseases. These studies allow the NIA to capture new and creative therapeutic approaches and to “seed” promising drug discovery and preclinical development programs.
The success of these and many other efforts relies on another vital partner in Alzheimer’s research: the many volunteer research participants, including patients in clinical trials. Both our recent progress and our growing confidence for future advances rely heavily on this generosity of spirit. Collaboration is key to translating discoveries into safe and effective therapies that will benefit us all.
Many scientists are as concerned with the direction brain research will take as with all new science having the ability to be used for good AND BAD. Global Stanford University is that far-right neo-conservative IVY LEAGUE that embraces ONE WORLD ONE GOVERNANCE EXTREME WEALTH EXTREME POVERTY----so we should watch as to what all that Federal funding is creating in PRODUCTS.
'Many questions remained. “How are you going to get the light deep into the brain?” he said. “How are you going to target these genes? Will it control behavior? Will you be able to turn on or off behaviors?”'
Many people were concerned about where genetic cloning of animals and humans would lead----and this is another concern all these kinds of medical advances happening during CLINTON/BUSH/OBAMA with no public discussions over ethics and morals----boundaries in research------all geared towards PRODUCT and not advancement of human health.
“A computer programmer who now understands how the brain works knows how to write code that will get the brain to do certain things.” Ramsay Brown'
Brain Control in a Flash of Light
The Map Makers
By JAMES GORMAN APRIL 21, 2014
SAN DIEGO — Dr. Karl Deisseroth is having a very early breakfast before the day gets going at the annual meeting of the Society for Neuroscience. Thirty thousand people who study the brain are here at the Convention Center, a small city’s worth of badge-wearing, networking, lecture-attending scientists.
For Dr. Deisseroth, though, this crowd is a bit like the gang at Cheers — everybody knows his name. He is a Stanford psychiatrist and a neuroscientist, and one of the people most responsible for the development of optogenetics, a technique that allows researchers to turn brain cells on and off with a combination of genetic manipulation and pulses of light.
He is also one of the developers of a new way to turn brains transparent, though he was away when some new twists on the technique were presented by his lab a day or two earlier.
“I had to fly home to take care of the kids,” he explained. He went home to Palo Alto to be with his four children, while his wife, Michelle Monje, a neurologist at Stanford, flew to the conference for a presentation from her lab. Now she was home and, here he was, back at the conference, looking a bit weary, eating eggs, sunny side up, and talking about the development of new technologies in science.
A year ago, President Obama announced an initiative to invest in new research to map brain activity, allocating $100 million for the first year. The money is a drop in the bucket compared with the $4.5 billion the National Institutes of Health spends annually on neuroscience, but it is intended to push the development of new techniques to investigate the brain and map its pathways, starting with the brains of small creatures like flies.
Cori Bargmann of Rockefeller University, who is a leader of a committee at the National Institutes of Health setting priorities for its piece of the brain initiative, said optogenetics was a great example of how technology could foster scientific progress.
“Optogenetics is the most revolutionary thing that has happened in neuroscience in the past couple of decades,” she said. “It is one of the advances that made it seem this is the right time to do a brain initiative.”
Dr. Deisseroth, 42, who has won numerous prizes and received plenty of news media attention for his work on optogenetics, is quick to point out that there is no sole inventor for this technology.
“It’s not as if one person had a eureka moment,” he said. “The time had come, and it was a question of who had put the resources and effort and people” on the task, and who would get there first. But it was he and his colleagues, Edward Boyden and Feng Zhang, who took those previous discoveries and devised a practical way to turn neurons on and off with light.
Ehud Isacoff, of the University of California, Berkeley, who recently wrote about the development of the technique, said that Dr. Deisseroth “was incredibly important in getting all the parts to come together.”
The reason optogenetics has transformed neuroscience is that it allows scientists to go beyond observation. In neuroscience, as in all science, it is crucial to be able to make and test predictions.
“You want to be able to play the piano,” said Dr. Bargmann, paraphrasing Rafael Yuste, a Columbia University neuroscientist and one of the people who proposed creating a brain activity map. The tools of optogenetics are allowing scientists to perform the neuroscientific equivalent of “Chopsticks” in the brains of laboratory animals — to find and control, for example, neurons that control a kind of aggression in fruit flies.
Optogenetics allows the use of light to switch specific brain activity on and off — so what does that tell us? A new look at the effect of nutrition on cancer suggests there’s not much effect at all.
The hope is that scientists can work their way up to the level of Chopin — and that this tool and others like it will uncover deep mechanisms of brain function that hold true not only for flies and mice, but for the ultimate neuroscientific puzzle, the human brain.
Karl Deisseroth was not always headed for a career in the laboratory, although his father, an oncologist, and his mother, who trained as a chemist, both exposed him to the world of science. “My first love was writing,” he said.
That was still the case in his first years at Harvard, when he took courses in creative writing and seriously considered pursuing a literary life. Eventually, however, interest in science took over. He majored in biochemistry and went on to Stanford for a medical degree and a Ph.D., expecting to become a neurosurgeon. In interviews at the San Diego meeting, and earlier at his Stanford lab, he explained what changed him.
Brain surgery “was the first clinical rotation I did; I was that certain that was what I wanted to do,” he said. But his next stop was psychiatry. “It was a completely transformative thing,” he said.
It was eye-opening, he said, “to sit and talk to a person whose reality is different from yours” — to be face to face with the effects of bipolar disorder, “exuberance, charisma, love of life, and yet, how destructive”; of depression, “crushing — it can’t be reasoned with”; of an eating disorder literally killing a young, intelligent person, “as if there’s a conceptual cancer in the brain.”
He saw patient after patient suffering terribly, with no cure in sight. “It was not as if we had the right tools or the right understanding.” But, he said, that such tools were desperately needed made it more interesting to him as a specialty. He stayed with psychiatry, but adjusted his research course, getting in on the ground floor in a new bioengineering department at Stanford. He is now a professor of both bioengineering and psychiatry.
With his own lab, in concert with other researchers, he began to pursue two projects. The one for which he was hired was low risk, involving stem cells and methods to enhance the growth of neurons. The second was the possibility of using light to control brain cells.
Scientists work with opsins, proteins derived from microbes, that are activated by blue light. The researchers are trying to change the way light pulses switch brain cells on and off. Credit Soo Yeun Lee, Andre Berndt and Karl Diesseroth
That was high risk, but not because it was an unknown idea; quite the opposite. Despite many barriers to success, it was a crowded field.
The Changeable Opsins
At the heart of all optogenetics are proteins called opsins. They are found in human eyes, in microbes and other organisms. When light shines on an opsin, it absorbs a photon and changes.
When he came into the field, Dr. Deisseroth said, “Microbial opsins had been studied since the ’70s.” Thousands of papers had been published. So the basics of the chemicals were well known.
“People talked and thought about the possibility of putting them into neurons as a control tool, and everybody thought that it might work but it would be unlikely to be very effective, unlikely to work very well, because these opsins come from organisms that are very distant and separated from mammals evolutionarily,” he said.
The genes to make the opsins needed to be inserted into the neurons, and several more steps were necessary so the system would work.
By the early 2000s there had also been an improvement in engineering viruses that were effective in smuggling the opsin genes into nerve cells, but caused no harm. Research intensified.
“There were, to my knowledge, maybe six or seven people actually trying” to get this idea of light control of neurons to work, he said.
In 2005 Dr. Deisseroth; Dr. Boyden and Dr. Zhang, both of whom now have their own labs at M.I.T.; and Ernst Bamberg of the Max Planck Institute of Biophysics and Georg Nagel at the University of Würzburg published a paper showing that an opsin called channelrhodopsin-2 could be used to turn on mammalian neurons with blue light.
This was the breakthrough research, but it had antecedents. In 2002 Gero Miesenböck, now at Oxford, and Boris Zemelman, now at the University of Texas, proved that optogenetics could work. Both were then at Memorial Sloan-Kettering Cancer Center. They reported their success using opsins from the fruit fly to turn on mouse neurons that had been cultured in the lab.
Dr. Isacoff reviewed the development of optogenetics recently after the awarding of the 2013 European Brain Prize to six people, including Dr. Deisseroth and Dr. Boyden, for work on optogenetics. The other winners were Dr. Bamberg, Dr. Nagel, Dr. Miesenböck and Peter Hegemann at Humboldt University in Berlin. He wrote of Dr. Miesenböck’s work, “If one had to identify the paper that launched the thousand ships of optogenetics, this is it.”
But although this was a breakthrough and a proof that light could be used to control neurons, Dr. Miesenböck and Dr. Zemelman’s work was not picked up as a tool by the neuroscience community because, Dr. Isacoff wrote, of the limited effectiveness of light in stimulating the neurons, and because it was hard to adapt to different biological systems.
Dr. Deisseroth’s group, said Dr. Isacoff, turned instead to microbial opsins, building on the work of Dr. Bamberg, Dr. Nagel and Dr. Hegemann. They figured out how to get one of these opsins safely into mammalian neurons so that the neurons would respond strongly to light. That made all the difference.
“The methods that are widely used now are the ones that Karl developed,” Dr. Bargmann said. “He flipped the switch that made them practical.”
Shortly thereafter the lab of Stefan Herlitze of Ruhr University Bochum, in Germany, collaborating with Dr. Hegemann and Lynn Landmesser of Case Western Reserve University, reported a similar finding. Dr. Deisseroth pointed out, however, that the initial paper was just the beginning. It involved only cells in culture. Many questions remained. “How are you going to get the light deep into the brain?” he said. “How are you going to target these genes? Will it control behavior? Will you be able to turn on or off behaviors?”
Seeing Clear Through a Brain
Clarity is a technique that can make a brain transparent so that networks of neurons that receive and send information can be highlighted in stunning color and viewed in all their three-dimensional complexity without slicing up the organ.
Those questions have now been answered through a great deal of work in Dr. Deisseroth’s lab and in others’. Hundreds of papers have been published. Many researchers are using and developing techniques, which, Dr. Isacoff wrote, “have been used to study brain waves, sleep, memory, hunger, addiction, aggression, courtship, sensory modalities, and motor behavior.”
And Now Clarity
In 2013, while continuing the work on developing optogenetic techniques, the Deisseroth lab produced another technique that Dr. Deisseroth has high hopes for. He and Kwanghun Chung, now an assistant professor at M.I.T. with his own lab, managed to turn whole mouse brains transparent, with a method called Clarity.
This is not a technique for living brains. They infused mouse brain tissue with a hydrogel, a substance well known to chemists but not one previously used in neuroscience. The method leaves the brain tissue not only transparent, but also still available for biochemical tests.
The lab is now working on making a whole preserved human brain transparent; it was a presentation on this work that Dr. Deisseroth had missed during his shuttle parenting in San Diego.
The long-term goal of his work continues to be to find a way to help people with severe mental illness or brain diseases, and he has recently proposed ways that optogenetics, Clarity and other techniques may be turned to this aim.
He still treats patients. “I don’t think a day goes by that I’m not looking at results and thinking how to apply them clinically,” he said.
Optogenetics is a crucial tool in understanding function. Clarity, on the other hand, is an aid to anatomical studies, basic mapping of structure, which, he says, is as important to understand as activity.
“I’ve administered electroconvulsive therapy — I know we can administer this therapy and cause a general seizure,” he says, in which the activity of the whole brain is disrupted.
“Within a few minutes, the whole person comes back. Where does it come back from? From the structure,” he said.