In a world of proliferating professions, S. Matthew Liao has a singular title: neuroethicist. Dr. Liao, 40, the director of the bioethics program at New York University, deploys the tools of philosophy, history, psychology, religion and ethics to understand the impact of neuroscientific breakthroughs. Define neuroethics. It’s a kind of subspecialty of bioethics. Until very recently, the human mind was a black box. But here we are in the 21st century, and now we have all these new technologies with opportunities to look inside that black box — a little. With functional magnetic imaging, f.M.R.I., you can get pictures of what the brain is doing during cognition. You see which parts light up during brain activity. Scientists are trying to match those lights with specific behaviors. At the same time this is moving forward, there are all kinds of drugs being developed and tested to modify behavior and the mind. So the question is: Are these new technologies ethical? A neuroethicist can look at the downstream implications of these new possibilities. We help map the conflicting arguments, which will, hopefully, lead to more informed decisions. What we want is for citizens and policy makers to be thinking in advance about how new technologies will affect them. As a society, we don’t do enough of that. Give us an example of a technology that entered our lives without forethought. The Internet. It has made us more connected to the world’s knowledge. But it’s also reduced our actual human contacts with one another.
Posts Tagged ‘Neuroscience’
Tags: auditory cortex, cochlear implant, deafness, hearing loss, Neuroscience, Science
From the day I was born, my brain developed according to the stimuli it received. My senses of vision, touch, taste, smell were all slightly heightened in compensation for the lack of input from my ears, helping me to create a world I could understand.
My mother worked full time with me, playing a set of activities she called “the game”. I was a child, and didn’t understand the real reason for playing the game — but it taught me to read, write, lipread, and speak, if not to hear in the traditional sense of the word. What I do hear is filtered through digital hearing aids that amplify what little sound I can hear.
A month ago, for the first time, I made the change from external technology to internal technology. I became a full time cyborg, free of charge on the NHS.
They cut away a flap of skin behind my left ear, drilled a tiny hole into my skull between the two main nerves of the face that control taste and the face, and inserted an electrode into my cochlear, connected to a small magnet and circuit board under the skin.
They’re going to switch me on in a few days — and if it’s all working as it should, my auditory cortex will be bombarded by a range of electronic noises. Over time, I may come to understand these sounds as consonants, music, even the spoken word.
This is what it will sound like, apparently.
Even if I can make sense of those sounds, it won’t be “hearing” in the normal sense of the word. My ears have had the same level of input for the last 30 years of my life — and now I’ve physically rewired one of them to receive a completely different signal.
In all the recent blue sky thinking on Wired.co.uk and elsewhere about the future of the human race — coprocessors for the brain, enhanced spectrum bionic eyes, artificial legs, even the possibility of interfacing with computers directly — people forget one thing. What it feels like, what it’s like to live with it every day, whether it makes you feel more, or less, yourself.
I’m also wary of augmentation and body enhancement becoming the norm. We have a fluid definition of what a disability is, and what isn’t. If certain people with access to this technology start engineering themselves to have greater physical or mental abilities, then where does that leave ordinary people? Differently abled? Or Disabled? Or in fact more abled? In giving up perfectly usable eyes, the end result of millions of years of evolution, to install digital eyes that can project images onto the retina, are we really putting ourselves at an advantage?
If I’d been born into a deaf family, all of us signing, my brain developing to become fluent in sign language and developing a deaf identity so strong and complete that I saw deafness as “normal” and hearing as “abnormal” — I wouldn’t have had this implant.
The cochlear implant, in crossing the line from external wearable technology to permanent fixture, becomes a technology that is potentially in conflict with human values, rather than a testament to them. Many deaf people see the cochlear implant as a symbol of medical intervention, to oppress and ultimately eradicate the deaf community and deaf culture, by fixing them one implant at a time — this includes implanting children at an early age so that they’ll be able to acquire spoken language rather than sign.
More than any year before, 2012 was the year neuroscience exploded into pop culture. From mind-controlled robot hands to cyborg animals to TV specials to triumphant books, brain breakthroughs were tearing up the airwaves and the internets.
From all the thrilling neurological adventures we covered over the past year, we’ve collected five stories we want to make absolutely sure you didn’t miss. Now, no matter how scientific our topic is, any Top 5 list is going to turn out somewhat subjective.
For one thing, we certainly didn’t cover every neuroscience paper published in 2012 – we like to pick and choose the stories that seem most interesting to us, and leave the whole “100 percent daily coverage” free-for-all to excellent sites like ScienceDaily. As you may’ve also noticed, we tend to steer clear of headlines like “Brain Region Responsible for [X] Discovered!” because – as Ben talks about with Matt Wall in this interview – those kinds of discoveries are usually as vague and misleading as they are overblown by the press. Instead, we chose to focus on five discoveries carry some of the most profound implications of any research published this past year – both for brain science, and for our struggle to understand our own consciousness. So on that note, here – in countdown order – are the five discoveries that got us the most pumped up in 2012!
5. A Roadmap of Brain Wiring
4. Laser-Controlled Desire
3. Programmable Brain Cells
2. Memories on Disc
1. Videos of Thoughts (via http://theconnecto.me/2012/12/the-top-5-neuroscience-breakthroughs-of-2012/)
Tags: cerebral cortex, Neuron, Neuroscience, olfactory bulb, olfactory system, optogenetics, Science
More than a century after it was first identified, Harvard scientists are shedding new light on a little-understood neural feedback mechanism that may play a key role in how the olfactory system works in the brain.
As described in a December 19 paper in Neuron by Venkatesh Murthy, Professor of Molecular and Cellular Biology, researchers have, for the first time, described how that feedback mechanism works by identifying where the signals go, and which type of neurons receive them. Three scientists from the Murthy lab were involved in the work: Foivos Markopoulos, Dan Rokni and David Gire.
“The image of the brain as a linear processor is a convenient one, but almost all brains, and certainly mammalian brains, do not rely on that kind of pure feed-forward system,” Murthy explained. “On the contrary, it now appears that the higher regions of the brain which are responsible for interpreting olfactory information are communicating with lower parts of the brain on a near-constant basis.”
Though researchers have known about the feedback system for decades, key questions about its precise workings, such as which neurons in the olfactory bulb receive the feedback signals, remained a mystery, partly because scientists simply didn’t have the technological tools needed to activate individual neurons and individual pathways.
“One of the challenges with this type of research is that these feedback neurons are not the only neurons that come back to the olfactory bulb,” Murthy explained. “The challenge has always been that there’s no easy way to pick out just one type of neuron to activate.”
To do it, Murthy and his team turned to a technique called optogenetics.
Using a virus that has been genetically-modified to produce a light-sensitive protein, Murthy and his team marked specific neurons, which become active when hit with laser light. Researchers were then able to trace the feedback mechanism from the brain’s processing centers back to the olfactory bulb.
Reaching that level of precision was critical, Murthy explained, because while olfactory bulb contains many “principal” neurons which are responsible for sending signals on to other parts of the brain, it is also packed with interneurons, which appear to play a role in formatting olfactory information as it comes into the brain.
Your brain has at least four different senses of location — and perhaps as many as 10 — and each is different, according to new research from the Kavli Institute for Systems Neuroscience at the Norwegian University of Science and Technology. The brain has a number of “modules” dedicated to self-location, they found. Each module contains its own internal GPS-like mapping system that keeps track of movement, and has other characteristics that also distinguish one motion from another. “We have at least four senses of location,” says Edvard Moser, director of the Kavli Institute. “Each has its own scale for representing the external environment, ranging from very fine to very coarse. “The different modules react differently to changes in the environment. Some may scale the brain’s inner map to the surroundings, others do not. And they operate independently of each other in several ways.” This is also the first time that researchers have been able to show that a part of the brain that does not directly respond to sensory input, called the association cortex, is organized into modules. The research was conducted using rats. (via The many maps of the brain | KurzweilAI)
During the past few years, there has been a dramatic increase in research examining the role of memory in imagination and future thinking. This work has revealed striking similarities between remembering the past and imagining or simulating the future, including the finding that a common brain network underlies both memory and imagination. Here, we discuss a number of key points that have emerged during recent years, focusing in particular on the importance of distinguishing between temporal and nontemporal factors in analyses of memory and imagination, the nature of differences between remembering the past and imagining the future, the identification of component processes that comprise the default network supporting memory-based simulations, and the finding that this network can couple flexibly with other networks to support complex goal-directed simulations. This growing area of research has broadened our conception of memory by highlighting the many ways in which memory supports adaptive functioning.