Tissue engineering research at MIT is now largely focused on creating tissue that can be used in the lab to model human disease and test potential new drugs. MIT professor Sangeeta Bhatia recently developed the first stem-cell-derived liver tissue model that can be infected with the hepatitis C virus. She has also designed thin slices of human liver tissue that can be implanted in mice, enabling rapid studies of potential drugs. Like other human tissues, liver is difficult to grow outside the human body because cells tend to lose their function when they lose contact with neighboring cells. “The challenge is to grow the cells outside the body while maintaining their function after being removed from their usual microenvironment,” says Bhatia, the John and Dorothy Wilson Professor of Health Sciences and Technology and Electrical Engineering and Computer Science. Human on a chip In a large-scale project recently funded by the Defense Advanced Research Projects Administration, several MIT faculty members are working on a “human-on-a-chip” system that scientists could use to study up to 10 human tissue types at a time. The goal is to create a customizable system of interconnected tissues, grown in small wells on a plate, allowing researchers to analyze how tissues respond to different drugs. “If they’re developing a drug for Alzheimer’s, they may want to examine the uptake by the intestine, the metabolism by the liver, and the toxicity on heart tissue, brain tissue or lung tissue,” says Linda Griffith, the S.E.T.I. Professor of Biological and Mechanical Engineering at MIT and leader of the (via Tissue engineering at MIT: where it’s going | KurzweilAI)
Posts Tagged ‘MIT’
MIT team builds most complex synthetic biology circuit yet
Unlike electronic circuits on a silicon chip, biological circuits inside a cell cannot be physically isolated from one another. “The cell is sort of a burrito. It has everything mixed together,” says Christopher Voigt, an associate professor of biological engineering at MIT.
Because all the cellular machinery for reading genes and synthesizing proteins is jumbled together, researchers have to be careful that proteins that control one part of their synthetic circuit don’t hinder other parts of the circuit.
The biggest question for would-be cyborgs is: How are you going to power all those brain implants? And now it looks like some MIT engineers may have stumbled upon the answer. They have developed a fuel cell that can run on your brain’s own glucose — a breakthrough that could result in powerful neural prosthetics that could restore and control a number of bodily functions. Here’s how it would work — plus why this breakthrough could combine with two other recent developments to make a cyborg future much closer than it was before. The glucose fuel cell isn’t an entirely new idea. Back in the 1970s, scientists showed that a pacemaker could be powered using your body’s own sugar, but lithium-ion batteries proved more practical. Moreover, a glucose fuel cell requires enzymes to work, which didn’t bode well for long-term implantation in the body. (via Brain Implants Powered by Spinal Fluid: Another Huge Step Towards Our Cyborg Future)
In today’s manufacturing plants, the division of labor between humans and robots is quite clear: Large, automated robots are typically cordoned off in metal cages, manipulating heavy machinery and performing repetitive tasks, while humans work in less hazardous areas on jobs requiring finer detail. But according to Julie Shah, the Boeing Career Development Assistant Professor of Aeronautics and Astronautics at MIT, the factory floor of the future may host humans and robots working side by side, each helping the other in common tasks. Shah envisions robotic assistants performing tasks that would otherwise hinder a human’s efficiency, particularly in airplane manufacturing. (via Robotic assistants may adapt to humans in the factory, thanks to new algorithm)
Anyone else see The Avengers? Just like in Iron Man 1 and 2, Tony Stark has the coolest interactive 3-D displays. He can pull a digital wire frame out of a set of blueprints or wrap an exoskeleton around his arm. Those moments aren’t just sci-fi fun; they’re full of visionary ideas to explore and manipulate objects in 3-D space. Except for one thing: How would Stark feel all of these objects to move them around? In reality, he’d be touching nothing but air. Jinha Lee, from the Tangible Media Group of the MIT Media Lab, in collaboration with Rehmi Post, has been playing with the idea of manipulating real floating objects in 3-D space to create a truly tactile user interface. His prototype is called the ZeroN, and it will drop your jaw when you see it working for the first (and second and third) time. (via MIT Creates Amazing UI From Levitating Orbs | Co.Design: business innovation design)
Conversations between people include a lot more than just words. All sorts of visual and aural cues indicate each party’s state of mind and make for a productive interaction. But a furrowed brow, a gesticulating hand, and a beaming smile are all lost on computers. Now, researchers at MIT and Tufts are experimenting with a way for computers to gain a little insight into our inner world. Their system, called Brainput, is designed to recognize when a person’s workload is excessive and then automatically modify a computer interface to make it easier. The researchers used a lightweight, portable brain monitoring technology, called functional near-infrared spectroscopy (fNIRS), that determines when a person is multitasking. Analysis of the brain scan data was then fed into a system that adjusted the user’s workload at those times. A computing system with Brainput could, in other words, learn to give you a break. (via A Computer Interface that Takes a Load Off Your Mind – Technology Review)