Computing hardware is composed of a series of binary switches; they’re either on or off. The other piece of computational hardware we’re familiar with, the brain, doesn’t work anything like that. Rather than being on or off, individual neurons exhibit brief spikes of activity, and encode information in the pattern and timing of these spikes. The differences between the two have made it difficult to model neurons using computer hardware. In fact, the recent, successful generation of a flexible neural system required that each neuron be modeled separately in software in order to get the sort of spiking behavior real neurons display. But researchers may have figured out a way to create a chip that spikes. The people at HP labs who have been working on memristors have figured out a combination of memristors and capacitors that can create a spiking output pattern. Although these spikes appear to be more regular than the ones produced by actual neurons, it might be possible to create versions that are a bit more variable than this one. And, more significantly, it should be possible to fabricate them in large numbers, possibly right on a silicon chip. (via “Neuristor”: Memristors used to create a neuron-like behavior | Ars Technica)
Posts Tagged ‘Computing’
Welcome to your everyday world of “Big Data,” the infinite sea of facts, products, books, maps, conversations, references, opinions, trends, videos, advertisements, surveys — all of the sense and nonsense that is literally at your fingertips, 24-7, every day from now on. Eric Schmidt, Google’s executive chairman, estimates that humans now create in two days the same amount of data that it took from the dawn of civilization until 2003 to create. Micro-bursts of technological innovation over the past decade have created a supernova of new data and a virtually limitless capacity to create and store it, shaping everyday lives across the planet. What may be most fascinating, experts agree, is that this change has come so quietly and seamlessly. “An extraordinary knowledge revolution (is) sweeping, almost invisibly, through business, academia, government, health care and everyday life,” says Rick Smolan, co-creator of a new book, The Human Face of Big Data.
Tags: analytics, Big Data, cloud, commerce, Computing, lifestyle, Mobile, new intelligence
A supernova of new data over the past decade is shaping everyday lives across the planet.
Tags: avatar, Computing, Emotions, Future, Software, tech
You move, he moves. You smile, he smiles. You get angry, he gets angry. “He” is the avator you chose. Faceshift, from EPFL’s Computer Graphics and Geometry Laboratory, now offers a software program that could save time for the designers of animation or video games. Thibaut Weise, founder of the start-up, smiles and nods. On the screen his avatar, a fantasy creature, directly reproduces his gestures. This system could enhance the future of video games or even make video chats more fun. One tool required: a camera that has motion and depth sensors in the style of Microsoft Kinect or Asus Xtion, well known to gamers. During its first use, the software needs only ten minutes to recognize the user’s face. The user reproduces several basic expressions requested by the program: smile, raise eyebrows, etc. “The more movement is incorporated into the program’s 50 positions, the more realistic are the results,” explains Thibaut Weise, creator of the start-up currently based at the Technopark in Zurich. Then you can get into the skin of your character and animate by moving yourself. “It’s almost like leaving your body to enter that of your avatar,” jokes the young entrepreneur. (via Software enables avatar to reproduce our emotions in real time)
Tags: Computing, Future, Predictions, Technology
Get ready for a time when computers know our world – and our future – better than we do. One of my favourite Isaac Asmiov stories, Franchise, imagines an election in which computing is sufficiently advanced for the preferences of an entire country to be predicted on the basis of just one voter’s actions. We’re not quite at that stage yet. But we may be on the right path. For perhaps the greatest geek triumph of the 2012 presidential elections was the unlikely figure of statistician Nate Silver, whose FiveThirtyEight blog – which algorithmically assessed hundreds of polls based on their historical accuracy – managed to successfully predict the result in 50 out of 50 states. His analysis – like every political story – divides opinion. To my mind, though, his work shines a light on a bigger story about our future relationship with technology, and in particular on a vision of progress where there’s an increasingly clear divide between those endeavours that can safely be left to humans, and those where machines and mathematics are preferable. It’s something that is already happening. From automated explorations of Mars, the use of unmanned drone aircraft for reconnaissance and remote assassination, to the analysis of probabilities and prediction. We do what we’re best at, and leave the rest to the machines. Some things have ever been thus, but never has the story of human enhancement been quite so closely entwined with the story of human redundancy. In Silver’s case, the people who may face immediate redundancy are those professional political pundits whose speculations saturate the media at election time – or at least replacement by suitably ideologically varied Silver-like figures next time around. In the longer term, though, the wholesale replacement of speculation with massively data-led science may be in order – not to mention the transformation of what it means to plan as well as to predict a political campaign. (via BBC – Future – Technology – All hail the prediction machines)
Many people cite Albert Einstein’s aphorism “Everything should be made as simple as possible, but no simpler.” Only a handful, however, have had the opportunity to discuss the concept with the physicist over breakfast. One of those is Peter G. Neumann, now an 80-year-old computer scientist at SRI International, a pioneering engineering research laboratory here. As an applied-mathematics student at Harvard, Dr. Neumann had a two-hour breakfast with Einstein on Nov. 8, 1952. What the young math student took away was a deeply held philosophy of design that has remained with him for six decades and has been his governing principle of computing and computer security. For many of those years, Dr. Neumann (pronounced NOY-man) has remained a voice in the wilderness, tirelessly pointing out that the computer industry has a penchant for repeating the mistakes of the past. He has long been one of the nation’s leading specialists in computer security, and early on he predicted that the security flaws that have accompanied the pell-mell explosion of the computer and Internet industries would have disastrous consequences. (via Rethinking the Computer at 80 – NYTimes.com)
The sound of 20 quadrillion calculations happening every second is dangerously loud. Anyone spending more than 15 minutes in the same room with the Titan supercomputer must wear earplugs or risk permanent hearing damage. The din in the room will not come from the computer’s 40,000 whirring processors, but from the fans and water pipes cooling them. If the dull roar surrounding Titan were to fall silent, those tens of thousands of processors doing those thousands of trillions of calculations would melt right down into their racks. Titan is expected to become the world’s most powerful supercomputer when it comes fully online at the US Oak Ridge National Laboratory, near Tennessee, in late 2012 or early 2013. But on this afternoon in mid-October, Titan isn’t technically Titan yet. It’s still a less-powerful supercomputer called Jaguar, which the US Department of Energy (DoE) has operated and continuously upgraded since 2005. Supercomputing power is measured in Flops (floating point operations per second), and Jaguar was the first civilian supercomputer to break the “petaflop barrier” of one quadrillion operations per second (a quadrillion is a one followed by 15 zeroes). In June 2010 it was the fastest supercomputer on Earth. (via BBC – Future – Technology – Building Titan: The ‘world’s fastest’ supercomputer)