Posts Tagged ‘Augmented Reality’

Posted: December 27, 2012 by Wildcat in Uncategorized
Tags: , , , , , ,

Racism is ugly to confront, and, like most people, I’ve got plenty of personal stories. My grandmother, bless her heart, was a wonderful grandmother, but like many Jewish people of her generation, she was incredibly racist, afraid of black people she didn’t know. This fear caused her anxiety when she got the urge to go to a favorite restaurant. She loved the food, but, as she would derisively say, so did the schvartze (Yiddish slur for a black person). What if she didn’t have to see the black people at all? This possibility is what worries me about our augmented-reality future, which is (mostly) anticipated with optimism. If grandma had lived to see ubiquitous augmented reality, I suspect she’d put it to dehumanizing use, leaving for the restaurant with her goggles on (a less obtrusive artifact than the Coke bottle glasses she actually wore), programming them to make all dark skinned people look like variations of Larry David and Rhea Pearlman. As Brian Wassom — who regularly writes on augmented reality — notes, if apps can “recognize a particular shade of melanin, and replace it with another,” racists could one day “live in their own version of…utopia.” (via Augmented-Reality Racism – Evan Selinger – The Atlantic)

Posted: December 9, 2012 by Wildcat in Uncategorized
Tags:

The project titled ‘Touch the Train Window’ was created by the Japanese audiovisual collective Salad (aka Particle at Rest). The project converts a train window into an augmented reality screen, which allows users to place objects as an overlay of the actual scenery and watch them go by as the train moves in real-time. The installation is created using Microsoft Kinect, GPS tracking, a projector, an iPhone, and openFrameworks. The innovative idea definitely makes a long train journey more fun. Watch the clip below to see how the interactive window works. (via Change The Passing Scenery On This AR Train Window [Video] – PSFK – PSFK)

Posted: November 25, 2012 by Wildcat in Uncategorized
Tags: , , ,

You might think that your feeling of satiation when eating is due simply to your stomach filling up. According to the Hirose Tanikawa Group at the University of Tokyo, however, the visual perception of food also has something to do with it – the greater the amount of food that a person sees that they’re eating, the sooner they feel full. With that in mind, the team has created a prototype dieting system that uses augmented reality to trick people into thinking their food items are larger than they actually are. Users wear a head-mounted camera-equipped display, and handle their food against a chroma-key-blue background – it is hoped that in a commercial version of the technology, any background (such as a table top) will suffice. The headgear could also likely be replaced by something considerably lighter and smaller, such as a set of Google Glasses. The camera’s video signal is processed by software that identifies hand-held food items and enlarges them relative to the user’s hand, in the display. A deformation algorithm likewise makes their hand appear to be opened wider, as if it’s naturally holding the larger piece of food. (via Augmented reality system could be a boon to dieters)

Augmented Vision, A plethora of Wearable displays comes our way

New Post: Augmented Vision, A plethora of Wearable displays comes our way

Posted: November 22, 2012 by Wildcat in Uncategorized
Tags: ,

Augmented reality will become very interesting when the barriers to creating custom objects, animations, apps and experiences is drastically lowered. Similar to Flash or the App store, AR becomes very interesting when these experiences become very personal or shared between friends. What has to change? Tools to create AR experiences such as notifications, real-time data feeds, location-context, alerts and images need to be created and easily accessible. Current tools (Layar, raw code, existing data) require large learning curves, knowledge of programming, and, at best, data parsing skills. How do we know when we have it? Once a platform exists or a program that allows for many people to create AR experiences happens, we’ll have a massive influx of horrible apps and bad designs. We’ll also have a number of unique experiences and programs that only work on AR, or take advantage of the unique aspects and shape of the AR experience. (Heads up display, machine vision, overlays, notifications, context, ect.) Some of them will be silly, others serious. But mostly silly. A lot of entertainment. Most will be short term and kitchy, but will prove a point. There will be some art involved. (via RealityAugmented — Augmented reality, cyborg anthropology, new aesthetics, coevolution technologies.)

Disclosure: I am part of the blogging team at Reality Augmented

Posted: November 6, 2012 by Wildcat in Uncategorized
Tags: , , ,

Up to now, mechanics carrying out complex repairs relied mostly on information from handbooks to guide them. But leafing through books tended to break concentration and repairs took longer. This situation is by no means improved by using PCs or laptops to call up the information; mechanics still need to click their way through page after page to find what they need. Another disadvantage is that tools have to be put to one side in order to deal with the book or computer. Researchers at the Fraunhofer Center for Organics, Materials and Electronic Devices Dresden COMEDD have been working for several years designing interactive HMDs – Head Mounted Displays – based on OLED technology for just such applications. These displays offer access to what is known as “augmented reality”, enhancing the real world with additional visual information. Navi-gating through this augmented reality used to require data gloves or a joystick. Now COMEDD scientists, working together with their colleagues from the Fraunhofer Institute for Optronics, System Technologies and Image Exploitation IOSB in Karlsruhe and near-the-eye technologies specialist TRIVISIO, have succeeded in developing data glasses fitted with displays that can be controlled by the movements of the human eye. Mechanics wearing such glasses are able to assess the damage while also using their eyes to turn the pages of the virtual instruction manual. The system will be on display at the joint Fraunhofer booth in Hall A5, Booth 121, at the electronica trade fair in Munich, from November 13–16. Photodiode detects eye movements (via Novel Data Glasses with an OLED microdisplay: Looking for information? | ZeitNews)

Posted: July 31, 2012 by Wildcat in Uncategorized
Tags: , , , ,

Canon begins selling a next-generation form of virtual reality technology known as mixed reality (MR) this month, IEEE Spectrum reports. Canon’s MR adds computer-generated virtual objects to the real world in real time, at full scale, and in three dimensions. It is initially targeted at engineering groups involved in designing and building new products. It uses a video see-through head-mounted display (HMD) with two video cameras, one for each eye, to capture video from the real world, which is sent via cable (attached to the HMD) to a computer for integration with the computer-generated graphics or computer-aided-design data to be overlaid. The synthesized video is then sent back to twin SXGA-resolution displays in the HMD, which reflect the images through an optical system in the helmet and then into the eyes. Canon will market the MR technology as a complete system, first in Japan, then overseas — possibly as early as the end of the year in the United States, priced at around US $125 000 for the basic system. (via Trying out Canon’s mixed-reality tech | KurzweilAI)