search slide
search slide
pages bottom

Samsung demonstrates camera sensors hooked to IBM’s brain-imitating silicon

A few years ago, IBM launched its TrueNorth project — a neuromorphic (brain-like) computer processor designed to hew more closely to what we know about the human brain’s biology, and hopefully take advantage of its capabilities. Unlike most processors, TrueNorth is a manycore design with 4,096 processors per core. Each core simulates 256 neurons and each neuron can be connected to up to 256 other neurons via artificial “synapses.”

TrueNorth can consume as little as 70mW of power, but we haven’t seen very many commercial demonstrations of the chip that showcased its capabilities. That changed last week when Samsung demonstrated a new camera sensor hooked up to TrueNorth and designed to take advantage of its capabilities.

Samsung describes the technology as follows:

Conventional vision sensors see the world as a series of frames. Successive frames contain enormous amounts of redundant information, wasting memory access, RAM, disk space, energy, computational power and time. In addition, each frame imposes the same exposure time on every pixel, making it difficult to deal with scenes containing very dark and very bright regions.

The Dynamic Vision Sensor (DVS) solves these problems by using patented technology that works like your own retina. Instead of wastefully sending entire images at fixed frame rates, only the local pixel-level changes caused by movement in a scene are transmitted – at the time they occur. The result is a stream of events at microsecond time resolution, equivalent to or better than conventional high-speed vision sensors running at thousands of frames per second. Power, data storage and computational requirements are also drastically reduced, and sensor dynamic range is increased by orders of magnitude due to the local processing.

It should be noted that this is exactly how the retina works. Your retina responds whenever it is struck by photons — there’s a phenomena related to processing motion called persistence of vision that explains how we see movies and television as contiguous streams rather than static snapshots — but your eye doesn’t just transmit a static image back to the brain until or unless it observes motion. What Samsung is referring to here is the fact that each cell within the retina “updates” when it is struck by a photon and stimulated. In this analogy, not updating the scene until the camera perceives that the scene has changed in some fashion is analogous to the response of specific cells in the retina, which don’t signal until or unless they have something to signal about.

According to Samsung, the TrueNorth chip is what allowed the company to build a camera that can capture up to 2,000 frames per second while running on a chip that consumes just 300mW of power. That’s vastly less than conventional processors, but still dwarfed by the efficiency of the human brain. Chips like TrueNorth are a huge breakthrough, but the gap between them and our own biological processors remains huge. Samsung believes its combination of TrueNorth and the Dynamic Vision Sensor could be useful on self-driving cars, creating 3D maps, and gesture-recognition. There’s no word on whether we’ll see a consumer version come to market.

Leave a Reply

Captcha image