News Story
Aloimonis receives NSF grant for robots with vision that find objects
ISR-affiliated Professor Yiannis Aloimonos has been awarded a three-year, $550K NSF Cyber-Physical Systems Methods and Tools grant, "Robots with Vision that Find Objects."
The objective of this research is the development of methods and software that will allow robots to detect and localize objects using Active Vision and develop descriptions of their visual appearance in terms of shape primitives. The approach is bio inspired and consists of three novel components. First, the robot will actively search the space of interest using an attention mechanism consisting of filters tuned to the appearance of objects. Second, an anthropomorphic segmentation mechanism will be used. The robot will fixate at a point within the attended area and segment the surface containing the fixation point, using contours and depth information from motion and stereo. Finally, a description of the segmented object, in terms of the contours of its visible surfaces and a qualitative description of their 3D shape will be developed.
The intellectual merit of the proposed approach comes from the bio-inspired design and the interaction of visual learning with advanced behavior. The availability of filters will allow the triggering of contextual models that work in a top-down fashion meeting at some point the bottom-up low-level processes. Thus, the approach defines, for the first time, the meeting point where perception happens.
The broader impacts of the proposed effort stem from the general usability of the proposed components. Adding top-down attention and segmentation capabilities to robots that can navigate and manipulate, will enable many technologies, for example household robots or assistive robots for the care of the elders, or robots in manufacturing, space exploration and education.
Published September 15, 2010