About AMP Lab Projects Downloads Publications People Links
Ted
Square Personal
Homepage: http://amp.ece.cmu.edu/people/Ted/ |
||
Office:
Porter Hall B8 Lab: Porter Hall B6 Phone: 412-268-7102 Fax: 412-268-3890 |
Mailing
Address: |
[Research Interests] [Project]
Real-time Dynamic Image-Based Rendering (DIBR) has recently gained interest in the research community. Having multiple images from unique vantage points synchronized in time allows us to reconstruct scene geometry without a pirori knowledge. In this project, the cameras are mobile so that they can reconfigure themselves automatically according to the scene content in order to achieve optimal rendering quality.
Although target recognition has been an active research topic for decades, traditional recognition algorithms are all based on static images. As computing power has increased, we are now able to harness the power of tracking with multiple images and/or sensors in real-time. One challenge is to optimize coordination among nodes for successful hand-off of tracking responsibilities from camera to camera. Moreover, tracking will not only allow us to track objects, but may prove to be a useful tool for recognition. One such techinique for video-based recognition can be found here: [Project - Video Based Face Recognition]. Being able to successfully hand off tracking from one camera to another facilitates numerous real world applications. For example, existing cameras at traffic intersections could transform into a useful police tool for tracking stolen vehicles.
Geiger-Mode 3D LADAR offers numerous advantages over coherent detection and traditional 2D imagery. A Geiger-Mode sensor is capable of detecting the arrival of a single photon. Hence, the term "Geiger" because it literally counts photons. Given the ability to detect the arrival of a single photon, a laser and high-speed CMOS circuitry as a timing mechanism, it is possible to create a digital time-of-flight sensor. The laser transmits a temporally narrow pulse of light. The pulse is detected and the "stopwatch" is started. The photons will then propogate to and from the target. When a photon arrives back at the APD array, the CMOS circuitry clocks the time (equivalent to pressing "stop" on the stopwatch). The speed of light is typically constant, so we can calculate the distance the photon has traveled. Hence, we have a 3D position. Of course, this is a gross simplification of the electronics required by such a system. For more details and one interesting application, see High-resolution 3D imaging laser radar flight test experiments.
The Mobile Camera Array Using a 6x8 array of webcams, the mobile camera array can render virtual, novel views in real time (5-10 fps). A unique advantage of this array is that each camera can move with two degrees of freedom. They can translate horizontally or pan to improve the rendering quality. This mobile camera array opens a wide area of novel research topics. See the mobile camera array project page for more information! |
|
ICTrack As computers become more powerful, they facilitate the possibility of more sophisticated and more powerful tracking algorithms. The knowledge gap between the novice and the expert continues to widen. Even for experts, there can be a significant time cost to instantiate an algorithm - even if it is for comparison purposes only. Enter ICTrack. Our goal is two fold. For the novice, we create a flexible interface that the novice can use to manipulate advanced algorithms without having to know the sometimes pedantic details. For the expert, we provide a framework for developing new algorithms and testing them verus existing techniques. See the ICTrack project page for more information!
|
This
website is maintained by
Devi Parikh
|