January 14, 2016

New EPICS team: Imaging Processing & Analysis

Priority: No

We have just created a new EPICS team, Image Processing & Analysis (IPA). It will feature three projects; the descriptions and requirements are below.  It meets Wednesdays, 8:30 – 10 am. It is open to seniors and juniors.  For seniors, it can be used to satisfy senior design (either the one semester or the two semester option). For juniors, we are especially looking for those who would want to continue these projects for senior design. 

 

Students in this team will be expected to have knowledge of signal and system methods (e.g. ECE 301) and working knowledge of programming methods (Matlab is acceptable but knowledge of Python or C or C++ is desirable). In addition, the course requires instructor approval. Interested students should email epics@purdue.edu. Their email should include a short (one or two sentences) explanation of their interest in the team/projects.

 

Image Processing & Analysis Projects:

 

  • Infant Sleep Detection Using Automated Video Analysis
    • Community Partner: Developmental Studies Laboratory, Purdue University
    • Description:  Sleep problems are common for children with autism spectrum disorder (ASD) but assessing their sleep can be challenging.   This project includes improving home-based sleep studies for children with and at elevated risk for ASD.  The goal of this project is determine when a child is sleeping/awake by designing an automated system that includes the development of image/video processing tools for automatic video processing. Using the video recordings of young children sleeping the team will (1) determine activity during sleep, (2) compare (1) to an ankle worn accelerometer, and (3) determine sleep/wake thresholds using standard behavioral codes for awake and asleep.
    • Faculty:
      • Professor Edward Delp, School of Electrical and Computer Engineering
      • Professor A. J. Schwichtenberg, Department of Human Development & Family Studies

 

  • Image Based Human Interaction Analysis
    • Community Partner: Department of Speech, Language, and Hearing Sciences, College of Health and Human Sciences, Purdue University
    • Description: Early social interaction events such as “touch” may have important influence on infants’ speech perception. It is important for researchers to study the “touch” interaction between caregivers and infants so they can determine how touch impacts infant language development. In this project, we are interested in automatically detecting and identifying the touch events from video captured by a RGB camera and a depth sensor. The team will design a system that (1) uses a RGB camera and a depth sensor to acquire image data, (2) processes the RGB and depth data, and (3) uses the combined RGB and depth data to automatically extract instances of potential touch events.
    • Faculty:
      • Professor Fengqing Maggie Zhu, School of Electrical and Computer Engineering (ECE)
      • Professor Edward Delp. School of Electrical and Computer Engineering (ECE)
      • Professor Amanda Seidl, Department of Speech, Language, and Hearing Sciences (SLHS)

 

  • The PhenoSorg Team
    • Community Partner: US Department of Energy
    • Description:  Modern farming techniques now include collecting information from both ground based and airborne sensors (e.g. drones). This team will work closely with a recently funded research project sponsored by the US Department of Energy. In this project we are interested in using imaging sensors (cameras) and 3D sensors to acquire data throughout the growing period on plant characteristics and develop models to predict the ultimate biomass yield of a crop. The particular crop we are using for our study is sorghum. We are collecting these images from drones and ground based sensors. The images are analyzed to estimate phenotypic properties of the sorghum plants. This includes plant height, number and area of leaves, chemistry related properties, and other traits. The team will work on two tasks: analyzing the sensor data and designing the ground based sensor platform. Note: Due to FAA restrictions this project will not include operating airborne drones.
    • Faculty:
      • Professor Edward Delp, School of Electrical and Computer Engineering
      • Professor Melba Crawford, School of Civil Engineering and Director of Laboratory for Applications of Remote Sensing (LARS)