Semester project 2016

Motion study for bimanual catching of a flying object
Nadir Benjamin Ramtoula (MT)

Bimanual catching of a flying object is an intriguing problem of visuo-motor coordination. In robotics, such a task is an interesting problem that has many possible applications (e.g. dynamic human-robot interaction with large objects or catching a falling object in dangerous situations), however, at the same time, it is a challenging problem that involves fast motor control with time constraint based on the perceived visual information. Humans are capable of coordinating the two arms in a very smooth and efficient way – especially if we consider how slow human brains are compared to the computational capabilities of robots these days. It is still largely unknown how humans coordinate their arms and generate desired trajectories in fast adaptive scenarios such as catching a flying ball with both hands. The simple and efficient strategy that humans use in their motions has large potential in the robotic applications where it can be used in designing a fast and effective controller. The purpose of this study is to analyze the characteristics of the hand trajectories in terms of the temporal constraint that is present in catching a flying object scenario. The goal of this project is to study: (i) how the remaining time before catching (time to contact, TTC) affects the trajectories of the hands, (ii) whether humans estimate the TTC and utilize it in motion generations, and (iii) how the trajectories of the two hands are coordinated in given TTC. In order to achieve the goals, the student should perform a set of experiments where the motions of two arms are recorded using a motion capture system for different catching scenarios. The student is expected to get familiarized himself/herself with related topics including human motions in interceptive actions through reading the literature. Once the data is collected from experiments, the student should analyze the data in terms of the trajectory, movement onset time, movement duration, tangential velocity at impact, normal velocity to the object, and etc. The student should analyze the effect of the TTC on the trajectories of hands in the aspects mentioned above. One possible way to explore is to compare the trajectories for catching a moving object with the trajectories for reaching a stationary object.
Project: Semester Project
Period: 20.09.2016 – 01.02.2017
Section(s): EL IN ME MT
Type: 20% theory, 10% software, 70% experiments
Knowledge(s): MATLAB, C++, machine learning
Subject(s): motion study, machine learning
Responsible(s): Seyed Sina Mirrazavi Salehian, Kevin Gonyop Kim
Smart Adaptive Control of Venetian Blinds and Electric Lighting System based on novel High Dynamic Range (HDR) Vision Sensor
Camille Lechot (CH)

Smart control of the dynamic façade and electric lighting system of an occupied building is a challenging task that requires considering numerous factors such as façade geometry, sun shading characteristics, occupant’s visual and thermal comfort. Moreover, the one-fit-all solutions have revealed serious deficiencies with regard to user acceptance of the technology. Well-designed sun control and shading system can dramatically reduce building peak heat gain and cooling requirement, improve the natural lighting quality of building interiors and reduce the rejection rate of the automatic system. Recently, a novel approach for assessing and integrating glare indices and non-image forming effects of light in building automation is developed through a calibrated High Dynamic Range (HDR) imaging sensor (Figure 1). Researchers have shown that the human-building interactions (namely expressed “wishes”) are the best source of information for personalizing the automation system and adapting it to each occupant’s needs. The goal of this project is to develop a human based controller for commanding electric lighting and shadings based on the lightning conditions. To achieve this, at first, the student needs to familiarize him/herself with literatures for control strategies of venetian blinds and electric lighting. Then, based on the data collected from human actions (the wishes), a proper decision making controller for commanding the electric lighting and the shadings needs to be proposed. Finally, the performance of the proposed system are systematically verified, if time permits. Some funding might be at disposal for tests in case of promising preliminary outcomes. This project is carried out in collaboration with LESO-PB. Please contact both responsibles for getting more information.
Project: Semester Project
Period: 01.02.2017 – 01.07.2017
Section(s): IN MA ME MT
Type: 30% theory, 20% software, 50% testing
Knowledge(s):
Subject(s): Human-Building interaction, Machine learning
Responsible(s): Seyed Sina Mirrazavi Salehian, Ali Motamed (ali.motamed@epfl.ch)