In this project we investigate and implement body movement tracking for sound modulation. The idea began when one of the student wanted to have the ability to pitch bend like guitar players while playing with piano. The basic idea is to track the movement of motion sensor (in this project with android sensors API) and modulate the sound playing by the musician according to the sensors data. The first challenge was to connect all the devices to the computer, in this project the backbone for the connection is UDP over Wi-Fi connection. Next challenge is to understand how to transform movement signals into sound modulation, we implement several filters and time series algorithms to give the player the best ability to control while disabling the noise of playing without modulation intention. Next challenge was to create real time application which processes the inputs as following and connect the output generically, so the musician will be able to choose the preferable synthesizer, the code was written in python efficiently (response ? 0 us) and the output is forwarded to virtual MIDI driver which acts like physical MIDI driver and can connect to any software based synthesizer. The most challenging task was to integrate all the above system (android, python, virtual MIDI driver, synthesizer and physical MIDI).