Robotic clothing assistance is a highly challenging problem involving close interaction of the robot with non-rigid clothing material and the assisted person with varying posture. The relationship between human and cloth needs to be accurately estimated to ensure the successful completion of the clothing task by the robot. In this presentation, a system for the real-time estimation of human-cloth relationship will be demonstarated. The proposed system relies on the use of a low cost depth sensor, which provides color and depth information of the environment without requiring an elaborate setup. An efficient algorithm will be presented that estimates the parameters representing the topological relationship between the human and the clothing article. At the core of the approach are 1) low dimensional representation of human-cloth relationship using topology coordinates and 2) a unified ellipse fitting algorithm for the compact representation of the state of clothing articles. Experimental results will be shown that illustrate the robustness of these feature representations. Furthermore, the performance of the proposed method will be demonstrated through its application to real-time clothing assistance tasks and the accuracy by comparison of the estimates provided with the ground truth. Finally some future directions with improvements for the proposed method will be discussed.