The proposed method is the preliminary step for a human-machine interaction system, in which a robot arm mimics the movements of a human arm, visualized through a camera set-up. In order to achieve this goal, the posture of a model joint, which simulates a human arm, is determined by finding the bending and yaw angles from captured images. The image analysis steps consist of preprocessing of noise via median filtering, thresholding and connected component analysis. The relation between the relative positions of these markers can be used to determine the unknown bending and yaw angles of the model joint. This information is further passed to a PUMA 760 robot arm to finalize the goal. The preliminary simulation results are promising to present that the proposed system can be utilized in a real environment in which a human (arm) can be mimicked by a machine with visual sensor.