Fuzzy based Computing for Disabled using 3D Head Movement

Pradeep V

Abstract


For those with disabilities in the movement who have not yet had a fair chance to use the typical input devices of a personal computer, many authors have put forth replacements for the mouse during the past three decades. The overhead of employing head-mounted devices is decreased in camera-based systems by using the web camera as the mouse. The research problems and opportunities are tracking the user's facial expression of various users with various head poses through the camera and accurately converting into mouse cursor movement and click events. The usual GUI interactive features like menus and scroll bars that need horizontal or vertical movement are difficult to operate on the existing systems since they can only move the cursor in a slanting manner. Unintentional head movements by users frequently result in the current system losing track of facial feature tracking. The proposed system uses a standard web camera to capture the three-dimensional head rotation. To shift the mouse pointer vertically, the positions of the nasal bridge and nose tip are collected. The inner corners of the left and right eyes, as well as the tip of the nose, are also used to shift the mouse cursor horizontally. Unintentional head motions are disregarded to prevent the loss of face features, and by using fuzzy logic, the movement of the mouse pointer is only mapped with the intended head movements. The fuzzy control uses the head's rate, direction, and distance as inputs to determine how the mouse pointer will travel. By capturing the stable, purposeful, and sudden movements of the head, respectively, the fuzzy system classifies the movement of the head as weak, fair, or powerful. By removing the weak and strong head motions, the algorithm just maps the fair head movements with cursor movement on the screen. The proposed system has achieved the horizontal and vertical movement of the cursor and the results are significant when compared with the existing system. The system also successfully ignores the slight movement of the head captured by the web camera when the user remains the head stable, and the feature loss is completely avoided.

Full Text:

PDF

References


World Health Organization. (2011). World Report on Disability. (Report No. WHO/NMH/VIP/ 11.01). https://www.who.int/disabilities/world_report/2011/report.pdf

Social Statistics Division - Ministry of Statistics and Programme Implementation – Government of India. (2017). Disabled Persons in India: A statistical profile 2016. http://mospi.nic.in/sites/default/files/publication_reports/Disabled_persons_in_India_ 2016.pdf

United Nations – Department of Economic and Social Affairs. (2018). Disability and Development Report - Realizing the Sustainable Development Goals by, for and with persons with disabilities. https://www.un.org/development/desa/disabilities/wp-content/uploads/sites/15/2019/ 07/disability-report-chapter1.pdf

Wikipedia. (n.d.). Hands-free computing. https://en.wikipedia.org/wiki/Hands-free_computing

Szeliski, R. (2011). Computer vision: Algorithms and applications.Springer.

Viola, P., & Jones, M. (2004). Robust Real-Time Face Detection. International Journal of Computer Vision, 57(2), 137-154. https://doi.org/10.1023/B:VISI.0000013087.49260.fb

Kazemi, V., & Sullivan, J. (2014). One Millisecond Face Alignment with an Ensemble of Regression Trees. Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 1867-1874. https://doi.org/10.1109/CVPR.2014.241

Zadeh, L. (1965). Fuzzy sets. Information and Control, 8(3), 338-353.

https://doi.org/10.1016/S0019-9958(65)90241-X

Takami, O., Kazuaki, M., Ochiai, T., &Ishimatsu, T. (1995). Computer Interface to use head and eyeball movement for handicapped people. Proceedings of International Conference on Systems, Man and Cybernetics, Intelligent Systems for the 21st Century, 2, 1119-1123.

https://doi.org/10.1109/ICSMC.1995.537920

Hutchinson, E., White, P., Martin, N., Reichert, C., & Frey, A. (1989). Human–computer interaction using eye-gaze input. IEEE Transactions on Systems, Man, and Cybernetics, 19(6), 1527-1534. https://doi.org/10.1109/21.44068

Morimoto, H., Koons, D., Amit, A., &Flickner, M. (1999). Keeping an Eye for HCI. Proceedings of XII Brazilian Symposium on Computer Graphics and Image Processing, 171-176.

https://doi.org/10.1109/SIBGRA.1999.805722

Gips, J., Olivieri, P., &Tecce, J. (1993). Direct control of the computer through electrodes placed around the eyes. In Smith, M., &Salvendy, G. (Eds.), Human-Computer Interaction: Applications and Case Studies - Proceedings of Fifth International Conference on Human Computer Interaction (pp. 630-635). Elsevier.

LaCourse, R., &Hludik, C. (1990). An eye movement communication-control system for the disabled. IEEE Transactions on Biomedical Engineering, 37, 1215-1220.

https://doi.org/10.1109/10.64465

Kim, S., Park, M., Anumas, S., &Yoo, J. (2010). Head mouse system based on gyro- and opto-sensors. Proceedings of 3rd International Conference on Biomedical Engineering and Informatics, 1503-1506. https://doi.org/10.1109/BMEI.2010.5639399

Takahashi, J., Suezawa, S., Hasegawa, Y., &Sankai, Y. (2011). Tongue Motion-based Operation of Support System for Paralyzed Patients. Proceedings of IEEE International Conference on Rehabilitation Robotics, 1-6. https://doi.org/10.1109/ICORR.2011.5975359

Kuzume, K. (2012). Evaluation of tooth-touch sound and expiration based mouse device for disabled persons.Proceedings of IEEE International Conference on Pervasive Computing and Communications Workshops, 387-390. https://doi.org/10.1109/PerComW.2012.6197515

Honye, S., &Thinyane, H. (2012). WiiMS: Simulating mouse and keyboard for motor-impaired users.Proceedings of the South African Institute for Computer Scientists and Information Technologists Conference, 188-195. https://doi.org/10.1145/2389836.2389859

Chen, L., Tang, T., Chang, H., Wong, K., Shih, Y., &Kuo S. (1999). The new design of an infrared-controlled human-computer interface for the disabled. IEEE Transactions on Rehabilitation Engineering, Volume 7(4), 474-481. https://doi.org/10.1109/86.808951

Evans, G., Drew, R., & Blenkhorn, P. (2000). Controlling mouse pointer position using an infrared head-operated joystick. IEEE Transactions on Rehabilitation Engineering, 8(1), 107-117.

https://doi.org/10.1109/86.830955

Barreto, B., Scargle, D., &Adjouadi, M. (2000). A practical EMG-based human-computer interface for users with motor disabilities. Journal of Rehabilitation Research and Development, Volume 37(1), 53-64.

Takami, O., Irie, N., Kang, C., Ishimatsu, T., &Ochiai, T. (1996). Computer interface to use head movement for handicapped people.Proceedings of Digital Processing Applications, 1, 468-472. https://doi.org/10.1109/TENCON.1996.608861

Chin, A., &Barreto, A. (2006). Enhanced Hybrid Electromyogram / Eye Gaze Tracking Cursor Control System for Hands-Free Computer Interaction. Proceedings of International Conference of the IEEE Engineering in Medicine and Biology Society, 2296-2299.

https://doi.org/10.1109/IEMBS.2006.259595

Lyons, C., Barreto, B., &Adjouadi, M. (2001). Development of a hybrid hands-off human computer interface based on electromyogram signals and eye-gaze tracking. Proceedings of the 23rd Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2, 1423-1426. https://doi.org/10.1109/IEMBS.2001.1020469

Kocejko, T., Bujnowski, A., &Wtorek J. (2008). Eye-Mouse for Disabled. Proceedings of the Conference on Human System Interactions,199-202. https://doi.org/10.1109/HSI.2008.4581433

Arai, K., &Mardiyanto, R. (2011). Eye based HCI with moving keyboard for reducing fatigue effects. Proceedings of Eighth International Conference on Information Technology: New Generations, 417–422. https://doi.org/10.1109/ITNG.2011.80

Lupu, G., Bozomitu, G., Păsărică, A., &Rotariu, C. (2017). Eye Tracking User Interface for Internet Access Used in Assistive Technology. Proceedings ofE-Health and Bioengineering Conference, 659-662. https://doi.org/10.1109/EHB.2017.7995510

Palleja, T., Rubion, E., Teixido, M., Tresanchez, M., Viso, D., Rebate, C., &Palacin, J. (2008). Simple and Robust Implementation of a Relative Virtual Mouse Controlled by Head Movements. Proceedings ofConference on Human System Interactions, 221-224.

https://doi.org/10.1109/HSI.2008.4581437

Kim, H., & Ryu, D. (2006). Computer control by tracking head movements for the disabled. Proceedings of International conference on computers helping people with special needs, 709-715.

https://doi.org/10.1007/11788713_104

Loewenich, F., & Maire, D. (2007). Hands-free mouse-pointer manipulation using motion-tracking and speech recognition. Proceedings of the 19th Australasian conference on Computer-Human Interaction: Entertaining User Interfaces, 295-302.

https://doi.org/10.1145/1324892.1324955

Kumar, S., Rai, A., Agarwal, A., &Bachani, N. (2010). Vision based human interaction system for disabled. Proceedings of 2nd IEEE International Conference on Image Processing Theory, Tools and Applications, 441-446. https://doi.org/10.1109/IPTA.2010.5586731

Yee, M., Varona, J., Ribot, T., & Perales, J. (2006). Non-verbal communication by means of head tracking. Proceedings of Ibero-American Symposium on Computer Graphics, 72-75.

Varona, J., Yee, M., & Perales, J. (2008). Hands-free vision-based interface for computer accessibility. Journal of Network and Computer Applications, 31(4), 357-374.

https://doi.org/10.1016/j.jnca.2008.03.003

Chathuranga, K., Samarawickrama, C., Chandima, L., Chathuranga, D., &Abeykoon, S. (2010). Hands free interface for Human Computer Interaction. Proceedings of Fifth International Conference on Information and Automation for Sustainability, 359-364.

https://doi.org/10.1109/ICIAFS.2010.5715687

Bian, P., Hou, J., Chau, P., &Thalmann, M. (2016). Facial position and expression-based human-computer interface for persons with tetraplegia. IEEE Journal of Biomedical and Health Informatics, 20(3), 915-924. https://doi.org/10.1109/JBHI.2015.2412125

Gorodnichy, O., & Roth, G. (2004). Nouse ‘use your nose as a mouse’ perceptual vision technology for hands-free games and interfaces. Image and Vision Computing, 22(12), 931- 942.

https://doi.org/10.1016/j.imavis.2004.03.021

Kraichan, C., &Pumrin, S. (2014). Face and Eye Tracking for Controlling Computer Functions. Proceedings of 11th International Conference on Electrical Engineering / Electronics, Computer, Telecommunications and Information Technology, 1-6.

https://doi.org/10.1109/ECTICon.2014.6839834

Parmar, K., Mehta, B., & Sawant, R. (2012). Facial-feature based Human-Computer Interface for disabled people. Proceedings of International Conference on Communication, Information & Computing Technology, 1-5. https://doi.org/10.1109/ICCICT.2012.6398171

Morris, T., & Chauhan, V. (2006). Facial feature tracking for cursor control. Journal of Network and Computer Applications, 29(1), 62-80. https://doi.org/10.1016/j.jnca.2004.07.003

Eric, M., &Betke, M. (2010). Blink and wink detection for mouse pointer control. Proceedings of the 3rd International Conference on Pervasive Technologies Related to Assistive Environments, 23-25. https://doi.org/10.1145/1839294.1839322

Shin, Y., & Kim, Y. (2006). Welfare Interface Using Multiple Facial Features Tracking. In Sattar, A., & Kang, B. (Eds.), Advances in Artificial Intelligence, AI 2006 Lecture Notes in Computer Science, 4304, (pp. 453-462). Springer. https://doi.org/10.1007/11941439_49

Fathi, A., & Mohammadi, A. (2015). Camera-based eye blinks pattern detection for intelligent mouse. Signal, Image and Video Processing, 9(8), 1907–1916.

https://doi.org/10.1007/s11760-014-0680-1

Chareonsuk, W., Kanhaun, S., Khawkam, K., &Wongsawang, D. (2016). Face and Eyes mouse for ALS Patients. Proceedings of Fifth ICT International Student Project Conference, 77-80. https://doi.org/10.1109/ICT-ISPC.2016.7519240

Connor, C., Yu, E., Magee, J., Cansizoglu, E., Epstein, S., &Betke, M. (2009). Movement and Recovery Analysis of a Mouse-Replacement Interface for Users with Severe Disabilities. In Stephanidis, C. (Ed.), Universal Access in Human-Computer Interaction: Intelligent and Ubiquitous Interaction Environments - Lecture Notes in Computer Science, 5615. Springer.

https://doi.org/10.1007/978-3-642-02710-9_54

Sancheti, K., Suhaas, A., & Suresh, P. (2018). Hands-free Cursor Control using Intuitive Head Movements and Cheek Muscle Twitches. Proceedings of TENCON 2018: IEEE Region 10 Conference, 0356-0361. https://doi.org/10.1109/TENCON.2018.8650532

Vasanthan, M., Murugappan, M., Nagarajan, R., Ilias, B., &Letchumikanth, J. (2012). Facial expression based computer cursor control system for assisting physically disabled person. Proceedings of International Conference on Communication, Networks and Satellite, 172-176.

https://doi.org/10.1109/ComNetSat.2012.6380800

Nasor, M., Rahman, M., Zubair, M., Ansari, H., & Mohamed F. (2018). Eye-controlled mouse cursor for physically disabled individual. Proceedings of 2018 Advances in Science and Engineering Technology International Conferences, 1-4.

https://doi.org/10.1109/ICASET.2018.8376907

Hao, Z., & Lei, Q. (2008). Vision-Based Interface: Using Face and Eye Blinking Tracking with Camera. Proceedings of 2nd International Symposium on Intelligent Information Technology Application, 306-310. https://doi.org/10.1109/IITA.2008.177

Magee, J.J., Epstein, S., Missimer, E.S., Kwan, C., &Betke, M. (2011). Adaptive Mouse-Replacement Interface Control Functions for Users with Disabilities. In Stephanidis, C. (Ed.), Universal Access in Human-Computer Interaction: Users Diversity - Lecture Notes in Computer Science, 6766. Springer. https://doi.org/10.1007/978-3-642-21663-3_36

Sugano, Y., Matsushita, Y., Sato, Y., & Koike, H. (2015). Appearance-based gaze estimation with online calibration from mouse operations. IEEE Transactions on Human-Machine Systems, 45(6), Pages 750-760. https://doi.org/10.1109/THMS.2015.2400434

Valenti, R., Lablack, A., Sebe, N., Djeraba, C., &Gevers, T. (2010). Visual gaze estimation by joint head and eye information. Proceedings of 20th International Conference on Pattern Recognition, 3870-3873. https://doi.org/10.1109/ICPR.2010.1160

R. RiadSaboundji and R. Adrian Rill (2020) Predicting Human Errors from Gaze and Cursor Movements, IEEE International Joint Conference on Neural Networks (IJCNN), 2020, pp. 1-8, doi: 10.1109/IJCNN48605.2020.9207189.

Bourel, F., Chibelushi, C., & Lowe, A. (2000). Robust facial feature tracking. Proceedings of the British machine vision conference, 1. 232–241. https://doi.org/10.1016/j.patcog.2007.02.021

Betke, M., Gips, J., & Fleming, P. (2002). The Camera Mouse: visual tracking of body features to provide computer access for people with severe disabilities. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 10(1), 1-10.

https://doi.org/10.1109/TNSRE.2002.1021581

Mohamed, W., &Koggalage, R. (2007). Control of mouse movements using human facial expressions. Proceedings of Third International Conference on Information and Automation for Sustainability, 13-18. https://doi.org/10.1109/ICIAFS.2007.4544773

P. Salunkhe and A. R. Patil (2016) A device controlled using eye movement, IEEE International Conference on Electrical, Electronics, and Optimization Techniques (ICEEOT), 2016, pp. 732-735, doi: 10.1109/ICEEOT.2016.7754779.

Kai, W., Peng, W., Yang, J, Meng, D., & Yu, Q. (2020). Region Attention Networks for Pose and Occlusion Robust Facial Expression Recognition. IEEE Transactions On Image Processing, 29, 4057-4069. https://doi.org/10.1109/TIP.2019.2956143

Cagdas, E., Ayhan, A., Aliye, T., Sahin, A. (2021) Novel hands‑free interaction techniques based on the software switch approach for computer access with head movements, Universal Access in the Information Society, 20:617–631, Publisher: Springer, doi: 10.1007/s10209-020-00748-1

A. Murata and W. Karwowski (2021) Automatic Lock of Cursor Movement: Implications for an Efficient Eye-Gaze Input Method for Drag and Menu Selection, IEEE Transactions on Human-Machine Systems, vol. 49, no. 3, pp. 259-267, June 2019, doi: 10.1109/THMS.2018.2884737.

Rathish, C. R., and A. Rajaram. "Efficient path reassessment based on node probability in wireless sensor network." International Journal of Control Theory and Applications 34.2016 (2016): 817-832.

S Rahamat Basha, Chhavi Sharma, Farrukh Sayeed, AN Arularasan, PV Pramila, Santaji Krishna Shinde, Bhasker Pant, A Rajaram, Alazar Yeshitla, "Implementation of Reliability Antecedent Forwarding Technique Using Straddling Path Recovery in Manet," Wireless Communications & Mobile Computing (Online), vol. 2022, 2022.

Rathish, C. R., and A. Rajaram. "Hierarchical Load Balanced Routing Protocol for Wireless Sensor Networks." International Journal of Applied Engineering Research 10.7 (2015): 16521-16534.

D. N. V. S. L. S. Indira, Rajendra Kumar Ganiya, P. Ashok Babu, A. Jasmine Xavier, L. Kavisankar, S. Hemalatha, V. Senthilkumar, T. Kavitha, A. Rajaram, Karthik Annam, Alazar Yeshitla, "Improved Artificial Neural Network with State Order Dataset Estimation for Brain Cancer Cell Diagnosis", BioMed Research International, vol. 2022, 10 pages, 2022.

P. Ganesh, G. B. S. R. Naidu, Korla Swaroopa, R. Rahul, Ahmad Almadhor, C. Senthilkumar, Durgaprasad Gangodkar, A. Rajaram, Alazar Yeshitla, "Implementation of Hidden Node Detection Scheme for Self-Organization of Data Packet", Wireless Communications and Mobile Computing, vol. 2022, 9 pages, 2022. https://doi.org/10.1155/2022/1332373.

M. Dinesh, C Arvind, S.S Sreeja Mole, C.S. Subash Kumar, P. Chandra Sekar, K. Somasundaram, K. Srihari, S. Chandragandhi, Venkatesa Prabhu Sundramurthy, "An Energy Efficient Architecture for Furnace Monitor and Control in Foundry Based on Industry 4.0 Using IoT", Scientific Programming, vol. 2022, Article ID 1128717, 8 pages, 2022. https://doi.org/10.1155/2022/1128717.

K. Mahalakshmi, K. Kousalya, Himanshu Shekhar, Aby K. Thomas, L. Bhagyalakshmi, Sanjay Kumar Suman, S. Chandragandhi, Prashant Bachanna, K. Srihari, Venkatesa Prabhu Sundramurthy, "Public Auditing Scheme for Integrity Verification in Distributed Cloud Storage System", Scientific Programming, vol. 2021, Article ID 8533995, 5 pages, 2021. https://doi.org/10.1155/2021/8533995.

J. Divakaran, Somashekhar Malipatil, Tareeq Zaid, M. Pushpalatha, Vilaskumar Patil, C. Arvind, T. Joby Titus, K. Srihari, M. Ragul Vignesh, Baswaraj Gadgay, Venkatesa Prabhu Sundramurthy, "Technical Study on 5G Using Soft Computing Methods", Scientific Programming, vol. 2022, Article ID 1570604, 7 pages, 2022. https://doi.org/10.1155/2022/1570604.

S. Shitharth, Pratiksha Meshram, Pravin R. Kshirsagar, Hariprasath Manoharan, Vineet Tirth, Venkatesa Prabhu Sundramurthy, "Impact of Big Data Analysis on Nanosensors for Applied Sciences Using Neural Networks", Journal of Nanomaterials, vol. 2021, Article ID 4927607, 9 pages, 2021. https://doi.org/10.1155/2021/4927607.

Wikipedia. (n.d.). Soft computing. https://en.wikipedia.org/wiki/Soft_computing

Kim, H., Kim, H., & Hwang, E. (2018). Real-time shape tracking of facial landmarks. Multimedia Tools and Applications, 79, 15945-15963. https://doi.org/10.1007/s11042-018-6814-7

Silwal, R., Alsadoon, A., Prasad, P.W.C., Alsadoon O.H., & Al-Qaraghuli A. (2020). A novel deep learning system for facial feature extraction by fusing CNN and MB-LBP and using enhanced loss function. Multimedia Tools and Applications, 79, 31027–31047.

https://doi.org/10.1007/s11042-020-09559-1.

Z. Wang et al. (2019) AirMouse: Turning a Pair of Glasses Into a Mouse in the Air, IEEE Internet of Things Journal, vol. 6, no. 5, pp. 7473-7483, Oct. 2019, doi: 10.1109/JIOT.2019.2900887.

Rahib, H. A., Murat, A. (2021) Head mouse control system for people with disabilities, Wiley: Expert Systems. 2019;e12398, Accepted: 22 February 2019, DOI: 10.1111/exsy.12398




DOI: https://doi.org/10.31449/inf.v49i15.5399

Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.