Automatic Thai Phiang or Flute-Playing Robot Reading Notes with Image Processing Technology

International Journal of Electronics and Communication Engineering
© 2025 by SSRG - IJECE Journal
Volume 12 Issue 6
Year of Publication : 2025
Authors : Thongchai Thongyoo, Keeradit Saipattalung, Panudach Srikrajang, Sarawut Puttaraksa
pdf
How to Cite?

Thongchai Thongyoo, Keeradit Saipattalung, Panudach Srikrajang, Sarawut Puttaraksa, "Automatic Thai Phiang or Flute-Playing Robot Reading Notes with Image Processing Technology," SSRG International Journal of Electronics and Communication Engineering, vol. 12,  no. 6, pp. 195-205, 2025. Crossref, https://doi.org/10.14445/23488549/IJECE-V12I6P115

Abstract:

This research aimed to study the characteristics of flute playing and note-reading techniques with image processing technology. The concept involved integrating robot technology with Thai music to preserve Thai culture by designing a robot structured similar to a human flute-playing position along with simulating the head of the “Ngo Pa” (an ethnic Ngo person) character in line with the theme from the Thai song “Rojana-uey” by controlling the opening-closing of the flute holes with a pneumatic system employing a raspberry pi board as the central processor that works with a webcam, touch screen monitor, 8-channel relay, 24-volt power supply, and air-controlled solenoid valves. The findings revealed that: 1. Reading notes with image processing technology is most effective when using the Adobe Thai font in bold and italics for Thai, which was 100% accurate, while English note reading was 100% accurate in almost all font types and character styles, except for the LilyUPC font, which had a lower accuracy; 2. Airspeed testing revealed that the speed suitable for Thai Phiang Or flute-playing is 1.14 meters/second; 3. The analysis of the airflow inside the Thai Phiang Or flute by Computational Fluid Dynamics (CFD) revealed values between 0 and 0.948 meters per second; 4. Sound similarity testing comparing the spectrum values between human-blown and robot-blown sounds found that the Fa note had the most significant similarity at 98.27%, followed by the Me note at 97.20% and the Sol note at 94.23%. The lowest similarity was at 56.56%.

Keywords:

Phiang or flute, Image processing, Robotics technology, Computational fluid dynamics.

References:

[1] Pongsilp Arunratana, “The Evolution of Thai Music: Thai Music Book in the Chauvinistic Nationalism Era,” Journal of the Faculty of Arts, Silpakorn University, vol. 26, no. 1, pp. 149-163, 2003.
[Google Scholar] [Publisher Link]
[2] Narut Suttachit, “History of Thai Music Studies: A Theoretical Perspective on Education,” Journal of Culture and Arts Institute, vol. 20, no. 2, pp. 35-51, 2019.
[Google Scholar] [Publisher Link]
[3] Niroot Kawla, “Ethnic Culture for Creative Invention Music Contemporary: Case Study, Hmong, Lisu, Lahu, Dara-Aung, Karen,” Community and Social Development Journal, vol. 21, no. 2, pp. 197-218, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[4] Thanraphat Ditdumrongsakul, Preeyanun Promsukkul, and Ampai Buranaprapuk, “Teaching and Learning Management Of Pre-College, College Of Music, Mahidol University,” Srinakharinwirot Research and Development Journal of Humanities and Social Sciences, vol. 13, no. 26, pp. 46-56, 2021.
[Google Scholar] [Publisher Link]
[5] Natthida Numpranee, “The Process of Teaching Khim 'Lao Phaen Song' of Lhuang Praditpirho (Sorn Silapabunleng): A Case Study of Chanok Sagarik,” Journal of Culture and Arts Institute, vol. 20, no. 1, pp. 68-75, 2018.
[Google Scholar] [Publisher Link]
[6] Sarun Songtoun, and Veera Phunsue, “A Study of Contemporary Thai Music: A Case Study of Ban Lum Band,” Master Thesis, Srinakharinwirot University, pp. 1-227, 2021.
[Google Scholar] [Publisher Link]
[7] Austin C. Bergstrom, David Conran, and David W. Messinger, “Gaussian Blur and Relative Edge Response,” arXiv, pp. 1-12, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[8] Emmanuelle Gouillart, Juan Nunez-Iglesias, and Stéfan van der Walt, “Analyzing Microtomography Data with Python and the Scikit-Image Library,” Advanced Structural and Chemical Imaging, vol. 2, pp. 1-11, 2016.
[CrossRef] [Google Scholar] [Publisher Link]
[9] Vishal Rajput, N. Jayanthi, and S. Indu, “An Efficient Character Segmentation Algorithm for Connected Handwritten Documents,” Document Analysis and Recognition, 4th Workshop, DAR 2018, Held in Conjunction with ICVGIP 2018, Hyderabad, India, vol. 1020, pp. 97-105, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[10] Amit Choudhary, Rahul Rishi, and Savita Ahlawat, “A New Character Segmentation Approach for Off-Line Cursive Handwritten Words,” Procedia Computer Science, vol. 17, pp. 88-95, 2013.
[CrossRef] [Google Scholar] [Publisher Link]
[11] Parikshit Sharma et al., “Advancements in OCR: A Deep Learning Algorithm for Enhanced Text Recognition,” International Journal of Inventive Engineering and Sciences, vol. 10, no. 8, pp. 1-7, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[12] Sana Saeed, Saeeda Naz, and Muhammad Imran Razzak, An Application of Deep Learning in Character Recognition: An Overview, Handbook of Deep Learning Applications, pp. 53-81, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[13] Afgani Fajar Rizky, Novanto Yudistira, and Edy Santoso, “Text Recognition on Images using Pre-Trained CNN,” arxiv, pp. 1-11, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[14] Ali Azgar et al., “MNIST Handwritten Digit Recognition Using a Deep Learning-based Modified Dual Input Convolutional Neural Network (DICNN) Model,” Proceedings of Ninth International Congress on Information and Communication Technology, pp. 1-11, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[15] Hadi Oqaibi, Abdullah Basuhail, and Gibrael Abosamra, “Handprinted Character and Online Signature Recognition Using Residual Convolutional Network: A Comparative Study,” 2021 International Conference on Electrical, Computer, Communications and Mechatronics Engineering (ICECCME), Mauritius, pp. 1-7, 2021.
[CrossRef] [Google Scholar] [Publisher Link]
[16] M.C. Rademeyer, A. Barnard, and M.J. Booysen, “Optoelectronic and Environmental Factors Affecting the Accuracy of Crowd-Sourced Vehicle-Mounted License Plate Recognition,” IEEE Open Journal of Intelligent Transportation Systems, vol. 1, pp. 15-28, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[17] Sukalpa Chanda, Umapada Pal, and Oriol Ramos Terrades, “Word-Wise Thai and Roman Script Identification,” ACM Transactions on Asian Language Information Processing (TALIP), vol. 8, no. 3, pp. 1-21, 2009.
[CrossRef] [Google Scholar] [Publisher Link]
[18] A. Abhinav Parashar Singh, Neepakumari Gameti, and Sandeep Gupta, “Future Trends in Industrial Hydraulics and Pneumatics: Implications for Operations and Maintenance,” International Journal of Technical Innovation in Modern Engineering & Science, vol. 10, no. 10, pp. 15-25, 2024.
[Google Scholar] [Publisher Link]
[19] Longwei Ding et al., “Design of Soft Multi-Material Pneumatic Actuators based on Principal Strain Field,” Materials and Design, vol. 182, pp. 1-13, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[20] Jan Pustavrh et al., “Comparative Study of a Hydraulic, Pneumatic and Electric Linear Actuator System,” Research Square, pp. 1-21, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[21] Edmond Richer, and Yildirim Hurmuzlu, “A High Performance Pneumatic Force Actuator System: Part II—Nonlinear Controller Design,” Journal of Dynamic Systems, Measurement, and Control, vol. 122, no. 3, pp. 426-434, 2000.
[CrossRef] [Google Scholar] [Publisher Link]
[22] Aniekan Essienubong Ikpe, and Imoh Ime Ekanem, “Integration of Intelligent Hydraulic Systems as Industry 4.0 Driving Trends: the Gateway to Industrial Automation in the Manufacturing Sectors,” El Ruha 11th International Conference on Scientific Research, Şanliurfa, Türkiye, pp. 255-276, 2024.
[Publisher Link]
[23] Aniekan Essienubong Ikpe, and Imoh Ime Ekanem, “Adoption of Machine Learning in Streamlining Maintenance Strategies for Effective Operations in Automotive Industries,” Big Data and Computing Visions, vol. 4, no. 3, pp. 180-200, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[24] Patel Raj et al., “Design and Construction of Pneumatic Vehicle,” International Journal of Engineering Research & Technology, vol. 9, no. 5, pp. 344-347, 2020.
[CrossRef] [Publisher Link]
[25] Shane Hoang et al., “Air-Powered Logic Circuits for Error Detection in Pneumatic Systems,” Device, vol. 2, no. 11, pp. 1-12, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[26] Viacheslav Antsiperov, Gennadii Mansurov, and Michael Danilychev, “Non-Invasive Arterial Pressure Monitoring by a New Pneumatic Sensor and On-Line Analysis of Pulse Waveforms for a Modern Medical Home Care Systems,” Procedia Computer Science, vol. 176, pp. 2894-2903, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[27] Z. Talha et al., “Pneumatic System for Granular Fertilizer Flow Rate Control,” Middle-East Journal of Scientific Research, vol. 8, no. 3, pp. 688-693, 2011.
[Google Scholar] [Publisher Link]
[28] Vinay Kumar Pamula, “Automated Digital Wind Instrument Using Arduino Uno Microcontroller,” Journal of Research in Music, vol. 2, no. 1, pp. 42-53, 2024.
[CrossRef] [Google Scholar] [Publisher Link]