進階搜尋


下載電子全文  
系統識別號 U0026-1305201919205000
論文名稱(中文) 基於音樂節奏之情感分析應用於舞台燈光模型之研究
論文名稱(英文) A stage lighting color model based on music rhythm and emotion
校院名稱 成功大學
系所名稱(中) 工業設計學系
系所名稱(英) Department of Industrial Design
學年度 107
學期 1
出版年 108
研究生(中文) 張淳翔
研究生(英文) Chun-Hsiang Chang
學號 P36054111
學位類別 碩士
語文別 英文
論文頁數 83頁
口試委員 指導教授-蕭世文
口試委員-郭炳宏
口試委員-吳昌祚
中文關鍵字 音樂節奏  情緒辨識  舞台燈光顏色  數量化一類 
英文關鍵字 music rhythm schema  music emotion recognition  emotion color recognition  quantification I 
學科別分類
中文摘要 隨著科技的發展,各大展演場地無不加強音響或舞台燈光等配備以增加吸引力,引進新的自動舞台燈光系統,使得打燈的模式與花樣變化更多,更加絢麗。然而,現有的舞台燈光控制仍仰賴燈光師的操作,通常為在正式表演前先依照樂手或表演者綵排時的音樂選擇好燈光控制模式以配合現場的氣氛和音樂,這過程不僅需要花費時間、精神、及許多力氣,打燈的方式與選擇也仰賴燈光師本身的專業及舞台經驗。根據文獻,已有研究開發出一套智慧型的燈光配色系統,輸入所需音樂的音訊檔案並藉由分析及偵測其音樂的節奏情緒和脈動,進而得出其最適合搭配的燈光配色。然而,現有的音樂情感偵測模型仍就以音調高低之旋律為判斷依據,鮮少有從節拍節奏做為切入點進行深入情緒探討的情感模型,故本研究以音樂節奏作為基礎,藉由分析其節奏情感,進而建構出燈光配色系統之模型作為研究目標。以爵士鼓組節奏之MIDI音訊檔案作為系統輸入端,從中擷取節奏特徵,作為判斷音樂情緒的標準,並對應到Thayer 情感平面上以研究與音樂情感的相關性,反應不同音樂節奏變化與音樂情感之間的連結關係。接著,再將色彩模型導入Thayer情感平面研究不同節奏情感與顏色之間的關係,以支持向量機SVM建構出色彩情感地圖,找出情感對應於舞台燈光顏色之關係模型。而後,本研究將上述兩者為基礎加以整合並以數量化一類進行分析,解析音樂節奏對於舞台燈光顏色之影響,建立一套符合音樂節奏與情感對應於舞台燈光顏色之配色關係模型,完整解析節奏、情感、顏色三者之間的關係。
英文摘要 With the development of science and technology, entertainment venues or live house manage to enhance their competitiveness by introducing new sound and stage lighting equipment. However, the stage lighting control still relies on the operation of the lighting engineer these days. Including various colors, the mode of stage lighting is selected according to the lighting engineer during the rehearsal before the official performance. Not only the process takes time and effort, but the choice of lighting also depends on professional and experienced lighting engineer. Thus, a methodology for stage lighting color selecting would help.
Music emotion recognition (MER) got much development in recent years. However, the existing music emotion detection model is still based on the melody of the pitch, or else of timbre. There is rarely an emotional model that uses the beat rhythm as an entry point to conduct in-depth emotional exploration. The study has developed a smart lighting color matching system, which inputs the audio files of the desired music and analyzes the rhythm emotion and mood of the music to obtain the most suitable lighting color matching and playing. Corresponding to the Thayer emotion model to study the correlation between music and emotion, reflecting the different music rhythm changes and music emotions. Support vector machine and color map which based on Thayer model are used to study the relationship between different rhythm emotions and colors, constructing a color emotion map. Furthermore, the results above are integrated and analyzed by Quantification I method, which analyzes the influence of different music rhythm emotions on the color of the stage lighting. Establishing a color matching relationship model that matches the rhythm and emotion of the music and the color of the stage lighting.
論文目次 摘要 I
SUMMARY II
ACKNOWLEDGEMENTS III
TABLE OF CONTENTS IV
LIST OF TABLES VII
LIST OF FIGURES VIII
CHAPTER 1 INTRODUCTION 1
1.1 Background 1
1.1.1 Music rhythm and emotions 2
1.1.2 Emotion classification and identification 3
1.1.3 Emotion within Colors 4
1.2 Motivation 5
1.3 Purpose 5
1.4 Limitation 6
1.5 Research framework 7
CHAPTER 2 LITERATURE REVIEW 9
2.1 Music rhythm and percussion 9
2.1.1 Music rhythm 9
2.1.2 The Setting of drum set 11
2.1.3 Drum tab 13
2.2 Connection between Music and emotion 15
2.3 Emotion Models 16
2.4 Music feature abstraction 20
2.5 Connection between lighting color and emotion 22
2.6 Stage lighting 24
2.6.1 Lighting Style 24
2.6.2 Stage lighting equipment 24
2.7 Color model 28
2.7.1 RGB color model 29
2.7.2 HSL color model 30
2.7.3 CIE Lab color model 32
2.7.4 CIE Lab conversion formula 34
CHAPTER 3 THEORETICAL FRAMEWORK 35
3.1 Rhythm Feature recognition 35
3.1.1 Audio analysis and processing 35
3.1.2 Principal Components Analysis PCA 36
3.2 Support Vector Machine SVM 37
3.2.1 Kernel Function 38
3.3 Multidimensional Quantification I 40
CHAPTER 4 RESEARCH PROCEDURES 44
4.1 Rhythm sample selection and feature extraction 45
4.2 Experiment of evaluating music emotions of rhythm 46
4.3 Experiment of stage lighting color matching 48
4.3.1 Color set selection. 48
4.3.2 Color emotion experiment 50
4.4 Establish the correlation model between music rhythm and emotion 50
4.4.1 Rhythm feature and types 51
4.4.2 Rhythm emotion experiment result analysis and data normalization 53
4.5 Establish the correlation model between rhythm emotion and color 54
4.5.1 Color map 54
4.5.2 Colors Classification 55
4.6 Establish the correlation model between music rhythm and color 55
4.6.1 Color model selection 56
4.6.2 Correlation analysis 57
4.6.3 Residual analysis and residual fix 58
CHAPTER 5 RESULT AND DISCUSSION 59
5.1 Music rhythm and music emotion 59
5.1.1 Music rhythm and valence 61
5.1.2 Music rhythm and arousal 62
5.2 Music emotion and color 63
5.3 Correlation between music rhythm and color 67
5.3.1 Correlation between music rhythm and L 67
5.3.2 Correlation between music rhythm and a 68
5.3.3 Correlation between music rhythm and b 69
5.3.4 Residual Analysis and Residual fix 70
CHAPTER 6 CONCLUSION AND RECOMMENDATION 73
6.1 Research conclusion 73
6.2 Recommendation 75
REFERENCE 76

參考文獻 Adnan M. B. , Muhammad M. , Syed M. A. , Bilal K. .(2016). Human emotion recognition and analysis in response to audio musicusing brain signals. Computers in Human Behavior, 65, p.267-275
Ajmera, J. , McCowan, I. , & Bourlard, H. . (2003). Speech/music segmentation using entropy and dynamism features in a HMM classification framework. Speech Communication, 40(3), p.351-363. doi: Doi 10.1016/S0167-6393(02)00087-0
Bigand, E. , Vieillard, S. , Madurell, F. , Marozeau, J. , & Dacquet, A. . (2005). Multidimensional scaling of emotional responses to music: The effect of musical expertise and of the duration of the excerpts. Cognition & Emotion, 19(8), p.1113-1139.
Caivano, J. L. . (1994). Color and Sound - Physical and Psychophysical Relations. Color Research and Application, 19(2), p.126-133.
Clarke, E. , & Cook, N. . (2004). Empirical Musicology. Oxford: Oxford University Press.
Curwen, C. . (2018). Music-colour synaesthesia: Concept, context and qualia. Consciousness and Cognition, 61, p.94-106.
Demaine, E. D. , Matrin, F. G. , Meijer, H. , Rappaport, D. , Taslakian, P. , Toussaint, G. T. , Winograd, T. , Wood, D. R. . (2009). The distance geometry of music. Computational Geometry, 42(5), p.429-454.
Desain, P. , & Honing, H. (2003). The formation of rhythmic categories and metric priming. Perception, 32, p.341-365
Dhanalakshmi, P. , Palanivel, S. , & Ramalingam, V. . (2009). Classification of audio signals using SVM and RBFNN. Expert Systems with Applications, 36(3), p.6069-6075. doi: DOI 10.1016/j.eswa.2008.06.126
Dixon, S. . (1997). Beat Induction and Rhythm Recognition, Advanced Topics in Artificial Intelligence, p.311-320
Dousty, M. , Daneshvar, S. , Haghjoo, M. . (2011). The effects of sedative music, arousal music, and silence on electrocardiography signals. Journal of Electrocardiology, 44(3), p.396e1-396e6
Duda, R. O. , Hart, P. E. , & Stork, D. G. . (2000). Pattern Recognition. New York: Wiley.
Ekman, P. . (1972). Universals and cultural differences in facial expression of emotion. Nebraska Symposium on Motivation, p.207-283. Lincoln, Nebraska: University of Nebraska Press
Fitch, W. T. . (2013). Rhythmic cognition in human and animals: distinguishing meter and pulse perception. Frontiers in Systems Neuroscience, 7, p.68
Fitch, W. T. . (2016). Music, dance, meter and groove: a forgotten partnership. Frontiers in Systems Neuroscience, 10, p.64.
Farnsworth, P. R. . (1954). A study of the hevner adjective list. The Journal of Aesthetics and Art Criticism, 13(1), p.97-103.
Feng, Y. , Zhuang, Y. , & Pan, Y. . (2003). Popular music retrieval by detecting mood. Paper presented at the Proceedings of the 26th annual international ACM SIGIR conference on Research and development in information retrieval, Toronto, ON, Canada.
Honing, H. . (2006a). Evidence for tempo-specific timing in music using a web-based experimental setup. Journal of Experimental Psychology: Human Perception and Performance, 32(3), 780-786.
Honing, H. . (2011a). Musical cognition: A science of listening. New Brunswick, NJ: Transaction Publishers.
Honing, H. (2011b). The illiterate listener: On music cognition, musicality and methodology. Amsterdam, The Netherlands: Amsterdam University Press.
Honing, H. . (2013). 9-Structure and Interpretation of Rhythm in Music. The Psychology of Music (Third Edition), p.369-404.
Hsiao, S. W. . (1994). Fuzzy set theory on car-color design. Color Research & Application, 19(3), p.202-p.213.
Hsiao, S. W. , Chiu, F. Y., & Hsu, H. Y. . (2008). A computer-assisted colour selection system based on aesthetic measure for colour harmony and fuzzy logic theory. Color Research & Application, 33(5), p.411-p.423.
Hsiao, S. W. , Hsu, C. F. , & Tang, K. W. . ( 2013). A consultation and simulation system for product color planning based on interactive genetic algorithms. Color Research & Application, 38(5), p.475-p.390.
Hsiao, S. W. , & Tsai, C. J. .(2015). Transforming the natural colors of an image into product design: A computer‐aided color planning system based on fuzzy pattern recognition. Color Research & Application, 40(6), p. 612-625.
Hsiao, S. W. , Chen, S. K. , Lee, C. H. .(2017). Methodology for stage lighting control based on music emotions, Information Science, 412-413, p. 14-35
Huron, D. . (1992). The Ramp Archetype and the Maintenance of Passive Auditory Attention. Music Perception, 10(1), p.83-92.
Huss, M. , Verney, J. P. , Fosker, T. , mead, N. , Goswami, U. (2011). Music, rhythm, rise time perception and developmental dyslexia: Perception of musical meter predicts reading and phonology. Cortex, 47(6), p.674-689.
Juslin, P. N.. (2000). Cue utilization in communication of emotion in music performance: relating performance to perception. Paper presented at the J. Exper. Psychol.: Human Percept. Perf. .
Juslin, P. N. , & Laukka, P. . (2004). Expression, perception, and induction of musical emotions: A review and a questionnaire study of everyday listening. Journal of New Music Research, 33(3), p.217-238. doi: Doi 10.1080/0929821042000317813
Juslin, P. N. ,Timmers, R. . (2010). Expression and communication of emotion in music performance. Handbook of Music and Emotion: Theory, Research, Application, p.453-489. Oxford/New York: Oxford University Press.
Kaya, N. , & Epps, H. . (2004). Relationship between Color and Emotion: A Study of College Students. College Student Journal, 38, p.396-405.
Knoeferle, K. M. , Paus, V. C. , Vossen, A. . (2017). An Upbeat Crowd: Fast In-store Music Alleviates Negative Effects of High Social Density on Customers’ Spending. Journal of Retailing, 93(4), p.541-549
Kontukoski, M. , Paakki, M. , Thureson, J. , Uimonen, H. , Hopia, A. . (2016). Imagined salad and steak restaurants: Consumer’ colour, music and emotion associations with different dishes. International Journal of Gastronomy and Food Science, 4, p.1-11.
Kotz, S. A. , Ravignani, A. , Fitch, W. T. . (2018). The Evolution of Rhythm Processing. Trends in Cognitive Sciences, 22(10), p.896-910.
Kreurz, G. . (2000). Basic emotions in music. Paper presented at the Proc. 6th Int. Conf. Music Perception Cognition.
Krumhansl, C. L. . (1997). An exploratory study of musical emotions and psychophysiology. Canadian Journal of Experimental Psychology-Revue Canadienne De Psychologie Experimentale, 51(4), 336-353. doi: Doi 10.1037/1196-1961.51.4.336
Krumhansl, C. L. . (2000). Rhythm and pitch in music cognition. Psychological Bulletin, 126(1), p.159
Knoblich, G. , Jordan, J. S. . (2003). Action coordination in groups and individual: learning anticipatory control. Journal of Experimental Psychology learning Memory and Cognition, 29, p.1066-1016.
Laurier, C. , & Herrera, P. . (2007). Audio music mood classification using support vector machine. Paper presented at the Proceedings of the 8th International Conference on Music Information Retrieval, Vienna, Austria.
Laurier, C. , Meyers, O. , Serr`a, J. , Blech, M. , & Herrera, P. . (2009). Music Mood Annotator Design and Integration.
Lee, C. H. , Shih, J. L. , Yu, K. M. , & Lin, H. S. . (2009). Automatic Music Genre Classification Based on Modulation Spectral Analysis of Spectral and Cepstral Features. IEEE Transactions on Multimedia, 11(4), 670-682. doi: Doi 10.1109/Tmm.2009.2017635
Li, C. H. . (2010). An automatic method for selecting the parameter of the RBF kernel function to support vector machines. Honolulu, HI
Li, T. , & Ogihara, M. . (2003). Detecting emotion in music. Paper presented at the Proceedings of the 4th International Conference on Music Information Retrieval, Baltimore, MD, USA.
Li, T. , & Ogihara, M. . (2004, 17-21 May). Content-based music similarity search and emotion detection. Paper presented at the Int. Conf. Acoust., Speech, Signal Process, Toulouse, France.
Li, T. , & Ogihara, M. . (2006). Toward intelligent music information retrieval. IEEE Transactions on Multimedia, 8(3), p.564-574. doi: Doi 10.1109/Tmm.2006.870730
Lie, L. , Liu, D. , & Zhang, H. J. . (2006). Automatic mood detection and tracking of music audio signals. IEEE Transactions on Audio Speech and Language Processing, 14(1), 5-18. doi: Doi 10.1109/Tsa.2005.860344
Lindstrom, E. , & Juslin, P. N. . (2003). Expressivity Comes From Within Your Soul: A Questionnaire Study of Student's Perception on Musical Expressivity. Research Studies in Music Education, 20, p.23-47.
London, J. . (2012). Hearing in Time: Psychological Aspects of Musical Meter Second. Oxford: Oxford University Press. ISBN: 978-0-19-974437-4
Mandel, M. I. , Poliner, G. E. , & Ellis, D. P. W. . (2006). Support vector machine active learning for music retrieval. Multimedia Systems, 12(1), p.3-13. doi: DOI 10.1007/s00530-006-0032-2
Marks, L. E. . (1997). On colored-hearing Synesthesia: Crossmodal Translations of Sensory Dimension. Paper presented at the Classic and contemporary readings, In S. Baron-Cohen, J. E. Harrison, eds. Synesthesia.
Miller, Mary C. (1997). Color for Interior Architecture (1st ed.). U.S.A. NY.
Nalini, N. J. , Palanivel, S. .(2015). Music emotion recognition: The combined evidence of MFCC and residual phase, Egyptian Informatics Journal, 17, p.1-10
Nagamachi, M. (1995). Kansei engineering: a new ergonomic consumer-oriented technology for product development. International Journal of Industrial Ergonomics, 15(1), 3-11.
Ou, L. C. , Luo, M. R. , Woodcock, A. , & Wright, A. . (2004). A study of colour emotion and colour preference. Part I: Colour emotions for single colours. Color Research & Application, 29(3), p.232-240.
Parncutt, R. . (2007). Can researchers help artists? Music performance research for music student. Music Performance Research, Vol. 1(1), 1-25. ISSN: 1775-9219
Peretz, I. , Gagnon, L. , & Bouchard, B. . (1998). Music and emotion: perceptual determinants, immediacy, and isolation after brain damage. Cognition, 68(2), p.111-141. doi: Doi 10.1016/S0010-0277(98)00043-2
Picard, R. W. , & Cosier, G. . (1997). Affective intelligence - the missing link? . Bt Technology Journal, 15(4), p.150-161.
Pridmore, R. W. . (1992). Music and Color - Relations in the Psychophysical Perspective. Color Research and Application, 17(1), p.57-61. doi: DOI10.1002/col.5080170110
Russell, J. A. . (1980). A Circumplex Model of Affect. Journal of Personality and Social Psychology, 39(6), 1161-1178. doi: Doi 10.1037/H0077714
Sakai, K. , Hikosaka, O. , Nakamura, K. . (2004). Emergence of rhythm during motor learning. Trends in Cognitive Sciences, 8, p.547-553
Scholes, P. (1977). Metre and Rhythm. London and New York: Oxford University Press. ISBN: 0-19-311306-6.
Sebba, R. . (1991). Structural Correspondence between Music and Color. Color Research and Application, 16(2), 81-88. doi: DOI 10.1002/col.5080160206
Sen, A. , & Srivastava, M. . (1990). Regression Analysis: Theory, Methods, and Applications. New York: Springer.
Seo, C. , Lee, K. Y. , & Lee, J. . (2001). GMM based on local PCA for speaker identification. Electronics Letters, 37(24), p.1486 - 1488.
Shao, B. , Wang, D. D. , Li, T. , & Ogihara, M. . (2009). Music Recommendation Based on Acoustic Features and User Access Patterns. IEEE Transactions on Audio Speech and Language Processing, 17(8), p.1602-1611. doi: Doi 10.1109/Tasl.2009.2020893
Shi, Y. Y. , Zhu, X. , Kim, H. G. , & Eom, K. W. . (2006). A tempo feature via modulation spectrum analysis and its application to music emotion classification. Paper presented at the Proceedings of the IEEE International Conference on Multimedia and Expo, Toronto, Canada.
Silva, A. G. , Guida, H. L. , Antônio, A. S. , Marcomini, R. S. , Fontes, A. M. G. G. , Abreu, L. C. , Roque, A. L. , Silva, S. B. , Raimundo, R. D. , Ferreira, C. , Valenti, V. E. .(2014). An exploration of heart rate response to differing music rhythm and tempos. Complementary Therapies in Clinical Practice, 20(2), p.130-134
Solomatine, D. P. , & Shrestha, D. L. (2004). AdaBoost.RT: A boosting algorithm for regression problems. Paper presented at the Proc. IEEE International Joint Conference of Neural Network.
Sordo, M. , Laurier, C. , & Celma, O. . (2007). Annotating music collections: How 103 content-based similarity helps to propagate labels. Paper presented at the Proceedings of the 8th International Conference on Music Information Retrieval, Vienna, Austria.
Swaminathan, S. , & Schellenberg, E. G. . (2015). Current Emotion Research in Music Psychology, Emotion Review, 7(2), p.189-197.
Tay, Francis E. H. , & Cao, L. . (2001). Application of support vector machines in financial time series forecasting. Omega, 29(4), p.309-317.
Thayer, R. E. . (1996). The Origin of Everyday Moods: Managing Energy, Tension, and Stress. Oxford: Oxford University Press.
Toiviainen, P. , Luck, G. , Thompson, M. R. . (2010). Embodied meter: hierarchical eigenmodes in music-induced movement. Music Perceptive, 28(1), p.59-70.
Umapathy, K. , Krishnan, S. , & Jimaa, S. . (2005). Multi-group classification of audio signals using time-frequency parameters. IEEE Transactions on Multimedia, 7(2), p.308-315. doi: Doi 10.1109/Tmm.2005.843363
Valdez, P. , & Mehrabian, A. . (1994). Effects of Color on Emotions. Journal of Experimental Psychology-General 123(4), p.394-409. doi: Doi 10.1037/0096-3445.123.4.394
Vapnik, V. . (1995). The nature of statistical learning theory Springer. New York.
Vapnik, V. , Goldwich, S. , & Smola, A. . (1997). Support vector method for function approximation, regression estimation, and signal processing: Cambridge: MIT press.
VieIllard, S. , Peretz, I. , Gosselin, N. , Khalfa, S. , Gagnon, L. , & Bouchard, B. . (2008). Happy, sad, scary and peaceful musical excerpts for research on emotions. Cognition & Emotion, 22(4), p.720-752. doi: Doi 10.1080/02699930701503567
Xu, C. S. , Maddage, N. C. , & Shao, X. . (2005). Automatic music classification and summarization. IEEE Transactions on Speech and Audio Processing, 13(3), p.441-450. doi: Doi 10.1109/Tsa.2004.840939
Yang, Y. H. , Lin, Y. C. , Su, Y. F. , & Chen, H. H. . (2008). A regression approach to music emotion recognition. IEEE Transactions on Audio Speech and Language Processing, 16(2), 448-457. doi: Doi 10.1109/Tasl.2007.911513
Zhu, X. , Shi, Y. Y. , Kim, H. G. , & Eom, K. W. . (2006). An integrated music recommendation system. IEEE Transactions on Consumer Electronics, 52(3), p.917-925.
Zentner, M. , Eerola, T. . (2010). Rhythm engagement with music in infancy. Proceedings of the National Academy of Sciences of the United States of America, 107, p.5768-5773.
曾國雄(1991),多變量解析與其應用,華泰書局,台北:華泰書局
張維哲(1992),人工神經網路,全欣出版社,台北:全欣資訊圖書股份有限公司
蔡振家(2011),音樂認知心理學,臺大出版中心,台北:國立臺灣大學出版中心
Sacks, O.著,廖月娟譯(2008),Musicophilia: Tales of Music and the Brain,台北:天下文化
論文全文使用權限
  • 同意授權校內瀏覽/列印電子全文服務,於2019-10-03起公開。
  • 同意授權校外瀏覽/列印電子全文服務,於2019-10-03起公開。


  • 如您有疑問,請聯絡圖書館
    聯絡電話:(06)2757575#65773
    聯絡E-mail:etds@email.ncku.edu.tw