進階搜尋


   電子論文尚未授權公開,紙本請查館藏目錄
(※如查詢不到或館藏狀況顯示「閉架不公開」,表示該本論文不在書庫,無法取用。)
系統識別號 U0026-2706201416000800
論文名稱(中文) 使用整體趨勢擴展技術提升多模式整合法之預測準確率
論文名稱(英文) Using Mega-Trend-Diffusion Technique to Improve the Forecasting Accuracy of Meta-Model Method
校院名稱 成功大學
系所名稱(中) 工業與資訊管理學系
系所名稱(英) Department of Industrial and Information Management
學年度 102
學期 2
出版年 103
研究生(中文) 陳惠昭
研究生(英文) Hui-Chao Chen
學號 R36001160
學位類別 碩士
語文別 中文
論文頁數 63頁
口試委員 指導教授-利德江
口試委員-李賢得
口試委員-吳植森
中文關鍵字 多模式整合法  預測數值整合法  整體趨勢擴展技術 
英文關鍵字 mega-trend-diffusion  numerical prediction ensemble method  meta-model ensemble method 
學科別分類
中文摘要 傳統的模式辨別系統係使用單一演算法進行資料的學習,但此並無法滿足不同行為模式者,故於過去數十年間機械學習法之開發有蓬勃之發展。近十年來,所開發之單一學習法已屆瓶頸卻仍未見可滿足不同行為特性之資料者,因此許多學者嘗試結合多種學習法以求進階之效果改善,稱之為多模整合法。然此些整合法之研究主要琢磨於模式整合過程,對於如何整合其結果之研究相對甚少,如分類預測問題藉由投票多數決方式整合,但數值預測問題方面,則大多如Yu (2011)採簡單平均法為之。因此本研究亦基於整體趨勢擴展(mega-trend-diffusion, MTD)技術提出一個小樣本資料學習下,具合理性之預測數值產生流程,求算各模式於追求最小誤差目標下之預測值,以取得多模系統之穩健性,其中本研究所整合者包含線性迴歸、倒傳遞類神經網路、支援向量迴歸以及M5’模式樹四種預測模式。在方法的效果驗證上,本研究將與簡單平均法以及各模式的預測值進行比較,同時以三個個案進行驗證。實驗結果發現,本研究所提出的方法確實能降低預測誤差。
英文摘要 Single learning algorithm has been developed in the past few decades, but it can not satisfied different kinds of data recently. There are many researches try to combine a variety of learning algorithms to improve learning accuracy called multi-model ensemble method, however these researches focus on its process instead of results. This paper presents an approach that is based on mega-trend-diffusion (MTD) and rational method of numerical prediction to minimize calculation error by compromising solution of models. The used methods include multiple regression, back-propagation network, support vector regression, and M5' model tree. The proposed approach would compare with the simple average method and these models, while three cases study are used to illustrate the details of this research. The empirical results show that the proposed method can reduce models mean absolute percentage error.
論文目次 摘要 I
Abstract II
誌謝 VII
目錄 VIII
表目錄 XI
圖目錄 XII
第一章 緒論 1
1.1 研究背景 1
1.2 研究動機 3
1.3 研究目的 6
1.4 研究流程 6
第二章 文獻探討 8
2.1 整合法 8
2.1.1 單一模式整合法 9
2.1.2 多模式整合法 14
2.2 預測模式 15
2.2.1 線性迴歸 15
2.2.2 倒傳遞類神經網路 16
2.2.3 支援向量迴歸 17
2.2.4 M5'模式樹 18
2.3 資訊擴散( information diffusion ) 概念 19
第三章 研究方法 23
3.1 整合法Bagging流程 23
3.2 預測模式 24
3.2.1 線性迴歸 24
3.2.2 倒傳遞類神經網路 24
3.2.3 支援向量迴歸 26
3.2.4 M5'模式樹 28
3.3 整體趨勢擴展技術 28
3.4 本研究方法流程 33
第四章 實例驗證 35
4.1 實驗環境 35
4.1.1 實驗方式 35
4.1.2 預測誤差評估指標 35
4.1.3 假設檢定 36
4.1.4 建模軟體 36
4.2 個案說明 37
4.2.1 個案1:Cell製程位偏問題 37
4.2.2 個案2:CF製程之感光型柱狀間隙物高度問題 40
4.2.3 個案3:MLCC被動元件特性預測問題 41
4.4 實驗結果 44
4.4.1 個案1實驗結果 44
4.4.2 個案2實驗結果 47
4.4.3 個案3實驗結果 51
第五章 結論與建議 58
5.1 結論 58
5.2 建議 59
參考文獻 60
(一)中文部分 60
(二)英文部分 60
參考文獻 葉怡成. (2003). 類神經網路模式應用與實作.
Acharya, N., U. C. Mohanty and L. Sahoo (2013). "Probabilistic multi-model ensemble prediction of Indian summer monsoon rainfall using general circulation models: A non-parametric approach." Comptes Rendus Geoscience 345(3): 126-135.
Breiman, L. (1996). "Bagging predictors." Machine Learning 24(2): 123-140.
Breiman, L., J. H. Friedman, R. A. Olshen and C. J. Stone (1984). Classification and Regression Trees. CA:Wadsworth International Group. Belmont, Springer Berlin Heidelberg. 1910: 54-64.
Bryll, R., R. Gutierrez-Osuna and F. Quek (2003). "Attribute bagging: improving accuracy of classifier ensembles by using random feature subsets." Pattern Recognition 36(6): 1291-1302.
Byon, E., A. K. Shrivastava and Y. Ding (2010). "A classification procedure for highly imbalanced class sizes." IIE Transactions 42(4): 288-303.
Chikamoto, Y., M. Kimoto, M. Ishii, T. Mochizuki, T. T. Sakamoto, H. Tatebe, Y. Komuro, M. Watanabe, T. Nozawa, H. Shiogama, M. Mori, S. Yasunaka and Y. Imada (2013). "An overview of decadal climate predictability in a multi-model ensemble by climate model MIROC." Climate Dynamics 40(5-6): 1201-1222.
Cortes, C. and V. Vapnik (1995). "Support-vector networks." Mach. Learn. 20(3): 273-297.
Dietterich, T. G. (2000). Ensemble Methods in Machine Learning. Proceedings of the First International Workshop on Multiple Classifier Systems, Springer-Verlag: 1-15.
Drucker, H., C. J. Burges, L. Kaufman, A. Smola and V. Vapnik (1997). "Support vector regression machines." Advances in neural information processing systems: 155-161.
Efron, B. and R. J. Tibshirani (1993). An Introduction to the Bootstrap, New York: Chapmen & Hall.
Huang, C. (1997). "Principle of information diffusion." Fuzzy Sets and Systems 91(1): 69-90.
Huang, C. and C. Moraga (2004). "A diffusion-neural-network for learning from small samples." International Journal of Approximate Reasoning 35(2): 137-161.
Jang, J. S. R. (1993). "ANFIS: adaptive-network-based fuzzy inference system." IEEE Transactions on Systems, Man and Cybernetics, 23(3): 665-685.
Jha, A., R. Chauhan, M. Mehra, H. R. Singh and R. Shankar (2012). "miR-BAG: bagging based identification of microRNA precursors." PLoS One 7(9): e45782.
Kuncheva, L. I. (2004). Combining Pattern Classifiers: Methods and Algorithms. Hoboken, NJ: Wiley.
Li, D.-C., C.-J. Chang, C.-C. Chen and W.-C. Chen (2012a). "A grey-based fitting coefficient to build a hybrid forecasting model for small data sets." Applied Mathematical Modelling 36(10): 5101-5108.
Li, D.-C., C.-W. Liu and W.-C. Chen (2012b). "A multi-model approach to determine early manufacturing parameters for small-data-set prediction." International Journal of Production Research 50(23): 6679-6690.
Li, D.-C., C.-S. Wu, T.-I. Tsai and F. M. Chang (2006). "Using mega-fuzzification and data trend estimation in small data set learning for early FMS scheduling knowledge." Computers & Operations Research 33(6): 1857-1869.
Li, D.-C., C.-S. Wu, T.-I. Tsai and Y.-S. Lina (2007). "Using mega-trend-diffusion and artificial samples in small data set learning for early flexible manufacturing system scheduling knowledge." Computers & Operations Research 34(4): 966-982.
Li, D.-C., C. Wu and F. M. Chang (2005). "Using data-fuzzification technology in small data set learning to improve FMS scheduling accuracy." The International Journal of Advanced Manufacturing Technology 27(3-4): 321-328.
Osawa, T., H. Mitsuhashi, Y. Uematsu and A. Ushimaru (2011). "Bagging GLM: Improved generalized linear model for the analysis of zero-inflated data." Ecological Informatics 6(5): 270-275.
Quinlan, J. R. (1992). Learning with Continuous Classes. Proceedings of the 5th Australian Joint Conference on Artificial Intelligence (1992), pp. 343–348
Reformat, M. and R. Yager (2008). "Building ensemble classifiers using belief functions and OWA operators." Soft Computing 12(6): 543-558.
Roiger, R. and M. Geatz (2003). Data mining: A tutorial-based primer, Addison Wesley New York.
Sánchez A, V. D. (2003). "Advanced support vector machines and kernel methods." Neurocomputing 55(1–2): 5-20.
Todorovski, L. and Džeroski, S. (2000). Combining Multiple Models with Meta Decision Trees. Machine Learning, 50 (3), 223–249.
Wang, Y. and I. H. Witten (1997). Inducing model trees for continuous classes. In Proceedings of Poster Papers, Ninth European Conference on Machine
Learning, 1997.
Yu, Q. (2011). "Weighted bagging: a modification of AdaBoost from the perspective of importance sampling." Journal of Applied Statistics 38(3): 451-463.
Yunqian, M. and V. Cherkassky (2003). Multiple model classification using SVM-based approach. The International Joint Conference on Neural Network, 4, 1581–1586.
論文全文使用權限
  • 同意授權校內瀏覽/列印電子全文服務,於2024-12-31起公開。


  • 如您有疑問,請聯絡圖書館
    聯絡電話:(06)2757575#65773
    聯絡E-mail:etds@email.ncku.edu.tw