BibTex RIS Kaynak Göster
Yıl 2018, Cilt: 67 Sayı: 2, 1 - 10, 01.08.2018
https://doi.org/10.31801/cfsuasmas.495775

Öz

Kaynakça

  • Duda, R. O., Hart, P. E. and Stork, D. G. Pattern classi…cation. 2nd. Edition. New York, Fan, L., and Yang, Y. A loss function analysis for classi…cation methods in text categorization. Proc. ICML,(2003), 472-479.
  • Schapire, R. E., MSRI Workshop on Nonlinear Estimation and Classi…cation,The Boosting Approach to Machine Learning An Overview, 2002.
  • Masnadi-Shirazi, H., Mahadevan, V., and Vasconcelos, N. (2010). On the design of robust classi…ers for computer vision. In Computer Vision and Pattern Recognition (CVPR), (2010)
  • IEEE Conference on (pp. 779-786). Debruyne, M. Robust Support Vector Machine Classi…cation, Leuven Statistical Day,(2008).
  • Ma, Y., Li, L., Huang, X. and Wang, S. . Robust Support Vector Machine Using Least Median Loss Penalty, Proceedings of the 18th IFAC World Congress, (2011).
  • Wu, Y., Liu, Y. Robust Truncated-Hinge-Loss Support Vector, Journal of the American Statistical Association, (2007)Vol. 102, No. 479.
  • Huang, X., Shi, L., and Suykens, J. A., Support vector machine classi…er with pinball loss. IEEE transactions on pattern analysis and machine intelligence, (2014), 36(5), 984-997.
  • Masnadi-Shirazi, H. The Design Of Bayes Consistent Loss Functions For Classi…cation, Uni- versity of California, Phd Dissertation, San Dieago, 223 p, 2011.
  • Kobetski, M. and Sullivan. J. Improved Boosting Performance by Exclusion of Ambiguous Positive Examples. In ICPRAM, (2013).
  • Freund, Y., and Schapire, R. E. A desicion-theoretic generalization of on-line learning and an application to boosting. In European conference on computational learning theory,(1995),(pp. 37), Springer Berlin Heidelberg,
  • Friedman, J., Hastie, T., and Tibshirani, R. Additive logistic regression: a statistical view of boosting (with discussion and a rejoinder by the authors). The annals of statistics, (2000), (2), 337-407.
  • Sano, N., Suzuki, H., and Koda, M., A robust boosting method for mislabeled data. Journal of the Operations Research Society of Japan, (2004), 47(3), 182-196.
  • Miao, Q., Cao, Y., Xia, G., Gong, M., Liu, J., and Song, J. , RBoost: label noise-robust boost- ing algorithm based on a nonconvex loss function and the numerically stable base learners. IEEE transactions on neural networks and learning systems, (2016), 27(11), 2216-2228.
  • Kanamori, T., Takenouchi, T., Eguchi, S., and Murata, N., The most robust loss function for boosting, In Neural Information Processing, (2004), 496-501, Springer Berlin/Heidelberg.
  • Rosset, S., Robust boosting and its relation to bagging, In Proceedings of the eleventh ACM SIGKDD international conference on Know ledge discovery in data mining, (2004), pp. 249
  • Freund, Y., A more robust boosting algorithm, arXiv preprint arXiv:0905.2138, (2009).
  • Kanamori, T., Takenouchi, T., Eguchi, S., and Murata, N., Robust loss functions for boosting. Neural computation, (2007), 19(8), 2183-2244.
  • Savage, L. J. The Elicitation of Personal Probabilities and Expectations, Journal of American Statistical Association,(1971), 66:783-801.
  • Dettling, M., and Bauhlmann, P., Boosting for tumor classi…cation with gene expression data. Bioinformatics, (2003), 19(9), 1061-1069.
  • Tuszynski, J., http://svitsrv25.ep*.ch/R-doc/library/caTools/html/LogitBoost.html, 2013. Toka, O., Gudermannian LossFunction and Gudermannian BinaryClassi…cation
  • Method,Hacettepe University PhD Thesis, 2016.
  • Ocal, N., Ercan, M. K., and Kadioglu, E. Corporate Ratings and a Model Proposition for the Manufacturing Industry at Borsa Istanbul. International Journal of Financial Research, (2015), 6(3), 13.
  • Lichman, M., UCI Machine Learning Repository [http://archive.ics.uci.edu/ml]. Irvine, CA: University of California, School of Information and Computer Science, 2013.
  • Shapiro, A. D., Structured Induction in Expert Systems Addison, Wesley. This book is based on Shapiro’s Ph.D. thesis at the University of Edinburgh entitled "The Role of Structured Induction in Expert Systems". 1987.
  • Asuncion, A., and Newman, D.J., UCI Machine Learning Repository. Irvine, CA: University of California, School of Information and Computer Science, 2007.
  • Quinlan, R., Simplifying decision trees, International Journal Man-Machine Studies, , (1987), , pp. 221-234.
  • Current address : Onur Toka: Hacettepe University, Department of Statistics, 06800, Ankara, TURKEY.
  • E-mail address : onur.toka@hacettepe.edu.tr ORCID: http://orcid.org/0000-0002-4025-4537
  • Current address : Meral Cetin: Hacettepe University, Department of Statistics, 06800, Ankara, TURKEY.
  • E-mail address : meral@hacettepe.edu.tr ORCID: http://orcid.org/0000-0003-0247-7120

A CORRECTION ON TANGENTBOOST ALGORITHM

Yıl 2018, Cilt: 67 Sayı: 2, 1 - 10, 01.08.2018
https://doi.org/10.31801/cfsuasmas.495775

Öz

TangentBoost is a robust boosting algorithm. The method combines loss function and weak classi…ers. In addition, TangentBoost gives penalties not only misclassi…cation but also true classi…cation margin in order toget more stable classi…ers. Despite the fact that the method is good one inob ject tracking, propensity scores are obtained improperly in the algorithm.The problem causes mislabeling of observations in the statistical classi…cation.In this paper, there is a correction proposal for TangentBoost algorithm. Afterthe correction on the algorithm, there is a simulation study for the new algorithm. The results show that correction on the algorithm is useful for binaryclassi…cation

Kaynakça

  • Duda, R. O., Hart, P. E. and Stork, D. G. Pattern classi…cation. 2nd. Edition. New York, Fan, L., and Yang, Y. A loss function analysis for classi…cation methods in text categorization. Proc. ICML,(2003), 472-479.
  • Schapire, R. E., MSRI Workshop on Nonlinear Estimation and Classi…cation,The Boosting Approach to Machine Learning An Overview, 2002.
  • Masnadi-Shirazi, H., Mahadevan, V., and Vasconcelos, N. (2010). On the design of robust classi…ers for computer vision. In Computer Vision and Pattern Recognition (CVPR), (2010)
  • IEEE Conference on (pp. 779-786). Debruyne, M. Robust Support Vector Machine Classi…cation, Leuven Statistical Day,(2008).
  • Ma, Y., Li, L., Huang, X. and Wang, S. . Robust Support Vector Machine Using Least Median Loss Penalty, Proceedings of the 18th IFAC World Congress, (2011).
  • Wu, Y., Liu, Y. Robust Truncated-Hinge-Loss Support Vector, Journal of the American Statistical Association, (2007)Vol. 102, No. 479.
  • Huang, X., Shi, L., and Suykens, J. A., Support vector machine classi…er with pinball loss. IEEE transactions on pattern analysis and machine intelligence, (2014), 36(5), 984-997.
  • Masnadi-Shirazi, H. The Design Of Bayes Consistent Loss Functions For Classi…cation, Uni- versity of California, Phd Dissertation, San Dieago, 223 p, 2011.
  • Kobetski, M. and Sullivan. J. Improved Boosting Performance by Exclusion of Ambiguous Positive Examples. In ICPRAM, (2013).
  • Freund, Y., and Schapire, R. E. A desicion-theoretic generalization of on-line learning and an application to boosting. In European conference on computational learning theory,(1995),(pp. 37), Springer Berlin Heidelberg,
  • Friedman, J., Hastie, T., and Tibshirani, R. Additive logistic regression: a statistical view of boosting (with discussion and a rejoinder by the authors). The annals of statistics, (2000), (2), 337-407.
  • Sano, N., Suzuki, H., and Koda, M., A robust boosting method for mislabeled data. Journal of the Operations Research Society of Japan, (2004), 47(3), 182-196.
  • Miao, Q., Cao, Y., Xia, G., Gong, M., Liu, J., and Song, J. , RBoost: label noise-robust boost- ing algorithm based on a nonconvex loss function and the numerically stable base learners. IEEE transactions on neural networks and learning systems, (2016), 27(11), 2216-2228.
  • Kanamori, T., Takenouchi, T., Eguchi, S., and Murata, N., The most robust loss function for boosting, In Neural Information Processing, (2004), 496-501, Springer Berlin/Heidelberg.
  • Rosset, S., Robust boosting and its relation to bagging, In Proceedings of the eleventh ACM SIGKDD international conference on Know ledge discovery in data mining, (2004), pp. 249
  • Freund, Y., A more robust boosting algorithm, arXiv preprint arXiv:0905.2138, (2009).
  • Kanamori, T., Takenouchi, T., Eguchi, S., and Murata, N., Robust loss functions for boosting. Neural computation, (2007), 19(8), 2183-2244.
  • Savage, L. J. The Elicitation of Personal Probabilities and Expectations, Journal of American Statistical Association,(1971), 66:783-801.
  • Dettling, M., and Bauhlmann, P., Boosting for tumor classi…cation with gene expression data. Bioinformatics, (2003), 19(9), 1061-1069.
  • Tuszynski, J., http://svitsrv25.ep*.ch/R-doc/library/caTools/html/LogitBoost.html, 2013. Toka, O., Gudermannian LossFunction and Gudermannian BinaryClassi…cation
  • Method,Hacettepe University PhD Thesis, 2016.
  • Ocal, N., Ercan, M. K., and Kadioglu, E. Corporate Ratings and a Model Proposition for the Manufacturing Industry at Borsa Istanbul. International Journal of Financial Research, (2015), 6(3), 13.
  • Lichman, M., UCI Machine Learning Repository [http://archive.ics.uci.edu/ml]. Irvine, CA: University of California, School of Information and Computer Science, 2013.
  • Shapiro, A. D., Structured Induction in Expert Systems Addison, Wesley. This book is based on Shapiro’s Ph.D. thesis at the University of Edinburgh entitled "The Role of Structured Induction in Expert Systems". 1987.
  • Asuncion, A., and Newman, D.J., UCI Machine Learning Repository. Irvine, CA: University of California, School of Information and Computer Science, 2007.
  • Quinlan, R., Simplifying decision trees, International Journal Man-Machine Studies, , (1987), , pp. 221-234.
  • Current address : Onur Toka: Hacettepe University, Department of Statistics, 06800, Ankara, TURKEY.
  • E-mail address : onur.toka@hacettepe.edu.tr ORCID: http://orcid.org/0000-0002-4025-4537
  • Current address : Meral Cetin: Hacettepe University, Department of Statistics, 06800, Ankara, TURKEY.
  • E-mail address : meral@hacettepe.edu.tr ORCID: http://orcid.org/0000-0003-0247-7120
Toplam 30 adet kaynakça vardır.

Ayrıntılar

Diğer ID JA94FS58MR
Bölüm Araştırma Makalesi
Yazarlar

Onur Toka

Meral Çetin Bu kişi benim

Yayımlanma Tarihi 1 Ağustos 2018
Gönderilme Tarihi 1 Ağustos 2018
Yayımlandığı Sayı Yıl 2018 Cilt: 67 Sayı: 2

Kaynak Göster

APA Toka, O., & Çetin, M. (2018). A CORRECTION ON TANGENTBOOST ALGORITHM. Communications Faculty of Sciences University of Ankara Series A1 Mathematics and Statistics, 67(2), 1-10. https://doi.org/10.31801/cfsuasmas.495775
AMA Toka O, Çetin M. A CORRECTION ON TANGENTBOOST ALGORITHM. Commun. Fac. Sci. Univ. Ank. Ser. A1 Math. Stat. Ağustos 2018;67(2):1-10. doi:10.31801/cfsuasmas.495775
Chicago Toka, Onur, ve Meral Çetin. “A CORRECTION ON TANGENTBOOST ALGORITHM”. Communications Faculty of Sciences University of Ankara Series A1 Mathematics and Statistics 67, sy. 2 (Ağustos 2018): 1-10. https://doi.org/10.31801/cfsuasmas.495775.
EndNote Toka O, Çetin M (01 Ağustos 2018) A CORRECTION ON TANGENTBOOST ALGORITHM. Communications Faculty of Sciences University of Ankara Series A1 Mathematics and Statistics 67 2 1–10.
IEEE O. Toka ve M. Çetin, “A CORRECTION ON TANGENTBOOST ALGORITHM”, Commun. Fac. Sci. Univ. Ank. Ser. A1 Math. Stat., c. 67, sy. 2, ss. 1–10, 2018, doi: 10.31801/cfsuasmas.495775.
ISNAD Toka, Onur - Çetin, Meral. “A CORRECTION ON TANGENTBOOST ALGORITHM”. Communications Faculty of Sciences University of Ankara Series A1 Mathematics and Statistics 67/2 (Ağustos 2018), 1-10. https://doi.org/10.31801/cfsuasmas.495775.
JAMA Toka O, Çetin M. A CORRECTION ON TANGENTBOOST ALGORITHM. Commun. Fac. Sci. Univ. Ank. Ser. A1 Math. Stat. 2018;67:1–10.
MLA Toka, Onur ve Meral Çetin. “A CORRECTION ON TANGENTBOOST ALGORITHM”. Communications Faculty of Sciences University of Ankara Series A1 Mathematics and Statistics, c. 67, sy. 2, 2018, ss. 1-10, doi:10.31801/cfsuasmas.495775.
Vancouver Toka O, Çetin M. A CORRECTION ON TANGENTBOOST ALGORITHM. Commun. Fac. Sci. Univ. Ank. Ser. A1 Math. Stat. 2018;67(2):1-10.

Communications Faculty of Sciences University of Ankara Series A1 Mathematics and Statistics.

Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.