INVERSE LOGISTIC REGRESSION CLASSIFICATION MODEL FOR PREDICTING FEATURE MATRIX VALUES

Subhradeep Biswas, Mohammad Zaiyan Alam, Subia Ansari

Abstract


This paper proposes a solution that leverages Logistic Regression classification model to predict the possible values of feature matrix to be qualified as a specific class (from response vector). It provides the mechanism to obtain the value of a specific attribute of feature matrix where other attribute values of the feature matrix are known, and the user aims to reach to a specific class. The Logistic Regression model calculates the Logit or score for each class in the response vector. The proposed solution uses the weights of each feature calculated by the Logistic Regression model and calculates the value of the unknown feature that can make the complete set of feature matrix (derived value of the unknown attribute and known values of other attributes) reach the response class.


Keywords


Inverse Classifier; Inverse Classification Model; Logistic Regression Model; Determine Feature Matrix;

References


LucilaOhno-Machado, StephanDreiseitl, Logistic regression and artificial neural network classification models: a methodology review, Journal of Biomedical Informatics, Volume 35, Issues 5–6.

R. Duda, P. Hart, D. Stork; Pattern classification; (2nd ed.), Wiley/Interscience, New York (2000)

V. Vapnik; The nature of statistical learning theory; (2nd ed.), Springer, New York (2000)

N. Cristianini, J. Shawe-Taylor; An introduction to support vector machines and other kernel-based learning methods; Cambridge University Press, Cambridge (2000)

B. Schölkopf, A. Smola; Learning with kernels: support vector machines, regularization, optimization, and beyond; MIT Press, Cambridge, MA (2002).

B. Dasarathy; Nearest neighbor pattern classification techniques; IEEE Computer Society Press, Silver Spring, MD (1991).

B. Ripley; Pattern recognition and neural networks; Cambridge University Press, Cambridge (1996).

L. Breiman, et al.Classification and regression trees; Wadsworth, Belmont, CA (1984).

R. Quinlan; C4.5: programs for machine learning; Morgan Kaufmann, Los Altos, CA (1993)

C. Bishop; Neural networks for pattern recognition; Oxford University Press, Oxford (1995)

T. Hastie, R. Tibshirani, J. Friedman; The elements of statistical learning: data mining, inference, and prediction; Springer, New York (2001)

W. Press, et al.Numerical recipes in C; (2nd ed.), Cambridge University Press, Cambridge (1993)

Subhradeep Biswas, Sudipa Biswas, Inverse Linear Regression in Machine Learning, Global Journal of Computer Science and Technology, Vol. 10, No. 13.


Full Text: PDF

Refbacks

  • There are currently no refbacks.




Copyright © 2012 - 2021, All rights reserved.| ijitr.com

Creative Commons License
International Journal of Innovative Technology and Research is licensed under a Creative Commons Attribution 3.0 Unported License.Based on a work at IJITR , Permissions beyond the scope of this license may be available at http://creativecommons.org/licenses/by/3.0/deed.en_GB.