Learn Before
Concept

Logistic Regression Mathematical Equation

P(Y^)=e(β0+β1X)/[1+e(β0+β1X)]P(\hat{Y})=e^{(\beta_{0}+\beta_{1}X)}/[1+e^{(\beta_{0}+\beta_{1}X)}] P(Y^)+P(Y^)×[e(β0+β1X)]=e(β0+β1X)\Rightarrow P(\hat{Y})+P(\hat{Y})\times[e^{(\beta_{0}+\beta_{1}X)}]=e^{(\beta_{0}+\beta_{1}X)} P(Y^)=[1P(Y^)]×[e(β0+β1X)]\Rightarrow P(\hat{Y})=[1-P(\hat{Y})]\times [e^{(\beta_{0}+\beta_{1}X)}] P(Y^)/(1P(Y^))=e(β0+β1X)(1.1)\Rightarrow P(\hat{Y})/(1-P(\hat{Y}))=e^{(\beta_{0}+\beta_{1}X)} (1.1) The left part is called odds ratio. log(P(X)/(1P(X)))=β0+β1X\Rightarrow\log{(P(X)/(1-P(X)))}=\beta_{0}+\beta_{1}X The left hand side is called log odds ratio. This is classified under linear regressions because the right hand side is the basic formula for Linear Regression. Coefficient β1\beta_{1} means: one-unit increase in x is associated with an increase in the log odds of 0/1 by β1\beta_{1} units.

0

1

Updated 2021-02-12

Tags

Data Science