Skip to content

Commit

Permalink
Deploying to gh-pages from @ 67dfdfe 🚀
Browse files Browse the repository at this point in the history
  • Loading branch information
rickecon committed Nov 10, 2023
1 parent d214084 commit a7a9129
Show file tree
Hide file tree
Showing 3 changed files with 19 additions and 19 deletions.
4 changes: 2 additions & 2 deletions basic_empirics/BasicEmpirMethods.html
Original file line number Diff line number Diff line change
Expand Up @@ -940,7 +940,7 @@ <h2> Contents </h2>
</div>
</div>
<div class="cell_output docutils container">
<div class="output stderr highlight-myst-ansi notranslate"><div class="highlight"><pre><span></span>/tmp/ipykernel_2762/3993614049.py:4: SettingWithCopyWarning:
<div class="output stderr highlight-myst-ansi notranslate"><div class="highlight"><pre><span></span>/tmp/ipykernel_2788/3993614049.py:4: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
Expand Down Expand Up @@ -1600,7 +1600,7 @@ <h2> Contents </h2>
Model: OLS Adj. R-squared: 0.608
Method: Least Squares F-statistic: 171.4
Date: Fri, 10 Nov 2023 Prob (F-statistic): 4.16e-24
Time: 08:14:56 Log-Likelihood: -119.71
Time: 08:34:48 Log-Likelihood: -119.71
No. Observations: 111 AIC: 243.4
Df Residuals: 109 BIC: 248.8
Df Model: 1
Expand Down
32 changes: 16 additions & 16 deletions basic_empirics/LogisticReg.html
Original file line number Diff line number Diff line change
Expand Up @@ -1141,29 +1141,29 @@ <h2> Contents </h2>
<section id="interpreting-coefficients-log-odds-ratio">
<span id="sec-loglogitinterpret"></span><h4><span class="section-number">13.2.2.4. </span>Interpreting coefficients (log odds ratio)<a class="headerlink" href="#interpreting-coefficients-log-odds-ratio" title="Permalink to this heading">#</a></h4>
<p>The odds ratio in the logistic model is provides a nice way to interpret logit model coefficients. Let <span class="math notranslate nohighlight">\(z\equiv X^T\beta = \beta_0 + \beta_1 x_{1,i} + ...\beta_K x_{K,i}\)</span>. The logistic model is stated by the probability that the binary categorical dependent variable equals one <span class="math notranslate nohighlight">\(y_i=1\)</span>.</p>
<div class="amsmath math notranslate nohighlight" id="equation-f6e74078-e905-42f7-a486-7317612d994b">
<span class="eqno">(13.11)<a class="headerlink" href="#equation-f6e74078-e905-42f7-a486-7317612d994b" title="Permalink to this equation">#</a></span>\[\begin{equation}
<div class="amsmath math notranslate nohighlight" id="equation-8fc74fbe-b7bd-4f5b-81d3-5347c2435265">
<span class="eqno">(13.11)<a class="headerlink" href="#equation-8fc74fbe-b7bd-4f5b-81d3-5347c2435265" title="Permalink to this equation">#</a></span>\[\begin{equation}
P(y_i=1|X,\theta) = \frac{e^z}{1 + e^z}
\end{equation}\]</div>
<p>Given this equation, we know that the probability of the dependent variable being zero <span class="math notranslate nohighlight">\(y_i=0\)</span> is just one minus the probability above.</p>
<div class="amsmath math notranslate nohighlight" id="equation-63df07e2-a99c-4dfe-a268-ebd0bf4935c9">
<span class="eqno">(13.12)<a class="headerlink" href="#equation-63df07e2-a99c-4dfe-a268-ebd0bf4935c9" title="Permalink to this equation">#</a></span>\[\begin{equation}
<div class="amsmath math notranslate nohighlight" id="equation-39d03220-d236-4391-896c-d4ca43300d38">
<span class="eqno">(13.12)<a class="headerlink" href="#equation-39d03220-d236-4391-896c-d4ca43300d38" title="Permalink to this equation">#</a></span>\[\begin{equation}
P(y_i=0|X,\theta) = 1 - P(y_i=1|X,\theta) = 1 - \frac{e^z}{1 + e^z} = \frac{1}{1 + e^z}
\end{equation}\]</div>
<p>The odds ratio is a common way of expressing the probability of an event versus all other events. For example, if the probability of your favorite team winning a game is <span class="math notranslate nohighlight">\(P(win)=0.8\)</span>, then we know that the probability of your favorite team losing that game is <span class="math notranslate nohighlight">\(P(lose)=1-P(win)=0.2\)</span>. The odds ratio is the ratio of these two probabilities.</p>
<div class="amsmath math notranslate nohighlight" id="equation-4daf6d79-8a28-43f0-a7ea-ab45f64d7cdf">
<span class="eqno">(13.13)<a class="headerlink" href="#equation-4daf6d79-8a28-43f0-a7ea-ab45f64d7cdf" title="Permalink to this equation">#</a></span>\[\begin{equation}
<div class="amsmath math notranslate nohighlight" id="equation-d42be9db-6ddf-4105-8b82-c29213390469">
<span class="eqno">(13.13)<a class="headerlink" href="#equation-d42be9db-6ddf-4105-8b82-c29213390469" title="Permalink to this equation">#</a></span>\[\begin{equation}
\frac{P(win)}{P(lose)} = \frac{P(win)}{1 - P(win)} = \frac{0.8}{0.2} = \frac{4}{1} \quad\text{or}\quad 4
\end{equation}\]</div>
<p>The odds ratio tells you that the probability of your team winning is four times as likely as your team losing. A gambler would say that your odds are 4-to-1. Another way of saying it is that your team will win four out of five times and will lose 1 out of five times.</p>
<p>In the logistic model, the odds ratio reduces the problem nicely.</p>
<div class="amsmath math notranslate nohighlight" id="equation-18cc280d-8b86-40c7-a87f-44fd0f334972">
<span class="eqno">(13.14)<a class="headerlink" href="#equation-18cc280d-8b86-40c7-a87f-44fd0f334972" title="Permalink to this equation">#</a></span>\[\begin{equation}
<div class="amsmath math notranslate nohighlight" id="equation-a43a33df-4990-48f2-93bb-4d1cfe81c61e">
<span class="eqno">(13.14)<a class="headerlink" href="#equation-a43a33df-4990-48f2-93bb-4d1cfe81c61e" title="Permalink to this equation">#</a></span>\[\begin{equation}
\frac{P(y_i=1|X,\theta)}{1 - P(y_i=1|X,\theta)} = \frac{\frac{e^z}{1 + e^z}}{\frac{1}{1 + e^z}} = e^z
\end{equation}\]</div>
<p>If we take the log of both sides, we see that the log odds ratio is equal to the linear predictor <span class="math notranslate nohighlight">\(z\equiv X^T\beta = \beta_0 + \beta_1 x_{1,i} + ...\beta_K x_{K,i}\)</span>.</p>
<div class="amsmath math notranslate nohighlight" id="equation-a3e73411-6b51-4185-afaf-33357aee1137">
<span class="eqno">(13.15)<a class="headerlink" href="#equation-a3e73411-6b51-4185-afaf-33357aee1137" title="Permalink to this equation">#</a></span>\[\begin{equation}
<div class="amsmath math notranslate nohighlight" id="equation-8708766c-5003-4636-97ef-d197cb5d34ff">
<span class="eqno">(13.15)<a class="headerlink" href="#equation-8708766c-5003-4636-97ef-d197cb5d34ff" title="Permalink to this equation">#</a></span>\[\begin{equation}
\ln\left(\frac{P(y_i=1|X,\theta)}{1 - P(y_i=1|X,\theta)}\right) = z = \beta_0 + \beta_1 x_{1,i} + ...\beta_K x_{K,i}
\end{equation}\]</div>
<p>So the interpretation of the coeficients <span class="math notranslate nohighlight">\(\beta_k\)</span> is that a one-unit increase of the variable <span class="math notranslate nohighlight">\(x_{k,i}\)</span> increases the odds ratio or the odds of <span class="math notranslate nohighlight">\(y_i=1\)</span> by <span class="math notranslate nohighlight">\(\beta_{k,i}\)</span> percent.</p>
Expand All @@ -1175,18 +1175,18 @@ <h2> Contents </h2>
<p>The multinomial logit model is a natural extension of the logit model. In contrast to the logit model in which the dependent variable has only two categories, the multinomial logit model accomodates <span class="math notranslate nohighlight">\(J\geq2\)</span> categories in the dependent variable. Let <span class="math notranslate nohighlight">\(\eta_j\)</span> be the linear predictor for the <span class="math notranslate nohighlight">\(j\)</span>th category.
$<span class="math notranslate nohighlight">\( \eta_j\equiv \beta_{j,0} + \beta_{j,1}x_{1,i} + ...\beta_{j,K}x_{K,i} \quad\forall y_i = j \)</span>$</p>
<p>The multinomial logit model gives the probability of <span class="math notranslate nohighlight">\(y_i=j\)</span> relative to some reference category <span class="math notranslate nohighlight">\(J\)</span> that is left out.</p>
<div class="amsmath math notranslate nohighlight" id="equation-35d8f320-74c1-4f3d-b5e4-38a73d908198">
<span class="eqno">(13.16)<a class="headerlink" href="#equation-35d8f320-74c1-4f3d-b5e4-38a73d908198" title="Permalink to this equation">#</a></span>\[\begin{equation}
<div class="amsmath math notranslate nohighlight" id="equation-25201e9d-bffd-4676-aaf1-719ae1ae98c5">
<span class="eqno">(13.16)<a class="headerlink" href="#equation-25201e9d-bffd-4676-aaf1-719ae1ae98c5" title="Permalink to this equation">#</a></span>\[\begin{equation}
Pr(y_i=j|X,\theta) = \frac{e^{\eta_j}}{1 + \sum_v^{J-1}e^{\eta_v}} \quad\text{for}\quad 1\leq j\leq J-1
\end{equation}\]</div>
<p>Once the <span class="math notranslate nohighlight">\(J-1\)</span> sets of coefficients are estimated, the final <span class="math notranslate nohighlight">\(J\)</span>th set of coefficients are a residual based on the following expression.</p>
<div class="amsmath math notranslate nohighlight" id="equation-28b4d2d6-f5bd-41c9-8ff7-5c1bc19a56e6">
<span class="eqno">(13.17)<a class="headerlink" href="#equation-28b4d2d6-f5bd-41c9-8ff7-5c1bc19a56e6" title="Permalink to this equation">#</a></span>\[\begin{equation}
<div class="amsmath math notranslate nohighlight" id="equation-eb40ee8c-8a2f-4eab-882c-0eb7285d33e7">
<span class="eqno">(13.17)<a class="headerlink" href="#equation-eb40ee8c-8a2f-4eab-882c-0eb7285d33e7" title="Permalink to this equation">#</a></span>\[\begin{equation}
Pr(y_i=J|X,\theta) = \frac{1}{1 + \sum_v^{J-1}e^{\eta_v}}
\end{equation}\]</div>
<p>The analogous log odds ratio interpretation applies to the multinomial logit model.</p>
<div class="amsmath math notranslate nohighlight" id="equation-fb2b90c3-be20-4e3e-8c5e-3004b522ceec">
<span class="eqno">(13.18)<a class="headerlink" href="#equation-fb2b90c3-be20-4e3e-8c5e-3004b522ceec" title="Permalink to this equation">#</a></span>\[\begin{equation}
<div class="amsmath math notranslate nohighlight" id="equation-150f6b5e-01ae-4ad0-9b7c-cb3ae77dfa78">
<span class="eqno">(13.18)<a class="headerlink" href="#equation-150f6b5e-01ae-4ad0-9b7c-cb3ae77dfa78" title="Permalink to this equation">#</a></span>\[\begin{equation}
\ln\left(\frac{Pr(y_i=j|X,\theta)}{Pr(y_i=J|X,\theta)}\right) = \eta_j = \beta_{j,0} + \beta_{j,1}x_{1,i} + ...\beta_{j,K}x_{K,i} \quad\text{for}\quad 1\leq j \leq J-1
\end{equation}\]</div>
<p>This is the odds ratio of <span class="math notranslate nohighlight">\(y_i=j\)</span> relative to <span class="math notranslate nohighlight">\(y_i=J\)</span>. The interpretation of the <span class="math notranslate nohighlight">\(\beta_{j,k}\)</span> coefficient is the predicted percentage change in the log odds ratio of <span class="math notranslate nohighlight">\(y_i=j\)</span> to <span class="math notranslate nohighlight">\(y_i=J\)</span> from a one-unit increase in variable <span class="math notranslate nohighlight">\(x_{k,i}\)</span>.</p>
Expand Down
2 changes: 1 addition & 1 deletion searchindex.js

Large diffs are not rendered by default.

0 comments on commit a7a9129

Please sign in to comment.