-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HW5 MLE and OLS #53
Comments
Ok so I don't really know what I did because I changed multiple things, but I now have |
Hi @isabelalmazan . It looks like you are getting a negative sigma. |
When you use the constrained optimizer, it gives you the hessian in a "sparse" format. You need to write However, when I looked at this on mine, it looks like using the unconstrained optimizer gives a better result. (Graphically the fit looks better.) One way to get the unconstrained optimizer to work is to put an absolute value sign on sigma. I did so like this: def loglik(params, data):
"""Compute log-likelihood of parameters
data should be provided in `args`.
The data should be a pandas.Series or a numpy array
"""
mu, sigma = params
f = lambda x: (1/(x * np.abs(sigma) * np.sqrt(2 * np.pi)) *
np.exp(-(np.log(x) - mu)**2/(2 * sigma**2)))
# f = lambda x: (1/(x * sigma * np.sqrt(2 * np.pi)) *
# np.exp(-(np.log(x) - mu)**2/(2 * sigma**2)))
f_array = f(data)
ll = np.sum(np.log(f_array))
return ll |
I added the absolute value, but I'm still getting Here is my code from Q3 and Q4 (m and s stand for mu and sigma): def PDF(x, m, s):
return (1/(x * np.abs(s) * (math.sqrt(2 * np.pi)))) * math.exp(-((math.log(x) - m)**2)/(2 * (s**2)))
def log_like(x, m, s):
pdf_vals = PDF(x, m, s)
ln_pdf_vals = np.log(pdf_vals)
log_lik_val = ln_pdf_vals.sum()
return log_lik_val Thanks for your help! |
I don't know why you're getting a negative value for What happens if you remove the constraints? This is what I get. In the first method, I don't use constraints---but I have Here are my results for Q7: |
It seems just a little off to me. I'm not sure what could be the cause. Does the estimate for |
Though, I understand that there might be some small difference due to possibly different optimizers. I'll make sure @PhilipCaoChicago knows |
@bariscangoc . Looks close, but not the same as mine. It's probably close enough that I wouldn't worry about it. What are your starting values for the optimizer? |
I used |
Looks good to me. I just wanted to make sure that you weren't using something like mu=11.3 and sigma = 0.2 |
I've noticed as well that the constraints make my results a little weird. BTW, I see |
aren't we truncating the the lognormal at 150k so that the pdf to 150k is equal to 1? |
No. However, I see the confusion. In Q3 I said
However, all I meant is that the limits of the plot should be 0 to 150,000. I meant for the distribution to be a standard lognormal. Sorry for the confusing wording. |
Thanks for the clarification! |
Thanks, I fixed it! |
@isabelalmazan . I don't think it will change anything, but you might change |
@isabelalmazan What if you use my code directly? (See #53 (comment)) You can then see the rest of the code here: #53 (comment) The function (I just realized that I had basically posted all of it here anyway. :) ) |
Hi Jeremy,
I got up to Q6 on the MLE and OLS section of the homework, but the results I'm getting from the MLE are really far off from the estimates we started off with:
The text was updated successfully, but these errors were encountered: