Skip to content

Conversation

HumphreyYang
Copy link
Member

This PR fixes typos in navy_captain and adds an section discussing non-IID likelihood ratio processes with models generated by two Markov chains.

Copy link

github-actions bot commented Aug 6, 2025

@github-actions github-actions bot temporarily deployed to pull request August 6, 2025 03:27 Inactive
@github-actions github-actions bot temporarily deployed to pull request August 6, 2025 03:27 Inactive
@mmcky mmcky requested a review from Copilot August 6, 2025 23:10
Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR addresses typos in the navy_captain lecture and adds a substantial new section to likelihood_ratio_process covering non-IID likelihood ratio processes with Markov chain models.

  • Corrects typos and improves clarity in the navy captain lecture
  • Updates variable names from Greek letters (α, β) to Latin letters (A, B) for consistency
  • Adds a comprehensive section on Markov chain likelihood ratio processes with theoretical analysis and simulations

Reviewed Changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.

File Description
lectures/navy_captain.md Fixes typos, updates references to wald_friedman_2, changes variable names from α/β to A/B, and corrects mathematical formulations
lectures/likelihood_ratio_process.md Adds new section on Markov chain models with KL divergence rate analysis and simulation code

Comment on lines +1860 to +1867
if np.any((P_f > 0) & (P_g == 0)):
return np.inf

valid_mask = (P_f > 0) & (P_g > 0)

log_ratios = np.zeros_like(P_f)
log_ratios[valid_mask] = np.log(P_f[valid_mask] / P_g[valid_mask])

Copy link
Preview

Copilot AI Aug 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The condition checks if P_f > 0 where P_g == 0, but this creates a logical issue. When P_g has zero entries where P_f is positive, the KL divergence should be infinite, but the function should handle this consistently with proper mathematical treatment of log(0) cases.

Suggested change
if np.any((P_f > 0) & (P_g == 0)):
return np.inf
valid_mask = (P_f > 0) & (P_g > 0)
log_ratios = np.zeros_like(P_f)
log_ratios[valid_mask] = np.log(P_f[valid_mask] / P_g[valid_mask])
# Use np.errstate to handle log(0) cases gracefully
with np.errstate(divide='ignore', invalid='ignore'):
# Where P_f == 0, set log ratio to 0 (since 0 * log(0/q) = 0 by convention)
log_ratios = np.where(P_f > 0, np.log(P_f / P_g), 0.0)

Copilot uses AI. Check for mistakes.

@mmcky
Copy link
Contributor

mmcky commented Aug 7, 2025

@HumphreyYang this doesn't inspire confidence either

Updates variable names from Greek letters (α, β) to Latin letters (A, B) for consistency

@HumphreyYang HumphreyYang merged commit 36bc72e into main Aug 8, 2025
7 checks passed
@HumphreyYang HumphreyYang deleted the navy_captain_tom branch August 8, 2025 05:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants