Skip to content
main
Switch branches/tags
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 
 
 
 
 

Embedding Regression: Models for Context-Specific Description and Inference

Paper and related materials for Rodriguez, Spirling and Stewart (2021). The abstract for the paper is as follows

Political scientists commonly seek to make statements about how a word’s use and meaning varies over circumstances—whether that be time, partisan identity, or some other document-level covariate. A promising avenue is the use of domain-specific word embeddings, that simultaneously allow for statements of uncertainty and statistical inference. We introduce the a la Carte on Text (ConText) embedding regression model for this purpose. We extend and validate a simple model-based linear method of refitting pre-trained embeddings to local contexts that requires minimal input data. It outperforms well-known competitors for studying changes in meaning across groups and time. Our approach allows us to speak descriptively of systematic differences across covariates in the way that words are used, and to comment on whether a particular use is statistically significantly different to another. We provide evidence of excellent relative performance of the model, and show how it might be used in substantive research.

You can find the paper here and a non-technical explainer here.

R software for fitting our models is here, along with a vignette and links to data sets.

Comments are very welcome: please send us an email, or open an "Issue" here.

About

Repository for paper "Embedding Regression: Models for Context-Specific Description and Inference"

Resources

License

Releases

No releases published

Packages

No packages published