New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Papers on Dimensionality Reduction #279
Conversation
…would be willing to give a talk about this topic\!
HI @tchitra - what chapter is closest to you? |
New York! Sent from my iPhone
|
@tchitra Can you add a small description for each paper? I'm interested in this subject too. |
@tcfuji Here are short summaries:
|
@tchitra Cool! Good luck explaining Banach Spaces (treading into 2nd year graduate-level math)! 👍 |
@tcfuji Thank you. We'll definitely keep you in the loop on speaking. In terms of this PR, we're all for adding these papers, but the copyrights/perms on them may cause issue. Do you mind adding their links to the README of each of the categories... w/ your short summaries? Here's one example: https://github.com/papers-we-love/papers-we-love/tree/master/experimental_algorithmics#included-papers. Thanks! |
@tchitra just checking in if you got my last message in regards to permissions issues. |
Dimensionality Reduction is interesting to me, too. What needs to be done here? I see the link to the FJLT at Princeton, and the other two can be links to arXiv (in addition to the backup in-repo). These are all under the Dimensionality Reduction section. Then there are 2 more papers, in Sublinear Algorithms. What's intended to happen with them?
|
@NewAlexandria sounds good! The Sublinear ones can stay as pdfs... but we've been also putting up links to binaries in the README's for each category as well, e.g. https://github.com/papers-we-love/papers-we-love/blob/master/cryptography/README.md... the one w/ the 📜 means link & pdf. I'd love to include @tchitra's summaries, #279 (comment), in the readme as well... along w/ the papers. From reading around, support for something like MathJax is afloat for GH, but nothing yet. |
@NewAlexandria were you still planning to finish adding what's needed here? |
@zeeshanlakhani Sorry I haven't gotten back to you guys on the @NewAlexandria: The sublinear papers include count-min-sketch and a simple On Monday, April 6, 2015, NewAlexandria notifications@github.com wrote:
|
@zeeshanlakhani Let me know if there is anything else you would like me to contribute and/or if there is a certain subset of the aforementioned topics that you would prefer is covered well within the repo. |
@tchitra thanks. On the licenses though, the Fast Johnson-Lindenstrauss Transform paper, for example, has an explicit copyright on the bottom: |
I have added the README changes in #308. Merge this branch first, as that branch is rebased. Hope this works across forks, but I'll rebase again if not. |
@zeeshanlakhani requested I get this merged. The SIAM paper was not removed by @NewAlexandria, so my plan is to merge this, then merge that #308, and then remove the paper in another pull request... Not ideal, but gets this cleaned up. |
Papers on Dimensionality Reduction
I've added a few papers on the theory of dimensionality reduction. A lot of this work is used to theoretically justify the performance of a variety of machine learning techniques and heuristics such as random forests, approximate least-squares and compressed sensing. The papers included are:
I'd be willing to give a talk on one of these papers (presumably not the last one, though, as it is quite deep and hard to cover in a short talk!)
[0] https://www.cs.princeton.edu/~chazelle/pubs/FJLT-sicomp09.pdf