Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Make inline and helper functions in coordinate descent to support fused types #6905

Closed
wants to merge 1 commit into from

Conversation

@yenchenlin
Copy link
Contributor

yenchenlin commented Jun 18, 2016

According to #5464, current implementation of ElasticNet and Lasso in scikit-learn constrain the input to be np.float64, which is a waste of space.

This trivial PR first changes some inline and helper functions in cython implementation of coordinate descent to support fused types.

@yenchenlin yenchenlin changed the title [MRG] Make inline and helper functions in coordinate descent to support fused types [WIP] Make inline and helper functions in coordinate descent to support fused types Jun 18, 2016
@yenchenlin yenchenlin closed this Jun 18, 2016
@jnothman
Copy link
Member

jnothman commented Jun 18, 2016

Did you mean to close this?

@yenchenlin
Copy link
Contributor Author

yenchenlin commented Jun 21, 2016

Sorry I've created #6913 to replace this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked issues

Successfully merging this pull request may close these issues.

None yet

2 participants
You can’t perform that action at this time.