[WIP] Make inline and helper functions in coordinate descent to support fused types #6905

Closed
wants to merge 1 commit into
from

Conversation

Projects
None yet
2 participants
@yenchenlin
Contributor

yenchenlin commented Jun 18, 2016

According to #5464, current implementation of ElasticNet and Lasso in scikit-learn constrain the input to be np.float64, which is a waste of space.

This trivial PR first changes some inline and helper functions in cython implementation of coordinate descent to support fused types.

@yenchenlin yenchenlin changed the title from [MRG] Make inline and helper functions in coordinate descent to support fused types to [WIP] Make inline and helper functions in coordinate descent to support fused types Jun 18, 2016

@yenchenlin yenchenlin closed this Jun 18, 2016

@jnothman

This comment has been minimized.

Show comment
Hide comment
@jnothman

jnothman Jun 18, 2016

Member

Did you mean to close this?

Member

jnothman commented Jun 18, 2016

Did you mean to close this?

@yenchenlin

This comment has been minimized.

Show comment
Hide comment
@yenchenlin

yenchenlin Jun 21, 2016

Contributor

Sorry I've created #6913 to replace this.

Contributor

yenchenlin commented Jun 21, 2016

Sorry I've created #6913 to replace this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment