Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: Add a metapackage to select BLAS #525

Closed
wants to merge 6 commits into from

Conversation

jakirkham
Copy link
Member

Adds a metapackage to enable the use of OpenBLAS. It is called blas_openblas. Have tried to fill in the standard pieces thus far, but it feels a bit verbose for how simple the recipe is.

@conda-forge-linter
Copy link

Hi! This is the friendly automated conda-forge-linting service.

I wanted to let you know that I linted all conda-recipes in your PR (recipes/blas_openblas) and found some lint.

Here's what I've got...

For recipes/blas_openblas:

  • The home item is expected in the about section.
  • The license item is expected in the about section.

@conda-forge-linter
Copy link

Hi! This is the friendly automated conda-forge-linting service.

I just wanted to let you know that I linted all conda-recipes in your PR (recipes/blas_openblas) and found it was in an excellent condition.


test:
commands:
- conda list blas_openblas
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So, this is a challenging package to test because it doesn't do much, but I tried to test it anyways and this seems like a simple test. However, it fails because it has not activated the _test environment. Here is the full error message.

TEST START: blas_openblas-1.0.0-0
Fetching package metadata: ........
.Solving package specifications: .........

The following packages will be downloaded:

    package                    |            build
    ---------------------------|-----------------
    blas_openblas-1.0.0        |                0          634 B  file:///opt/conda/conda-bld/linux-64/

The following NEW packages will be INSTALLED:

    blas_openblas: 1.0.0-0 file:///opt/conda/conda-bld/linux-64/

+ conda list blas_openblas
Error: Error: environment does not exist: /opt/conda/envs/_build
#
# Use 'conda create' to create an environment before listing its packages.
TESTS FAILED: blas_openblas-1.0.0-0

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Opened this issue ( conda/conda-build#910 ) with regards to this error.


test:
commands:
- conda list -n _test blas_openblas
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Setting the environment here feels like a horrific hack and quite fragile. Unfortunately, not setting it runs into this issue ( conda/conda-build#910 ). So, not really sure there is another option ATM.

@msarahan
Copy link
Member

msarahan commented May 2, 2016

as I said:

It doesn't really enable use of OpenBLAS - it just keeps the packages in the system all using OpenBLAS for their blas needs.

For example, numpy, scipy, and scikit-learn all use BLAS of some sort. People may compile each of them with any BLAS library. Features try to line these all up. In some cases, the only gain is disk space and perhaps memory footprint. In other cases, mixing different builds might be harmful. This is the case with the MS Visual C Runtime. Memory that is allocated in one runtime must be deallocated in that same runtime - or else crashes happen. Bad behavior can be more subtle, too. Using features, we keep all of the libraries that Python uses compiled with with same VS version to avoid these crashes.

Features have limitations. In particular, you are not able to mix different programs in a given environment if they haven't been compiled with the same compiler. This is only really a problem when shared libraries are involved, though. For now, the only solution to work around needing a program (say, Python 2.7/vc9) in a different feature environment (say Python 3.5/vc14) is to install it in a separate environment and use filesystem paths to use it.

@jakirkham
Copy link
Member Author

I understand that. What I was looking for in that question is how would you say this in one sentence or less.

@msarahan
Copy link
Member

msarahan commented May 2, 2016

Sorry, IMHO it is not a one sentence concept. That's why we need a link to somewhere else.

@jakirkham
Copy link
Member Author

Agreed. However, we want to always have a summary here. So, we need to think of what fits there if what I wrote can't go there.

Could we do enable -> "enable"? Or could we use "encourage"? Something totally different?

@jakirkham
Copy link
Member Author

@mcg1969, would really appreciate your feedback on this before we go too far. We are trying to use features with BLASes (I don't really see another way ATM and that is how MKL is used too). Should we also track nomkl here?

@mcg1969
Copy link
Contributor

mcg1969 commented May 2, 2016

I do have very strong opinions here :-) but I am slammed today. I will get to it ASAP

@mcg1969
Copy link
Contributor

mcg1969 commented May 3, 2016

The problem with features is they are binary: mkl used to cause packages to prefer MKL. nomkl on the other hand now causes packages to prefer, well, non-MKL, which now happens to be openblas. What the happens if I managed to include both mkl and nomkl in my environment? Well, conda gets to do whatever it wants, really. And what happens if I'm interested in using Accelerate on a Mac? Or if I want to use ACML?

And what happens if want to use MKL with Python and OpenBlas with R?

I can simulate the behavior I really want with metapackages. To start, I'll name the package python_blas, not just BLAS. I create multiple builds of this package with different build strings: openblas, mkl, acml, accelerate, etc. (I only have to build the ones I need; if someone else offers a different option, they can add it themselves). I give every one of them the same version number, save 0, except for whichever one I want selected as the default, which I give 1.

Now, whenever I build a Python package that depends on BLAS, I make it depend on a particular build of python_blas; e.g., python_blas * openblas. Mutual exclusion of packages with the same name ensures that only one python_blas is active in an environment.

I can handle the actual BLAS dependency in several ways.

  1. I could make numpy depend both on openblas and python_blas * openblas. Seems clumsy.
  2. I could make numpy depend solely on python_openblas, and make openblas a dependency of python_blas-0-openblas. The problem here is: what do if I need to control BLAS versions?
  3. I could discard the simple 0/1 versioning of python_openblas in favor of mirroring the actual OpenBLAS package; e.g., python_blas-2.3-openblas would depend on openblas 2.3. That would have weird effects when I have parallel version tracks for mkl.
  4. I could do Added an example package. #2, but when I need a specific version of OpenBlas, I do Added an example package. #1.

Because I've named it python_blas, I don't have a problem now if I want to build a parallel set of options for R, such as r_blas.

I this a horrible hack? You betcha. Does it give us desirable behavior? Yep. If we like how it works, can we add syntactic sugar to conda so that it doesn't look so hackish? Absolutely.

@jakirkham
Copy link
Member Author

jakirkham commented May 3, 2016

I think I follow what you are saying here @mcg1969 and I appreciate you taking time to comment. The idea sounds quite nice actually (though I might tweak it a little) and I think it's behavior is closer to what we really want.

Could you maybe draw up a simple example recipes so it might be a little easier to follow? Maybe just a bare bones recipe for the feature and a bare bones recipe for the package. One feature case should be sufficient to start I think.

@mcg1969
Copy link
Contributor

mcg1969 commented May 3, 2016

Sure, I'll work on it. Stay tuned

@jakirkham
Copy link
Member Author

jakirkham commented May 3, 2016

Awesome. Thanks.

If you don't have enough time, I could try drawing up what I understand you to mean and we could iterate. However, you want to go about it.

@mcg1969
Copy link
Contributor

mcg1969 commented May 3, 2016

Honestly, if I consider the specific case for BLAS & Python, I'd love to see this be implemented not as a metapackage but rather as middleware. That is to say, python_blas actually exposes a standard BLAS API that any Python package like NumPy, SciPy, etc. uses. But what's done is done.

@mcg1969
Copy link
Contributor

mcg1969 commented May 3, 2016

Before I implement an example, care to give me a gut feel about which approach you like the best of the 4 I enumerated in my (edited) comment?

And note that I'm being deliberate here to put the BLAS type in the build string and not the version. This is because I don't want to have to rely on lexicographic order to decide which BLAS is the default.

@jakirkham
Copy link
Member Author

While it is clumsy, I'm kind of liking 1. We can always use jinja templates to make this feel less clumsy. The trade-offs with the other options I am having trouble getting comfortable with.

@mcg1969
Copy link
Contributor

mcg1969 commented May 3, 2016

OK, 1 is probably my second choice, but I'm partial to 4. Since they're close I'll do that here. Here's a meta.yaml for the OpenBlas version of python_blas. Seems to me that this would be easy to manage with templates.

package:
  name: python_blas
  version: 0 # 1 to choose the default
requires:
  run:
    - openblas # or mkl, or acml, or accelerate, or whatever
build:
  number: 0
  string: openblas # or mkl, or acml, or accelerate

Will the build number get appended to the build string? I don't recall offhand, but I hope not.

And a package that uses BLAS:

package:
   name: blas_caller
   version: 1.2.3
requires:
   build:
      - python
      - python_blas * openblas
      - openblas >1.25 # only needed if I need to control the openblas version
   run:
      - python
      - python_blas * openblas
      - openblas >1.25 # only needed if I need to control the openblas version

To implement Option 1, you just drop the openblas dependency from the metapackage, and of course it now becomes mandatory in blas_caller.

@jakirkham
Copy link
Member Author

I see. I guess I wasn't quite following your description of 4. Was worried there was more maintenance burden in there somewhere. That doesn't actually look so bad now that I'm seeing.

@jakirkham
Copy link
Member Author

Will the build number get appended to the build string? I don't recall offhand, but I hope not.

I test this and it does not. Though it doesn't show up in package name at all, which doesn't seem good.

@conda-forge-linter
Copy link

Hi! This is the friendly automated conda-forge-linting service.

I just wanted to let you know that I linted all conda-recipes in your PR (recipes/blas) and found it was in an excellent condition.

@jakirkham
Copy link
Member Author

jakirkham commented May 3, 2016

Alright, so I have tried to revise this recipe with the suggestions that you have provided @mcg1969. Please let me know what you think.

As the build number is not included in the filename, I felt that this needed to be handled another way so I added some jinja to configure this. Maybe it is too much, but it just makes every build number increase to some .postN release.


test:
commands:
- conda list -n _test blas
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Setting the environment here feels like a horrific hack and quite fragile. Unfortunately, not setting it runs into this issue ( conda/conda-build#910 ). So, not really sure there is another option ATM.

@jakirkham jakirkham changed the title WIP: Add a metapackage to select OpenBLAS WIP: Add a metapackage to select BLAS May 3, 2016
@jakirkham
Copy link
Member Author

So, this doesn't seem to be winning out against build numbers. I switched to building NumPy 1.10.4 as there is a package with build number 1 there. I built a version of NumPy using build number 0. Here is what happens.

$ conda create --use-local -n npenv blas=1.0.0=openblas numpy=1.10
Using Anaconda Cloud api site https://api.anaconda.org
Fetching package metadata: ........
Solving package specifications: .........

Package plan for installation in environment /opt/conda/envs/npenv:

The following packages will be downloaded:

    package                    |            build
    ---------------------------|-----------------
    numpy-1.10.4               |           py35_1         5.9 MB  defaults

The following NEW packages will be INSTALLED:

    blas:       1.0.0-openblas file:///opt/conda/conda-bld/linux-64/
    libgcc:     5.2.0-0        defaults                             
    mkl:        11.3.1-0       defaults                             
    numpy:      1.10.4-py35_1  defaults                             
    openblas:   0.2.18-1       conda-forge                          
    openssl:    1.0.2h-0       defaults                             
    pip:        8.1.1-py35_1   defaults                             
    python:     3.5.1-0        defaults                             
    readline:   6.2-2          defaults                             
    setuptools: 20.7.0-py35_0  defaults                             
    sqlite:     3.9.2-0        defaults                             
    tk:         8.5.18-0       defaults                             
    wheel:      0.29.0-py35_0  defaults                             
    xz:         5.0.5-1        conda-forge                          
    zlib:       1.2.8-0        conda-forge                          

@jakirkham
Copy link
Member Author

Admittedly, this is probably a collision so maybe we need a better example to test this on. I'll try building my own copies of 1.11.0.

@jakirkham
Copy link
Member Author

To avoid collisions, I noted that the most recent version of numpy from defaults is 1.11.0 with build number 0. So, I took my recipe for numpy and changed the build number to 1 and built it (no other builds exist in my environment). Then I stripped the blas package, added the openblas package instead, and bumped the build number to 2. After building both of these and trying to install them in a fresh environment, it picked the higher build number for numpy.

$ conda create --use-local -n npenv blas=1.0.0=openblas numpy=1.11
Using Anaconda Cloud api site https://api.anaconda.org
Fetching package metadata: ........
Solving package specifications: .........

Package plan for installation in environment /opt/conda/envs/npenv:

The following packages will be downloaded:

    package                    |            build
    ---------------------------|-----------------
    numpy-1.11.0               |           py35_2         6.8 MB  file:///opt/conda/conda-bld/linux-64/

The following NEW packages will be INSTALLED:

    blas:       1.0.0-openblas file:///opt/conda/conda-bld/linux-64/
    libgcc:     5.2.0-0        defaults                             
    numpy:      1.11.0-py35_2  file:///opt/conda/conda-bld/linux-64/
    openblas:   0.2.18-1       conda-forge                          
    openssl:    1.0.2h-0       defaults                             
    pip:        8.1.1-py35_1   defaults                             
    python:     3.5.1-0        defaults                             
    readline:   6.2-2          defaults                             
    setuptools: 20.7.0-py35_0  defaults                             
    sqlite:     3.9.2-0        defaults                             
    tk:         8.5.18-0       defaults                             
    wheel:      0.29.0-py35_0  defaults                             
    xz:         5.0.5-1        conda-forge                          
    zlib:       1.2.8-0        conda-forge                          

@mcg1969
Copy link
Contributor

mcg1969 commented May 6, 2016

I suppose we could change it, but I do like the idea of semantic versioning here. We may try to bake stuff in that makes the build process go smoothly and it would be nice to separate enhancements from bug fixes and the like.

Well yes, but again, these are metapackages that are intended to emulate behavior that might eventually be baked into conda directly. They're not intended to be built as normal packages or behave like them.

@mcg1969
Copy link
Contributor

mcg1969 commented May 6, 2016

So, this doesn't seem to be winning out against build numbers.

I'm not sure why it should be expected to. We haven't changed conda's versioning rules here.

@mcg1969
Copy link
Contributor

mcg1969 commented May 6, 2016

The build number for the blas package should not really matter. What mattered here is that you have a higher build number for numpy.

@jakirkham
Copy link
Member Author

The build number for the blas package should not really matter.

Yeah, didn't change that.

What mattered here is that you have a higher build number for numpy.

The higher one doesn't have the build string.

I'm not sure why it should be expected to. We haven't changed conda's versioning rules here.

So, let me understand. The purpose of the build string will be to select between numpy (or other) packages that have the same version and build number, but a different build string.

@mcg1969
Copy link
Contributor

mcg1969 commented May 6, 2016

The higher one doesn't have the build string.

conda doesn't use the build string during the resolution process. Its only purpose is to make the filenames unique. The build number is all that conda considers when prioritizing packages.

Remember, this isn't going to behave like a feature. Installing blas is not going to force the other packages to depend on it. All this is for is to make sure that any packages that do depend on blas all depend on the same one.

@mcg1969
Copy link
Contributor

mcg1969 commented May 6, 2016

All the more reason why we don't want to think of these as features, but rather as variants.

@mcg1969
Copy link
Contributor

mcg1969 commented May 6, 2016

This is all going to be easier to sort out once the channel collision fix is released (4.1). When that happens, you will be able to stop gaming your version and build numbers to compete with defaults.

That said, you may find that it will be worthwhile to consider backporting these metadata tricks to earlier builds, to avoid weird cases where conda mixes and matches earlier builds with newer ones.

@pelson
Copy link
Member

pelson commented May 6, 2016

This is a pretty lengthy and hugely valuable thread which I'm keen to see continuing, but I'd like to exercise a little caution in pinning conda-forge's ability to deliver numpy, scipy etc. on an interface which is quite so non-standard. Whilst I don't believe anybody here actually likes the way features have been used for blas/mkl for numpy in defaults, the fact remains that we are going to need to maintain compatibility with the defaults channel, even if we go a completely different route. Pragmatically, to me that means initially mirroring the approach that has been taken on defaults and finding a long-term, friendly API which gives us solutions to the usecases that we require. To ground that a little, it is very conceivable to me that we end up solving the "multiple blas" or "multiple vc" issue by using multiple environments rather than adding the complexity of supporting multiple variants within a single environment.

As I said, this conversation has huge value, but I don't think we need a conclusion (and conda PRs) to be able to do something useful for conda-forge in the short term.

@mcg1969
Copy link
Contributor

mcg1969 commented May 6, 2016

Well, I can assure you that there is plenty of buy-in within Continuum about the concept of variants. We may have some issues to work out when it comes to making a clean transition from the current feature-based selection approach. But variants have many uses outside of BLAS.

@mcg1969
Copy link
Contributor

mcg1969 commented May 6, 2016

To ground that a little, it is very conceivable to me that we end up solving the "multiple blas" or "multiple vc" issue by using multiple environments rather than adding the complexity of supporting multiple variants within a single environment.

This may reflect a bit of a misunderstanding about what we're trying to accomplish here. This isn't really about supporting multiple variants in the same environment---this is primarily about making sure we don't accidentally mix variants.

@mcg1969
Copy link
Contributor

mcg1969 commented May 6, 2016

The misunderstanding may arise from my desire to support different BLAS versions for different programs---for example, MKL for Python and OpenBLAS for R. But nothing we're doing here prevents that.

@pelson
Copy link
Member

pelson commented May 6, 2016

The misunderstanding may arise from my desire to support different BLAS versions for different programs---for example, MKL for Python and OpenBLAS for R. But nothing we're doing here prevents that.

Indeed not. But that is prevented in the existing features approach, and to my understanding is the reason that a python_blas package has been mentioned in this thread (even if it isn't still on the table).

Healthy debate brings with it completely orthogonal ideas which as individuals we can easily miss - you may be interested in https://github.com/pelson/conda-execute/issues/20#issuecomment-217570352 as just one of those orthogonal ideas for dealing with variants across the process boundary (this is effectively the idea of child environments, which I don't think is completely new).

In order to avoid descending into completely blue sky activity we must try to remember what we are trying to achieve - we need a safe (and ideally convenient) way of packaging blas so that we can build on top of it to provide a suitable numpy, scipy etc.. Right now conda-forge does not have these keystone packages, yet defaults does. Whilst we all agree that the solution there isn't ideal (i.e. using features) I don't want conda-forge's capabilities to be limited by a future design.

As I said initially, this conversation is hugely valuable, and I'm hopeful it will lead to an improved conda for both conda-forge and defaults to benefit from. I just don't think it is healthy for conda-forge to necessarily wait for that to be implemented.

@mcg1969
Copy link
Contributor

mcg1969 commented May 6, 2016

Indeed not. But that is prevented in the existing features approach,

I'm pretty sure that's not the case. That is to say: if we were forced to stick with features, we could still have different BLAS libraries in the same environment, with Python linking to one and R linking to another.

to my understanding is the reason that a python_blas package has been mentioned in this thread (even if it isn't still on the table).

Well yes, but even if we were to standardize on the use of a blas name for the Python BLAS variants, we could still do r_blas for the R BLAS variants. So I'm really not worried about the specific names chosen.

@mcg1969
Copy link
Contributor

mcg1969 commented May 6, 2016

I certainly agree that conda forge should not get too far ahead of defaults here. But what I'm saying is that if we have unity on this proposal, and I think we can get there, then we can perform the necessary evolution in both channels, and soon.

@jakirkham
Copy link
Member Author

So, as someone who has tried to use features in various ways to package BLASes I can say from experience (please forgive the liberal analogy) it is like walking down a dark alley in Detroit. We don't want to do it and when we do it anyways we hear a voice telling us we never should have done it.

This build string approach (which we have called variants) sounds very experimental, but it isn't so wild in the end. It is interesting, but it is effective and simple. This is what I love about @mcg1969's thoughts on this. We are tweaking the incredibly powerful conda solver in a very slight way and gaining quite a bit from it.

We should not be so fearful here as this is really what we will want in the end IMHO. There is some syntactic sugar and some tweaks to the solver sure, but they shouldn't distract us from the elegance and simplicity that is already here ready for the taking. The future is now gentlemen. Let's embrace it.

@pelson
Copy link
Member

pelson commented May 8, 2016

@mcg1969 - if we had the ability to express "conflicts" (in the RPM sense), would the analysis in this PR be different? Would it be easier at that point to be able to state that openblas "conflicts" with mkl? I appreciate that it still has the issue of not being able to install both into the same environment, but I don't think that needs to be a driving use case.

@mcg1969
Copy link
Contributor

mcg1969 commented May 8, 2016

Actually, if you added these variant metapackages as dependencies of the various BLAS packages themselves (that is, openblas, mkl, etc.), then you would get a sort of "conflicts" behavior. It would make it impossible to install more than one BLAS into the same environment.

We have multiple cases, however, where we want to be able to install the packages side by side but allow only one to be depended upon in a given context (Python, R, standalone, etc.) So we don't want to be locked into providing just a "conflicts" behavior.

@pelson
Copy link
Member

pelson commented May 12, 2016

We have multiple cases, however, where we want to be able to install the packages side by side but allow only one to be depended upon in a given context (Python, R, standalone, etc.) So we don't want to be locked into providing just a "conflicts" behavior.

I really don't want that to be a driving usecase. I do not believe the machinery should, and ever will, exist within conda to deal with that in a single environment correctly. The usecase which kills the discussion for me is Jupyter's desire to install two versions of python into the same environment - py3 for Jupyter itself, and py2 for a py2k kernel. Process fork compatibility (is there a better name?) can be solved very well by having sibling or child/parent sub-environments and I believe that to be the easiest and most reliable way to express and manage these dependencies.

The discussion on this use case belongs in conda/conda IMO. We shouldn't be hacking our variant names with namespaces (blas vs python_blas vs r_blas) to achieve the desired result - that should be designed into conda directly.


@jakirkham - as the author of it, is the purpose of this PR to have this debate, or is it a critical piece of being able to move forwards with a numpy recipe?


@mcg1969 - can we start looking at this problem from the other end? How do we want this to look for our users? I think it is perfectly reasonable to invent syntaxes here which could become part of conda. Here is a starter for 10 (I haven't thought this through thoroughly, and may be overloading [):

# Gives me the numpy that conda-forge recommends by default (openblas)
conda install numpy

# Allows me to choose the variant
conda install numpy[openblas]

# **Must** install a compatible scipy without me knowing that it needs openblas.
conda install scipy

# Removes the old numpy, and changes all other packages which use the openblas variant (i.e. scipy)
conda install numpy[mkl]

@mcg1969
Copy link
Contributor

mcg1969 commented May 12, 2016

I really don't want that to be a driving usecase. I do not believe the machinery should, and ever will, exist within conda to deal with that in a single environment correctly.

I'm afraid this isn't an option for Continuum. Conda-forge may never support it, and that is fine, but we must.

@jakirkham
Copy link
Member Author

as the author of it, is the purpose of this PR to have this debate, or is it a critical piece of being able to move forwards with a numpy recipe?

The purpose was and still is (for me at least) to find the quickest and most correct way to move NumPy, SciPy, scikit-learn, and so many other BLAS-dependent packages forward. I'm eager to stop maintaining my own builds of these and this is the critical piece. Also, I think it is ready as is. Though I would really like some other people that use BLAS in conda to take a look and share some thoughts on it.

@mcg1969
Copy link
Contributor

mcg1969 commented May 12, 2016

Longer answer, now that I'm home from travel...

I really don't want that to be a driving usecase. I do not believe the machinery should, and ever will, exist within conda to deal with that in a single environment correctly.

As I've indicated, we have use cases where this capability is necessary. We can debate the syntactic sugar necessary to pull it off, it is imperative that we do not artificially prevent different BLAS or CRT implementations from living side-by-side in the same environment---for instance, by embedding some sort of conflicts functionality inside the openblas and mkl packages themselves.

If the conda-forge project chooses to enforce a single variant across all platforms (Python, R, standalone), that's fine, but it needs to be done in such a way that it will not hinder our efforts to do differently.

The usecase which kills the discussion for me is Jupyter's desire to install two versions of python into the same environment - py3 for Jupyter itself, and py2 for a py2k kernel.

It seems very odd to allow a single difficult usecase like this to "kill the discussion". In fact, I personally have no desire to enable two versions of Python to be installed in the same environment, even if in theory, it could be done. What I cannot accept is for this particular usecase to prevent me from installing Python and R in the same environment.

The discussion on this use case belongs in conda/conda IMO. We shouldn't be hacking our variant names with namespaces (blas vs python_blas vs r_blas) to achieve the desired result - that should be designed into conda directly.

We haven't made any final movements yet on the conda syntactic sugar that helps manage variants within conda. Let's certainly continue that discussion. But I don't want to bake anything into conda until we've figured out the behavior we need.

@mcg1969
Copy link
Contributor

mcg1969 commented May 12, 2016

can we start looking at this problem from the other end? How do we want this to look for our users?

We're already talking about something here. I'm going to continue the discussion there.

@jakirkham
Copy link
Member Author

Let's keep the discussion over there for now.

We have opted to go a different route ( #643 ) for now. This pulls ideas from here, but ends up needing to rely on features to some extent. It is what @pelson and I found to work ok for the near term given the current state of affairs. This is already in use at conda-forge now and we will be building it out.

Ultimately we would like to fix how we handle BLAS and this conversation is still very important to us, but we needed to do something quickly. By doing so, I hope this will relieve some pressure as we try to figure out the right way to solve this problem in the long term.

@jakirkham jakirkham closed this May 21, 2016
@jakirkham jakirkham deleted the add_blas_openblas branch May 21, 2016 19:40
@jakirkham jakirkham mentioned this pull request May 24, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

None yet

7 participants