New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove WSL requirement, provide native Windows toolchain #18
Comments
I must agree with this sentiment. I don't want LLMs with 6000 pounds of Linux crud I have to run. Take a look at LlamaSharp https://github.com/SciSharp/LLamaSharp. All I need are CUDA drivers and a Llama2 model from HuggingFace. It comes with an ASP.Net solution that works, Semantic Kernel connectors, works with multiple GPUs, and can be put into any C# UI framework. Far more useful in native Windows. That's what people want. Windows. |
I did a simple model... same result - but, I think the wizard gets lost because of where the system puts the files... I installed a Project under: Where the model ended up was under a mount directory: When queried for relaunch, the project was looking for: It looks like the setup installed all the required pieces. (cuda libraries for an NVidia GPU, miniconda and dependency libraries for python libs to the WSL pointing the default Ubuntu.) IMHO, Asking Microsoft to have a better toolchain is an unfair. Most AI uses Linux based Python libraries. It's great to have a tool to sets everything up (miniconda, cuda libs, and dependency python libs) which plugs into WSL Ubuntu as a start is great. |
@eric-vanartsdalen I don't think it's unfair. Python/Conda/PyTorch are cross-platform. MacOS has its own native toolchain, they don't require a Linux subsystem. Windows deserves to have a native toolchain as well. |
@niutech Respectfully - Don't get me wrong, I'ld love to see things just work on Windows. |
Don't save the project inside a WSL instance. I think there's a bug filed already for this issue, #28 The WSL instance is used to create Conda environments and for CUDA. The model(s) are on Windows filesystem. So in step 1 if you select Use |
Yes, with windows path it worked fine. Thank you. |
I would focus on making the best application rather than chasing down all the dependencies to get all of these tools to smoothly run on Windows natively. |
Got to agree with the sentiment here. Windows' biggest strength was how it always maintained its own vendor toolchains and dependencies for what it delivers or publishes, so that it was possible to rely on just the native Windows stack for almost everything. Unfortunately, this has been changing over the years. Instead of having vendored dependencies, more and more dependencies are now just swiped wholesale from other FOSS projects where Microsoft has little control over. Many of these dependencies exist for Windows. There is really no reason to have to load up Linux curd just for this. If we wanted to play with the Linux dependencies we would be playing with them in a Linux VM. But we are on Windows. And we WANT to be on Windows, along with the Windows versions of these dependencies. |
It's comical that Windows AI Studio requires Linux under the hood. Can't you provide a native Windows toolchain? Python, Conda, PyTorch, llama.cpp, QLoRA should run natively in Windows.
The text was updated successfully, but these errors were encountered: