Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove WSL requirement, provide native Windows toolchain #18

Open
niutech opened this issue Dec 14, 2023 · 9 comments
Open

Remove WSL requirement, provide native Windows toolchain #18

niutech opened this issue Dec 14, 2023 · 9 comments

Comments

@niutech
Copy link

niutech commented Dec 14, 2023

It's comical that Windows AI Studio requires Linux under the hood. Can't you provide a native Windows toolchain? Python, Conda, PyTorch, llama.cpp, QLoRA should run natively in Windows.

@PaulaScholz
Copy link

I must agree with this sentiment. I don't want LLMs with 6000 pounds of Linux crud I have to run. Take a look at LlamaSharp https://github.com/SciSharp/LLamaSharp. All I need are CUDA drivers and a Llama2 model from HuggingFace. It comes with an ASP.Net solution that works, Semantic Kernel connectors, works with multiple GPUs, and can be put into any C# UI framework. Far more useful in native Windows. That's what people want. Windows.

@meetrais
Copy link

I am getting below message after Model Fine Tuning Project completes and I relaunch Project.
"Workspace does not exist"

Windows-AI-Error

@eric-vanartsdalen
Copy link

I did a simple model... same result - but, I think the wizard gets lost because of where the system puts the files...

I installed a Project under:
/home/{user}/{project_folder}/{project_name}...
The setup & download occurred...

Where the model ended up was under a mount directory:
/mnt/c/home/{user}/{project_folder}/{project_name}

When queried for relaunch, the project was looking for:
/mnt/home/{user}/{project_folder}
This could be solved by the wizard creating a symbolic link in the Project folder pointing to the symbolic link location, and pointing from the home and not the mnt mount.

It looks like the setup installed all the required pieces. (cuda libraries for an NVidia GPU, miniconda and dependency libraries for python libs to the WSL pointing the default Ubuntu.)

IMHO, Asking Microsoft to have a better toolchain is an unfair. Most AI uses Linux based Python libraries. It's great to have a tool to sets everything up (miniconda, cuda libs, and dependency python libs) which plugs into WSL Ubuntu as a start is great.

@niutech
Copy link
Author

niutech commented Dec 18, 2023

@eric-vanartsdalen I don't think it's unfair. Python/Conda/PyTorch are cross-platform. MacOS has its own native toolchain, they don't require a Linux subsystem. Windows deserves to have a native toolchain as well.

@eric-vanartsdalen
Copy link

@niutech Respectfully - Don't get me wrong, I'ld love to see things just work on Windows.
However, Windows in not a Unix-like based OS like MacOS. Many of the AI models have been born out of Linux work and python libraries. (I've struggled previously to try and get environments working only to find some of the lower level python referenced libraries don't have a Windows equivalent or are broken. Conda included.) I also realize, this is a preview too. So who knows what the future brings...

@zcobol
Copy link

zcobol commented Dec 19, 2023

Don't save the project inside a WSL instance. I think there's a bug filed already for this issue, #28

The WSL instance is used to create Conda environments and for CUDA. The model(s) are on Windows filesystem. So in step 1 if you select /home/<username>/project it will be saved in C:\home\<username>\project and will result in a Workspace does not exist error.

Use Local to select a folder when remote connection is up. For example, if the project is saved in C:\project it will be mounted under /mnt/c/project in step 3.

@meetrais
Copy link

Don't save the project inside a WSL instance. I think there's a bug filed already for this issue, #28

The WSL instance is used to create Conda environments and for CUDA. The model(s) are on Windows filesystem. So in step 1 if you select /home/<username>/project it will be saved in C:\home\<username>\project and will result in a Workspace does not exist error.

Use Local to select a folder when remote connection is up. For example, if the project is saved in C:\project it will be mounted under /mnt/c/project in step 3.

Yes, with windows path it worked fine. Thank you.

@FruityWelsh
Copy link

I would focus on making the best application rather than chasing down all the dependencies to get all of these tools to smoothly run on Windows natively.

@etna
Copy link

etna commented Dec 28, 2023

Got to agree with the sentiment here.

Windows' biggest strength was how it always maintained its own vendor toolchains and dependencies for what it delivers or publishes, so that it was possible to rely on just the native Windows stack for almost everything.

Unfortunately, this has been changing over the years. Instead of having vendored dependencies, more and more dependencies are now just swiped wholesale from other FOSS projects where Microsoft has little control over.

Many of these dependencies exist for Windows. There is really no reason to have to load up Linux curd just for this. If we wanted to play with the Linux dependencies we would be playing with them in a Linux VM. But we are on Windows. And we WANT to be on Windows, along with the Windows versions of these dependencies.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants