Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question on GPU #15

Open
Mouss94 opened this issue May 17, 2016 · 5 comments

Comments

Projects
None yet
5 participants
@Mouss94
Copy link

commented May 17, 2016

Hello , First thank you for this great project !
I would like to know please if tensorflow use the GPu on the raspberry pi ?
Thanks you

@samjabrahams

This comment has been minimized.

Copy link
Owner

commented May 17, 2016

Thanks for the question- unfortunately, at this time TensorFlow isn't compatible with the GPU on the Raspberry Pi, as TensorFlow only supports NVIDIA CUDA graphics cards.

One day (hopefully), a group of people will find a reasonable way to map OpenCL/CUDA calls onto the RPi GPU!

In the meantime, if you'd like to create custom deep learning code for the Raspberry Pi's GPU, a good place to get started is this post by Pete Warden.

@danbri

This comment has been minimized.

Copy link

commented Apr 12, 2017

Presumably XLA is part of the answer.

https://twitter.com/danbri/status/839431030201266177 -> https://developers.googleblog.com/2017/03/xla-tensorflow-compiled.html

XLA (Accelerated Linear Algebra), a compiler for TensorFlow. XLA uses JIT compilation techniques to analyze the TensorFlow graph created by the user at runtime, specialize it for the actual runtime dimensions and types, fuse multiple ops together and emit efficient native machine code for them - for devices like CPUs, GPUs and custom accelerators (e.g. Google’s TPU).

-> https://www.tensorflow.org/performance/xla/

@samjabrahams

This comment has been minimized.

Copy link
Owner

commented Apr 13, 2017

That's a pretty fun idea! I can imagine it being a good exercise, and useful if it's implemented correctly. For me personally, it'll have to be a back-burner idea, but I'll leave some links that might be useful moving forward with that idea:

I'll reopen this thread to increase discoverability.

@samjabrahams samjabrahams reopened this Apr 13, 2017

@arunmrao

This comment has been minimized.

Copy link

commented Jul 20, 2017

Hate to see Nvidia having a monopoly on TF GPU - would like explore helping break that - any links, resources appreciated

@bafu

This comment has been minimized.

Copy link

commented Oct 17, 2017

@samjabrahams py-videocore is also an interesting RPi GPGPU library supporting Python.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.