-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[RFC]Android Clients #136
Comments
Very impressive research! If we were to develop Android apps using Kotlin, I did a bit of investigations myself and I failed to find any other alternatives, suggesting that what you proposed may very likely be the only way to move forward. One example that I investigated is PyTorch Mobile, while it supports inference on both Android and iOS, it doesn't support training (for now). In terms of developing the apps, however, it may be much easier and time-efficient to develop using PyTorch Live's way, which used JavaScript only and React Native. This is cross-platform, so no separate apps for iOS need to be developed using Swift. JavaScript may also be easier to develop and to maintain. I will investigate further to see if it is feasible to support training while using PyTorch Live. |
As I understand, PyTorch mobile iOS uses torch's C++ frontend api. We can do native programming in the part of the app that involves torch. Android has a limited java interface which does not have backward pass, so the solution is still to use the c++ api. PyTorch Live on android and iOS both depends on torch lite, which I cannot find details on. The React Native interface for it is very crude right now. To do any part of the training we still have to do native programming, if it's possible on torch lite. It will mean that we lose numpy, and we have to maintain two separate client code base. This is why I lean heavily towards porting the current python client to android as a starting point. I also doubt the feasibility of running multiple iOS clients in a test as the emulation does not scale well. |
Makes a lot of sense. Would you mind help checking what FedML used for its Android and iOS clients? |
deeplearning4j For android, no iOS source code available. |
Thank you — I am guessing that this is completely different from PyTorch and cannot build upon what PyTorch has to offer? And if this is a correct statement, then can I infer that chaquo is a better alternative? Assuming, of course, that we are only interested in Android. |
I don't know how different they are. But interop will most likely be very hard to achieve. The 'best' way I assume is to use the C++ API with pytorch in android / iOS app as it is the most efficient, lightweight solution. It will also take a long time to develop. The solution with cheque is not perfect: it is not open source, it includes a python runtime and run the python scripts on it. But it is the solution with the least effort for development. So it really depends on what the goal and timeline is. |
While I am still a bit undecided, I thought about it seriously and I'm currently leaning towards using the PyTorch C++ front-end API. Not only it will be more efficient, it also supports both iOS (including Metal GPU support) and Android. Being cross-platform is important since it can reduce the amount of development time when we need to move from Android to iOS. Since this development is planned for the goal of future real-world deployment tests using Plato, the fact that multiple iOS simulators cannot be launched as easily as Android may not be too much of a concern. The solution using chaquo is limited to Android only (which is a major limitation since all iOS development has to start from scratch using the PyTorch C++ front-end API anyway), and as you said, it is not open source, which is another drawback. The benefit is that development can be done more quickly, but perhaps it would be best to get it right first, rather than get something working but not exactly right. Existing client-side code cannot be reused if we use the PyTorch C++ API, but the server-side code can still be used. What do you think? |
I would view the cheque solution as a way to get a look and feel of how Plato would work in a multi-host, mobile environment. If you want a preview its a good solution to get one quickly. It can be a one-two week work. It most likely would only involve minor modification to current tried and tested client and some environment settings. There can be some unforeseen roadblock though as it's not open source and we don't have a way to know any issues. For using C++ PyTorch front-end, its a complete solution, but it would take significantly more time, like 12-24 months or more, to develop two apps into a stable state. It would involve adopting socket.io, adopting the missing numpy/other libraries, developing native FL client libraries for the app, reimplementing every learning method natively, iterating over the code structures and designs and repeated testings. |
All very true and very convincing. Well, it seems that we'll have to go with the chaquo solution first, get it done, and then try the PyTorch C++ front-end. |
Closed due to low priority. |
Development of android FL client to further enhance the simulation of FL on mobile devices
Approach
The text was updated successfully, but these errors were encountered: