Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

connecting to the ev3 #63

Closed
bravespacemonkey opened this issue May 8, 2014 · 18 comments
Closed

connecting to the ev3 #63

bravespacemonkey opened this issue May 8, 2014 · 18 comments

Comments

@bravespacemonkey
Copy link

Hi,
I'm new to linux (Ubuntu) and I just can't connect do my device (ev3) and don't really understand the "nfs" and "usb" pages here.

I looked at https://github.com/mindboards/ev3dev/wiki/Setting-Up-Linux-USB-Ethernet-Networking
And I don't know what are cdc_eem and cdc_subset and how to get them.
Also, when i try to run ssh root@192.168.2.100 it dosent connect (is it imported which dir i am now?)

Thanks for the future help!

@dlech
Copy link
Member

dlech commented May 8, 2014

Here is my advice:

  • Don't worry about NFS, there are other ways to transfer files to the EV3
  • Don't worry about cdc_eem and cdc_subset - they are installed by default in Ubuntu.
  • Use g_cdc instead of g_ether (set in the ev3dev.rc.local file). Technically both work in Linux, but g_cdc works better.
  • Get an experienced Ubuntu user-friend to help you or ask via IRC in #ev3dev on freenode.net.
  • Try out the latest test release that (hopefully) makes connecting a little bit easier (documentation for getting started with the test release is here)

@bravespacemonkey
Copy link
Author

Thank you, I managed to follow the instructions and just got my Brick to turn on and off its green light.
Thank you!

P.S
I'm not quite sure how to program for the robot, mainly do i just write a program and run it through the ssh in the brick?
Is there an example program for me to reference?

Thanks again for helping with the installation, your advice really helped!

@dlech
Copy link
Member

dlech commented May 8, 2014

I'm not quite sure how to program for the robot, mainly do i just write a program and run it through the ssh in the brick?

We are still figuring that out ourselves. So, you get to be a pioneer. There is some info on the wiki that tells how to uses sensors and motors and things like that. Most of it applies to the latest test image that I linked above.

On Ubuntu, I found that you can open folders on the ev3 directly using the file manager by press CTRL_+L and type in sftp://root@192.168.0.100. This way, I have been able to edit files directly on my host machine and then run them on the EV3 (via ssh).

@dlech dlech closed this as completed May 8, 2014
@cavenel
Copy link

cavenel commented May 9, 2014

Hi,

To complete what dlech said about programming the ev3, you can find some Python example in my git repository:
https://github.com/cavenel/ev3dev_examples
You can especially have a look into the pyev3/ev3.py file here:
https://github.com/cavenel/ev3dev_examples/blob/master/python/pyev3/ev3.py
It gives all basic functions to use motors and sensors with simple Python functions.

You can find a demo of the robot using the rubiks cube example here:
https://www.youtube.com/watch?v=HuKsfp19yF0

The model is based on mindcub3r, from David Gilday. See here for more informations: http://www.mindcuber.com/mindcub3r/mindcub3r.html
The brick is running ev3dev with last version, and the code is the one in my repository.

Hope that can help!

@rhempel
Copy link
Member

rhempel commented May 10, 2014

Hey, I am no Python expert, so I appreciate your work on this. I am so impressed with this I can't tell you! The solving algorithm is running entirely on the brick, right? And the little text to speech thing at the end where the solver says how long it took is a nice touch.
Do you feel that doing this on the ev3dev distribution was a good experience? Did you feel that not having an IDE slowed you down? How neat is it that you can do this from a tablet?
Great job!

@cavenel
Copy link

cavenel commented May 10, 2014

Hi,

Thank you for your nice message!
I have two different solving algorithms, one is running on the brick and gives a solution in around 60 steps (see http://cubex.sourceforge.net/). The other one gives a solution in around 20 steps, but takes around 500 Mo of memory, so I need to run it on a computer (see http://www.cube20.org/src/).
The little text to speech is using espeak, so all merits go to G33kDude ;-)
I loved working on the ev3dev distribution, as I am a linux user. I use my own IDEs and reaally appreciate the freedom that ev3dev gives in the choice of tools to use!
I didn't use a tablet yet, but I think I will have a look into it soon, with Bluetooth or Wifi connections...

And of course, thank you again for all the very nice work on ev3dev!

@walshbp
Copy link

walshbp commented May 13, 2014

Hi Cavenel,

First let me say your examples are awesome. Thank you for all the hard
work you put into it. I liked what you have done with your ev3 python
wrapper. It looks very reusable and am looking forward to playing with
it in the future.

My daughter and I had fun building the mindcub3r this weekend. We are
running into some problems correctly scanning the color's of the cube.
My cube has different colors than yours. I have updated the colors in
your get_color function. I'm using averaged values that I calculated
from solving a side and had the robot scan it. Biggest problem I'm
having is the color sensor telling red from orange. I noticed on
David's website that he has custom firmware where that "cannot be
distinguished by the standard color mode provided by the standard LEGO
MINDSTORMS EV3 software." I'm wondering if this is something that should
be ported back to ev3dev.

Thoughts?

Bryan

@dlech
Copy link
Member

dlech commented May 13, 2014

I noticed on David's website that he has custom firmware where that "cannot be distinguished by the standard color mode provided by the standard LEGO MINDSTORMS EV3 software."

When working on the driver for the EV3 color sensor, I found that the checksums in the "raw" mode were incorrect and I had to add a workaround to the driver to accommodate it otherwise we would not have a raw mode. So, what I got from Gilday's comment there was that he had to make a custom firmware that included a similar fix in order to use the raw mode of the color sensor. He also created a EV3-G block for raw color mode as well since one did not already exist.

I see that @cavenel is already using the raw mode, so I don't think there is anything else we can do on the driver level.

@cavenel
Copy link

cavenel commented May 13, 2014

Ha ha! I didn't think that any one would try my code for MindCub3r! That's awesome thanks!
I totally understand the problem with the scan, it was not very robust on my rubik's cube, but I had a very special one...
You should have a look on my conversation with David Gilday on youtube: https://www.youtube.com/watch?v=HuKsfp19yF0
By the way, good job @dlech on finding the workaround by yourself, the other David thought it was his code ;-)

I will try to implement a new scan soon, that takes into account David's comments on Youtube. I'll let you know if I have something nice. The main idea is to convert RGB to HSL. Then use Saturation and Lightness to determine white (high L, low S) and then sort the other colors by Hue. Then divide the sorted colors into 5 groups for the the remaining 5 colors.

Christophe

@rhempel
Copy link
Member

rhempel commented May 13, 2014

This. This is exactly why OpenSource and collaboration works. I have learned so much from everyone by opening up this project. Thanks to @dlech, I have learned about guestfish and other tools for dealing with disk images. He has also straightenedme out on more than one occasion on Linux'y driver issues. @fdetro and @dlech worked on the audio capabilities of ev3dev to make it much better than stock LEGO firmware. And now we're getting lessons on colour detection.
Thanks to all of you for sharing your knowledge and letting everyone else benefit!

@ameysutavani
Copy link

@cavenel first off amazing work there....
This project, being a very good combination of various programming scenarios for ev3dev and python, it would be awesome if you could make a step by step tutorial (for the programming part only) so that novice users like me can get to know more.

Thanking in advance. :D

@ameysutavani
Copy link

I was also thinking if we can create such example models (like basic line follower etc) so that people can learn how to use ev3dev's capabilities replicating them

@jsvalderrama
Copy link

I agree! It would be very useful to have a tutorial for making the robot do
a === (je ne me rappelle pas du nom) loop. I find the cube code quite
complicated for a beginner, and so I know that understanding how to chain
the commands would be a good start and it would be better if later on we
could share or own experiences and explain the added functions so that
everyone would profit from others' experiences

2014-05-13 20:34 GMT+02:00 Amey Sutavani notifications@github.com:

I was also thinking if we can create such example models (like basic line
follower etc) so that people can learn how to use ev3dev's capabilities
replicating them


Reply to this email directly or view it on GitHubhttps://github.com//issues/63#issuecomment-42994267
.

@dlech
Copy link
Member

dlech commented May 13, 2014

je ne me rappelle pas du nom

PID?

@cavenel
Copy link

cavenel commented May 13, 2014

Well for sure, the Mindcub3r implementation was not for beginners. Actually when I wrote it it was more for me than for anyone else, to see what was possible to do with this ev3dev! But then if it can be used as an example of what is doable, why not :-)

But I could do a simple tutorial on how to use my Python wrapper... A simple example on Ev3rstorm would be better than Mindcub3r then.
Maybe I'll put that on my github someday.

@jsvalderrama
Copy link

If you could you please help writing the basic moving instructions, such as
the robot´s start up, move straight ahead, turn left, go straight ahead,
etc, with comments.

I think it would help to start building a documentation such as a tutorial
where everyone could write further instructions so that we could increase
possible actions for the robot and allow everyone to move the robot as
wanted. What do you think about it?

2014-05-14 0:59 GMT+02:00 cavenel notifications@github.com:

Well for sure, the Mindcub3r implementation was not for beginners.
Actually when I wrote it it was more for me than for anyone else, to see
what was possible to do with this ev3dev! But then if it can be used as an
example of what is doable, why not :-)

But I could do a simple tutorial on how to use my Python wrapper... A
simple example on Ev3rstorm would be better than Mindcub3r then.
Maybe I'll put that on my github someday.


Reply to this email directly or view it on GitHubhttps://github.com//issues/63#issuecomment-43023861
.

@rhempel
Copy link
Member

rhempel commented May 14, 2014

Please bear in mind that some of the motor attribute files will change. For example, there is no reason to have brake, coast, and hold in separate attributes. There may be a difference in the way we handle transitioning to relative mode from absolute mode, and the concept of speed and power setpoints may change.
If you're doing instructions and tutorials (always a good idea) please consider submitting them to the ev3dev wiki and keeping them in sync with what's on your own wiki. The more material is on the ev3dev wiki, the more critical mass we get to encourage more users.

@cavenel
Copy link

cavenel commented May 21, 2014

Thank you for the reminder @rhempel. Indeed, I will wait for a more final version to take some time to write tutorials.

@walshbp: I updated my code so that the scan works for every cube and most light conditions. I took some of David G. advice, and programmed a small clustering algorithm to be a bit more robust. Let me know if it helps :) https://github.com/cavenel/ev3dev_examples/blob/master/python/pyev3/rubiks.py

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants