- Sensor Information
- Helpful Links
- LEGO EV3 MATLAB Documentation
- Connect to EV3
- Sensor Information
- Test Key Control
- Autonomous Control
- Goal of the Project
- Meet the Team
- December 1 Update
- December 3 Update
- December 4 Update
- Demonstration
- Sensor Port 1: N/A
- Sensor Port 2: N/A
- Sensor Port 3: Ultrasonic Sensor
- Sensor Port 4: Color/Light Sensor
- Motors connected to ports A and B
- Crane is connected to Port C
https://www.mathworks.com/help/supportpkg/legomindstormsev3io/ref/readcolor.html
// How to start with automatic detection
https://education.lego.com/en-us/lessons/mindstorms-ev3/object-detection#Planitem2
//How to run the code
https://makecode.mindstorms.com/reference/sensors/ultrasonic/on-event
//set up collision alarm
! To move the robot straight forward Motor B is 3 less power than A
You can find all current MATLAB documentation for the LEGO EV3 utility files implemented in this class at the following this
You may also find the following files useful for programming your robot:
-
Installing MATLAB and EV3 utility files on your personal computer- Personal Machine Software EV3 Toolbox
Make sure that your robot has a battery and is turned on and that the battery is connected
In addition to making sure the robots bluetooth is on as well as the device you are connected to
- On windows or mac, navigate to your bluetooth settings and set-up a new device
- Locate the name of your ev3 power brick (found at the top of the brick screen)
- Connect
- Once connected you will be prompted for a code, refer to your robot
- Input code
- Open "ConnectToEV3.m" file
In ConnectToEV3, run the program and it should connect MATLAB to the robot as well as play a sound and display the battery power
Congratulations
Your MindstormsEV3 robot should now be setup
The first file to test the sensor is touchSwitch which was used to test the touch Sensor
- includes set-up syntax
- different modes
- output values
- uses and practical applications
colorSensor
- includes how to set-up the color sensor
- four different modes included
- ambientLight
- colorCode
- colorRGB
- lightReflect
- returns values to understand applications
Ultrasonic
- The main sensor used
- Shows the setup and
brick.UltrasonicDist(motor)
- includes code to use in autonomous program
This file tests the use of the key controls and using the keyboard to control the robot
- Uses keys to make the robot move in all directions
- Use keys 'w' and 's' to raise and lower the crane accordingly
- Main first use to controlling the robot
Learning autonomous control was one of the most challenging parts of this Mindstorms EV3 project
Working on the While loops was a struggle as we knew what we wanted the robot to do, but we were telling it to do something else. Lot's of extra work was needed to be put into the robot to allow it to run on it's own
- Make sure that you load ConnectToEV3
- Once done locate the Main.m file within the automation folder
- Press F5 on the program, or just run within the view tab
- On the pop-up window press the 'up-arrow' on your keyboard to start
Challenge with the While Loop
- This is the basis for our while loop and understanding of it:
# MATLAB
global key;
InitKeyboard();
startMoving = 0;
while 1
pause(0.1);
distance = brick.UltrasonicDist(3);
switch key
case 'uparrow' % on the up arrow, the auto-driving will begin
while(startMoving == 0)
distance = brick.UltrasonicDist(3);
if (distance > 15)
brick.MoveMotor('A', 50);
brick.MoveMotor('B', 47);
numRightTurns = 0;
numLeftTurns = 0;
distance = brick.UltrasonicDist(3);
elseif (distance < 15)
% turn left
brick.MoveMotor('A', 27);
brick.MoveMotor('B', -27);
pause(0.875);
brick.StopMotor('A');
brick.StopMotor('B');
numLeftTurns = 1;
distance = brick.UltrasonicDist(3);
disp(distance);
end
end
end
end
end
- Our first large problem was with the startMoving variable at the beginning. Without this the while loop would not run the way we want it to and would stop before the program would move through the while-loop
- The next was the order in which to run through the while loop, much similar to
Exception
within Java, where the order in which you place certain loops matters. We need to make sure the robot moves further before it checks new locations - The next thing that we need to work towards is getting the robot to move in a straight line as the motors and wheels are not centered correctly
So what are we trying to do?:
- Our first goal in this project was to learn how to work as a team and accomplish the task of delegating our mission within the group
- After this, we had people design the robot, work through a gantt chart and learn the MATLAB Documentation
- Our first goal and milestone was getting the robot to work through remote control. This was not too difficult and was good for the team as we had to make small design changes as we learned how the sensors worked
- From here, the group made many small design changes but ultimately led towards our final design of the robot which would be used for our autonomous control of the robot
Our Goal
- Successfully retrieve a handicapped figure through a maze and return to the start while avoiding walls and obeying traffic laws
How Did we Achieve this Goal?
- By working as a team and delegating our work together to ensure that the project was completed successfully and on time
Eric Wu
- Worked on design as well as making sure that the engineering design of the robot fit well with the programming of it
Jordan Post
- Worked on design with Eric making sure that the design of the robot was up to code and was able to work through the maze
Mark Ashinhust
- Worked mainly on the code of the robot (as well as this README), and made sure that everything worked well and on time as needed
Sammy Arenson
- Main documentation lead as well as looking over code to ensure that we were progressing, held the team accountable and together
The main trouble working through the autonomous driving robot was getting it to be able to drive forward without any assistance. Though this was not completely achieved, manual intervention is still not necessary.
- Robot makes it through maze without human touch
- New implementation to have robot stop when the color on the ground is red, therefore points for stopping should be achieved (Final Rubric)
- robot makes necessary turns, turning right, left 180 and finally backwards
- Implement stopping on green and trying to pick up person in wheel chair
- Maintain turning and driving straight capabilities
- A way to detect when the robot has been moving against a wall
- What to do when the color on the ground is blue?
- implementations
- December 5th Final Demonstration
Functionality for the robot works when moving straight and turning. However, trouble was hit when turning towards a wall and there were no sensors to calculate for these differences. Therefore, I engineered some touch sensors on the front right and front left of the robot to detect when the robot hits a wall and it will back up and turn slightly in the opposite direction of the wall.
- Touch Sensors to know when the robot encounters a wall
- Robot moves straight
- Sensors work
- touch1 is connected to PORT 1 and touch2 is connected to PORT 2
- What to do when the robot reaches a blue tile
- What to do when the robot reaches a green tile? Move to remote control?
- How do we implement the robot to get through the maze when there are certain obstacles?
- Can our robot pick up the wheel-chair
Robot works as well as it can with the information provided. Can get through most of the maze without instruction but needs some help. May achieve a good score, however the robot will not be fully Autonomous.
The main rules of the course were a little difficult in the final go of the robot, however, we were able to successfully get the robot through the course having a drop off and pick up point. Our robot did not work in a fully autonomous style, however, it did go through at least 1/3 of the course in an autonomous style. Videos and pictures will be updated soon.
- scored a 97/100
- 1st place in the robot battles