Presentation is loading. Please wait.

Presentation is loading. Please wait.

Sponsors: National Aeronautics and Space Administration (NASA) NASA Goddard Institute for Space Studies (GISS) NASA New York Research Initiative (NYCRI)

Similar presentations


Presentation on theme: "Sponsors: National Aeronautics and Space Administration (NASA) NASA Goddard Institute for Space Studies (GISS) NASA New York Research Initiative (NYCRI)"— Presentation transcript:

1 Sponsors: National Aeronautics and Space Administration (NASA) NASA Goddard Institute for Space Studies (GISS) NASA New York Research Initiative (NYCRI) NASA Goddard Space Flight Center (GSFC) Contributors: Dr. Haim Baruh, Ph. D., P.I. Alexey Titovich, Graduate Student Kelvin Quarcoo, High School Teacher David Kelly, High School Student Rutgers University (RU) Lead Car Follower 1 Follower 2 CMUcam2MotorshieldArduino References Abstract System Troubleshooting Programming Goals Future Goals The convoy is made up of three main parts: the lead car, the first follower, and the second follower. The lead car is a normal RC car with relatively few changes to it. This is the car that the autonomous cars will be following. On the back of the lead car, a red ping pong ball with an LED light inside is attached. The two follower vehicles, the autonomous vehicles, will track and follow this ping pong ball. As the ping pong ball moves within their range of sight, they will make corrections and follow it. Develop a working convoy of autonomous vehicles using light detection technology. Look into other forms of autonomous tracking including GPS and accelerometers. The first several weeks of research were spent troubleshooting the CMUcam2, the Arduino Microprocessor, and the motorshield then getting them to work independently. Once we accomplished that goal, we began to integrate two parts at a time and got them working with each other. Once that was done, we fully integrated all three major parts and attempted to get that working. The first major steps we took were reading over many manuals in relation to the separate parts and previous year’s work on similar projects. The next step involved working with our Arduinos (previously Arduino BTs, or Bluetooth, but since then we have upgraded to an Arduino Duemilanove because it was twice as powerful) and learning how to upload basic programs. Afterwards, we began working with our Motor control and integrating them with the Arduinos. Once that was successfully completed, we turned to our CMUcam2, which proved to be one of the hardest challenges. We began by focusing the camera through its Graphical User Interface and sending it commands through a terminal emulator called HyperTerminal - both of these functions were done on a computer. Finally, we began sending it commands from the Arduino instead of a computer. After much work, all of these tasks were accomplished and integration between all three component was successful and programming could begin. Continue refining autonomous tracking based on light detection technology. Further research other forms of autonomous tracking including GPS and accelerometers. Leave a working blueprint for future research. Chukrallah, Bashir, David Laslo, Michael Ma, Sean Murphy, and Stefan Novak. Autonomous Vehicle Control Systems. Rutgers University, 1 May Web. Gartzman, Steve, Marifae Tibay, Thien Win, Steve Agudelo, Christian Cuentas, and Adekola Adesina. A Convoy of Autonomous Vehicles. Rutgers University, 24 Apr Web. Henlich, Jonathan D. "Mobile Robot Navigation." Information Systems Engineering (1997): Print. Rowe, Anthony, Charles Rosenberg, and Illah Nourbakhsh. CMUcam2 Vision Sensor: User Guide Print. An autonomous vehicle is any vehicle that can drive itself from one location to another without the use of human interference. When put in a convoy, several autonomous vehicles can follow one lead vehicle and mimic its movements. This can have many uses in the private, commercial, and government sectors. By using a CMUcam2, an Arduino Microprocessor, a Motorshield, and three Remote Controlled (RC) cars, we were successfully able to create a prototype autonomous convoy based on light detection technology. One of the three RC cars remained unchanged. This car would act as the Lead Vehicle – the vehicle that the autonomous vehicles would follow. The other two would be interfaced with a CMUcam2, an Arduino Microprocessor, and a Motorshield. These cars would become the two autonomous follower vehicles. The autonomous vehicles worked in several steps. First, the Arduino Microprocessor would send a command to the CMUcam2 telling it to track a certain color. Next, it would send another command asking for information about the color it is tracking. This information would be sent back to the Arduino, where it would be processed and used to determine the turning angle and velocity. After the calculations, the Arduino would send command to the servo motors controlling the angle of the front wheels and the Motorshield controlling the rear wheels. The Motorshield would read the commands, and vary the voltage going to the DC motor controlling the rear wheels, and thus control the speed of the vehicle. Once all the components had been integrated together, programming could begin. The program had to send a command to the camera telling it to track a certain color. Next, the program would take in data from the camera, store it, and convert it to a usable form. It would then process the data and send it out to the motors and the Motor controller. Below is a section of the code. void loop() { camera.print("TC \r"); delay(time); test(); ack(); data(); convert(); if (x1>0) { ///// STEERING ///// hi=(float) y2-y1; wi=(float) x2-x1; D=ho*hs/(2.0*hi*tanthetah); dx=D-Deq; dxchange=dxold-dx; Vx=dxchange/(time/1000); xi=((((float) x2-x1)/2)+(float)x1)-(ws/2)+xcalib; xo=(2*D*tanthetaw*xi)/ws; dy=xo; dychange=dyold-dy; Vy=dychange/(time/1000); alpha=atan((2*tanthetaw*xi)/ws); alphadeg=(alpha/3.1416)*180; delta=(int)alphadeg; deltachange=2*(deltaold-delta); pos=posold+deltachange; posold=pos; deltaold=delta; dxold=dx; dyold=dy; ///// VELOCITY ///// V=sqrt(pow(Vx,2)+pow(Vy,2)); ///// FORWARD ///// if (dx>0.2) { steering.write(pos); delay(10); output=(int)(12*dx); if (output>=200) { motor.setSpeed(200); delay(10); } else if(output<200) { motor.setSpeed(output); delay(10); } motor.run(FORWARD); } ///// BACKWARD ///// else if (dx<-0.2) { posdiff=pos-90; posback=90-posdiff; steering.write(posback); delay(10); output=(int)(8*dx); if (output>=200) { motor.setSpeed(200); delay(10); } else if(output<200) { motor.setSpeed(output); delay(10); } motor.run(BACKWARD); } ///// EQUILIBRIUM DISTANCE ///// if (dx>=-0.2 && dx<=0.2) { motor.run(RELEASE); } Serial.println(alphadeg); Serial.println(dx); } The two follower vehicles are nearly identical; the only difference between them is that the first follower vehicle has its own red ping pong ball with an LED which the second follower will track. There are three main parts to the follower vehicles: the CMUcam2, the Arduino Microprocessor, and the Motorshield. The CMUcam2 is a camera attached to the front of the vehicles which will track and follow the red ping pong ball based on its color. The camera will send information about the location of the ball and its pixel count to the Arduino Microprocessor. The location is used to calculate the turning angle and the pixel count is used to calculate the distance between the cars. The Arduino will process the information, and send commands to the servo motor controlling the front wheels and the Motorshield. The Motorshield takes the commands from the Arduino and varies the voltage to the DC motors controlling the rear wheels, thus controlling the speed.


Download ppt "Sponsors: National Aeronautics and Space Administration (NASA) NASA Goddard Institute for Space Studies (GISS) NASA New York Research Initiative (NYCRI)"

Similar presentations


Ads by Google