Home > Engineering essays > Autonomous mobile robotics

Essay: Autonomous mobile robotics

Essay details and download:

Text preview of this essay:

This page of the essay has 3,076 words.

Chapter 1
Introduction to Project
1.1 Project Vision
Robotics is the part of technological advancement the human lifestyle. The real definition of robot is it always sense-think-process to get in action. Automation in robot controlling and processing is the key to progress in it. Semi or full artificial intelligence in robot is the pioneer area of development and manufacture of robot. This project “Autonomous mobile robotics” is the try to get insight of this field. The essential requirements of the hardware and software approach needed to done are the central study part of this project.
1.2 Project Details
The main idea of the project is to simulate and get things in action accordance the definition. There is need of automatic surveillance in many areas. Idea of project is to make a robot which needs less human interface and work on real time self correction controlling. Robot that goes everywhere coordinated path, avoiding obstacles detected by various ultrasonic sensors arrangement and GPS on path. The rotating ultrasonic sensor provides the surround environment object detection. On board remote communication link, movement control and path correction etc is done very effectively by Raspberry Pi as central control unit. The remote connectivity is provided over the internet. The system provides real time environment information and visuals via camera to remote monitoring system. Which is helpful for many application like surveillance, navigation and environment Scouting.
Chapter 2
Robotics and Recent Trends
2.1 Robotics
World is evolving day by day more advance. Technological advancement in human lifestyle is essential part of human race. Enable us to do most of all critical, hazardous jobs and processes in more reliable, efficient and fast manner. Robotics is the one of pillar foundation of the future. The designs and sketches of robots ware present centuries ago. Those were mechanical in nature. Due to advancement in semi-conductor devices, the age of electronics controlled robots is here. The designing of robot in day-to-day activity comes around the mid 60’s time. From car manufacturing to the self driving car there is wide spectrum of various robots. All do the respective assigned tasks very efficiently.
2.2 First Industrial Robot
Robot named “Unimate” in 1961 was the first industrial robot to work at General Motors car manufacturing. Its job was to work with handling of heated die-castings and welding on car body. The commands of controlling it ware stored in magnetic drum. Unimate was having six programmable axes of motion and weight capacity of 500 lbs.
Fig. 2(a):UNIMATE Fig. 2(b): Car Robot Assembly Line
At the end of the 70’s the General Motors was able to achieve the production of 110 cars per hours double then rival company just because of robot automated plant. That was very revolutionary in the automobile industry and big steps towards the robotics in manufacture sector.
2.3 Robot at Expedition
Meet Stanford University’s OceanOne, a humanoid robot with haptic feedback ability works underwater for deep ocean expedition. Wreck of flagship of King Louis XIV La Lune had sunk 20 miles off the southern coast of France since 1664. Since then no human had explored its ruins or the treasures and artifacts the ship once carried long way up to the centuries past. At the time of April 2016, the impossible had done. The OceanOne begins its voyage mission towards wreck and artifacts of La Lune. The contours of vase and its weight were sensed by pilot of OceanOne because of the haptic feedback of robots hand. The robot body moves and arms adjust itself to keep steady as it works. The artifacts ware brought over to surface after the 300 years of time. That was the great moment for archaeologists, engineers, and scientists. It is hoped that the underwater robots will one day take on highly-skilled underwater tasks too dangerous for human scuba divers.
Fig. 2(c): OceanOne Underwater Expedition
2.4 Automatic Driving Car
In mid of last decade, the Defense Advanced Research Projects Agency (DARPA) announced the grand challenge of self-driving vehicles. Major corporate giants like General Motors, Google, Mercedes-Benz, Toyota, Audi, Volvo, Tesla Motors, Bosch, Continental Automotive Systems, IAV, Autoliv Inc., Nissan, Renault, Hyundai Motor Company, Peugeot, Local Motors, AKKA Technologies, and major Universities like Stanford University, University of Parma(Vislab), Oxford University and all numerous are in development of automatic driving car. Google is pioneer of self driving car as we know from long. Google started the self driving car project in 2009. They had completed total 2,400,000 km on road testing. The real build of prototype vehicle was launched in December 2014.
Fig.2 (d): First Test Model at 2009
Fig.2(e): Real Built Prototype in 2014
2.5 Personal Autonomous Robot
Unmanned aerial vehicles (UAV) are the commercially available in market nowadays. Take example of DJI is Chinese Technology Company founded in 2006. It manufactures unmanned aerial vehicles (UAV) for aerial visuals and flight platforms. Its product series called Phantom is highly advanced developed drones in the consumer market. It is mounted with various sensors, camera and long lasting battery. It can avoid indoor as well as outdoor obstacles very effectively. On board navigation system can control the drone at various positions, speed and path through various controlling platform. With all this and Mission Hub from flylitchi we are able to plan the desired path of drone at desired height by just some settings. We can also track down something using drone’s onboard camera image processing. All this don’t need any kind of coding, everything is inform of graphical user interface (GUI).
Fig.2(f): Phantom Drone
Fig.2(g):Litchi Autonomous Flight Planner GUI
2.6 Personal Robotics Development Platform
Ardupilot is open source autopilot platform able to control module for multirotors drones, cars, boats or any robotics platform. It does the job as Ground controlling unit. The robot sensors and motors can be controlled using this. The robotic platforms can possess fully automatic or semi-automatic ability when given an appropriate coding and hardware connections. For autonomous waypoint navigation robot the modules called pixhawk is to be used for controlling and other things to do. The mission planner is GUI platform which can plan the movement of robotic platform using commands generated in from of path waypoints.
Fig.2(h):Pixhawk Hardware Module
Fig.2(i):Mission Planner GUI
Chapter 3
Components Assembly
3.1 Sensors and Processing
There are various kinds of sensors and components can be used in making of autonomous robot. The array of sensor sensors takes the quantity measurement in synchronizing manner is essential. In real automatic driving car combination of various sensors is used for decision making of control and error correction.
Take example of figure below the main fundamental sensor is laser scanners which minds the car of surrounding environment physical condition. GPS is used for map coordination on road. Camera is used for image processing and object detection. The car system make whole real time 2D and 3D analyzed data simulation with the help of this all complex sensors connection.
Fig.3(a):Sensors Establishment on Automatic Driving Car
3.2 Sensors Management
The main sensor in self driving car is surround scanning laser light detection and ranging system is there. For a development stage of robot it is expensive and complex. So the compliment idea for this project is to make surround scanning device using ultrasonic sensor. The self driving cars also use the array of cameras for image processing and detection purpose. Camera can be also used as navigating device in indoor robot controlling. The gps sensor is useful for navigating the robot on road navigation.
3.3 Simple Process Flow
Fig.3(b):Process Flow in Simple Robot
As above the basic process of the robots input and output data flow. The components required to make the robot is listed And explained in detail afterwards.
3.4 List of Basic Components for Robot Designing for Use Or for Future Purpose:
? Raspberry pi
? Adafruit Ultimate Gps module
? Ultrasonic sensor HC-SR04
? Stepper motor
? L293D shield
? Camera
3.4.1 Raspberry pi
Raspberry pi is a development board with feature of system on chip (SoC). It supports various operating systems like Raspbian, Ubuntu MATE, Snappy Ubuntu Core, Windows 10 IoT Core, RISC OS, Debian, and Arch Linux ARM for development environment, with command line interface and GUI. The internet connectivity is feature of this development board. 4 USB port is provided for external peripheral connection. It has 40 general purpose input output pins for connection of various electronics components. The latest version of this board contains wifi, Bluetooth onboard. It draws 800 mA current, at 5 volt power supply at micro USB port. Approx 5 watt power is drown by it so it is idol for embedded circuit designing. The raspberry pi controlled rover robot picture is given below.
Fig.3(c):Raspberry pi 2 Model B
Fig.3(d):Raspberry pi Rover
3.4.2 Adafruit Ultimate GPS
It is the GPS hardware module manufactured by Adafruit. It has built in data logging capability. It is build around the MTK3339 chipset. It has a standard ceramic patch antenna with excellent -166 dBm tracking sensitivity. This GPS module can track up to 22 satellites on 66 channels. The position accuracy is less than 3 meters, better then all GPS technology about 3 meters. The velocity tracking accuracy is about 0.1 meters/s. It gives 10 location data updates per second. The internal flash can store data about 16 hours. That automatically appends data in case of accidentally losing data if power is lost. The default output of module is NMEA 0183, 9600 baud. It can operate at 5-3.3 volt and draws only 20mA current during navigation.
Fig.3(e):Adafruit Ultimate GPS Module
NEMA stands for National Marine Electronics Association. The NMEA protocol frame is explained with the help of image below.
Fig.3(f):NEMA Protocol Frame
3.4.3 Ultrasonic Sensor HC-SR04
The Ultrasonic sensor HC-SR04 is the most used sensor for ultrasonic ranging. It provides the sensing range of 2-400 cm with the accuracy of 3mm. It contains transmitting and receiving sections separately as seen from the figure. It operates on 5 volt at 15mA current. The measuring angle is 15 degree. 10us TTL pulse is needed at trigger pin of ultrasonic sensor module. At echo pin the output from the receiving point is produced. The range is calculated using the time interval between sending trigger signal and receiving echo signal. The speed of ultrasonic signals is 340 m/s. so by formula Distance = (Time x Speed Of Sound) / 2, we can calculate the range of obstacle.
Fig.3(g):HC-SR04 Ultrasonic Sensor
The pulse transmitted is reflected from the obstacle and captured back as echo at sensor. As explained above timing of signal processing is important for precise ranging. It works as same way as bat try to catch its bait.
Fig.3(h):Ultrasonic Signal Feedback
3.4.4 Stepper Motor:
Stepper motor is brushless dc motor but moves at discrete steps at a time. The group of coils in sequence of their input pulse, motor shaft rotates at step angle. The figure below shows the internal structure of stepper motor. Driving a bipolar stepper motor requires 2 full H-bridges; hence it can reverse the current to the phases.
Fig.3(i):Inner part of Stepper Motor
As we can see the magnet is surrounded by the coil assembly in stepper motor.
3.4.5 L293D Motor Driver
The L293D is one of most popular and cheap IC available in the market. The timing is critical thing for stepper motor for precise rotation. That is provided by the IC L293D very effectively. It can also run two DC motors simultaneously. When an enable input pin is high, the associated driver gets enabled. As a result, the outputs become active and work in phase with their inputs. This IC has very large Voltage range of 4.5 V to 36 V.
Fig.3(j):IC L293D
? The Pin Configuration The IC
Fig.3(k): Pin Configuration the IC L293D
The IC can be used for 4_wire stepper motor. The coil 1 is connected to motor A+/- terminal and same in coil 2 in terminals motor B+/-. 4 GPIO signals control the coil. GPIO 1 &2 control the coil 1, GPIO 3 & 4 control coil 2. The motor needs supply voltage which is provided from pin no. 8 of IC. The circuit diagram of using the 4-wire stepper motor and L293D is given below. Motor runs at 9v of power supply. And control signals are provided from the pins 5, 6 for coil 1 and pins 10, 11 for coil 2.
? Schematic Using fritzing:
Fig.3(l):Circuit Diagram of Stepper Motor Using L293D Driver
3.4.6 Camera
Camera in the robot is the not for visual gathering nowadays. The self driving cars also use the array of cameras for image processing and detection purpose. Camera can be also used as navigating device in indoor robot controlling. The GPS is not helpful in that case. The real time image processing is used in robots to detect the objects and find the path. With the help of light detection and ranging system and all sensors image processing is done in 2D and 3D.
Fig.3(m):Objects Detection Using Camera
The sign board, road lane, objects nearby is detected onboard camera mounted on autonomous driving car. The object mapping is also generated using camera in self driving car. Take an example below as references.
Fig.3(n):How Camera in Autonomous Car Sees the World
Chapter 4
GPS Waypoints
4.1 Gps Data from Hardware
For parsing the raw data properly from the GPS module to central computer say Raspberry pi the “gpsd”-GPD Daemon is used. It acts as bridge between the actual GPS hardware and the application software. It also provides handling of parsing errors and gives well-defined interfaces to any GPS module.
4.1.1 Gpsd
See the hardware working on Raspberry pi the connection as bellow is done:
Raspberry Pi GPS module(Adafruit ultimate GPS)
5V (pin04) VIN
GND (pin06) GND
TX (pin08)(GPIO14) RX
RX (pin10)(GPIO15) TX
Table 1:Pin connections
? Schematic Using fritzing:
Fig.4(a): Interfacing of GPS Module with Raspberry pi
Software coding has to be done for installations, which are listed as below:
? We have to install the software package of gpsd. We write codes in terminal:
$ sudo apt-get install gpsd gpsd-clients python-gps
? After all the setup we write code below to start the gps module and software dialog.
$ sudo killall gpsd
$ sudo gpsd /dev/ttyAMA0 -F /var/run/gpsd.sock
? And to test it we write:
$ cgps -s
? And we get output of all gps Quantity table.
Fig.4(b):GPS Data Table
Hence we get the output gps hardware data with the help of gpsd. Applications like navit, pyGPS, OpenCPN, roadmap, LiveGPS, roadnav, Kismet, GpsDrive, gpeGPS, gpsd-ais-viewer, viking, geohist, tangogps, foxtrot, geoclue, obdgpslogger, gpsd-navigator qlandkartegt, gpredict, and firefox/mozilla uses gpsd.
4.2 Waypoint Extraction
Now we have to determine the coordinates for navigation reference. It can’t be done directly. First of all we have to determine source and destination points of course. And simple steps listed below make us reach us to precise waypoints turn by turn coordinates.
? Open site: www.maps.google.com
? Go to your location on side bar
? Select maps
? Go to Create maps
? Select source and destination point in drive direction as shown in figure.
Fig.4(c):Selecting Source and Destination Using Google maps
? Select export to kml from options
Fig.4(d):Dialog Box For Download KML File
? Check the box and Select download
? Open site: gpsvisualizer.com
? Upload kml file earlier via providing path of file.
Fig.4(e):Screenshot of GPSvisualizer Homepage
? Select output format you want to make. The wide range o output options are available for waypoints. We can make text file, plot file, picture format file, Gpx file or any. Our choice is text file for reference set of waypoints.
? Expanded Output text file is observed as figure below. The file can be downloaded for linking to any compatible software code.
Fig.4(f):Waypoints in Form of Text
The output file contains the coordinates of the waypoints. This can be used in program when it is needed. The proper indexing is needed to navigate the robot via the path formed by waypoints. The each waypoint consists of turn by turn navigating point. This is helpful for precise left or right movement to the path.
4.3 The Rotating Ultrasonic Radar Sensor
The main sensor in self driving car is surround scanning laser light detection and ranging system is there. For a development stage of robot it is expensive and complex. So the compliment idea for this project is to make surround scanning device using ultrasonic sensor. The 360 rotating surround sensing ultrasonic sensor mounted on top of stepper motor. Wires of ultrasonic sensor don’t get mashed up while rotating. Here is the picture bellow of this designed hardware.
Fig.4(g): Rotating Ultrasonic Radar Sensor Model
The connections of ultrasonic sensor terminals are supported by 3 point audio jack. It gives three needed terminals Vcc, Echo, and Trigger. Still its in development stage. The challenge is to control the RPM of stepper motor and take ultrasonic sensor reading simultaneously without error. The voltage needed to drive the stepper motor is 9v or above. The interfacing of both is tried to implement on simple development board for testing. The test circuit diagram is discussed this after and limitations are also discussed.
4.3.1 Testing
As per the circuit diagram the all components wire connections are illustrated. The stepper motor is to run by the L293D motor driver IC. The supply voltage of battery controls signal power at terminals of stepper motor wires.
? Schematic using fritzing:
Fig.4(h):Circuit Diagram of Rotating Ultrasonic Sensor Using Stepper Motor
10us TTL pulse is needed at trigger pin of ultrasonic sensor module. At echo pin the output from the receiving point is produced. The range is calculated using the time interval between sending trigger signal and receiving echo signal. The speed of ultrasonic signals is 340 m/s. so by formula Distance = (Time x Speed of Sound) / 2, we can calculate the range of obstacle.
Not the whole 360 area is covered because the support shaft of audio jack is work as obstacle. All is explained next.
4.2.2 Dark Zone
As per the figure illustrated the top view of the device is shown. Area inside of two lines which forms angle of around 90 degrees. A surround 270 degree is measured by the ultrasonic sensor. If any or both trigger and echo terminal have no obstacle nearby they tend to sense the obstacle. The problem arises when the support shaft of audio jack is sensed as obstacle from rotating ultrasonic sensor. The ultrasonic show the area marked between the lines as obstacle of range within 5 cm.
Fig.4(i): Top View of Rotating Ultrasonic Radar Sensor Model
The solution observed is to move further support shaft relatively to the ultrasonic sensor. So that angle of two black lines can be decreased to some angle.

About this essay:

If you use part of this page in your own work, you need to provide a citation, as follows:

Essay Sauce, Autonomous mobile robotics. Available from:<https://www.essaysauce.com/engineering-essays/autonomous-mobile-robotics/> [Accessed 19-12-24].

These Engineering essays have been submitted to us by students in order to help you with your studies.

* This essay may have been previously published on EssaySauce.com and/or Essay.uk.com at an earlier date than indicated.