In this paper, a biped robot for target recognition is introduced. It can track and approach a set color object through a camera. This system is based on Android system and Arduino microcontroller system. It uses OpenCV4Android, the vision library of Android system, to analyze the collected images, and then stereo-positioning the set color objects. Then it sends instructions to Arduino controller through Bluetooth chip to control the steering gear driving robot to approach the target. The design goal of this system is to use Android system and Arduino controller to make a biped target positioning robot. The robot can use OpenCV4Android visual library to search for objects with certain color (HSV format) in the course of moving.
It can track objects with specific shape and color in the course of experiment, and always face it. Robot components are manufactured by 3-D printing method, and the actuator is controlled by Arduino microcontroller to drive the robot to move and steer. The biped robot system is composed of Android PC equipment and software, Arduino single chip computer system, ultrasonic ranging module and leg mechanism of biped robot. Firstly, the system detects the object with a specific color range through the camera installed on the Android PC, obtains the shape and contour of the object, and then distinguishes the coordinates of the midpoint of the object’s surface in the camera coordinate system, that is (x, y). When the object coordinate is not in the central axis of the camera coordinate system, the robot will turn automatically until the central axis is directly opposite to the object coordinate, and the robot will stop turning. In this paper, Android device is used as the upper computer of the robot (the smart phone with Android system is used in this paper).
The object recognition program is written based on the Android vision library OpenCV4Android through the Android development environment ADT, and the steering gear on the robot’s feet is controlled by the Bluetooth controller on the mobile phone to drive the robot forward or turn. The object color discrimination and location program installed on Android device is written by Android development environment ADT combined with OpenCV visual library. OpenCV vision library is an open source cross-platform vision library developed by Intel Corporation [1].
First, the Android platform version of OpenCV visual library is loaded in ADT, and then a new basic class Camera BridgeViewBase is created in the main program file MainActivity. Java in the program folder SRC to open the camera. Then in onCamera Frame, the instantaneous image captured by camera is stored in RGB format in a matrix variable named MRgba. The detection threshold is set by Inrange of OpenCV matrix operation function. Then the contours matrix of object shape contour set is found by FindContours function of OpenCV library. Finally, the contours matrix is transformed into the contour point matrix by iteration statement, and the midpoint coordinates of the contour are obtained by the program. After testing, the positioning effect is good. Figure 1 shows that the program can detect the outline of the specified color object in the image (in this case, the outline of the green object is detected), draw the outline with a curve, and draw the outline with a straight line. Finally, the program calculates the upper left and lower right endpoints of the contour, and obtains the coordinates of the central point of the contour according to the two endpoints.
Android devices, as host computers, use Bluetooth Adapter class of Android library to communicate with Bluetooth module on microcontroller. When Android PC connects with Bluetooth module on microcontroller through Bluetooth (the default password of Bluetooth device is 0000 or 1234 by entering password), APP on host computer creates a Bluetooth Socket, which is used to communicate with Aduino. Bluetooth module communication on the controller. This system uses Bluetooth chip to connect Arduino controller and Android device through serial port. The model of Bluetooth chip is HC06. The chip has two interfaces, TXD and RXD, which are connected with RXD (P3.0) and TXD (P3.1) of 8051 single chip computer through DuPont line. Arduino is a convenient, flexible and powerful open source microcontroller platform based on AVR microcontroller. The model of Arduino used in this paper is Arduino MEGA2560. The core of Arduino MEGA2560 processor is ATmega2560. It uses USB interface and has 54 digital input and output channels. Among them, 16 channels can be used as PWM output. It has powerful functions and is suitable for steering gear to control the joint of biped robot. Fig. 2 is the wiring diagram of the steering gear. There are five steering gears in the leg structure of the robot. When the number of actuators is large, if only Arduino’s on-board power supply is used to supply power for the actuator, the actuator driving current will be insufficient, resulting in the jitter of the actuator when it rotates. The system uses 6V12AH battery to supply power for the steering gear, which can effectively solve the jitter of the steering gear when it rotates. The positive and negative poles of the accumulator are connected with the power pole and the ground pole of the steering gear through the DuPont line and breadboard to supply power for the steering gear. The steering gear signal line is connected with the PWM generator port of Arduino and receives the control signal from Arduino. The wiring diagram of Arduino controller and steering gear is shown in Fig. 2. The robot in this design adopts the skeleton of the 3D printer robot.
3D printer printing is convenient, fast and easy to use. Printed products have the advantages of light weight and high strength. Firstly, the leg mechanism model of the robot is created by SolidWorks software, and then the leg mechanism model is simulated by SolidWorks Motion plug-in integrated with SolidWorks software.
If the simulation results meet the technical requirements, the printer is printed through the serial port of the computer. Finally, each skeleton component is connected together by plastic thread connectors. Considering the light skeleton of the robot, Huisheng MG995 actuator is used to drive the leg joint of the robot.
This steering gear has the advantages of low cost, sensitive response and high torque, thermostatic element which is enough to drive the biped robot forward and steering. The leg structure and SolidWorks motion simulation results are shown in Fig. 3 and Figure 4. The simulation results show that when the weight of the robot skeleton is about 0.5kg, the maximum output torque of the MG995 steering gear is 13kg.cm using the battery as the power supply. In the simulation process, the maximum load of the joint of the driving robot is only 2.19 kg cm, so the MG995 actuator has enough power to drive the robot. This paper presents a design idea of visual robot. The focus of this design is to use the Android OpenCV library to write a program with high speed and stability to locate the target object in the Android Eclipse environment.