Student Projects

cancel
Showing results for 
Search instead for 
Did you mean: 

2015 GSDC_SKKU

Contact Information

University

SungKyunKwan University(SKKU)

Team Name

SAVE

Team Members

ChanHo Park, DoUi Hong,

HwaYoung Lee, DongYeon Yu

Faculty Advisers

SungHo Hwang

Email Address

pch881125@gmail.com

Contacts

010-5044-4791

Project Information

Title: Scenario Reproduction Soft Target Development for Simulation of Unmanned Autonomous Vehicle

Description: We developed the self-driving soft target which can drive autonomously with pre-planned GPS path and scenario. The soft target can drive itself by recognizing around circumstances with Camera, GPS, and Gyro sensor and handling these acquired data with NI-MyRIO and LabVIEW.

Products:      

HW / SW

Model Number / Module

Controller

NI myRIO

myRIO-1900

Development System

NI LabVIEW myRIO 2013

LabVIEW Real-Time Module

LabVIEW Vision Development Module

Gyroscope

NI myRIO Mechatronics Kit (PmodGYRO)

L3G4200D

Vehicle Chassis

Unmanned solution ERP

(including Servo motors, MCU, LCD pannel)

ERP-42

AM-SLCD420 Serial LCD

WebCam

Microsoft LifeCam

HD1393

GPS

AscenKorea GPS

AKBU3

The Challenge: Soft targets are used to actual pre-planned driving tests such as ADAS(Advanced Driver Assistance System) and autonomous vehicle driving test. But performing the pre-planned test with manpower might be inefficient and it may not be able to exactly perform the pre-planned scenario. To overcome these problems, we developed the self-driving soft target which can drive autonomously with pre-planned GPS path and scenario. The soft target can drive itself by recognizing around circumstances with Camera, GPS, and Gyro sensor and handling these acquired data with NI-MyRIO and LabVIEW.

The Solution: Overall soft target system is shown as the diagram below. The system is composed of three parts; those are recognition, determination, and control. First, at the recognition stage, recognize current status of vehicle from the collected data from the sensors. Second, at decision stage, generate vehicle driving path and compensate sensor error signal based on the data obtained in the recognition stage. Third, at the control stage, control movement of the vehicle such as steering angle and speed based on the decision stage. Data communications with each other at each stage are displayed as the diagram.

1.png

Figure 1. Soft Target System Configuration

     1. Recognition Stage

First, the soft target must accurately recognize status of the vehicle and the surroundings in order to be self-driving. The sensors used in recognition are shown as Figure 2(a). GPS sensor recognizes current position and heading of vehicle and input desired driving path. Gyroscope is used to realize inclination status of vehicle and slope of driving road. Web camera detect lane around the vehicle and help GPS to recognize accurate current location. Data communications between hardware and controller are explained as figure 2(b).

22.png                                    Figure 2. (a)Hardware configuration                                                       (b)Platform-Controller signal

     2. Decision Stage

Second, at decision stage, software generate vehicle driving path based on the input desired driving path and obtained around environment data from recognition stage.

3.png

Figure 3. Sensor Data Compensation Loop

Figure 3 shows compensation loop of vehicle heading and GPS location which is an example of decision making LabVIEW code. At part A, compensate GPS heading data using slope of detected lane. And then, synchronize generated driving path with input desired path at part B.

The Front panel configuration of system is shown as Figure 4 which can be divided into tab control container, image/map display, and variable list.

4.png

Figure 4. Front panel configuration

At tab control container, user can select auto/manual driving mode and determine initial vehicle speed. Image/map display show global, local map which describe information of region of interest(ROI), input reference driving path, detected lane, and location of vehicle and current camera input data at topmost display. Real-time obtained/calculated variable values show up at variable list area.

     3. Vehicle Control Stage 5.png

Figure 5. Vehicle Control Loop

Lastly, software command condition of vehicle such as speed, steering angle based on reference path and calibrated current location of vehicle. Vehicle control LabVIEW code configuration is shown as Figure 5. Key factors of the vehicle control are the distance from the target path, the corrected current position, the lane recognition, and the slope of the road.

Results

Experiment is performed at SKKU Natural Science Campus with pre-planned driving path as shown in Figure 6.

66.png

  Figure 6. Experiment performed image

 

Contributors