Introduction:

The Remote Sensory Vehicle (RSV) is a solution used to gather various types of data about an area into which it might be dangerous or impossible for a human to go. The RSV is completely wireless and is complete with an onboard power supply for autonomous operation. It requires no cable that could restrict its movements or cause it to become stuck, making it perfect for exploration nearly anywhere.

 

The Remote Sensory Vehicle comes with many features, including streaming video, physical collision avoidance and velocity feedback. The vehicle can be used by a remote user or rely on a set of preprogrammed autonomous control algorithms for movement.  The vehicle can operate up to 50 feet from the base station with no line of sight contact  necessary.  It is controlled through a keyboard attached to the base station and outputs its data to a computer monitor via its VGA out port.

 

Our system is meant to be a proof of concept vehicle. Our main objective is to get a working platform up and running. The platform will have basic functionality and a sample set of sensors, many of which will be used to enhance the abilities of the device. A full version of the vehicle would have increased range and the ability to add a custom set of sensors to the platform for various tasks.
Requirements:

The system will be split into two major components, the base station and the platform. Each of these pieces will have to meet specific requirements and handle specific situations. We will look at the requirements for the base station first.

The base station will need to receive the following inputs:

Keyboard input on the PS/2 connection – This will serve as the main user input device for the system. All user controls will come from this device. The FPGA will need to be able to handle a range of possible keystrokes, most important among these are the arrow keys and some alphabet keys that will act as switches.

Video input via the RCA jack – The XSV300 board will need to receive and process the video input signal that is being received via the RCA jack. This input will need to be converted to a format that will be compatible with the VGA out to the monitor and be processed in such a way that it can provide full motion video.

Telemetry data via the RS232 port – The XSV board will need to pull in telemetry data via the RS232 port and process it in such a way that it will be displayable on the monitor. It will need to be able to receive this data on a port that will be used for occasional control signal traffic going to the board as well.

The base station will also need to handle the following outputs:

VGA out to monitor – The base will need to put together the signal that will go to the VGA monitor. This will require decoding the inputs and placing the information in a usable format onto the screen.

Special control signals out to platform – The control signals that will allow us to switch modes and modify the incoming data stream (if possible) will need to be sent over the same bus through which the telemetry data is coming.

Control signals out to platform – The XSV board will need to be able to send basic RF control signals via the existing radio link to the platform. The XSV will interface with these controls via one of its pin banks.

The main job of the base station will be in processing the incoming data and putting out results to the platform and monitor. The hardest part of this job will be decoding and controlling the video feed. The XSV board will need to process this incoming data at or near real time. The base station may use double buffering for part of the VGA signal (the video portion) if speed becomes a real issue.

Now we will take a look at the requirements for the platform. The platform will take the following inputs:

Forward and Aft Sonar Array – The platform will be receiving data from both the forward and aft sonar range finding arrays. Each sonar in each array will fire at the rate of two times a second. This data will need to be collected by the platform and formatted into a digital signal that can be both send to the base station and sent to the onboard autonomous controller.

Accelerometer – The accelerometer will output data that will need to be digitized and processed to get angle of tilt. This information will then need to be sent to the base station and to the on-board autonomous controller.

Incoming Special Control Signals – The platform will need to receive and process incoming special commands (such as change state to user mode) in a timely manner with out significant loss of telemetry data.

Basic control signals – The platform will need to receive basic control signals on the 27 MHz channel. These signals may be partially ignored if the vehicle is in autonomous mode. The vehicle will always need to recognize the control signals that will turn the turret and elevate the camera unless the platform moves out of range.

The platform will also have the following outputs:

Video Feed – The vehicle will output a full motion color video feed from a camera mounted on the chassis. This feed will need to be processed on the XSV board.

Telemetry data via the serial port – The platform will need to be able to send basic telemetry data over the serial port connection provided by the Virtual Wireless board. This telemetry data will consist of the output from the range finders and the accelerometer. The telemetry data will consist of about 14-20 bytes of data per transmission with transmissions happening approximate twice per second.

In addition to these inputs and outputs, the platform will need to be able to determine when it has lost contact with the base station. It will need to be able to react to this situation autonomously. The platform will also need to determine if it is near an object and stop if it comes to close, regardless of the mode that it is in.


Design:

Our design uses several ready-made modules interfaced together to provide the functionality of the Remote Sensory Vehicle.  The main focuses of our project are the video link, rangefinder data, accelerometer data, movement, autonomous control, user interface, and device interfaces.                                                                                         

 

 

 

 

 

 

 

 

 

 


Fig. 1 – Block diagram of RSV project

Video

To create a streaming video link we chose to use a digitally acquired image translated into an analog signal.  Using an off the shelf digital video camera from Trendmasters provides us with this solution.  The camera takes a digital picture, translates it into NTSC format and transmits it via a 2.4 GHz radio link to a receiver equipped with RCA jacks.  The RCA jack is then plugged into the XSV board where it will be digitized by the built in Philips Semiconductor SAA7113 decoder microchip into CCIR 601 format.  The digitized video is then fed into the FPGA where it will be integrated into the VGA signal to be sent to the monitor.

 

There are several reasons why we used this approach over that of leaving the signal in digital format.  The main reason was parts. We found it nearly impossible to find a transmitter that could handle the bandwidth that the digital signal would require. Uncompressed video data from a CIF (352x288) camera requires a 37.5 Mbps link to display at 30 frames per second.  This is an unreasonable amount to transmit over any existing radio frequency link designed to transmit digital data.  One option we investigated was to encode and decode the video stream around the RF link.  Several algorithms are available, such as MPEG-4 and H.263.  These codecs reduce the needed bandwidth to a reasonable, though still large amount.  MPEG-4 requires a 1Mbps link to display video at 15 frames per second.  Unfortunately, implementations of these algorithms in integrated circuits were impossible to find and promised to be costly if available.  Also, a 1 Mbps RF link, such as Bluetooth, proved not only to be costly but not available in standard packaging.

 

Using the NTSC format provided a viable alternative. NTSC is the name of a set of common video format used to transmit television signals in the US and other countries. Since we knew television signals could be transmitted reliably via this format and NTSC sender/receiver pairs were readily available in consumer electronics stores, we decided to use this transmission method. We chose the Trendmasters camera because it already integrated digital picture acquisition with a NTSC radio frequency link. It also proved to be cheaper than a stand-alone NTSC transmitter/receiver pair. The fact that we no longer have to focus on the video link also means that more time can be spent processing the video on the Xilinix FPGA once the signal arrives.

 

Rangefinder

The sonar rangefinders are an integral part of creating a vehicle with collision avoidance.  When the RSV is put into autonomous control mode, it must never collide with any obstacle and the rangefinders are a reliable solution for this problem.

 

We will construct an array of six rangefinders on the RSV’s platform, three fore and three aft, to cover all the directions the vehicle can move at any one time.  An Atmel AT89C55 will control the rangefinders and the information they provide will be sent to both the XSV board via a link provided by the virtual wireless development kit and to an autonomous controller on the platform.  Before the information is transmitted, each rangefinder’s data will be associated with an identification so the controller and the XSV board will know in which direction the platform has an obstacle.

 

The rangefinders will be set up with a 4.7kW pull-up resistor on pin 7 (ECHO).  Pins one, two, and eight will be connected to ground, and pin nine will be connected to 5V.  Pin four (INIT), will be driven by pin P1_5 on the Atmel AT89C55.  A pull-up resistor may be necessary on P1_5.   When P1_5 is raised, INIT will be raised, which causes the sonar to fire.  We will use the negated ECHO output of the sonar to drive P1_2 (T2EX).  When ECHO goes high, T2EX will go low, triggering the Timer two interrupt, which allows us to get an accurate reading.  We plan on firing each individual sonar two times a second, although this depends on how fast we can switch between the sonars.   We may need to use two 8051s for this – one for the three rangefinders on the front of the vehicle, and one for the rangefinders in back.

 

 

 

 

 

 

 

 

 


Fig. 2 – Diagram of rangefinder interface to 8051

 

If we use the three rangefinders, the second and third sonar will have their inverted ECHO input connected to INT0 and INT1 respectively.  Pins P1_6 and P1_7 will be used to drive the INIT pin on the sonar.

 

We chose to use the sonar over the laser rangefinders for two reasons.  The first is area coverage.  The sonar rangefinders provide information within a conical volume with the emitter at the tip.  This is much more valuable than the single line the laser rangefinder would provide.  The other advantage to using the sonar rangefinder is availability.  The hardware lab already has many in stock whereas we would have to purchase laser rangefinders.

 

Accelerometer

By attaching an accelerometer to the RSV we can derive the tilt of the vehicle.  The accelerometer will interface to an Atmel AT89C55 and will transmit its data via the virtual wireless development kit to the XSV board where it will be interpreted and changed to an angle to be displayed on the monitor.  This is done to provide the user and the onboard controller with a sense of the vehicle’s orientation.

 

The output from the accelerometer presents information about acceleration on two separate axes. One axis will be aligned along the centerline of the vehicle with the other axis perpendicular to it. Information from both of these axes will be used to compute the tilt of the vehicle.  To provide as accurate information as possible, the accelerometer will be mounted on the chassis of the vehicle, preferably as close to the center as possible.

 

The accelerometer provides data in terms of a duty cycle on a single pin for each axis.  The cycle length is configurable through two capacitors and a resistor.  The attached 8051 then records the amount of time the accelerometer outputs high with a 16 bit counter and stores the information to transport it to the communication controller later.  By knowing the cycle time, we can measure the duty cycle of the accelerometer and thus derive the tilt of the vehicle along that axis.

Movement

The platform will have two modes of control, interactive and autonomous.  In interactive mode, the user can directly control the movement of the platform and also the orientation of the camera.  This platform uses four servos to accomplish this.  One servo for each tread provides the tank with forward and reverse motion and the ability of to turn within its own length.  One servo is used to rotate the camera parallel to the ground, and the final servo is used for the camera’s vertical angle.  In this manner, the user has complete control over the camera view independent from the orientation of the vehicle.

 

The RSV is equipped with an autonomous exploration mode.  It will search the area while avoiding obstacles by using data from its rangefinders.  The user will still retain control of the camera orientation, but the actual platform movement will no longer be under user control.  The autonomous controller will be located on the vehicle, so if the RSV travels out of range of the base station controller, it will reverse direction in an attempt to regain RF communication.  Requiring two-way communication via the virtual wireless development kit will allow the vehicle to recognize when it has left the range of the base station because it will no longer receive information from the base station.

 

User Interface

The user will control the RSV through a keyboard and receive information from a VGA monitor.  The keyboard will allow the user to control both directions of all four servos and change the vehicle between interactive and autonomous modes.  The monitor will display the video feed in the upper right corner with information from the accelerometer and rangefinders in the upper left.  The lower portion of the screen will be used to display status information about the vehicle along with what mode it is currently operating in.

 

Both the keyboard and the monitor will be plugged directly into the XSV board.  We chose this approach rather than using a PC to control the devices because all members of the team have experience with interfacing the keyboard and monitor with the XSV board.  The XSV board also provides every type of port we need, including PS/2, RCA, monitor, serial, and direct interfacing to pins on the FPGA.

 

Device Interfaces

The three main device interfaces of the RSV are the integration of the video RF link and the RC controller into the XSV board the 8051 cluster on the vehicle.

RF Video Link

 

 

 

 

 

 

 

 

 


Fig. 3 - Video RF interface through XSV board to monitor

 

 

Our video stream arrives on the XSV board via the 2.4 GHz RF link and the RCA video jack.  This NTSC signal is then digitized by the video decoder into ITU 656, YUV 4:2:2 format.  This format has a resolution of 720 x 525 pixels and stores the information in terms of luminance and red and blue color.  The 4:2:2 ratio means that every pixel has its luminance stored, but the red and blue color are only recorded every other pixel.  This turns out to be advantageous.  Since we do not plan to display a full screen image, we can simply ignore every other luminance value without losing much information about the picture as a whole.  This will reduce the number of columns to 360.  In order to display a nondistorted image, we are now required to drop every other row, giving us a 263 rows.  The decoder provides horizontal and vertical synchronization signals and a line locked clock that allows us to know exactly where we are in the image.

 

The one remaining issue about displaying the image is how the pixel information is stored.  The RAMDAC does not accept YUV signals, but will accept RGB, thus we need to convert from YUV to RGB.  This is done by the following transformation:

where Y, Cr (U), and Cb (V) are 8 bit values.  This yields 8 bit values for R, G, and B.  Since this transformation will be done on an FPGA, we will approximate the multiplication by using 16 bit variables and truncating all extra bits.  In this manner we now have a 360 x 262 RGB image we can send to the RAMDAC and display on the monitor.

 

RC Controller

The RC controller will interface with the XSV board through the one of the pin banks.  This will give us direct interaction with pins on the FPGA.  The controller is a simple collection of switches.  By making contacts, a circuit is closed and the a message is sent to the platform.  The nature of the message is unimportant to us because we do not plan to modify it in any way.  We will control this controller through a network of transistors.  This will allow the XSV board to control the transistors’ gates and make or break connections made on the controller.  This will allow us to retain the functionality of the controller while actually controlling the vehicle with the keyboard.

 

The keyboard interfaces into the XSV board through the PS/2 port.  Its output is fed into a shift register and captured every 10 bits, the start and stop bits are removed from the signal.  Keyboard scancodes are composed of a control character followed by a one or two instances of a byte representing the key pressed and then another of the same byte when the key is released.  We will need to parse through these packets to determine what the user is typing into the system.

 

The FPGA will provide the mapping from key presses into specific gate assignments on the RC controller.  This way we will be able to gain control of the vehicle from the keyboard without needing to learn the interface between the RC controller and the receiver inside the vehicle.

 

 

8051 Cluster

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 
 

Fig.  4 – 8051 cluster interface on vehicle

 

A cluster of five Atmel AT89C55 microcontrollers controls the RSV.  Two control an array of three rangefinders each.  One controls the accelerometer.  One provides autonomous control.  The final 8051 creates a communication link between the other four and the base station via the virtual wireless development kit.  The communication controller acts as a master and signals the other microcontrollers to begin the data transmission process.

 

Since all three external interrupts on the rangefinder controllers are already used, we had to create a protocol that did not require any interrupts on the part of the device controllers.  The device controllers are configured to take readings twice per second.  These readings will then be collected and sent to the autonomous controller (ACCON) and the virtual wireless development kit (VWDK) via RS-232 and a line powered RS-232 transceiver.

 

The data collection process begins with the communication controller toggling its interface control signal.  Once this line is toggled, the first rangefinder is cleared to send its data.  If the rangefinder is not ready to send, the communication controller waits until it is ready.  Conversely, if the rangefinder is ready to send before the signal toggles, the rangefinder must wait for the signal.  Once it is ready and cleared to send, the rangefinder controller (RCON1) will set its bus ports to output and write the first two bytes of data to it.  It will also set the data clock high.  It will then pulse the ready to send signal for 10 cycles.  This will trigger an interrupt in the communication controller (CCON) and it will record the value on the bus.  After another 10 cycles has passed, the data clock will be set low and a new value will be written to the bus after it has been low for 5 cycles.  The data clock will be set high 5 cycles later.  CCON will then record this value.  Since the order in which the device controllers send is fixed, it knows how many 16 bit values it will receive.  This sets up an effective transfer rate of 16 bits every 20 cycles.  Once RCON1 is finished sending, it will toggle its interface control signal, transferring control to RCON2.  RCON2 and the accelerometer controller (ACON) will proceed in the same fashion as RCON1, each tristating the bus after it has finished sending, thus avoiding bus contention issues.

 

The amount of ten cycles was chosen because of the asynchronous interface between the microcontrollers.  All the 8051s are using a 32 MHz clock, but will not be perfectly in phase or have the exact same period.  Thus, in order to ensure a signal is seen, we must hold it for longer than 1 cycle.  We chose the number 10 because it is sufficiently large to ensure data capture and we are dealing with a relatively low bitrate stream (14 total readings, each 16 bits, yields 28 bytes every half second.)

 

Once ACON has finished, all readings have been recorded into CCON and are ready to be transmitted.  As long as the RS-232 RxD line is not busy, CCON will send its 28 byte packet immediately.  This packet will go to the base station via the VWDK and to ACCON.  Since the base station will be communicating to the RSV, some packet collision and loss is inevitable, but since they will not be communicating with the same frequency, twice per second for the device readings and 2.86 times per second for the base station packets, collisions will only occur less than once every 7 seconds and this amount of loss is tolerable, since we will be guaranteed there is no loss on the next transmission.  Also, the only information sent from the base station is autonomous or interactive control.  Since it will be sent again .35 seconds later, .7 seconds is the maximum amount of time the vehicle would take to switch modes.  Conversely, the device reading are sent every .5 seconds, so the maximum delay inbetween device readings is 1 second.  This is tolerable because the sensor data transmitted to the base station is only displayed for the user’s benefit, not for control.  The onboard autonomous controller provides collision avoidance.
Parts:

Hardware

 

Quantity

Manufacturer

Part Number

Description

Comment

1

Horizon Hobbies

 

M1A1 Abrams Tank

This is the Radio Controlled tank on which we build the RSV.  It operates at 27.145 MHz.

1

Trendmasters

 

Digital Video Camera

This module provides our video feed.  It captures digital images, converts them to NTSC (standard TV) format and broadcasts the signal to a nearby base station. It operates in the 2.4 GHz range.

6

Acroname

R14-SONAR1

Polaroid Instrument Grade Sonar Ranging Package

The sonar range finders provide us with the ability to detect a possible collision with a physical object.

1

Analog Devices

ADXL210EB

Dual Axis Accelerometer Evaluation Board

The accelerometer board gives us information about the platform’s movement from which we can derive its velocity and the amount of tilt the vehicle is experiencing.

1

RF Monolithics

DR-1004DK

Virtual Wireless Development Kit

This kit gives us the ability to send digital signals to and from the platform in order to provide control and receive telemetry data. This link is in the 900 MHz range.

1

Xess

XSV300

Virtex Prototyping Board

This FPGA prototyping board provides us with many interfaces and is required for the integration of all the components of the RSV.

5

Atmel

AT98C55-33PC

8051 DIP Microcontroller

The Atmel 8051 is used to control the various sensors on the RSV.

5

Fox Electronics

F1100E-320

32.0 MHz clock

This oscillator is used to drive the Atmel 89C55 microcontroller.

1

Dallas Semiconductor

DS275

Line-Powered RS-232 Transceiver Chip

This chip allows us to interface the Atmel AT89C55 to the serial port on the virtual wireless development kit.

6

Fairchild Semiconductor

2N3906

PNP General Purpose Amplifier

We will use these transistors to control the switching on the RC controller for the tank.

1

Texas Instruments

SN74LS32

Quad 2-input

OR gate

This TTL is used in the 8051 cluster interface.

2

Texas Instruments

SN74LS00

Quad 2-input NAND gate

This TTL device will provide inverters for use with rangefinder and RS-232 interfaces to the Atmel AT89C55s.

1

 

 

VGA Monitor

The monitor will display the video output and current status of RSV.

1

 

 

PS/2 Keyboard

The keyboard is used to interface

with the RSV.

1

 

 

Onboard Power supply

We will need a 5V power supply to provide power to the accelerometer, range finders and onboard  8051s.

 

Software

Manufacturer

Program

Description

Kiel

C51 Compiler

Compiles C programs into hex code for the 8051 microcontroller.

Xilinx

Foundation Series

Creates FPGA configurations from schematics and Verilog modules.


 

Analysis:

Although the system is not fully defined at this time, the system is doable. In this section we will focus on each of the major components in turn starting with the simplest.

 

Basic motor functions – This system came with the basic platform. The basic movement controls of the platform are already existent via the RF link. Furthermore, the control signals that are used to operate the motors attached to the treads are simple voltage levels. These are the signals that autonomous controller will need to learn in order to control the vehicle without user input.

 

Full motion video – The full motion video is a nearly complete solution. The video camera and speaker that will be mounted on the vehicle can capture and encode the data that we need in NTSC format. This data is then transmitted to the base station via the included RF link where it is processed on the XSV300 board. This processing will be the difficult part. The NTSC data will be converted to a digital format (YUV) that in turn will be translated into the RGB. This will be done one of two ways, we can built the logic ourselves or use the RAMDAC to convert it for us.

 

Virtual Wireless link – The virtual wireless board can basically be treated like a half-duplex RS232 port. Control signals and telemetry data share this link. This fact doesn't really matter due to the fact that the amount of data that is being transmitted via this link either way is minimal. The amount of data that we expect to be sent is significantly less than the amount of data that the link is capable of handling.

 

Base Station FPGA – Although this board represents the heart of the base station, there is plenty of space on it to accommodate all of the functionality that we will need to have. Much of the basic layout off the screen can be accomplished using sprites and/or seven segmented displays and the user status portion of the board will simply be a set of registers recording state. Identifying keyboard strokes can be done fairly easily. Since we don't anticipate using over a dozen keys, a simple set of and gates may be sufficient to recognize patterns. Alternatively, we could use some of the keyboard recognition logic that we developed in CSE 467 last quarter.  The biggest challenge will be controlling the video feed and processing it the way that we want. This may be able to be done real time, if it can't, we can chose to use a double buffering system using the RAM available to the FPGA on the Xilinx board.

 

 

Rangefinder - There are several main issues that came up when using the rangefinder.  The first was whether multiple rangefinders firing at the same time could cause erroneous values (from returns generated by the other rangefinders).  We decided that there could potentially be a problem here, so by our design, we will not allow any of the front three rangefinders to be fired at the same time.  The same constraint will be applied to the rear rangefinders as well.

 

A second issue is the power consumption of the rangefinders.   In the data sheets, it states that when you fire the sonar, it can draw a maximum of 1A.  This would cause a problem because 6 * 1A = 6A.  This value is not including any of the other components on the RC tank such as the accelerometer or the 8051 microcontrollers.  However, when we tested the rangefinder that was firing once every 0.8-0.9 seconds, only 40mA of current was drawn.  With six rangefinders, we get 40mA * 6 = 240mA which is manageable.

 

Another issue was if we could switch between the rangefinders fast enough.  We wanted to get data from each rangefinder every half second.  With the speed of sound equal to approximately 300m/s, and the range of the rangefinder equal to 11m/s (total distance round trip is 22 m/s), we decided the maximum amount of times we could fire the rangefinder and switch to another rangefinder is 10.  Due to this, we changed our design to have only three rangefinders per 8051 (as opposed to 6).  Three rangefinders, firing two times a second, gives us a total of 6 which is a good distance below our upper bound of 10.  This also allows us to increase the number of checks per second to three if we feel that we need that degree of accuracy.  In addition, this allows us to use the three external interrupts on the Atmel 89C55 (timer 2, INT0, and INT1).

 

The last issue we saw relates to the first problem: erroneous returns.  If there are two objects, one closer than the other, what would happen if the 8051 switched rangefinders upon reception of the first return, causing an possible false return for the next rangefinder (the signal from the furthest object).   This is not as big of a problem due to the fact that we are firing three rangefinders twice a second.  This gives us time to add in a delay between switching rangefinders so we can ensure that we will not have any problems with false returns.  A simple counter will be sufficient to implement this delay.

 

Accelerometer – In order to ensure we are getting a precise enough reading to give an accurate calculation of the angle the vehicle is currently at, we will use a 16 bit counter in the controller.  The accelerometer is set to the slowest setting, 10 Hz, giving us the smallest root mean square noise, .0023g.  At this setting we will only transmit once every 5 readings.  The device has sensitivity equal to ±2g for a single reading.  Thus if we average the four of the five values, we can get a more accurate reading on average, by using shifts and adds while paying careful attention to any overflow.  Since this device gives its error in terms of a multiple of g, we can not be certain the angle calculated is correct, but by using averaging, we will be able to gain a better feel for the contour of the ground on which the vehicle is traveling.

 

 

Autonomous Control – Autonomous control of the vehicle brings up several issues, for example, what happens when the vehicle goes out of range of the base station.  If the vehicle is in interactive, it will stop.  The vehicle will no longer receive the signal from the controller. 

 

If the vehicle is in autonomous mode, it will detect it is no longer in range because it won’t receive the periodic messages from the base station.  In this case, we will reverse direction without turning until we find the signal again, then turn in a different direction and continue exploring.  If it is unable to return the way it came, for example a new obstacle was placed behind it, it will simply stop and wait to be placed back in range of the base station.  This method is used because the vehicle could get into an area it could not get out of, such as traversing over a ledge.  If the vehicle were to continually explore, it could end up traveling far away and we would have no record of the direction in which it went.

 

 


Testing:

Testing on this product will be done in stages. The first set of tests are currently being conducted to determine the basic interfaces to the sensors that we are using. These sensors include the sonar range finders, the accelerometer and the video camera itself.

 

Rangefinder

Testing the rangefinder will also have several stages.   At first, I plan on testing one rangefinder connected to the 8051.   This will allow me to determine if my code for configuring the 8051 is correct.  I will be using Hyperterminal and the serial port to communicate with the 8051 using the transmit and receiver pins.  This will allow me to retrieve any relevant data from the rangefinder such as the distance to the target.   For example, if I get 161 as a value from the rangefinder, I would expect that if I moved my hand further away, I would get an even larger value than 161.  By physically measuring the distance and performing some calculations, I can determine if the values are increasing at the rate I expect them to increase at (these values should be linearly increasing).  I will also be using Hyperterminal to test my ability to turn on and off the sonar (you do not want to have all three of the frontal rangefinders firing at once) by embedding a case statement in the sonar program that will allow me to switch between two states upon a given key input.  It is a trivial matter to determine if a rangefinder is on.  I can measure the voltage on the INIT pin (5V if it is on), or I can just listen to the rangefinder for a “clicking” sound.  Likewise, if the rangefinder is silent and the INIT pin is low (0V), the rangefinder is off.

 

Once this stage of testing the rangefinder is complete, I will integrate the additional two rangefinders (to a total of three rangefinders).  Although the additional rangefinders will be using different interrupts, the setup for them is practically the same as the timer 2 external interrupt, making this part of the integration simple.  The case statement in the code will be extended to allow me to switch any one of the three rangefinders on or off (using a total of six different keys).   The same method as above will be used to determine the state of each individual rangefinder.  I will also be able to use Hyperterminal to determine if I am getting the correct values from each of the rangefinders (for distance). 

 

The third stage of testing the sonar is to automatically switch from one rangefinder to the next.  By using an oscilloscope, I will be able to monitor the response of the INIT line on all three rangefinders, and determine the order they are firing in.  This portion of testing will most likely not be as difficult as the previous two sections.

 

Lastly, I will modify the sonar code to place the data on pins in port 0 according to the specifications for the microcontroller integration listed above in the design section.  I will hook up a logic analyzer to port 0, and make sure I am getting the correct data for all three rangefinders, and in the correct order.  To make sure that I am getting consistent data, I am going to place cardboard obstacles at a fixed distance from each rangefinder.  If I get different results (equating to more than a few inches distance), I will know that the data has been placed on the pins incorrectly because the first two stages verified the correctness of the range.

 

Accelerometer

 

The accelerometer testing will be a bit more simplistic then the sonar testing, due to the precalculated values we have from the data sheet for the accelerometer.  We plan on mounting the accelerometer onto a breadboard, and fixing the breadboard to some flat platform (piece of plywood or thick cardboard).  By using an oscilloscope, we can measure the difference in the high time between T2 and T1.  Once we have those values, we can compare them with the given values given on the accelerometer data sheet.

 

Once we finish that stage of testing, we will integrate the accelerometer with an 8051, and convert the T2 and T1 values into an 8 bit integer value.  This we can test by connecting a logic analyzer up to the 8051, and examining the outputs.

 

Video RF

This stage has already been verified.  Since the video RF was pre-made, our testing consisted of hooking it up to a TV, and seeing if we could get a picture.  Testing for video RF range will be done in conjunction with the testing of the RC tank range.  We plan on mounting the camera to the RC tank, and sending it to a given distance away.  This will give us an idea of how accurate the specified range of 50 feet is.

 

RF Link

We have already completed some testing of the RF link.  Currently, we have tested how the RF link will handle overflow and/or collisions.  This was tested using the DOS program that came with the Virtual Wire Development kit.  We found out that the RF link will lose data in the case of a collision, but with our system configuration, this should not be a big issue.

 

Control for RC car

The control for the RC tank is largely going to be tested through trial and error.  The first step is to connect the RC controller to the XSV, and use the push buttons on the XSV board for control.   This can be tested by trial and error.  The next step will be to map the keyboard keys (up arrow moves the tank forward, left arrow turns the tank to the left, etc).  We have pre-existing code for this section (some minor changes may be necessary) that already has been confirmed to work.  We will need to test that only one direction change at a time can be done (for example, holding the up and down arrows at the same time should not cause any damage to the car).

 

Autonomous Control

There are three main stages in testing for autonomous control.  At first, we plan on having the tank perform a pre-planned sequence of moves.  This will allow us to test that the 8051 microcontroller can truly control the tank.  Since we know the sequence of moves we programmed onto the 8051, it will be easy to determine if it is following those directions.

 

The second stage is to test the rangefinders in conjunction with the autonomous control.  If the tank gets too close to a wall or object, it should stop and turn to a different direction before continuing.  This can easily be tested by placing the tank in a confined area such as a cubicle.

 

The last stage is total autonomous control.  The best way to test this would be to place it in a relatively small area, and watch to see if it can avoid hitting the walls.


Design Issues:

There are several design problems that we are facing with the project. We will address each in its major area of involvement.

 

Video

Although most of the solution is already contained in the ready made camera and we do have all the connections that we need to complete this portion of the project, We still need to determine how exactly we will process the video data that is coming into the board and how we will display it. We may be able to process the video in real time, or, alternatively, be able to set up a double buffering system in the RAM that remains available to the FPGA just off chip. Either way there are still a number of design issues to be overcome in this area.

 

Virtual Wireless Link

This area is pretty well and easily covered. The protocol boards allow us to treat the Virtual Wireless data link as a half duplex RS232 serial port connection. However, we do anticipate that there will be problems with the interface on both ends, problems that have yet to be resolved or really identified.

 

Base Station

Besides the video and the data link, there are a number of other inputs and outputs that we will need to work with. Although we are confident that all of these modules will fit on the board, we are unsure exactly how they will be able to interact. This is especially true of the video solution and its output to the screen. Most of the other functionality should be fairly simple.

Onboard Functionality

Although we have determined how the 8051 controllers will interact with each other, we anticipate that there may be timing issues involved in the actual operation of the machine that will need to be dealt with. Also, at this stage of the design we don't know if we will have to define additional protocols for sending the data, especially from the accelerometer. Although our solution looks good on paper and seems pretty solid, there may be holes that we haven't quite discovered yet.
Technical References:

XSV Board manual version 1.0, a local copy can be found at http://www.cs.washington.edu/education/courses/cse477/CurrentQtr/admin/Handouts/xsv-300/xsv-manual-1_0.pdf

 

XL210 evaluation board manual (for the accelerometer) http://www.analog.com/techsupt/eb/xl210eb.pdf

 

XL202 accelerometer data sheet

http://www.analog.com/pdf/ADXL202_10_b.pdf

 

Virtual Development Kit Manual, DR 1004-DK and DR 1005-DK

 

Atmel AT89 Series Hardware Description Manual

Atmel 89C55 Manual

 

Fairchild Semiconductor 2N3906 PNP General Purpose Amplifier Data Sheet

 

Dallas Semiconductor DS275 Line-Powered RS-232 Transceiver Chip Manual

 

Analog Devices ADXL202 Dual Axis Accelerometer with Digital Output Data Sheet