Aries - Missile Defence System
1.
Introduction
Conventional target
tracking systems which either deploy radar or camera contain inherent
drawbacks. Radars for example are good at tracking distant objects, give a
precise distant measurement and could adopt complex algorithms to trace
objects. But because of an active sensor radars are easily discovered by
foreign entities. On the other hand they often err triggering false alarms due
to poor target recognition fidelity. Image sensors like infrared and visible
light CCD cameras cannot track distant targets as expediently as radars but are
good at image detection, covert operation and provides accurate azimuth and
elevation angles about the target. The proposed system fuses the two
technologies combining advantages and eliminating drawbacks.
1.2 Scope of the System
The
system deploys an image sensor and radar. The radar is good at detecting remote
objects but not close range objects. The camera on the other hand is good at
close range objects. The operations of both sensors are fused to give a far
more effective tracking system. The scope of the system is as follows,
a. Detects
both short range and long rage objects.
b.
Provide
object orientation details.
c.
Continuously
tracks the path of the target
d.
Switching
between radar and image sensor for more covert operation.
1.3 Applications of the System
The system could
be implemented in missile defence systems and any related security discipline.
2.
Background
Radars and
cameras are highly used in modern surveillance systems. Complex computations
can be performed by using software in order to make the system yield more
information than was previously possible. Regardless of how effective these
systems can be made with the aid of software and other enhancements they still
contain inherent drawbacks that cannot be eliminated. As an example the radar
uses an active sensor which can be detected by foreign objects. This project
ventures at combining the two technologies in order to eliminate these
drawbacks.
This report
discusses the fusion of a radar and image sensor for surveillance purposes. The
underlying principle could differ based on the application of the system. The
implementation of both the radar and image sensor are taken separately and
discussed.
3.
Project
Description
3.1 Solution Outline
The solution
would comprise of a
a. Radar
Subsystem
b. Image
Sensor Subsystem
c. Software
3.2
Solution Description
The software
named Aries will be used to fuse the operation of the radar and image sensor
subsystems. It comprises of three components, Data Measurement, Fusion
Measurement and Decision Making
The
radar subsystem will be used as the primary device to track and trace distant
targets. From the data obtained the direction (azimuth angle) and distance of
the object will be calculated. This information will then be sent to the Data
Measurement component of the software. The Fusion Measurement component decides
whether the object is within the range of the camera, if so the Decision Making
component deactivates the radar and feeds the image sensor subsystem with the
distance and direction measurements. Once the object is beyond the range of the
camera the radar is reactivated.
Shown below is the operation spectrum of
the system.
a. At
b-c the operation of the radar and camera fuse
b. At
a-b only the camera will be functional
c. At
c-d only the radar will be functional
1.
Implementation
of Radar Subsystem
4.1
Function of the Radar
Radar
(Radio Detection and Ranging) radiates electromagnetic energy to identify
foreign objects. The electromagnetic waves are reflected if they meet an
electrically leading surface. The radar is in the receive mode in between the
transmitted pulses. The presence of a target within the propagation range is
determined if the reflected waves are received at the place of origin.
4.2
Determining Distance
The
actual range of the target from the radar is known as the Slant Range. Since
the wave travels to the target and back the distance can be obtained by
dividing the round trip time by two. The equation is shown below,
4.3
Determining Direction
By
measuring the direction at which the antenna is pointing when the signal is
received both the azimuth angle and the elevation of the object can be
determined. The angular determination of the target is determined by the
directive of the antenna. The directive is the ability of the antenna to
concentrate the transmitted energy at a particular direction. The diagram below
shows the top view of the radar. The azimuth angle is measure with respective
to the north.
4.4
Determining Elevation
The
radar will hold the elevation angle at a constant but varies the azimuth angle.
The return can then be mapped on the horizontal plane. As shown in the diagram
below the elevation angle obtained by the radar is only a close estimate. The
image sensor subsystem will be used in order to obtain a more accurate estimate
of the elevation of the object.
4.5 The Plan Position
Indicator
The
results will be viewed using a Plan Position Indicator (PPI), which is the most
common type of radar display. The radar is positioned at the centre of the
display. While the radar rotates a beam sweeps the PPI. The distance and the
height of the object from the radar are shown as concentric rings. In the
diagram shown below the dotted line is the horizontal beam that sweeps the
display and the dark spot is an identified object.
4.6
Transmitting and Receiving Operation of the Radar
The
radar switches between the transmitting and receiving mode at a predetermined
rate using a device called a duplexer. If a target is within the vicinity of
propagation the transmitted wave will be reflected and received when the radar
is in receive mode. The lapse between two transmitted waves is known as the Pulse Repetition Time (PRT), during which the reflected wave should
be receive in order for the radar to trace the target. The reciprocal of that
is known as the Pulse Repetition
Frequency (PRF).
This
method imposes restrictions on the maximum and minimum range of the radar. During
transmitting time the radar cannot receive because the receiver is switched off
by the duplexer and vice versa. The minimum range is calculated by measuring
the width of the wave and multiplying it by the speed of light. Thus to measure
a closer range a shorter pulse needs to be used. But this affects the maximum
range since shorter pulses have less effective energy making reflected waves
from distant targets untraceable. In order to measure a longer range a longer
pulse with a longer PRT need to be used, but this is at odds with the minimum
range of the radar. As far as the implemented system is concern the primary
operation of the radar is to track distant targets as the target closes on its
range the operation will be transferred to the camera. In order to accomplish
this a longer pulse with a longer PRT will be used.
4.7 Minimum Range of
the Radar
The
minimal measuring distance Rmin is
the minimum distance at which the target will be detected. For this it is
necessary that the pulse completely leaves the radar and the duplexer switch on
the receiving unit. The shortest time at which the pulse will be recovered by
the radar will decide the minimum range. This is shown in the following
equation,
According
to the above equation a longer pulse (Pwd) will have a larger
minimum range.
4.8 Maximum Range of
the Radar
The
Pulse Repetition Frequency (PRF) determines the maximum range of the radar.
Unlike the minimum range when considering the maximum range two phenomena need
to be addressed. The returning echo signal maybe placed into either of the
following,
·
Into the next transmit
time, when the radar receive mode is switched off or,
·
The next receive time,
which may lead to erroneous results.
As
shown in the equation above the PRT is crucial because target-returns that are
received after the PRT expires are either not detected or are shown at
incorrect locations on the radar screen. Such results are known as ambiguous
returns. Therefore the image sensor will be used to confirm the presence of a
target at a given location.
1.
Implementation
of Image Sensor Subsystem
5.1
Function of the Image Sensor
The function of
the image sensor is to perform target detection and obtain the elevation angle
of the object. In addition to that the image sensor obtains a distance and
direction measurement of the target which will be fused with the measurements
obtained by the radar to give a more accurate estimate. A motion tracking
software will be used in order to track the motion of the target and adjust the
coordinates of the image sensor accordingly.
5.2
Determining Target Elevation
When the
software decides to activate the image sensor it provides the image sensor
subsystem with the slant range and direction measurements obtained from the
radar. Once the object is located, based on the slant range measurement the
image sensor subsystem changes the cameras focus on the object according to the
following formula,
Where f is the
cameras focus, u is the object distance measure and v is the image distant
measure.
The diagram
below shows the coordinates of the camera. X axis is the vertical plane, Z axis
is the optical centre and the X axis is vertical to the YZ plane.
Once the camera
locks on to the target an accurate measure for the elevation of the object can
be obtained.
5.3Target
Detection
Before the
camera locks on to a target it first needs to detect the target within a
certain frame. In places where the image background is simple like the sky or
sea, the peak method can be used to detect the object.
As shown in the
above figure the maximum gray-scale value point can be identified as the object.
When the
background is complicated such as the ground the object needs to be extracted
from the background. For this a Frame Subtraction method will be deployed. In
this method the image of the object is generated by first subtracting the
current image with an image frame taken at a previous time and then subtracting
the current image with an image frame taken from another previous time. This
subtraction procedure removes the background clutter since the backgrounds of
all three images are nearly the same. Afterwards the two resulting images are
logically ANDed in order to detect the current position of the object of
interest.
The diagram above demonstrated the
subtraction procedure. The results obtained are shown in the diagram in the
bottom. The procedure continues for the successive frames in order to keep
tacking the object of interest.
1.
Implementing
Software
6.1
Purpose of software
Software is
required to interpret raw data obtained by the radar and image sensor
subsystems. The software required by the system can be broken down into three
main categories,
·
Radar Subsystem
Software
·
Image Sensor Subsystem
Software
·
Aries Software
6.2
Radar Subsystem Software
The radar
subsystem software is required to calculate the azimuth angle (direction) and
the range distance of the object based on the returning pulse. A study needs to
be made in order to determine the maximum range of the radar in accordance with
the radar equations given above. Once a target is detected its range distance
and direction has to be plotted in the plan position indicator.
6.3 Aries Software
The Data
Measurement component of Aries Software obtains the direction and range
distance of the target from the radar subsystem. Based on this information the
Fusion Measurement component decides whether the object is within the range of
the image sensor. If so the image sensor is activated and the radar is
deactivated by the Decision Making component. Once the image sensor is in
operation the software obtains its data from the image sensor subsystem. If the
target moves beyond the range of the camera the radar subsystem will be
reactivated.
The Aries
Software switches the operation mode between the radar and image sensor
subsystem. But based on the application of the system the software could be
programmed to operate with both subsystems in the fusion region of the
operation spectrum to give a more accurate estimate regarding the position of
the object.
6.4 Image Sensor Subsystem Software
Once the
operation is transferred to the image sensor the software first has to detect
the target and then continue tracking the target. It will also calculate the
direction, distance and elevation of the object.
A Motion tracker
software will track the motion of the target so that that image sensor could
change its co-ordinates according to the motion of the object.
The diagram
below shows how the three software systems will be working together.
1.
Further
Enhancements
·
Predicting Object
Future Path – With the necessary software the system could predict the future
path of a given target.
·
Target Recognition –
Image recognition software could be integrated to the system to identify object
without human intervention.
2.
Limitation
·
This report discusses
the fusion of radar and image sensor technologies for surveillance purposes.
The underlying principle on how the two technologies can be fused is given, but
not discussed in an application specific manner. As an example if the system is
implemented for missile defence purposes there would be additional hardware and
software components that need to be integrated with it.
·
Precise values for the
maximum and minimum range of the radar cannot be given. Practical observations
need to be made before these values can be derived.
·
The simplified
operation spectrum of the system is given in the report. Precise values for the
operation range of the radar and camera cannot be given since that would
require practical observation.
3.
Summary
Radars and image
sensors and highly used in modern surveillance systems. Although many
developments have been made these systems still contain inherent drawbacks. A
more effective surveillance system can be produce if the two technologies are
integrated, combining their advantages and eliminating drawbacks.
References
1.
Zhiqiang Hou, “Target Tracking Systems”, Research
thesis, Institute of Automation
School of Electronics and Information Engineering, Xian Jianotong
University, Xian,
China, 2003.
2.
Clarf f. Olson, Daniel
P. Huttenlocher, “Automatic Target
Recognition by Matching Oriented Endge Pixels”, vol 6, IEEE Transactions on
image Processing, 1997.
3.
Banh Namd, “Moving Target Detection Method using
Two-Frame Subtraction”, U.S patent 5150426, September 22, 1992.
4.
David Tweed, Andre
Calway, “Tracking Many Object Using Subordinate Condensation”, M.S thesis,
Computer Science Department, Bristol University, U.K, 2005
5.
“Air Surveillance
Radar”, globalsecurity.com,
3,July,2008. [Online]. Available at http://www.globalsecurity.org/military/systems/aircraft/systems/air-surveillance-radars.htm
[Accessed 2.August.2009 ]
6.
“Radar Tutorial” , radartutorial.com, 3.May.2005[Online],
Available at http://www.radartutorial.eu/04.history/hi04.en.html
[Accessed 25.August.2009]