VISVESVARAYA TECHNOLOGICAL UNIVERSITY
JNANA SANGAMA, BELAGAVI – 590014
A Technical Seminar Report on
“ARTIFICIAL INTELLIGENCE FOR SATELLITE
COMMUNICATION”
Submitted in partial fulfilment of the requirements for the award of degree of
BACHELOR OF ENGINEERING
IN
COMPUTER SCIENCE AND ENGINEERING
Submitted by
Manasi Shahapurkar (2AG19CS028)
Under the Guidance of
Prof.Priyanka Pujari
Assistant Professor,Dept of CSE,
AITM, Belagavi
ANGADI INSTITUTE OF TECHNOLOGY & MANAGEMENT
BELAGAVI-590009
2022-2023
ANGADI INSTITUTE OF TECHNOLOGY & MANAGEMENT
BELAGAVI -590009
DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING
Certificate
This is to certify that the Technical Seminar work entitled “Artificial Intelligence for satellite
communication” is work carried out by Manasi Shahapurkar (2AG19CS028) in partial fulfillment of
the requirements for the award of the degree of Bachelor of Computer Science & Engineering
under Visvesvaraya Technological University, Belagavi during the year 2022-2023. It is certified
that all the correction/suggestion indicated for internal and external assessment have been
incorporated in the report. The Final Year Technical Seminar report has been approved as it satisfies
the academic requirements in respect of the final year seminar work prescribed for the Bachelor of
Engineering degree.
Signature of the Signature of the Seminar Signature of the Head Signature of the
Guide Coordinator of Department Principal
Prof.Priyanka Pujari Prof. Vilas M Jarali Prof.Dhanashree Kulkarni Dr.Anand Deshpande
Assistant Professor, Assistant Professor, Professor and Head, Principal and Director,
Dept. of CSE, AITM Dept. of CSE, AITM Dept. of CSE, AITM AITM, Belagavi
Name of the Examiner: Signature with date:
1. …………………………………........... ………………………………….
2. …………………………………………… ………………...................
DECLARATION
I Manasi Shahapurkar (2AG19CS014), studying in the final semester of Bachelor of
Engineering in Computer Science and Engineering at Angadi Institute of Technology and
Management, Belagavi, hereby declare that this Technical Seminar work entitled “Artificial
Intelligence for satellite communication” which is being submitted by me in the partial
fulfillment for the award of the degree of Bachelor of Engineering in Computer Science and
Engineering from Visvesvaraya Technological University, Belagavi is an authentic record of
us carried out during the academic year 2022-2023 under the guidance of Prof. Priyanka
Pujari, Department of Computer Science and Engineering, Angadi Institute of Technology
and Management, Belagavi.
I further undertake that the matter embodied in the dissertation has not been submitted
previously for the award of any degree or diploma by us to any other university or institution.
Place: Belagavi Manasi Shahapurkar
2AG19CS028
Date: 18/4/2023
ACKNOWLEDGEMENT
It is my proud privilege and duty to acknowledge the kind of help and guidance
received from several people in preparation of this report. It would not have been possible to
prepare this Seminar report in this form without their valuable help, cooperation and guidance.
First and foremost, I wish to record my sincere gratitude to Management of Angadi
Institute of Technology and Management, Belagavi and to our beloved Principal Dr. Anand
Deshpande, Angadi Institute of Technology and Management, Belagavi for his constant
support and encouragement in preparation of this report and for making available library and
laboratory facilities needed to prepare this report.
My sincere thanks to our HOD Prof. Dhanashree Kulkarni, Department of Computer
Science and Engineering, AITM, Belagavi for her valuable suggestions and guidance
throughout the period of this report.
I express my sincere gratitude to our guide Prof. Priyanka Pujari, Department of
Computer Science and Engineering, AITM, Belagavi for guiding us in investigations for this
seminar and in carrying out experimental work. My numerous discussions with her were
extremely helpful. I hold her in esteem for guidance, encouragement and inspiration received
from her.
The Techincal Seminar report on “Artificial Intelligence for satellite
communication” was very helpful to me in giving the necessary background information and
inspiration in choosing this topic for the Seminar.
My sincere thanks to Prof. Vilas M Jarali, Seminar Coordinator for having
supported the work related to this technical Seminar. His contributions and technical support in
preparing this seminar report are greatly acknowledged.
Last but not the least, I wish to thank my parents for financing my studies in this
college as well as for constantly encouraging me to learn engineering. Their personal sacrifice
in providing this opportunity to learn engineering is gratefully acknowledged.
Manasi Shahapurkar
2AG19CS028
TABLE OF CONTENT
TITLE Page no.
Acknowledgement ⅰ
Abstract ⅰⅰ
Chapter 1 Introduction 1
1.1 The Early Days of AI for Satellite Communication 1
1.2 AI for Satellite Communication Today 2
Chapter 2 Literature survey 4
Chapter 3 Introduction Satellite communication 5
3.1 What is satellite communication 5
3.2 Satellite communication block diagram 5
3.3 Need for satellite communication 6
3.4 How satellite communication work 6
3.5 Satellite communication services 7
3.6 Advantages of satellite communication 8
3.7 Disadvantages of satellite communication 9
3.8 Applications of satellite communication 10
Chapter 4 Architecture of AI for satellite communication 11
Chapter 5 Applications of AI for satellite communication 12
5.1Applications of AI for satellite communication 12
5.2 Beam hopping 13
5.3 Anti-jamming 14
5.3 Network traffic forecasting 16
5.5 Channel modeling 18
5.6 Telemetry mining 19
5.7 Ionospheric scintillation detecting 20
5.8 Interference managing 22
5.9 Remote sensing 23
5.10 Behavior modeling 24
5.11 Space-air-ground-integrating 25
5.12 Energy managing 27
5.13 Other Applications 30
31
Chapter 6 Various AI algorithms
6.1 Various AI algorithms with their respective
Satellite communication applications 32
6.2 Abbreviations and full names. 34
Opportunities for future research 35
Conclusion 36
References 37
SURESH ANGADI EDUCATION FOUNDATION’S
ANGADI INSTITUTE OF TECHNOLOGY AND MANAGEMENT
Savagaon Road, BELAGAVI – 590 009.
(Approved by AICTE, New Delhi & Affiliated to Visvesvaraya Technological University, Belagavi)
(Accredited by NAAC)
DEPARTMENT OF COMPUTER SCIENCE & ENGINEERING
Vision of the Department
To impart quality education in computer science and engineering with emphasis on professional
skills to meet the global challenges in IT paradigm.
Mission of the Department
M1: Impart knowledge in current and emerging computing technologies by adopting various
pedagogical approaches.
M2: Develop a conducive environment to inculcate and nurture analytical and communication
skills along with societal and ethical responsibility in all professional endeavors.
M3: Enable students to be successful globally by being effective problem solvers and lifelong
learners.
Program Educational Objectives (PEOs)
Graduates will be able to:
PEO1: Utilize the strong foundation in mathematics, programming, scientific and engineering
fundamentals necessary to formulate, analyze and solve IT related engineering problems,
and endeavor themselves for higher learning.
PEO2: Demonstrate an ability to analyze the requirements and technical specifications of software
to articulate novel engineering solutions for an efficient product design.
PEO3: Adapt to emerging technologies, and work as teams on multidisciplinary projects meeting
the requirements of Indian and multinational companies.
PEO4: Pursue professional career adopting work values with a social concern to bridge the digital
divide and develop effective communication skills and leadership qualities, and
PEO5: Understand the efficacy of life-long learning, professional ethics and practices, so that they
may emerge as global leaders.
ABSTRACT
Satellite communication offers the prospect of service continuity over uncovered and
under- covered areas, service ubiquity, and service scalability. However, several challenges
must first be addressed to realize these benefits, as the resource management, network control,
network security, spectrum management, and energy usage of satellite networks are more
challenging than that of terrestrial networks. Meanwhile, artificial intelligence (AI), including
machine learning, deep learning, and reinforcement learning, has been steadily growing as a
research field and has shown successful results in diverse applications, including wireless
communication. In particular, the application of AI to a wide variety of satellite communication
aspects has demonstrated excellent potential, including beam-hopping, anti-jamming, network
traffic forecasting, channel modeling, telemetry mining, ionosphere scintillation detecting,
interference managing, remote sensing, behavior modeling, space-air-ground integrating, and
energy managing. This work thus provides a general overview of AI, its diverse sub-fields, and
its state-of-the-art algorithms. Several challenges facing diverse aspects of satellite
communication systems are then discussed, and their proposed and potential AI-based solutions
are presented. Finally, an outlook of field is drawn, and future steps are suggested.
Artificial Intelligence for satellite communication
CHAPTER 1
INTRODUCTION
1.1 The Early Days of AI for Satellite Communication
In the early days of AI for satellite communication, scientists and engineers were exploring the
potential of using AI to improve satellite communication. They were experimenting with different
algorithms and techniques to make the communication more efficient and reliable. AI was seen as a
way to help reduce the amount of human intervention required for successful communication. The first
AI-based satellite communication system was developed in the late 1960s. This system was able to
autonomously control the satellite's position and transmit data to ground stations. This system was a
major breakthrough in the field of satellite communication, as it allowed for more reliable and efficient
communication. It was also the first step towards the development of more advanced AI- based satellite
communication systems.
1.2 AI for Satellite Communication Today
Artificial Intelligence (AI) is a powerful tool that can be used to improve satellite communication. AI
can be used to automate tasks, analyze data, and create more efficient systems. AI can help satellite
communication systems become more reliable, efficient, and secure.
AI can be used to monitor satellite systems and detect anomalies in signal transmission. AI can also be
used to optimize the transmission of data and detect any interference. AI can also be used to analyze data
and make predictions about future events. All of these features can help improve satellite
communication.
The remarkable advancement of wireless communication systems, quickly increasing demand for new
services in various fields, and rapid development of intelligent devices have led to a growing demand
for satellite communication systems to complement conventional terrestrial networks to give access
over uncovered and under-covered urban, rural, and mountainous areas, as well as the seas. There are
three major types of satellites, including the geostationary earth orbit, also referred to as a
geosynchronous equatorial orbit (GEO), medium earth orbit (MEO), and low earth orbit (LEO)
Dept. of CSE, AITM, Belagavi Page 1
Artificial Intelligence for satellite communication
Satellites. Classification depends on three main features, i.e., the altitude, beam footprint size, and orbit.
GEO, MEO, and LEO satellites have an orbit around the earth at an altitude of 35 786 , 7000–25 000,
and 300–1500 km, respectively. The beam footprint of a GEO satellite ranges from 200 to 3500 km;
of an MEO or LEO beam footprint satellite ranges from 100 to 1000 km. The orbital period of a GEO
satellite is equal to that of the Earth period, which makes it appear fixed to the ground observers,
whereas LEO and MEO satellites have a shorter period, many LEO and MEO satellites are required to
offer continuous global coverage. For example, Iridium NEXT has 66 LEO satellites and 6 spares, Star
link by Space X plans to have 4425 LEO satellites plus some spares, and other-three-billion (O3b) has
20 MEO satellites including 3 on-orbit spares.
Satellite communication use cases can also be split into three categories: (1) service continuity, to
provide network access over uncovered and under-covered areas; (2) service ubiquity, to ameliorate the
network availability in cases of temporary outage or destruction of a ground network due to disasters;
and (3) service scalability, to offload traffic from the ground networks. In addition, satellite
communication systems could provide coverage to various fields, such as the transportation, energy,
agriculture, business, and public safety fields.
Although satellite communication offers improved global coverage and increased communication
quality, it has several challenges. Satellites, especially LEO satellites, have limited on-board resources
and move quickly, bringing high dynamics to the network access.
Models for terrestrial networks can have a high computational complexity; as the on-board satellite
computational resources are limited, terrestrial models are not suitable for satellites. The high mobility
of the space segments, and the inherent heterogeneity between the satellite layers (GEO, MEO, LEO),
the aerial layers (unmanned aerial vehicles (UAVs), balloons, airships), and the ground layer make
network control, network security, and spectrum management challenging. The high mobility results in
frequent handoffs. Hence many researchers have thus focused on handoff management for satellite
communication. In addition, the frequent handoff makes safe routing more difficult to realize, thus
making it more exposed to jamming. In addition, achieving high energy efficiency for satellite
communication is more challenging than for terrestrial networks. Several surveys have discussed
different aspects of satellite communication systems, such as handoff schemes[3] , mobile satellite
systems[4] , multiple-input multiple-output (MIMO) over satellite[5] , satellites for the internet of
remote things[6] , inter-satellite communication systems[7] , quality of service (QoS) provisioning[8] ,
Dept. of CSE, AITM, Belagavi Page 2
Artificial Intelligence for satellite communication
space optical communication[9] , space air-ground integrated networks[10] , small satellite. remote
things[6] , inter-satellite communication systems[7] , quality of service (QoS) provisioning[8] , space
optical communication[9] , space air-ground integrated networks[10] , small satellite .Meanwhile,
interest in artificial intelligence (AI) increased in recent years. AI, including machine learning (ML),
deep learning (DL), and reinforcement learning (RL), has shown successful results in diverse
applications in science and engineering fields, such as electrical engineering, software engineering,
bioengineering, and financial engineering. Several researchers have thus turned to AI techniques to
solve various challenges in their respective fields and have designed diverse successful AI-based
applications, to overcome several challenges in the wireless communication field. Being aware of the
potential of artificial intelligence, being inspired from other successful applications of AI in the other
fields, and giving the inherent difficulties in the satellite communication, we believe that AI can play a
big role in the optimization of several aspects in the field of satellite communication. Some have
discussed AI and its applications to wireless communication in general [14−17] . Others have focused
on the application of AI to one aspect of wireless communication, such as wireless communications in
the IoT [18] , network management[19] , wireless security[20] , emerging robotics communication[21]
, antenna design[22] , and UAV networks[23, 24] . Vazquez et al.[25]Briefly discussed some
promising use cases of AI for satellite communication, whereas Kato et al.[26] discussed the use of AI
for space-air-integrated networks. The use of DL in space applications has also been addressed
[27].Overall, several researchers have discussed wireless and satellite communication systems, and
some of these have discussed the use of AI for one or a few aspects of satellite communication;
however, an extensive survey of AI applications in diverse aspects of satellite communication has yet
to be performed. This work therefore aims to provide an introduction to AI, a discussion of various
challenges being faced by satellite communication and an extensive survey of potential AI-based
applications to overcome these challenges. A general overview of AI, its diverse subfields, and its
state-of-the-art algorithms are presented in Section 2. Several challenges being faced by diverse aspects
of satellite communication systems, and then potential AI-based solutions are discussed in Section 3;
these applications are summarized. Some of these applications are specific to satellite communication
such as beam hopping (BH), telemetry mining, ionosphere scintillation detecting, and remote sensing
(RS). Space-air-ground integrated networks (SAGINs) are another application where satellite and non-
satellite networks are integrated using AI to offer more-flexible services.
Dept. of CSE, AITM, Belagavi Page 3
Artificial Intelligence for satellite communication
CHAPTER 2
LITERATURE SURVEY
This literature survey will provide a comprehensive overview of the current state of artificial
intelligence (AI) for satellite communication. We will discuss the various applications of AI in this
field, as well as the challenges and potential opportunities for future research.
The survey will cover topics such as AI-based satellite image processing, AI-based satellite
communications networks, and AI-based satellite navigation systems. We will also discuss the
potential of AI for satellite communication in terms of cost savings, efficiency, and reliability.
AI can be used in a variety of ways to improve satellite communication. AI-based satellite image
processing can be used to detect and classify objects in satellite images. AI-based satellite
communications networks can be used to optimize the routing of data between satellites and ground
stations. Finally, AI-based satellite navigation systems can be used to accurately determine the location
of satellites in space.
The use of AI in satellite communication can result in cost savings, efficiency, and reliability. AI-based
systems can automate tasks that would otherwise require significant manual effort, and can often
provide more accurate results than manual approaches.
The development of AI-based systems for satellite communication faces a number of challenges. These
include the need for large amounts of training data, the complexity of the algorithms used, and the
need for robust hardware and software systems. In addition, AI-based systems are vulnerable to
malicious attacks, and must be designed with security in mind.
The development of AI-based systems for satellite communication also requires significant research
and development. This includes the development of new algorithms, the integration of existing
systems, and the development of new hardware and software systems
Dept. of CSE, AITM, Belagavi Page 4
Artificial Intelligence for satellite communication
CHAPTER 3
SATELLITE COMMUNICATION
3.1 WHAT IS SATELLITE COMMUNICATION?
Satellite communication is transporting information from one place to another using a communication
satellite in orbit around the Earth. Watching the English Premier League every weekend with your
friends would have been impossible without this. A communication satellite is an artificial satellite that
transmits the signal via a transponder by creating a channel between the transmitter and the receiver at
different Earth locations.
Telephone, radio, television, internet, and military applications use satellite communications. Believe it
or not, more than 2000 artificial satellites are hurtling around in space above your heads.
3.2 SATELLITE COMMUNICATION BLOCK DIAGRAM
Fig 3.2.1 Block Diagram of Satellite Communication
Dept. of CSE, AITM, Belagavi Page 5
Artificial Intelligence for satellite communication
3.3NEED FOR SATELLITE COMMUNICATION
We know that there are different ways to communicate, and the propagation of these waves can occur
in different ways. Ground wave propagation and sky wave propagation are the two ways
communication takes place for a certain distance. The maximum distance covered by them is 1500 km,
which was overcome by the introduction of satellite communication.
3.4 HOW SATELLITE COMMUNICATIONS WORK?
The communication satellites are similar to the space mirrors that help us bounce signals such as radio,
internet data, and television from one side of the earth to another. Three stages are involved, which
explain the working of satellite communications. These are:
Uplink
Transponders
Downlink
Let’s consider an example of signals from a television. In the first stage, the signal from the television
broadcast on the other side of the earth is first beamed up to the satellite from the ground station on
the earth. This process is known as uplink.
The second stage involves transponders such as radio receivers, amplifiers, and transmitters. These
transponders boost the incoming signal and change its frequency so that the outgoing signals are not
altered. Depending on the incoming signal sources, the transponders vary.
The final stage involves a downlink in which the data is sent to the other end of the receiver on the
earth. It is important to understand that usually, there is one uplink and multiple downlinks.
Dept. of CSE, AITM, Belagavi Page 6
Artificial Intelligence for satellite communication
Fig 3.4.1 working Diagram of Satellite Communication
3.5 SATELLITE COMMUNICATI SERVICE
There are two categories in which satellite communication services can be classified:
One-way satellite communication
Two- way satellite communication
3.5.1 ONE-WAY COMMUNICATION
In one-way satellite communication, the communication usually takes place between either one or
multiple earth stations through the help of a satellite.
The communication takes place between the transmitter on the first earth satellite to the receiver which is
the second earth satellite. The transmission of the signal is unidirectional. Some common one-way
satellite communication is:
Dept. of CSE, AITM, Belagavi Page 7
Artificial Intelligence for satellite communication
location services are provided by the radio
Tracking is a part of space operations services
Internet services take place with broadcasting satellites
Fig 3.5.1.1 One Way Satellite Communication
3.5.2 TWO-WAY SATELLITE COMMUNICATION
In two-way satellite communication, the information is exchanged between any two earth stations. It can
be said that there is a point to point connectivity The signal is transmitted from the first earth station to
the second earth station such that there are twouplinks and two downlinks between the earth stations and
the satellite.
In two-way satellite communication, the information is exchanged between any two earth stations. It can
be said that there is a point to point connectivity.
The signal is transmitted from the first earth station to the second earth station such that there are two
uplinks and two downlinks between the earth stations and the satellite.
Dept. of CSE, AITM, Belagavi Page 8
Artificial Intelligence for satellite communication
Fig 3.5.2.1 Two way satellite communication
3.6 Advantages of satellite communication
The following are the advantages of satellite communication:
Installments of circuits are easy.
The elasticity of these circuits is excellent.
With the help of satellite communication, every corner of the earth can be covered.
The user fully controls the network.
Dept. of CSE, AITM, Belagavi Page 9
Artificial Intelligence for satellite communication
3.7 DISADVANTAGES OF SATELLITE COMMUNICATION
The following are the disadvantages of satellite communication:
Initial expenditure is expensive.
There are chances of blockage of frequencies.
Propagation and interference.
3.8 APPLICATIONS OF SATELLITE COMMUNICATION
Telephone
Television
Digital cinema
Radio broadcasting
Amateur radio
Internet access
Military
Disaster Management
.
:
Dept. of CSE, AITM, Belagavi Page 10
Artificial Intelligence for satellite communication
CHAPTER 4
4.1 ARCHITECTURE OF AI FOR SATELLITE COMMUNICATION
A mysterious, intergalactic network of satellites forms a web of possibilities, connecting distant stars
and providing a gateway to the unknown.
Radiating out from a central hub, these AI-powered satellites work together to transmit data and
analyses the vast expanse of space.
AI for satellite communication is based on a distributed architecture that allows for data to be sent and
received quickly and reliably. This architecture consists of an AI-powered satellite, an AI-powered
ground station, and a communication network that connects the two. The AI-powered satellite is able
to process data faster and more accurately than a traditional satellite, while the AI-powered ground
station is able to detect and analyses data quickly and accurately
Fig 4.1 Architecture of AI for satellite communica
Dept. of CSE, AITM, Belagavi Page 11
Artificial Intelligence for satellite communication
CHAPTER 5
APPLICATIONS OF AI FOR SATELLITE COMMUNICATION
Fig 5.1 applications of AI for different satellite communication aspects.
Dept. of CSE, AITM, Belagavi Page 12
Artificial Intelligence for satellite communication
5.2BEAM HOPPING
5.2.1 Definition & limitations
Satellite resources are expensive and thus require efficient systems involving optimizing and
timesharing. In conventional satellite systems the resources are fixed and uniformly distributed across
beams. As a result, conventional large multi-beam satellite systems have shown a mismatch between
the offered and requested resources; some spot beams have a higher demand than the offered capacity,
leaving the demand pending (i.e., hot-spots), while others present a demand lower than the installed
capacity, leaving the offered capacity unused (i.e., cold-spots, as summarized in Fig. 8 ). Thus, to
improve multi-beam satellite communication, the on-board flexible allocation of satellite resources
Over the service coverage area is necessary to achieve more efficient satellite communication.
BH has emerged as a promising technique to achieve greater flexibility in managing non-uniform and
variant traffic requests throughout the day, year, and lifetime of the satellite over the coverage area
[55, 56] .BH involves dynamically illuminating each cell with a small number of active beams, as
summarized in Fig. 9, thus using all available on-board satellite resources to offer service to only a
subset of beams. The selection of this subset is time-variant and depends on the traffic demand, whichis
based on the time-space dependent BH illumination pattern. The illuminated beams are only active long
enough to fill the request for each beam. Thus, the challenging task in BH systems is to decide which
beams should be activated and for how long, i.e., the BH illumination pattern; this responsibilityis left to
the resource manager who then forwards the selected pattern to the satellite via telemetry, tracking, and
command [57] .Of the various methods that researchers have provided to realize BH, most have been
based on classical optimization algorithms. For example, Angeletti et al. [58] demonstrated several
advantages to the performance of a system when using BH and proposed the use of genetic algorithm
(GA) to design the BH illumination pattern; Anza chi et al. [59] also illustrated the merits of BH and
compared the performance between BH and non-hopped systems. Alberta et al.[60] proposed a
heuristic iterative algorithm to obtain a solution to the BH illumination design. BH has also been used
to decrease the number of transponder amplifiers for Terabit/s satellites [61] .An iterative algorithm has
also been proposed to maximize the overall offered capacity under certain beam demand and power
constraints in a joint BH design and spectrum assignment [62] . Alleger et al.[63] designed two
heuristics to allocate capacity resources basing on the traffic request per-beam, and then further
discussed the long and short-term traffic variations and suggested techniques to deal with both
Dept. of CSE, AITM, Belagavi Page 13
Artificial Intelligence for satellite communication
variations[64] . Liu et al. [65] studied BH system. The QoS delay fairness equilibrium has also been
addressed in BH satellites [66] . Joint BH schemes were proposed by Shi et al. [67] and Genesis et al.
[68] to further ameliorate the efficiency of on-board resource allocation. To find the optimal BH
illumination design, Cocoa et al. [69] used a simulated annealing algorithm.
Fig 5.2.1.1 Simplified architecture of beam hopping (BH). TT&C represents telemetry, tracking, and
command
5.2.2 AI-based solutions
Seeking to overcome these limitations and enhance the performance of BH, some researchers have
proposed AI-based solutions. Some of these solutions have been fully based on the learning approach,
i.e., end-to-end learning, in which the BH algorithm is a learning algorithm. Others have tried to
improve optimization algorithms by adding a learning layer, thus combining learning and optimization.
To optimize the transmission delay and the system throughput in multiband satellite systems, Hu et
al.[70] formulated an optimization problem and modeled it as a Markov decision process (MDP). DRLis
then used to solve the BH illumination design and optimize the long-term accumulated rewards of the
modeled MDP. As a result, the proposed DRL-based BH algorithm can reduce the transmission delay
by up to 52.2% and increased the system throughput by up to 11.4% when compared with previous
algorithms. To combine the advantages of end-to-end learning approaches and optimization approaches,
for a more efficient BH illumination pattern design, Lei et al.[57] suggested a learning and optimization
algorithm to deal with the beam hopping pattern illumination selection, in which a learning approach,
based on fully connected NNs, was used to predict non-optimal BH patterns and thus address the
difficulties faced when applying an optimization algorithm to a large search space. The trained ML
algorithm is used to provide a predicted feature vector, which is then used to delete a large amount of
non-promising designs from the original search space. Thus, the learning-based prediction reduces the
search space, and the optimization can be reduced on a smaller set of promising
Dept. of CSE, AITM, Belagavi Page 14
Artificial Intelligence for satellite communication
5.3 ANTI-JAMMING
5.3.1 Definition & limitations
Satellite communication systems are required to cover a wide area, and provide high-speed,
communication, and high-capacity transmission. However, in tactical communication systems using
satellites, reliability and security are the prime concerns; therefore, an ant jamming (AJ) capability is
essential. Jamming attacks could be launched toward main locations and crucial devices in a satellite
network to reduce or even paralyze the throughput. Several AJ methods have thus been designed to
reduce possible attacks and guarantee secure satellite communication.
Most prior AJ techniques are not based on learning and thus cannot deal with clever jamming
techniques that are capable of continuously adjusting the jamming methodology by interaction and
learning. Developing AI algorithms offer advanced tools to achieve diverse and intelligent jamming
attacks based on learning approaches and thus present a serious threat to satellite communication
reliability. In two such examples, a smart jamming formulation automatically adjusted the jamming
channel[76, 77] , whereas a smart jammer maximized the jamming effect by adjusting both the
jamming power and channel[78] . In addition, attacks could be caused by multiple jammers
simultaneously implementing intelligent jamming attacks based on learning approaches. Although this
may be an unlikely scenario, it has not yet been seriously considered. Further, most researchers have
focused on defending against AJ attacks in the frequency-based domain, rather than space based AJ
techniques, such as routing AJ.
5.3.2 AI-based solutions
By using a long short-term memory (LSTM) network, which is a DL RNN, to learn the temporal trend
of a signal, Lee et al. [79] demonstrated a reduction of overall synchronization time in the previously
discussed FHFDMA scenario [75] In mobile communication, mobile devices can achieve, using RL,
an optimal communication policy without necessarily knowing the jamming and the radio channel
model in a dynamic game framework [77] . Han et al.[80] proposed the use of a learning approach for
AJ to block smart jamming in the Internet of Satellites using a space-based AJ method, AJ routing,
summarized in Fig. 10 . By combining game theory modeling with RL, and modeling the interactions
between smart jammers and satellite users as a Stackelberg AJ routing game, Han et al .[80]
Dept. of CSE, AITM, Belagavi Page 15
Artificial Intelligence for satellite communication
Demonstrated how to use DL to deal with the large decision space caused by the high dynamics of the
IoS and RL to deal with the interplay between the satellites and the smart jamming environment. DRL,
specifically actor-critic algorithm, with the source node as a state, where the critic network evaluates
the expected reward for chosen actions, made it possible to solve the routing selection issue for the
heterogeneous IoS while preserving an available routing subset to simplify the decision space for the
Stackelberg AJ routing game. Based on this routing subset, a popular RL algorithm, Q-Learning, was
then used to respond rapidly to intelligent jamming and adapt AJ strategies.
Fig 5.3.2.1 Space-based anti-jamming (AJ) routing. The red line represents the found jammed path, and the
green one represents the suggested path
Dept. of CSE, AITM, Belagavi Page 16
Artificial Intelligence for satellite communication
5.4TRAFFIC FORECASTING
5.4.1 Definition & limitations
Network traffic forecasting is a proactive approach that aims to guarantee reliable and high-quality
communication, as the predictability of traffic is important in many satellite applications, such as
congestion control, dynamic routing, dynamic channel allocation, network planning, and network
security. Satellite network traffic is self-similar and demonstrates long-range-dependence (LRD)[82]
To achieve accurate forecasting, it is therefore necessary to consider its self-similarity. However,
models for terrestrial networks based on self-similarity have a high computational complexity; as the
on-board satellite computational resources are limited, terrestrial models are not suitable for satellites.
An efficient traffic forecasting design for satellite networks is thus required.
Several researchers have performed traffic forecasting for both terrestrial and satellite networks; these
techniques have included the Markov [83], autoregressive moving average (ARMA)[84] ,
autoregressive integrated moving average (ARIMA)[85] , and fractional ARINA (FARIMA)[86]
models. By using empirical mode decomposition (EMD) to decompose the network traffic and then
applying the ARMA forecasting model, Ggo et al.[87] demonstrated remarkable improvement. The two
major difficulties facing satellite traffic forecasting are the LRD of satellite networks and the limited
on-board computational resources. Due to the LRD property of satellite networks, short- range
dependence (SRD) models have failed to achieve accurate forecasting. Although previous LRD models
have achieved better results than SRD models, they suffer from high complexity. To address these
issues, researchers have turned to AI techniques
5.4.2 AI-based solutions
Kati’s and Daskalaki [86] combined FARIMA with NNs for internet traffic forecasting, whereas Pan et
al.[88] combined a differential evolution with NNs for network traffic prediction .By applying
principal component analysis (PCA), to reduce the input dimensions and then a generalized regression
NN, Liu and Li[90] achieved higher-accuracy forecasting with less training time. Na et al.[91] used
traffic forecasting as a part of their distributed routing strategy for LEO satellite network. An extreme
learning machine (ELM) has also been employed for traffic load forecasting of satellite node before
routing[92] Bie et al.[82] used EMD to decompose the traffic of the satellite with LRD into a series
with SRD and at one frequency to decrease the predicting complexity and augment the speed. Their
combined EMD, fruit-fly optimization, and ELM methodology achieved more accurate forecasting at a
higher speed than prior approaches.
Dept. of CSE, AITM, Belagavi Page 17
Artificial Intelligence for satellite communication
5.5 CHANNEL MODELING
5.5.1 Definition & limitations
A channel model is a mathematical representation of the effect of a communication channel through
which wireless signals are propagated; it is modeled as the impulse response of the channel in the
frequency or time domain.
A wireless channel presents a variety of challenges for reliable high-speed communication, as it is
vulnerable to noise, interference, and other channel impediments, including path loss and shadowing.
Of these, path loss is caused by the waste of the power emitted by the transmitter and the propagation
channel effects, whereas shadowing is caused by the obstacles between the receiver and transmitter that
absorb power [93] .
Precise channel models are required to assess the performance of mobile communication system and
therefore to enhance coverage for existing deployments. Channel models may also be useful to forecast
propagation in designed deployment outlines, which could allow for assessment before deployment,
and for optimizing the coverage and capacity of actual systems. For small number of transmitter
possible positions, outdoor extensive environment evaluation could be done to estimate the parametersof
the channel [94, 95] . As more advanced technologies have been used in wireless communication, more
advanced channel modeling was required. Therefore the use of stochastic models is computationally
efficient while providing satisfactory results [96] .
Ray tracing is used for channel modeling, which requires 3D images that are generally generated using
computer vision methods including stereo-vision-based depth estimation [97−100]. A model is
proposed for an urban environment that requires features, including road widths, street orientation
angles, and height of buildings[101] .A simplified model was then proposed, by Fernandez and Sores
[102] that required only the proportion of building occupation between the receiver and transmitter,
which could be computed from segmented images manually or automatically[103] .
Despite the satisfactory performance of some of the listed techniques, they still have many limitations.
For example, the 3D images required by ray tracing are not generally available and their generation is
not computationally efficient. Even when the images are available, ray tracing is computationally
costly and data exhaustive and therefore is not appropriate for real-time coverage area optimization.
Further, the detailed data required for the model presented by Cichon and Kumar [101] are often
unavailable.
Dept. of CSE, AITM, Belagavi Page 18
Artificial Intelligence for satellite communication
5.5.2 AI-based solutions
Some early applications of AI for path loss forecasting have been based on classical ML algorithms such
as SVM [104, 105] , NNs[106−111] , and decision trees[112] . Interested readers are referred to a survey
of ML-based path loss prediction approaches for further details [113] . However, although previous ML
efforts have shown great results, many require 3D images. Researchers have recently thus shifted their
attention to using DL algorithms with 2D satellite/aerial images for path loss forecasting. For example,
Ates et al.[114] approximated channel parameters, including the standard deviation of shadowing and the
path loss exponent, from satellite images using deep CNN without the use of any added input
parameters, as shown in Fig5.4.2.1.
By using a DL model on satellite images and other input parameters to predict the reference signal
received power (RSRP) for specific receiver locations in a specific scenario/area, Thrane et al.[115]
demonstrated a gain improvement of and at 811 and 2630 MHz respectively, over previous techniques,
including ray tracing. Similarly, Ahmadien et al. [116] applied DL on satellite images for path loss
prediction, although they focused only on satellite images without any supplemental features and worked
on more generalized data. Despite the practicality of this method, as it only needs satellite images to
forecast the path loss distribution, 2D images will not always be sufficient to characterize the 3D
structure. In these cases, more features (e.g., building heights) must be input into the model.
5.6 TELEMETRY MINING
5.6.1 Definition & limitations
Telemetry is the process of recording and transferring measurements for control and monitoring. In
satellite systems, on-board telemetry helps mission control centers track platform’s status, detect
abnormal events, and control various situations.
2D satellite/aerial image Deep convolutional neural Channel parameter
network
Fig 5.6.1.1 Channel parameters prediction. 2D satellite/aerial images used as input to the deep convolutional neural
network (CNN) to predict channel parameters. The model is trained separately for each parameter.
Dept. of CSE, AITM, Belagavi Page 19
Artificial Intelligence for satellite communication
5.6.2 AI-based solutions
In recent years, AI techniques have been largely considered in space missions with telemetry. Satellite
health monitoring has been performed using probabilistic clustering[117] , dimensionality reduction,
hidden Markov[118] , and regression trees[119] , whereas others have developed anomaly detection
methods using the k-nearest neighbor (kNN), SVM, LSTM, and testing on the telemetry of Centre
National d’Etudes Spatiales spacecraft[120−122] .
Further, the space functioning assistant was further developed in diverse space applications using
datadriven [123] and model-based [124] monitoring methods. In their study of the use of AI for fault
diagnosis in general and for space utilization, Sun et al.[125] argued that the most promising direction is
the use of DL; suggested its usage for fault diagnosis for space utilization in China.
By comparing different ML algorithms using telemetry data from the Egyptsat-1 satellite, Ibrahim et
al.[126] demonstrated the high prediction accuracy of LSTM, ARIMA, and RNN models. They
suggested simple linear regression for forecasting critical satellite features for short-lifetime satellites
(i.e., 3–5 years) and NNs for long-lifetime satellites (15–20 years).
Unlike algorithms designed to operate on the ground in the mission control center, Wan et al.[127]
proposed a self-learning classification algorithm to achieve onboard telemetry data classification with
low computational complexity and low time latency.
5.7 IONOSPHERIC SCINTILLATION DETECTING
5.7.1 Definition & limitations
Signals transmission by satellites toward the earth can be notably impacted due to their propagation
through the atmosphere, especially the ionosphere, which is the ionized part of the atmosphere higher
layer, and is distinguished by an elevated density of free electrons (Fig. 12). The potential irregularities
and gradients of ionization can distort the signal phase and amplitude, in a process known as
ionospheric scintillation.
In particular, propagation through the ionosphere can cause distortion of global navigation satellite
system (GNSS) signals, leading to significant errors in the GNSS-based applications. GNSSs are radio
communication satellite systems that allow a user to compute the local time, velocity, and position in any
place on the earth by processing signals transferred from the satellites and conductingtrilateration [128]
GNSSs can also be used in a wide variety of applications, such as scientific observations. Because of
the low-received power of GNSS waves, any errors significantly threaten theaccuracy and credibility of
the positioning systems. GNSS signals propagating through the ionosphere face the possibility of both
Dept. of CSE, AITM, Belagavi Page 20
Artificial Intelligence for satellite communication
a temporal delay and scintillation. Although delay compensation methods are applied to all GNSS
receivers [128] , scintillation is still a considerable issue, as its quasi-random nature makes it difficult
to model[129] . Ionospheric scintillation thus remains a major limitation to high-accuracy applications
of GNSSs. Theaccurate detection of scintillation thus required to improve the credibility and quality of
GNSSs [130] To observe the signals, which are a source of knowledge for interpreting and modeling the
atmosphere higher layers, and to raise caution and take countermeasures for GNSS-based applications,
networks of GNSS receivers, have been installed, both at high and low latitudes, where scintillation is
expected to occur [131, 132] Robust receivers and proper algorithms for scintillation-detecting
algorithms are thus both required [133] .
Fig 5.6.1.1 Representation of ionosphere scintillation, where distortion occurs during signal propagation. The blue,
green, and red lines show the line-of-sight signal paths from the satellite to theearth antennas, the signal fluctuation,
and the signal delay, respectively.
Dept. of CSE, AITM, Belagavi Page 21
Artificial Intelligence for satellite communication
5.7.2AI-based solutions
Recently, studies have proved that AI can be utilized for the detection of scintillation. For example,
Rezoned et al.[138] proposed a survey of data mining methods, that rely on observing and integrating
GNSS receivers. A technique based on the SVM algorithm has been suggested for amplitudescintillation
detection [139, 140] , and then later expanded to phase scintillation detection[141, 142] . By using
decision trees and RF to systematically detect ionospheric scintillation events impacting the amplitude of
the GNSS signals, the methodology proposed by Linty et al.[143] outperformed state-of- the art
methodologies in terms of accuracy (99.7%) and Fscore (99.4%), thus reaching the levels of a manual
human-driven annotation.
More recently, Imam and Davis [144] proposed the use of decision trees, to differentiate between
ionospheric scintillation and multi-path in GNSS scintillation data. Their model, which annotates the
data as scintillated, multi-path affected or clean GNSS signal, demonstrated an accuracy of 96%.
5.8 INTERFERENCE MANAGING
5.8.1Definition & limitations
Interference managing is mandatory for satellite communication operators, as interference negatively
affects the communication channel, resulting in a reduced QoS, lower operational efficiency and loss of
revenue [145] . Moreover, interference is a common event that is increasing with the increasing
congestion of the satellite frequency band as more countries are launching satellites and more
applications are expected. With the growing number of users sharing the same frequency band, the
possibility of interfering augments, as does the risk of intentional interference, as discussed in Section3.
Interference managing is thus essential to preserve high-quality and reliable communication systems;
management includes detection, classification, and suppression of interference, as well as the
application of techniques to minimize its occurrence.
Interference detection is a well-studied subject that has been addressed in the past few decades[146,
147] , especially for satellite communication[145, 148] .
However, researchers have commonly relied on the decision theory of hypothesis testing, in which
specific knowledge of the signal characteristics and the channel model is needed. Due to the
contemporary diverse wireless standards, the design of specific detectors for each signal category is
fruitless approach.
Dept. of CSE, AITM, Belagavi Page 22
Artificial Intelligence for satellite communication
5.8.2 AI-based solutions
To minimize interference, Liu et al.[149] suggested the use of AI for moving terminals and stations in
satelliteterrestrial networks by proposing a framework combining different AI approaches including
SVM, unsupervised learning, and DRL for satellite selection, antenna pointing, and tracking.
Another AI-based approach that executes automatic real-time interference detection is based on the
forecasting of the following signal spectrum to be received in absence of anomaly, by using LSTM
trained on historical anomaly-free spectra [150]. Here the predicted spectra are then compared to the
received signal using a designed metric, to detect anomalies.
Henarejos et al. [151] proposed the use of two Abased approaches, DNN AEs and LSTM, for detecting
and classifying interference, respectively. In the former, the AE is trained with interference free signals
and tested against other signals without interference to obtain practical thresholds. The difference in
error in signals with and without interference is then exploited to detect the presence of interference.
5.9 REMOTE SENSING
5.9.1 Definition & limitations
R9 is the process of extracting information about an area, object or phenomenon by processing its
reflected and emitted radiation at a distance, generally from satellite or aircraft.
RS has a wide range of applications in multiple fields including land surveying, geography, geology,
ecology, meteorology, oceanography, military, and communication. As RS offers the possibility of
monitoring areas that are dangerous, difficult or impossible to access, including mountains, forests,
oceans, and glaciers, it is a popular and active research area.
5.9.2 AI-based solutions
The revolution in computer vision capabilities caused by DL has led to the increased development of
RS by adopting state-of-the-art DL algorithms on satellite images; image classification for RS has
become most popular task in computer vision. For example, Kussul et al. [152] used DL to classify
land coverage and crop types using RS images from Landsat-8 and Sentinel-1A over a test site in
Ukraine. Zhang et al. [153] combined DNNs by using a gradient-boosting random CNN for scene
classification. More recently, Li et al. [154] proposed the combination of kNN and CNN to map coral
reef marine habitats worldwide with RS imaging. RS and AI have also been used in communication
theory applications, such as those discussed in Section 3.4[114−116] .
Dept. of CSE, AITM, Belagavi Page 23
Artificial Intelligence for satellite communication
5.10 BEHAVIOR MODELING
5.10.1 Definition & limitations
Owing to the increasing numbers of active and inactive (debris) satellites of diverse orbits, shapes,
sizes, orientations, and functions, it is becoming infeasible for analysts to simultaneously monitor all
satellites. Therefore, AI, especially ML, could play a major role by helping to automate this process.
5.10.2 AI-based solutions
Mittal et al.[170] discussed the potential of ML algorithms to model satellite behavior. Supervised
models have been used to determine satellite stability [171], whereas unsupervised models have been
used to detect anomalous behavior and a satellites’ location [172], and an RNN has been used to predict
satellite maneuvers over time[173] .
Accurate satellite pose estimation, i.e., identifying a satellite’s relative position and attitude, is critical in
several space operations, such as debris removal; inter spacecraft communication, and docking. The
recent proposal for satellite pose estimation from a single image via combined ML and geometric
optimization by Chen et al. [174] won the first place in the recent Kelvins pose estimation challenge
organized by the European Space Agency [175] .
The amount of space debris has augmented immensely over the last few years, which can cause a
crucial menace to space missions due to the high velocity of the debris. It is thus essential to classify
space objects and apply collision avoidance techniques to protect active satellites. As such,
Jahirabadkar et al.[176] presented a survey of diverse AI methodologies, for classification of space
objects using the curves of light as a differentiating property
Yadava et al. [177] employed NNs and RL for onboard attitude determination and control; their method
effectively provided the needed torque to stabilize a nanosatellite along three axes.
To avoid catastrophic events because of battery failure, Ahmed et al. [178] developed an on-board
remaining battery life estimation system using ML and a logical analysis of data approaches.
5.11 SPACE-AIR-GROUND INTEGRATING
5.11.1 Definition & limitations
Recently, notable advances have been made in ground communication systems to provide users higher-
quality internet access. Nevertheless, due to the restricted capacity and coverage area of networks, such
services are not possible everywhere at all time, especially for users in rural or disaster areas. Although
Dept. of CSE, AITM, Belagavi Page 24
Artificial Intelligence for satellite communication
terrestrial networks have the most resources and highest throughput, non-terrestrial communication
systems have a much broader coverage area. However, non-terrestrial networks have their own
limitations; e.g., satellite communication systems have a long propagation latency, and air networks
have a narrow capacity and unstable links. To supply users with better and more-flexible end-toned
services by taking advantage of the way the networks can complement each other, researchers have
suggested the use of SAGINs[10] , which include the satellites in space, the balloons, airships, UAVs in
the air, and the ground segment, as shown in Fig. 5.10.1.1.
The multi-layered satellite communication system which consists of GEO, MEO, and LEO satellites,
can use multi-cast and broadcast methods to ameliorate the network capacity, crucially easing the
augmenting traffic burden [10, 26] . As SAGINs allow packet transmission to destinations via multiple
paths of diverse qualities, they can offer different packet transmissions methods to encounter diverse
service demands [26] .
However, the design and optimization of SAGINs is more challenging than that of conventional ground
communication systems owing to their inherent selforganization, time-variability, and heterogeneity
[10] . A variety of factors that must be considered when designing optimization techniques have thus
been identified [10, 26] . For example, the diverse propagation mediums, the sharing of frequency
bands by different communication types, the high mobility of the space and air segments, and the
inherent heterogeneity between the three segments, make the network control and spectrum
management of SAGINs arduous. The high mobility results in frequent handoffs, which makes safe
routing more difficult to realize, thus making SAGINs more exposed to jamming. Further,as optimizing
the energy efficiency is also more challenging than in standard terrestrial networks, energy
management algorithms are also required.
5.11.2 AI-based solutions
In their discussion of challenges facing SAGINs, Kato et al.[26] proposed the use of a CNN for the
routing To address this problem, Lee et al.[179] jointly optimized the source-satellite-UAV association
and the location of the UAV via DRL. Their suggested technique achieved up to a 5.74 times higher
average data rate than a direct communication baseline in the absence of UAV and satellite.
For offloading calculation-intensive applications, a SAGINs edge/cloud computing design has been
developed in such a way that satellites give access to the cloud and UAVs allow near-user edge
computing [180] . Here, a joint resource allocation and task scheduling approach is used to allocate the
computing resources to virtual machines and schedule the offloaded tasks for UAV edge servers,
Dept. of CSE, AITM, Belagavi Page 25
Artificial Intelligence for satellite communication
Whereas an RL-based computing offloading approach handles the multidimensional SAGINs resources
and learns the dynamic network conditions. Here, a joint resource allocation and task scheduling
approach is used to assign the computing resources to virtual machines and plan the offloaded
functions for UAV edge servers, whereas an RL-based computing offloading approach handles the
multidimensional SAGINs resources and learns the dynamic network characteristics. Simulation
results confirmed the efficiency and convergence of the suggested technique.
Fig 5.11.2.1 Space-air-ground integrated networks
Dept. of CSE, AITM, Belagavi Page 26
Artificial Intelligence for satellite communication
5.12 ENERGY MANAGING
5.12.1Definition & limitations
Recent advances in the connection between ground, aerial, and satellite networks such as SAGINs have
increased the demand imposed on satellite communication networks. This growing attention towards
satellites has led to increased energy consumption requirements. Satellite energy management thus
represents a hot research topic for the further development of satellite communication.
Compared with a GEO satellite, an LEO satellite has restricted on-board resources and moves quickly.
Further, an LEO satellite has a limited energy capacity owing to its small size [183]; as billions of
devices need to be served around the world[184] , current satellite resource capability can no longer
satisfy demand. To address this shortage of satellite communication resources, an efficient resource
scheduling scheme to take full use of the limited resources, must be designed. As current resource
allocation schemes have mostly been designed for GEO satellites, however, these schemes do not
consider many LEO specific concerns, such as the constrained energy, movement attribute, or
connection and transmission dynamics.
5.12.2 AI-based solutions
Some researchers have thus turned to AI-based solutions for power saving. For example, Kothari et al.
[27] suggested the usage of DNN compression before data transmission to improve latency and save
power. In the absence of solar light, satellites are battery energy dependent, which places a heavy load
on the satellite battery and can shorten their lifetime leading to increased costs for satellite
communication networks. To optimize the power allocation in satellite to ground communication using
LEO satellites and thus extend their battery life, Touched et al. [185] employed RL to share the
workload of overworked satellites with near satellites with lower load. Similarly, implementing DRL
for energy-efficient channel allocation in Sat lot allowed for a 67.86% reduction in energy consumption
when compared with previous models [186] . Mobile edge computing enhanced SatIoT networks
contain diverse satellites and several satellite gateways that could be jointly optimized with coupled
user association, offloading decisions computing, and communication resource allocation to minimize
the latency and energy cost. In a recent example, a joint user-association and offloading decision with
optimal resource allocation methodology based on DRL proposed by Cui et al.[187] improved the
long-term latency and energy costs.
Dept. of CSE, AITM, Belagavi Page 27
Artificial Intelligence for satellite communication
5.13 OTHER APPLICATIONS
5.13.1. Handoff optimization
Link-layer handoff occurs when the change of one or more links is needed between the communication
endpoints due to the dynamic connectivity patterns of LEO satellites. The management of handoff in
LEO satellites varies remarkably from that of terrestrial networks, since handoffs happen more
frequently due to the movement of satellites [3] . Many researchers have thus focused on handoff
management in LEO satellite networks.
In general, user equipment (UE) periodically measures the strength of reference signals of different
cells to ensure access to a strong cell, as the handoff decision depends on the signal strength or some
other parameters. Moreover, the historical RSRP contains information to avoid unnecessary handoff.
Thus, Zhang et al.[188] converted the handoff decision to a classification problem. Although the
historical RSRP is a time series, a CNN was employed rather than an RNN because the feature map of
historical RSRP has a strong local spatial correlation and the use of an RNN could lead to a series of
wrong decisions, as one decision largely impacts future decisions. In the proposed AI-based method,
the handoff was decreased by more than 25% for more than 70% of the UE, whereas the commonly
used “strongest beam” method only reduced the average RSRP by 3%.
5.12.2. Heat source layout design
The effective design of the used heat sources can enhance the thermal performance of the overall
system, and has thus become a crucial aspect of several engineering areas, including integrated circuit
design and satellite layout design. With the increasingly small size of components and higher power
intensity, designing the heat-source layout has become a critical problem [189]. Conventionally, the
optimal design is acquired by exploring the design space by repeatedly running the thermal simulationto
compare the performance of each scheme [190−192] . To avoid the extremely large computational
burden of traditional techniques, Sun et al.[193] employed an inverse design method in which the
layout of heat sources is directly generated from a given expected thermal performance based on a DL
model called Show, Attend, and Read[194] . Their developed model was capable of learning the
underlying physics of the design problem and thus could efficiently forecast the design of heat sources
under a given condition without any performing simulations. Other DL algorithms have been used in
diverse design areas, such as mechanics [195] ,optics[196] , fluids[197] , and materials[198] .
Dept. of CSE, AITM, Belagavi Page 28
Artificial Intelligence for satellite communication
5.12.3 Reflect array analysis and design
ML algorithms have been employed in the analysis and design of antennas [22] , including the
analysis[199, 200] and design[201, 202] of reflect arrays. For example, NNs were used by Shan et al.
[203] to forecast the phase-shift, whereas kriging was suggested to forecast the electromagnetic
response of reflect array components [204]. Support vector regression (SVR) has been used to
accelerate the examination [205] and to directly optimize narrowband reflect arrays[ 206] . To hasten
calculations without reducing their precision, Prado et al. [207] proposed a wideband SVR-based
reflect array design method, and demonstrated its ability to obtain wideband, dual-linear polarized, and
shaped beam reflect arrays for direct broadcast satellite application
5.13.4 Carrier signal detection
As each signal must be separated before classification, modulation, demodulation, decoding, and other
signal processing, localization and detection of carrier signals in the frequency domain are a crucial
problem in wireless communication.
The algorithms used for carrier signal detection have been commonly based on threshold values and
required human intervention [208−213] , although several improvements have been made including the
use of a double threshold[214, 215] . Kim et al. [216] proposed the use of a slope-tracing-based
algorithm to separate the interval of signal elements based on signal properties such as amplitude,
slope, deflection width, or distance between neighboring deflections.
More recently, DL has been applied to carrier signal detection; for example, Morozov and Ovchinnikov
[217] applied a fully connected NN for their detection in FSK signals, whereas Yuan et al. [218] used
DL to mores signals blind detection in wideband spectrum data.
Hangeul [219] employed a fully convolutional network (FCN) model to detect carrier signal in the
broadband power spectrum. A FCN is a DL method for semantic image segmentation in which the
broadband power spectrum is regarded as a 1D image and each subcarrier as the target object to
transform the carrier detection problem on the broadband to a semantic 1D image segmentation
problem [220−222] Here, a 1D deep FCN model was designed to categorize each point on a
broadband power spectrum array into two categories (i.e., subcarrier or noise), and then position the
subcarrier signals’ location on the broadband power spectrum. After being trained and validated using a
simulated and real satellite broadband power spectrum dataset, respectively, the proposed deep CNN
successfully detected the subcarrier signal in the broadband power spectrum and achieved a higher
accuracy than the slope tracing method.
Dept. of CSE, AITM, Belagavi Page 29
Artificial Intelligence for satellite communication
CHAPTER 6
VARIOUS AI ALGORITHMS
Table 6.1 Various AI algorithms with their respective satellite communication
applications.
AI algorithm Satellite communication application
SVM Network traffic forecasting, channel
modeling, telemetry mining,
ionosphere scintillation detecting
managing interference, and remote
sensing
Decision trees Channel modeling, ionospheres
detecting, and remote sensing
CNN Channel modeling, remote sensing, space
air-ground integrating, handoff,
And carrier signal detection
RNN Anti-jamming, telemetry mining, behavior
model and handoff optimization
AE Managing interference
RL Beam hopping, anti-jamming, managing
Interference, behavior modeling, space-ground.
Dept. of CSE, AITM, Belagavi Page 30
Artificial Intelligence for satellite communication
Table 6.2 Abbreviations and full names.
Abbreviation Full name
AE Auto encoder
AI Artificial intelligence
AJ Anti-jamming
ARIMA Auto regressive integrated moving average
ARMA Auto regressive moving average
BH Beam hopping
CNN Convolutional neural network
DL Deep learning
DNN Deep neural network
DRL Deep reinforcement learning
ELM Extreme learning machine
EMD Empirical mode decomposition
FARIMA Fractional auto regressive integrated moving
FCN Fully convolutional network
FDMA Frequency division multiple access
FH Frequency hopping
GA Genetic algorithm
GANs Generative adversarial networks
GNSS Global navigation satellite system
Dept. of CSE, AITM, Belagavi Page 31
Artificial Intelligence for satellite communication
Abbreviation Full name
KNN k-nearest neighbor
LRD Long-range-dependence
LSTM Long short-term memory
MDP Markov decision process
ML Machine learning
MO-DRL Multi-objective deep reinforcement learning
NNs Neural networks
PCA Principal Component analysis
QoS Quality of service
RL Reinforcement learning
RNNs Recurrent neural networks
RS Remote sensing
RSRP Reference signal received power
SAGINs Space-air-ground integrated
networks
SRD Short range dependence
SVM Support vector machine
SVR Support vector regression
SatIoT Satellite internet of things
UE User equipment
VAEs Variational autoencoders
Dept. of CSE, AITM, Belagavi Page 32
Artificial Intelligence for satellite communication
OPPORTUNITIES FOR FUTURE RESEARCH
The development of AI-based systems for satellite communication presents a number of opportunities
for future research. These include the development of new algorithms for image processing, the
development of new routing algorithms for satellite networks, and the development of new navigation
algorithms for satellite navigation systems.
In addition, research into the security of AI-based systems is needed, as well as research into the
integration of existing systems into AI-based systems. Finally, research into the potential cost savings
and efficiency gains that can be achieved with AI-based satellite communication systems is also
needed.
Dept. of CSE, AITM, Belagavi Page 33
Artificial Intelligence for satellite communication
CONCLUSION
This literature survey has provided an overview of the current state of AI for satellite communication.
We have discussed the various applications of AI in this field, as well as the challenges and potential
opportunities for future research. AI-based systems for satellite communication can provide cost
savings, efficiency, and reliability. However, the development of such systems faces a number of
challenges, and requires significant research and development. There are also a number of
opportunities for future research in this field. This review provided an overview of AI and its different
sub-fields, including ML, DL, and RL. Some limitations to satellite communication were then
presented and their proposed and potential AI-based solutions were discussed. The application of AI
has shown great results in a wide variety of satellite communication aspects, including beam-
hopping, AJ, network traffic forecasting, channel modeling, telemetry mining, ionospheric
scintillation detecting, interference managing, remote sensing, behaviour modeling, space-air-ground
integrating, and energy managing. Future work should aim to apply AI, to achieve more efficient,
secure, reliable, and highquality communication systems. Although ML has achieved great results in
terms of precision and accuracy in several applications, for more secure and reliable communication,
there is still more work to be done on ML interpretability and adversarial ML.
Dept. of CSE, AITM, Belagavi Page 34
Artificial Intelligence for satellite communication
REFERENCES
[1] G. Maral, M. Bousquet, and Z. L. Sun, Satellite Communications Systems: Systems,
Techniques and Technology. 6th ed. West Sussex, UK: John Wiley & Sons, 2020
[2] F. Rinaldi, H. L. Maattanen, J. Torsner, S. Pizzi, S. Andreev, A. Iera, Y. Koucheryavy, and G.
Araniti, Non terrestrial networks in 5G & beyond: A survey, IEEE Access, vol. 8, pp. 165178–
165200, 2020
[3] P. K. Chowdhury, M. Atiquzzaman, and W. Ivancic, Handover schemes in satellite networks:
State-of-the-art and future research directions, IEEE Commun. Surv. Tutorials, vol. 8, no. 4,
pp. 2–14, 2006.
[4] P. Chini, G. Giambene, and S. Kota, A survey on mobile satellite systems, Int. J. Satell.
Commun. Netw. , vol. 28, no. 1, pp. 29–57, 2010.
[5] P. D. Arapoglou, K. Liolis, M. Bertinelli, A. Panagopoulos, P. Cottis, and R. De Gaudenzi,
MIMO over satellite: A review, IEEE Commun. Surv. Tutorials, vol. 13, no. 1, pp. 27–51,
2011.
[6] M. De Sanctis, E. Cianca, G. Araniti, I. Bisio, and R. Prasad, Satellite communications
supporting internet of remote things, IEEE Int. Things J. , vol. 3, no. 1, pp. 113–123, 2016.
[7] R. Radhakrishnan, W. W. Edmonson, F. Afghah, R. M. Rodriguez-Osorio, F. Pinto, and S. C.
Burleigh, Survey of inter-satellite communication for small satellite systems: Physical layer to
network layer view, IEEE Commun. Surv. Tutorials, vol. 18, no. 4, pp. 2442–2473, 2016.
[8] R. Radhakrishnan, W. W. Edmonson, F. Afghah, R. M. Rodriguez-Osorio, F. Pinto, and S. C.
Burleigh, Survey of inter-satellite communication for small satellite systems: Physical layer to
network layer view, IEEE Commun. Surv. Tutorials, vol. 18, no. 4, pp. 2442–2473, 2016.
[9] H. Kaushal and G. Kaddoum, Optical communication in space: Challenges and mitigation
techniques, IEEE Commun. Surv. Tutorials, vol. 19, no. 1, pp. 57–96, 2017.
[10] H. Kaushal and G. Kaddoum, Optical communication in space: Challenges and
mitigation techniques, IEEE Commun. Surv. Tutorials, vol. 19, no. 1, pp. 57–96, 2017.
[11] S. C. Burleigh, T. De Cola, S. Morosi, S. Jayousi, E. Cianca, and C. Fuchs, From
connectivity to advanced internet services: A comprehensive review of small satellites
Dept. of CSE, AITM, Belagavi Page 35
Artificial Intelligence for satellite communication
[12] B. Li, Z. S. Fei, C. Q. Zhou, and Y. Zhang, Physicallayer security in space information
networks: A survey, IEEE Int. Things J., vol. 7, no. 1, pp. 33–52, 2020.
[13] N. Saeed, A. Elzanaty, H. Almorad, H. Dahrouj, T. Y. Al-Naffouri, and M. S. Alouini,
CubeSat communications: Recent advances and future challenges, IEEE Commun. Surv.
Tutorials, vol. 22, no. 3, pp. 1839– 1862, 2020.
[14] O. Simeone, A very brief introduction to machine learning with applications to
communication systems, IEEE Trans. Cogn. Commun. Netw., vol. 4, no. 4, pp. 648– 664, 2018.
[15] M. Z. Chen, U. Challita, W. Saad, C. C. Yin, and M. Debbah, Artificial neural
networks-based machine learning for wireless networks: A tutorial, IEEE Commun. Surv.
Tutorials, vol. 21, no. 4, pp. 3039–3071, 2019.
[16] Y. C. Qian, J. Wu, R. Wang, F. S. Zhu, and W. Zhang, Survey on reinforcement learning
applications in communication networks, J. Commun. Inform. Netw., vol. 4, no. 2, pp. 30–39,
2019.
[17] E. C. Strinati, S. Barbarossa, J. L. Gonzalez-Jimenez, D. Ktenas, N. Cassiau, L. Maret,
and C. Dehos, 6G: The next frontier: From holographic messaging to artificial intelligence
using subterahertz and visible light communication, IEEE Veh. Technol. Mag., vol. 14, no. 3,
pp. 42–50, 2019.
[18] J. Jagannath, N. Polosky, A. Jagannath, F. Restuccia, and T. Melodia, Machine learning
for wireless communications in the Internet of Things: A comprehensive survey, Ad Hoc
Networks , vol. 93, p. 101913, 2019.
[19] G. P. Kumar and P. Venkataram, Artificial intelligence approaches to network
management: recent advances and a survey, Comput. Commun. , vol. 20, no. 15, pp. 1313–
1322, 1997.
[20] Y. L. Zou, J. Zhu, X. B. Wang, and L. Hanzo, A survey on wireless security: Technical
challenges, recent advances, and future trends, Proc. IEEE, vol. 104, no. 9, pp. 1727–1765,
2016.
[21] S. H. Alsamhi, O. Ma, and M. S. Ansari, Survey on artificial intelligence based
techniques for emerging robotic communication, Telecommun. Syst. , vol. 72, no. 3, pp. 483–
503, 2019.
Dept. of CSE, AITM, Belagavi Page 36
Artificial Intelligence for satellite communication
[22] H. M. El Misilmani and T. Naous, Machine learning in antenna design: An overview
on machine learning concept and algorithms, presented at 2019 Int. Conf. High Performance
Computing & Simulation (HPCS), Dublin, Ireland, 2019, pp. 600–607
[23] P. S. Bithas, E. T. Michailidis, N. Nomikos, D. Vouyioukas, and A. G. Kanatas, A
survey on machinelearning techniques for UAV-based communications, Sensors, vol. 19, no.
23, p. 5170, 2019
[24] M. A. Lahmeri, M. A. Kishk, and M. S. Alouini, Artificial intelligence for UAV-
enabled wireless networks: A survey, IEEE Open J. Commun. Soc., vol. 2, pp. 1015–1040,
2021
[25] M. Á. Vázquez, P. Henarejos, A. I. Pérez-Neira, E. Grechi, A. Voight, J. C. Gil, I.
Pappalardo, F. Di Credico, and R. M. Lancellotti, On the use of AI for satellite
communications, arXiv preprint arXiv: 2007.10110, 2020
Dept. of CSE, AITM, Belagavi Page 37