Suresh Gyan Vihar University: Department of CEIT
Suresh Gyan Vihar University: Department of CEIT
Department of CEIT
ANS.
A database management system (DBMS) is system software for creating and managing databases. The
DBMS provides users and programmers with a systematic way to create, retrieve, update and
manage data.
A DBMS makes it possible for end users to create, read, update and delete data in a database. The DBMS
essentially serves as an interface between the database and end users or application programs, ensuring
that data is consistently organized and remains easily accessible.
The DBMS manages three important things: the data, the database engine that allows data to be
accessed, locked and modified -- and the database schema, which defines the database’s logical
structure. These three foundational elements help provide concurrency, security, data integrity and
uniform administration procedures. Typical database administration tasks supported by the DBMS
include change management, performance monitoring/tuning and backup and recovery. Many database
management systems are also responsible for automated rollbacks, restarts and recovery as well as
the logging and auditing of activity.
Q2 Draw the DBMS Architecture/ Structure explain its various components which included init. How
the controller manager control the process of database.
ANS.
Structure of DBMS:
DBMS (Database Management System) acts as an interface between the user and the database. The
user requests the DBMS to perform various operations such as insert, delete, update and retrieval on
the database.
The components of DBMS perform these requested operations on the database and provide necessary
data to the users.
DDL Compiler:
Data Description Language compiler processes schema definitions specified in the DDL.
It includes metadata information such as the name of the files, data items, storage details of each file,
mapping information and constraints etc.
The DML commands such as insert, update, delete, retrieve from the application program are sent to
the DML compiler for compilation into object code for database access.
The object code is then optimized in the best way to execute a query by the query optimizer and then
send to the data manager.
Data Manager:
The Data Manager is the central software component of the DBMS also knows as Database Control
System.
Data Dictionary:
Data Dictionary, which stores metadata about the database, in particular the schema of the database.
names of the tables, names of attributes of each table, length of attributes, and number of rows in each
table.
Detailed information on physical database design such as storage structure, access paths, files and
record sizes.
Data dictionary is used to actually control the data integrity, database operation and accuracy. It may be
used as a important part of the DBMS
Data Files:
Compiled DML:
The DML complier converts the high level Queries into low level file access commands known as
compiled DML.
End Users:
The second class of users then is end user, who interacts with system from online workstation or
terminals.
Use the interface provided as an integral part of the database system software.
User can request, in form of query, to access database either directly by using particular language, such
as SQL, or by using some pre-developed application interface.
Such request are sent to query evaluation engine via DML pre-compiler and DML compiler
The query evaluation engine accepts the query and analyses it.
It finds the suitable way to execute the compiled SQL statements of the query.
Finally, the compiled SQL statements are executed to perform the specified operation
Query Processor Units:
DDL Interpreter:-
a. DML Compiler
b. Embeddedre-compiler
ANS.
With millions of customers accessing the bank systems daily at ATMs, branches, online, and
through multiple call centers, any downtime or service disruptions are practically unacceptable to the
bank. With a growing portion of customers relying on online and mobile banking, 24/7 service reliability
has become more critical than ever.
To address these needs, major efforts and resources have been directed towards the creation of a
robust high availability and disaster recovery infrastructure.
In this complex infrastructure comprising multiple datacenters, configuration changes are undertaken
daily by different groups in various parts of the environment. While each team was making an
effort to apply best practices in its own domain, there was no visibility to the implications and risks
introduced by such modifications on the overall stability, service availability, and DR readiness of critical
systems.
As the IT environment has grown in size and complexity, keeping production high availability and
disaster recovery systems in complete sync across IT teams and domains (e.g., server, storage,
databases and virtualization) has become an increasing challenge. Moreover, management was lacking
visibility into how well the organization was keeping up with established Service Level Agreements
(SLA’s) for availability (RTO), data protection (RPO), and retention objectives.
Following management’s directive, a committee was put in place to define the requirements for a
solution: