1.
Define the concept of a finite state machine (FSM):
A finite state machine has a finite number of states and
transitions.
States change based on inputs.
It models systems or computations.
Includes a start state and one or more accepting states.
2. Illustrate how a transition diagram helps in recognizing languages:
A transition diagram is a visual representation of state
transitions.
States are represented as circles, and transitions as arrows.
It shows valid paths for input strings.
Helps to trace strings to determine acceptance.
3. Define language acceptance by a finite automaton:
A language is accepted if all its strings lead to an accepting state.
Defined over a specific alphabet.
Only valid strings belong to the language.
Rejects strings that don't fit the defined rules.
4. Define deterministic finite automaton (DFA):
A DFA has one unique transition for each input symbol.
No ambiguity in transitions.
Recognizes regular languages.
Easier to implement than NFAs.
5. Define non-deterministic finite automaton (NFA):
An NFA allows multiple transitions for the same input.
Can have epsilon (ϵ) transitions, which require no input.
Equally powerful as DFA for recognizing languages.
Simpler to design but harder to implement directly.
6. Describe the role of an alphabet in formal languages:
An alphabet is a set of symbols used to define strings.
It forms the building blocks of languages.
Example: {a, b, c}.
All strings and languages are derived from it.
7. Define regular expressions and list some common identity rules:
Regular expressions represent patterns in regular languages.
Use symbols like *, |, and concatenation to define languages.
Example: (a|b)* means all combinations of a’s and b’s.
Rules: (A∣B)∗=A∗∪B∗(A|B)* = A* \cup B*(A∣B)∗=A∗∪B∗,
A⋅(B∪C)=AB∪ACA \cdot (B \cup C) = AB \cup ACA⋅(B∪C)=AB∪AC.
8. Define the pumping lemma for regular sets:
A property used to prove a language is not regular.
States that strings longer than a certain length can be split and
"pumped."
Proves non-regularity by contradiction.
Example: L={anbn∣n≥0}L = \{a^n b^n | n \geq 0\}L={anbn∣n≥0} is
not regular.
9. Illustrate the closure properties of regular sets:
Regular languages are closed under union, intersection, and
complement.
Also closed under concatenation and the Kleene star.
This allows combining and modifying regular languages.
Ensures regular expressions remain consistent.
10. Discuss the significance of right linear and left linear grammars:
Right linear grammars generate regular languages (A → aB).
Left linear grammars have non-terminals on the left (A → Ba).
Right linear is used for DFA and NFA.
Left linear grammars are less common but useful in special
cases.
11. How does the pumping lemma prove non-regularity of languages?:
Assume the language is regular.
Use the pumping lemma to split strings and "pump" parts.
Show strings that cannot fit the language's rules.
Proves the language is not regular.
12. Define the significance of the acceptance of strings in automata
theory:
Acceptance determines if a string belongs to a language.
Helps validate inputs in systems.
Ensures computations follow defined rules.
Links theoretical models to practical uses.
13. Define Moore Machine:
A finite automaton where outputs depend only on current states.
Outputs are fixed for each state.
Simpler and more predictable.
Example: Traffic light systems.
14. Define Mealy Machine:
A finite automaton where outputs depend on states and inputs.
Outputs are dynamic and change with transitions.
More flexible than Moore machines.
Example: Digital counters.
15. Define: 1) Strings 2) Alphabet:
A string is a sequence of symbols from an alphabet.
An alphabet is a finite set of symbols.
Example: {a, b} is an alphabet; "abba" is a string.
Strings combine to form languages.
16. Define the significance of context-free grammars in syntax analysis:
Context-free grammars (CFG) define rules for hierarchical
structures.
Used in programming language syntax.
Example: Arithmetic expressions.
Helps build parse trees for code analysis.
17. Define ambiguity in context-free grammars with an example:
Ambiguity occurs when a grammar allows multiple parse trees
for a string.
Example: E→E+E∣E∗E∣idE → E+E | E*E | idE→E+E∣E∗E∣id.
Makes parsing unclear and inconsistent.
Needs resolution for correct interpretation.
18. Define Chomsky Normal Form (CNF):
A CFG with rules A → BC or A → a.
Simplifies parsing and computation.
Used in CYK parsing algorithms.
Efficient for language processing.
19. Define Greibach Normal Form (GNF):
A CFG where rules start with a terminal symbol (A → aα).
Simplifies recursive parsing.
Ensures leftmost derivations.
Useful for top-down parsers.
20. Define the concept of pushdown automata:
A computational model with a stack for memory.
Recognizes context-free languages.
Handles nested structures like parentheses.
Extends finite automata with stack operations.
21. Define the pumping lemma for context-free languages:
Tests if a language is context-free.
States that long strings in context-free languages can be split and
repeated.
Proves non-context-free languages by contradiction.
Example: L={anbncn∣n≥1}L = \{a^n b^n c^n | n \geq
1\}L={anbncn∣n≥1}.
22. Define the role of a stack in a pushdown automaton:
o A stack stores symbols to track context and nesting.
o Operates on a last-in, first-out (LIFO) basis.
o Enables recognition of context-free languages.
o Handles structures like balanced parentheses.
23.Define context-free languages used in parsing programming
languages:
o Context-free languages define hierarchical and nested
structures.
o Used to parse if-else, loops, and function calls.
o Recognized by pushdown automata.
o Example: Arithmetic expressions in programming.
24. Enlist the application of pumping lemma:
o Proves non-regularity of languages.
o Distinguishes between regular and non-regular languages.
o Tests context-free vs. non-context-free languages.
o Identifies limits of regular and context-free grammars.
25. Define CFG and PDA inter-convertible:
o Every context-free grammar (CFG) has an equivalent PDA.
o A PDA can generate strings of a CFG.
o Ensures theoretical equivalence between grammars and
automata.
o Provides tools for both recognition and generation of
languages.
26.Define concatenation:
o Combines two strings or languages sequentially.
o Example: "a" + "b" = "ab".
o Builds more complex structures from simpler ones.
o Essential in regular and context-free languages.
27. Define a Turing machine and its components:
o A Turing machine is a theoretical model of computation.
o Components: an infinite tape, a head for reading/writing, a
state register, and a transition function.
o Solves any algorithmic problem that is computable.
o Basis of modern computation theory.
28. Illustrate the role of a tape in a Turing machine:
o The tape acts as infinite memory for input, output, and
intermediate computations.
o Tracks data and head position.
o Enables both read and write operations.
o Central to the machine's functionality.
29.Define the significance of recursively enumerable languages in
automata theory:
o Languages accepted by a Turing machine, possibly non-
halting.
o Represent the most general class of languages.
o Includes all computable problems.
o Highlights the limits of language recognition.
30. Define the relationship between context-sensitive
languages and linear bounded automata:
o Context-sensitive languages are recognized by linear
bounded automata (LBA).
o LBAs use a tape size proportional to the input.
o Extends computational power beyond context-free
languages.
o Example: L={anbncn∣n≥1}L = \{a^n b^n c^n | n \geq 1\}.
31. Define computable functions in the context of Turing machines:
o Functions solvable by a Turing machine.
o Includes arithmetic, logic, and algorithmic tasks.
o Defines the boundary of computation.
o Central to understanding decidability.
32.Define how Turing machines are used to simulate algorithms:
o Executes algorithms step-by-step with defined rules.
o Uses states and tape for data manipulation.
o Models real-world computational procedures.
o Acts as the theoretical foundation for computers.
33.Describe how a Turing machine can be used to model real-world
computations:
o Simulates any process with a clear algorithm.
o Models data manipulation, decision-making, and iteration.
o Basis for designing programming languages and algorithms.
o Explains computational limits.
34. Define the concept of decidability in the context of Turing
machines:
o A problem is decidable if a Turing machine halts on all
inputs.
o Decidability determines solvability of problems.
o Divides problems into solvable and unsolvable categories.
o Central to computation and logic.
35. State the role of Turing machines in the theory of
computation:
o Defines the limits of algorithmic computation.
o Provides a framework for studying languages and problems.
o Models both solvable and unsolvable problems.
o Basis for modern computer science.
36.Define the significance of the halting problem in Turing machines:
o The halting problem asks whether a Turing machine stops
on a given input.
o Proven undecidable for all cases.
o Highlights limits of algorithmic computation.
o Central to understanding computation theory.
37.Define the role of a compiler in software development:
o Converts high-level code into machine-readable format.
o Detects and resolves errors in source code.
o Optimizes performance.
o Bridges programming and execution.
38. Define the significance of lexical analysis in the compilation
process:
o Breaks source code into tokens for parsing.
o Detects basic syntax errors.
o First phase of compilation.
o Converts human-readable code into manageable units.
39.Discuss the various phases involved in the compilation process:
o Lexical analysis: Tokenizes source code.
o Syntax analysis: Checks grammatical structure.
o Semantic analysis: Validates meaning of code.
o Code generation and optimization: Produces efficient
machine code.
40. Define the front-end and back-end of a compiler:
o The front-end handles syntax and semantics.
o The back-end generates optimized machine code.
o Front-end ensures code correctness.
o Back-end ensures execution efficiency.
41. Define the concept of top-down parsing in syntax analysis:
o Parsing starts from the root of the parse tree.
o Example: Recursive-descent parsing.
o Easier to implement and understand.
o Suitable for LL grammars.
42. Define the importance of parse trees in syntactic analysis:
o Represents the structure of input code.
o Helps validate syntax according to grammar rules.
o Used for debugging and optimizing code.
o Essential in compiler design.
43. Define differences between LL and LR parsing techniques:
o LL parsing: Top-down, scans left-to-right, constructs
leftmost derivations.
o LR parsing: Bottom-up, scans left-to-right, constructs
rightmost derivations.
o LL is simpler but less powerful.
o LR handles more complex grammars and languages.