Practice

Quiz Type

Single Choice
Single Choice

Quiz Level

Basic

Multiple Choices

1. Define Preorder Traversal Preorder traversal is a method of visiting all nodes in a binary tree. In this traversal, the nodes are visited in the following order: Root → Left → Right This means: First visit the root node. Then traverse the left subtree. Finally traverse the right subtree. It is a type of Depth First Search (DFS). Preorder traversal is useful when we want to create a copy of the tree or represent the tree structure. Example: A / \ B C Preorder traversal: A, B, C 2. Implement Stack Using Queue (Java) A stack follows LIFO (Last In First Out) principle. A queue follows FIFO (First In First Out) principle. We can implement a stack using one queue by rearranging elements after every push operation. import java.util.LinkedList; import java.util.Queue; class StackUsingQueue { Queue<Integer> q = new LinkedList<>(); void push(int x) { q.add(x); for(int i = 0; i < q.size() - 1; i++) q.add(q.remove()); } int pop() { return q.remove(); } int top() { return q.peek(); } boolean isEmpty() { return q.isEmpty(); } } Here, after inserting an element, we move all previous elements behind it so that the last inserted element comes to the front. This makes it behave like a stack. 3. Determine Worst Case Time Complexity public static void function(int n) { int i = 1, s = 1; while (s < n) { s = s + i; i++; } } In this code: The value of s increases as 1 + 2 + 3 + ... + k. This is the sum of first k natural numbers. Formula = k(k+1)/2. We stop when this sum becomes equal to or greater than n. So, k² ≈ n Therefore, k ≈ √n. Hence, the worst case time complexity is: O(√n) 4. What is Complexity of an Algorithm? The complexity of an algorithm tells us how much time or memory the algorithm uses when input size increases. It helps us compare algorithms. Given: Algorithm A → O(2logn) Algorithm B → O(log₂n) Explanation: In Big-O notation, constant values are ignored. So O(2logn) becomes O(logn). Log base does not matter in Big-O. So log₂n is also O(logn). Therefore, both algorithms have the same time complexity: O(log n). 5. Divide and Conquer Approach Divide and Conquer is a technique where we: Divide the problem into smaller subproblems. Solve each subproblem recursively. Combine the results to get final solution. This approach reduces complexity and makes problem easier. Example: Binary Search Divide array into two halves. Check middle element. Search only in one half. Other examples: Merge Sort Quick Sort It is efficient for large problems. 6. Fractional Knapsack Problem In the fractional knapsack problem, we can take fractions of items. The goal is to maximize total profit within given weight capacity. Steps: Calculate profit/weight ratio for each item. Sort items in descending order of ratio. Pick items fully until capacity allows. If capacity remains, take fraction of next item. It is solved using Greedy algorithm because we make the best choice at each step. Time Complexity: O(n log n) (due to sorting) 7. Longest Common Prefix (Java) The longest common prefix means the longest starting string that is common in all strings. If no common prefix exists, return "". public String longestCommonPrefix(String[] strs) { if(strs.length == 0) return ""; String prefix = strs[0]; for(int i = 1; i < strs.length; i++) { while(strs[i].indexOf(prefix) != 0) { prefix = prefix.substring(0, prefix.length() - 1); if(prefix.isEmpty()) return ""; } } return prefix; } Example: Input: ["flower","flow","flight"] Output: "fl" 8. Implement LCS (Longest Common Subsequence) LCS finds the longest subsequence common in two strings. It is solved using Dynamic Programming. public int lcs(String s1, String s2) { int m = s1.length(); int n = s2.length(); int[][] dp = new int[m+1][n+1]; for(int i = 1; i <= m; i++) { for(int j = 1; j <= n; j++) { if(s1.charAt(i-1) == s2.charAt(j-1)) dp[i][j] = dp[i-1][j-1] + 1; else dp[i][j] = Math.max(dp[i-1][j], dp[i][j-1]); } } return dp[m][n]; } Time Complexity: O(m × n) 9. Hill Climbing Algorithm Hill climbing is a local search algorithm used in optimization problems. Working: Start with an initial solution. Check neighboring solutions. Move to better solution. Stop when no better neighbor is found. Advantages: Simple to implement. Fast for small problems. Disadvantages: Can get stuck in local maximum. Does not guarantee optimal solution. Example: 8 Queens Problem Traveling Salesman Problem 10. Graph Representation There are two main ways to represent a graph: Adjacency Matrix 2D array. If edge exists → value is 1. Space complexity: O(V²). Adjacency List Each vertex stores list of neighbors. Space complexity: O(V + E). More efficient for sparse graphs. Program to Detect Cycle in Undirected Graph boolean isCycle(int v, boolean[] visited, int parent) { visited[v] = true; for(int i : adj[v]) { if(!visited[i]) { if(isCycle(i, visited, v)) return true; } else if(i != parent) return true; } return false; } If a visited node is found again and it is not the parent, then cycle exists. 11. Greedy vs Dynamic Programming Greedy: Makes best choice at each step. Faster and simpler. May not always give optimal solution. Dynamic Programming: Solves subproblems. Stores previous results. Gives optimal solution. Use Greedy when greedy choice property holds. Use DP when overlapping subproblems exist. 12. Prim’s Algorithm (Example) Prim’s algorithm is used to find Minimum Spanning Tree (MST). Steps: Start from any vertex. Choose smallest edge connecting visited and unvisited vertex. Repeat until all vertices are included. Example edges: A-B (1) B-C (2) C-D (3) MST includes smallest edges connecting all vertices. Time Complexity: O(V²) or O(E log V) 13. Postorder Traversal Postorder traversal visits nodes in: Left → Right → Root Steps: Traverse left subtree. Traverse right subtree. Visit root node. Example: A / \ B C Postorder: B, C, A It is useful for deleting a tree or evaluating expressions.