set-2

51. What is the process of finding the location of a given item in a collection of items called?

  1. Discovering

  2. Finding

  3. Searching

  4. Mining

Show me the answer

Answer: 3. Searching

Explanation:

  • Searching: Searching is the process of finding the location of a specific item in a collection of items. It involves checking each element in the collection until the desired item is found.

  • Conclusion: The process of finding the location of a given item in a collection is called searching.

52. What is the time complexity of quicksort?

  1. O(n)O(n)

  2. O(logn)O(\log n)

  3. O(n2)O(n^2)

  4. O(nlogn)O(n \log n)

Show me the answer

Answer: 4. O(nlogn)O(n \log n)

Explanation:

  • Quicksort: Quicksort is a divide-and-conquer algorithm that works by selecting a pivot element and partitioning the array into two subarrays, one with elements less than the pivot and one with elements greater than the pivot. It then recursively sorts the subarrays.

  • Time Complexity: The average-case time complexity of quicksort is O(nlogn)O(n \log n). However, in the worst case (e.g., when the pivot is always the smallest or largest element), the time complexity can degrade to O(n2)O(n^2).

  • Conclusion: The average-case time complexity of quicksort is O(nlogn)O(n \log n).

53. What is quicksort also known as?

  1. Merge sort

  2. Tree sort

  3. Shell sort

  4. Partition and exchange sort

Show me the answer

Answer: 4. Partition and exchange sort

Explanation:

  • Quicksort: Quicksort is also known as partition and exchange sort because it works by partitioning the array around a pivot element and exchanging elements to sort the array.

  • Conclusion: Quicksort is commonly referred to as partition and exchange sort.

54. What sorting method is good for alphabetizing a large list of names?

  1. Merge

  2. Heap

  3. Radix

  4. Bubble

Show me the answer

Answer: 3. Radix

Explanation:

  • Radix Sort: Radix sort is a non-comparative sorting algorithm that works by distributing elements into buckets based on their digits or characters. It is particularly efficient for sorting strings or numbers with a fixed number of digits.

  • Alphabetizing Names: Radix sort is well-suited for alphabetizing a large list of names because it can sort strings lexicographically.

  • Conclusion: Radix sort is a good choice for alphabetizing a large list of names.

55. What is the total number of comparisons in a bubble sort?

  1. O(nlogn)O(n \log n)

  2. O(2n)O(2n)

  3. O(n2)O(n^2)

  4. O(n)O(n)

Show me the answer

Answer: 3. O(n2)O(n^2)

Explanation:

  • Bubble Sort: Bubble sort is a simple sorting algorithm that repeatedly steps through the list, compares adjacent elements, and swaps them if they are in the wrong order. This process is repeated until the list is sorted.

  • Number of Comparisons: In the worst case, bubble sort makes n(n1)/2n(n-1)/2 comparisons, which is O(n2)O(n^2).

  • Conclusion: The total number of comparisons in bubble sort is O(n2)O(n^2).

56. What form of access is used to add and remove nodes from a queue?

  1. LIFO, Last In First Out

  2. FIFO, First In First Out

  3. Both A and B

  4. None of these

Show me the answer

Answer: 2. FIFO, First In First Out

Explanation:

  • Queue: A queue is a data structure that follows the First-In-First-Out (FIFO) principle, meaning the first element added is the first one to be removed.

  • Access Method: Nodes are added to the rear of the queue and removed from the front, ensuring FIFO access.

  • Conclusion: The form of access used to add and remove nodes from a queue is FIFO.

57. Enqueue is the process of ………..

  1. Removing data from queue

  2. Adding item into a queue

  3. Searching item in queue

  4. None of the above

Show me the answer

Answer: 2. Adding item into a queue

Explanation:

  • Enqueue: Enqueue is the process of adding an item to the rear of a queue.

  • Conclusion: Enqueue refers to adding an item into a queue.

  1. Array

  2. Lists

  3. Stacks

  4. Trees

Show me the answer

Answer: 3. Stacks

Explanation:

  • Push and Pop: These are operations associated with stacks. "Push" adds an element to the top of the stack, and "pop" removes the top element from the stack.

  • Conclusion: The terms "push" and "pop" are related to stacks.

59. What is an application of stack?

  1. Finding factorial

  2. Tower of Hanoi

  3. Infix to postfix

  4. All of the above

Show me the answer

Answer: 4. All of the above

Explanation:

  • Stack Applications: Stacks are used in various applications, including:

    1. Finding Factorial: Recursive algorithms often use stacks to manage function calls.

    2. Tower of Hanoi: The problem can be solved using a stack-based approach.

    3. Infix to Postfix Conversion: Stacks are used to convert infix expressions to postfix notation.

  • Conclusion: All of the above are applications of stacks.

60. What is the operation of processing each element in a list called?

  1. Sorting

  2. Merging

  3. Inserting

  4. Traversal

Show me the answer

Answer: 4. Traversal

Explanation:

  • Traversal: Traversal is the process of visiting each element in a list or data structure exactly once.

  • Conclusion: The operation of processing each element in a list is called traversal.

61. What is the situation when in a linked list, START=NULL called?

  1. Underflow

  2. Overflow

  3. Houseful

  4. Saturated

Show me the answer

Answer: 1. Underflow

Explanation:

  • Underflow: In a linked list, if START=NULL, it means the list is empty. Attempting to remove an element from an empty list results in an underflow condition.

  • Conclusion: When START=NULL in a linked list, it is called underflow.

62. What are two-way lists?

  1. Grounded header list

  2. Circular header list

  3. Linked list with header and trail nodes

  4. List traversed in two directions

Show me the answer

Answer: 4. List traversed in two directions

Explanation:

  • Two-Way Lists: A two-way list, also known as a doubly linked list, allows traversal in both forward and backward directions. Each node contains pointers to both the next and previous nodes.

  • Conclusion: Two-way lists are lists that can be traversed in two directions.

63. What is the pointer associated with the availability list?

  1. FIRST

  2. AVAIL

  3. TOP

  4. REAR

Show me the answer

Answer: 2. AVAIL

Explanation:

  • Availability List: In memory management, the AVAIL pointer points to the list of available (free) nodes that can be allocated for use.

  • Conclusion: The pointer associated with the availability list is AVAIL.

64. What data structure cannot store non-homogeneous data elements?

  1. Arrays

  2. Records

  3. Pointers

  4. Stacks

Show me the answer

Answer: 1. Arrays

Explanation:

  • Arrays: Arrays are homogeneous data structures, meaning they can only store elements of the same data type.

  • Non-Homogeneous Data: Structures like records or objects can store non-homogeneous data (e.g., different data types).

  • Conclusion: Arrays cannot store non-homogeneous data elements.

65. What is a non-linear data structure?

  1. Stacks

  2. List

  3. Strings

  4. Trees

Show me the answer

Answer: 4. Trees

Explanation:

  • Non-Linear Data Structure: A non-linear data structure is one where elements are not arranged sequentially. Examples include trees and graphs.

  • Trees: Trees are hierarchical data structures where each node can have multiple children.

  • Conclusion: Trees are an example of a non-linear data structure.

66. What data structure is suitable for representing hierarchical relationships between elements?

  1. Dequeue

  2. Priority

  3. Tree

  4. Graph

Show me the answer

Answer: 3. Tree

Explanation:

  • Hierarchical Relationships: Trees are ideal for representing hierarchical relationships because they have a root node and child nodes that form a hierarchy.

  • Conclusion: Trees are suitable for representing hierarchical relationships between elements.

67. What data structure allows deletions at both ends of the list but only allows insertion at one end?

  1. Input restricted dequeue

  2. Output restricted queue

  3. Priority queues

  4. Stack

Show me the answer

Answer: 1. Input restricted dequeue

Explanation:

  • Input Restricted Dequeue: An input restricted dequeue allows deletions at both ends but only allows insertions at one end.

  • Conclusion: An input restricted dequeue is the data structure that fits the given description.

68. What is the order of visiting nodes in a pre-order tree traversal?

  1. Root, Left, Right

  2. Left, Root, Right

  3. Right, Root, Left

  4. Left, Right, Root

Show me the answer

Answer: 1. Root, Left, Right

Explanation:

  • Pre-order Traversal: In pre-order traversal, the nodes are visited in the following order:

    1. Root: Visit the root node first.

    2. Left: Traverse the left subtree.

    3. Right: Traverse the right subtree.

  • Conclusion: The order of visiting nodes in a pre-order traversal is Root, Left, Right.

69. What is the order of visiting nodes in a post-order tree traversal?

  1. Root, Left, Right

  2. Left, Root, Right

  3. Right, Left, Root

  4. Left, Right, Root

Show me the answer

Answer: 3. Right, Left, Root

Explanation:

  • Post-order Traversal: In post-order traversal, the nodes are visited in the following order:

    1. Left: Traverse the left subtree.

    2. Right: Traverse the right subtree.

    3. Root: Visit the root node last.

  • Conclusion: The order of visiting nodes in a post-order traversal is Left, Right, Root.

70. What is the order of visiting nodes in an in-order tree traversal?

  1. Root, Left, Right

  2. Left, Root, Right

  3. Right, Root, Left

  4. Left, Right, Root

Show me the answer

Answer: 2. Left, Root, Right

Explanation:

  • In-order Traversal: In in-order traversal, the nodes are visited in the following order:

    1. Left: Traverse the left subtree.

    2. Root: Visit the root node.

    3. Right: Traverse the right subtree.

  • Conclusion: The order of visiting nodes in an in-order traversal is Left, Root, Right.

71. What is the average time complexity of searching for an element in a binary search tree?

  1. O(1)O(1)

  2. O(n)O(n)

  3. O(logn)O(\log n)

  4. O(nlogn)O(n \log n)

Show me the answer

Answer: 3. O(logn)O(\log n)

Explanation:

  • Binary Search Tree (BST): In a balanced BST, the height of the tree is logn\log n, where nn is the number of nodes. Searching for an element involves traversing from the root to a leaf, which takes O(logn)O(\log n) time on average.

  • Conclusion: The average time complexity of searching for an element in a binary search tree is O(logn)O(\log n).

72. What is the worst-case time complexity of searching for an element in a binary search tree?

  1. O(1)O(1)

  2. O(n)O(n)

  3. O(logn)O(\log n)

  4. O(nlogn)O(n \log n)

Show me the answer

Answer: 2. O(n)O(n)

Explanation:

  • Worst-Case Scenario: In the worst case, a binary search tree can degenerate into a linked list (e.g., when elements are inserted in sorted order). In this case, the height of the tree becomes nn, and searching for an element takes O(n)O(n) time.

  • Conclusion: The worst-case time complexity of searching for an element in a binary search tree is O(n)O(n).

73. What is the average time complexity of inserting an element into a binary search tree?

  1. O(1)O(1)

  2. O(n)O(n)

  3. O(logn)O(\log n)

  4. O(nlogn)O(n \log n)

Show me the answer

Answer: 3. O(logn)O(\log n)

Explanation:

  • Binary Search Tree (BST): In a balanced BST, the height of the tree is logn\log n, where nn is the number of nodes. Inserting an element involves traversing from the root to the appropriate leaf, which takes O(logn)O(\log n) time on average.

  • Conclusion: The average time complexity of inserting an element into a binary search tree is O(logn)O(\log n).

74. In a linked list, what is the time complexity of inserting an element at the beginning of the list?

  1. O(1)O(1)

  2. O(n)O(n)

  3. O(logn)O(\log n)

  4. O(nlogn)O(n \log n)

Show me the answer

Answer: 1. O(1)O(1)

Explanation:

  • Insertion at the Beginning: In a linked list, inserting an element at the beginning involves updating the head pointer, which takes constant time, O(1)O(1).

  • Conclusion: The time complexity of inserting an element at the beginning of a linked list is O(1)O(1).

75. What is the time complexity of deleting an element from the middle of a linked list?

  1. O(1)O(1)

  2. O(n)O(n)

  3. O(logn)O(\log n)

  4. O(nlogn)O(n \log n)

Show me the answer

Answer: 2. O(n)O(n)

Explanation:

  • Deletion from the Middle: To delete an element from the middle of a linked list, you must first traverse the list to find the element, which takes O(n)O(n) time in the worst case.

  • Conclusion: The time complexity of deleting an element from the middle of a linked list is O(n)O(n).

76. What is a circular linked list?

  1. A linked list where the last node points to the first node

  2. A linked list where the first node points to the last node

  3. A linked list where the last node points to the second node

  4. A linked list where the first node points to the second node

Show me the answer

Answer: 1. A linked list where the last node points to the first node

Explanation:

  • Circular Linked List: In a circular linked list, the last node points back to the first node, creating a circular structure.

  • Conclusion: A circular linked list is one where the last node points to the first node.

77. What is a doubly linked list?

  1. A linked list where each node points to the next node

  2. A linked list where each node points to the previous node

  3. A linked list where each node points to the next and previous nodes

  4. A linked list where the first and last nodes are connected

Show me the answer

Answer: 3. A linked list where each node points to the next and previous nodes

Explanation:

  • Doubly Linked List: In a doubly linked list, each node contains two pointers: one pointing to the next node and one pointing to the previous node.

  • Conclusion: A doubly linked list is one where each node points to both the next and previous nodes.

78. What is the advantage of using a doubly linked list over a singly linked list?

  1. A doubly linked list requires less memory

  2. A doubly linked list is faster for inserting and deleting elements

  3. A doubly linked list allows for traversal in both directions

  4. A doubly linked list has better cache performance

Show me the answer

Answer: 3. A doubly linked list allows for traversal in both directions

Explanation:

  • Doubly Linked List: The main advantage of a doubly linked list is that it allows traversal in both forward and backward directions, making it easier to navigate the list.

  • Conclusion: The ability to traverse in both directions is the primary advantage of a doubly linked list over a singly linked list.

79. What is the time complexity of reversing a linked list?

  1. O(1)O(1)

  2. O(n)O(n)

  3. O(logn)O(\log n)

  4. O(nlogn)O(n \log n)

Show me the answer

Answer: 2. O(n)O(n)

Explanation:

  • Reversing a Linked List: To reverse a linked list, you must traverse the entire list and update the pointers of each node, which takes O(n)O(n) time.

  • Conclusion: The time complexity of reversing a linked list is O(n)O(n).

80. What is the time complexity of finding an element in a binary tree?

  1. O(1)O(1)

  2. O(n)O(n)

  3. O(logn)O(\log n)

  4. O(nlogn)O(n \log n)

Show me the answer

Answer: 3. O(logn)O(\log n)

Explanation:

  • Binary Tree Search: In a balanced binary tree, the height of the tree is logn\log n, where nn is the number of nodes. Searching for an element involves traversing from the root to a leaf, which takes O(logn)O(\log n) time on average.

  • Conclusion: The time complexity of finding an element in a binary tree is O(logn)O(\log n).

81. What is the time complexity of inserting an element in a binary tree?

  1. O(1)O(1)

  2. O(n)O(n)

  3. O(logn)O(\log n)

  4. O(nlogn)O(n \log n)

Show me the answer

Answer: 3. O(logn)O(\log n)

Explanation:

  • Binary Tree Insertion: In a balanced binary tree, the height of the tree is logn\log n, where nn is the number of nodes. Inserting an element involves traversing from the root to the appropriate leaf, which takes O(logn)O(\log n) time on average.

  • Conclusion: The time complexity of inserting an element in a binary tree is O(logn)O(\log n).

82. What is the time complexity of deleting an element in a binary tree?

  1. O(1)O(1)

  2. O(n)O(n)

  3. O(logn)O(\log n)

  4. O(nlogn)O(n \log n)

Show me the answer

Answer: 3. O(logn)O(\log n)

Explanation:

  • Binary Tree Deletion: In a balanced binary tree, the height of the tree is logn\log n, where nn is the number of nodes. Deleting an element involves traversing from the root to the appropriate node, which takes O(logn)O(\log n) time on average.

  • Conclusion: The time complexity of deleting an element in a binary tree is O(logn)O(\log n).

83. What is the time complexity of searching for the minimum element in a binary tree?

  1. O(1)O(1)

  2. O(n)O(n)

  3. O(logn)O(\log n)

  4. O(nlogn)O(n \log n)

Show me the answer

Answer: 2. O(n)O(n)

Explanation:

  • Minimum Element in Binary Tree: In a binary tree, the minimum element is typically found in the leftmost node. However, in the worst case (e.g., a skewed tree), you may need to traverse all nn nodes to find the minimum element.

  • Conclusion: The time complexity of searching for the minimum element in a binary tree is O(n)O(n).

84. What is the time complexity of searching for the maximum element in a binary tree?

  1. O(1)O(1)

  2. O(n)O(n)

  3. O(logn)O(\log n)

  4. O(nlogn)O(n \log n)

Show me the answer

Answer: 2. O(n)O(n)

Explanation:

  • Maximum Element in Binary Tree: In a binary tree, the maximum element is typically found in the rightmost node. However, in the worst case (e.g., a skewed tree), you may need to traverse all nn nodes to find the maximum element.

  • Conclusion: The time complexity of searching for the maximum element in a binary tree is O(n)O(n).

85. What is the time complexity of performing an in-order traversal of a binary tree?

  1. O(1)O(1)

  2. O(n)O(n)

  3. O(logn)O(\log n)

  4. O(nlogn)O(n \log n)

Show me the answer

Answer: 2. O(n)O(n)

Explanation:

  • In-order Traversal: In-order traversal visits each node of the binary tree exactly once. Since there are nn nodes, the time complexity is O(n)O(n).

  • Conclusion: The time complexity of performing an in-order traversal of a binary tree is O(n)O(n).

86. What is the time complexity of performing a pre-order traversal of a binary tree?

  1. O(1)O(1)

  2. O(n)O(n)

  3. O(logn)O(\log n)

  4. O(nlogn)O(n \log n)

Show me the answer

Answer: 2. O(n)O(n)

Explanation:

  • Pre-order Traversal: Pre-order traversal visits each node of the binary tree exactly once. Since there are nn nodes, the time complexity is O(n)O(n).

  • Conclusion: The time complexity of performing a pre-order traversal of a binary tree is O(n)O(n).

87. What is the advantage of using the array data structure among the following options?

  1. The entities of mixed type of data types can be easily stored

  2. Access to elements in the array is relatively easy

  3. The index number of the first element always starts from 1

  4. None of the above

Show me the answer

Answer: 2. Access to elements in the array is relatively easy

Explanation:

  • Array Advantages: Arrays provide constant-time access to elements using their indices, making it easy to retrieve or modify elements.

  • Conclusion: The primary advantage of using an array is that access to elements is relatively easy and efficient.

88. Which among the following is an application of the queue data structure?

  1. When a resource is shared among multiple users

  2. Load balancing

  3. Asynchronous data transfer

  4. All of the above

Show me the answer

Answer: 4. All of the above

Explanation:

  • Queue Applications: Queues are used in various applications, including:

    1. Resource Sharing: Queues manage access to shared resources among multiple users.

    2. Load Balancing: Queues distribute tasks evenly across multiple servers or processes.

    3. Asynchronous Data Transfer: Queues are used to transfer data between processes or systems asynchronously.

  • Conclusion: All of the above are applications of the queue data structure.

89. Which type of data structure can be used to implement queues among the following options?

  1. Linked List

  2. Array

  3. Stack

  4. All of the above

Show me the answer

Answer: 4. All of the above

Explanation:

  • Queue Implementation: Queues can be implemented using:

    1. Linked List: A linked list can efficiently implement a queue by maintaining pointers to the front and rear of the list.

    2. Array: A circular array can be used to implement a queue with fixed size.

    3. Stack: Two stacks can be used to simulate a queue.

  • Conclusion: All of the above data structures can be used to implement queues.

90. Which type of data structure allows for insertion and deletion from both ends?

  1. String

  2. Queue

  3. Stack

  4. Dequeue

Show me the answer

Answer: 4. Dequeue

Explanation:

  • Dequeue: A dequeue (double-ended queue) allows insertion and deletion from both the front and rear ends.

  • Conclusion: A dequeue is the data structure that allows insertion and deletion from both ends.

91. Which sorting algorithm is used to achieve the best time complexity in the worst-case scenario?

  1. Bubble sort

  2. Selection sort

  3. Quick sort

  4. Merge sort

Show me the answer

Answer: 4. Merge sort

Explanation:

  • Merge Sort: Merge sort has a worst-case time complexity of O(nlogn)O(n \log n), which is the best among the given options.

  • Conclusion: Merge sort achieves the best time complexity in the worst-case scenario.

92. What term is used to describe the scenario when a user tries to remove an element from an empty stack?

  1. Underflow

  2. Empty collection

  3. Overflow

  4. Garbage Collection

Show me the answer

Answer: 1. Underflow

Explanation:

  • Underflow: Underflow occurs when an attempt is made to remove an element from an empty stack.

  • Conclusion: The term used to describe this scenario is underflow.

93. What is the maximum number of non-zero values that can exist in an adjacency matrix of a simple graph with nn vertices?

  1. n(n+1)2\frac{n(n+1)}{2}

  2. n(n1)2\frac{n(n-1)}{2}

  3. n(n1)n(n-1)

  4. n(n+1)n(n+1)

Show me the answer

Answer: 3. n(n1)n(n-1)

Explanation:

  • Adjacency Matrix: In a simple graph with nn vertices, the adjacency matrix is an n×nn \times n matrix. The maximum number of non-zero values (edges) is n(n1)n(n-1), as each vertex can be connected to every other vertex except itself.

  • Conclusion: The maximum number of non-zero values in an adjacency matrix of a simple graph with nn vertices is n(n1)n(n-1).

94. What is the time complexity of the selection sort algorithm in the best and worst cases?

  1. Best: O(n2)O(n^2), Worst: O(n)O(n)

  2. Best: O(nlogn)O(n \log n), Worst: O(n2)O(n^2)

  3. Best: O(n)O(n), Worst: O(n2)O(n^2)

  4. Best: O(n2)O(n^2), Worst: O(n2)O(n^2)

Show me the answer

Answer: 4. Best: O(n2)O(n^2), Worst: O(n2)O(n^2)

Explanation:

  • Selection Sort: Selection sort has a time complexity of O(n2)O(n^2) in both the best and worst cases because it always performs n(n1)/2n(n-1)/2 comparisons, regardless of the input.

  • Conclusion: The time complexity of selection sort is O(n2)O(n^2) in both the best and worst cases.

95. What type of sorting algorithm is the bubble sort algorithm?

  1. Linear

  2. Non-linear

  3. Logarithmic

  4. Exponential

Show me the answer

Answer: 1. Linear

Explanation:

  • Bubble Sort: Bubble sort is a linear sorting algorithm because it compares adjacent elements and swaps them if they are in the wrong order, iterating through the list multiple times.

  • Conclusion: Bubble sort is a linear sorting algorithm.

96. What is the space complexity of the merge sort algorithm?

  1. O(n)O(n)

  2. O(1)O(1)

  3. O(nlogn)O(n \log n)

  4. O(n2)O(n^2)

Show me the answer

Answer: 1. O(n)O(n)

Explanation:

  • Merge Sort: Merge sort requires additional space proportional to the input size nn to store temporary arrays during the merging process.

  • Conclusion: The space complexity of merge sort is O(n)O(n).

97. What is the time complexity of the quick sort algorithm in the average case?

  1. O(n)O(n)

  2. O(n2)O(n^2)

  3. O(nlogn)O(n \log n)

  4. O(1)O(1)

Show me the answer

Answer: 3. O(nlogn)O(n \log n)

Explanation:

  • Quick Sort: In the average case, quick sort has a time complexity of O(nlogn)O(n \log n) because it divides the array into two subarrays and recursively sorts them.

  • Conclusion: The average-case time complexity of quick sort is O(nlogn)O(n \log n).

98. Which sorting algorithm is used to sort linked lists?

  1. Insertion sort

  2. Quick sort

  3. Merge sort

  4. Bubble sort

Show me the answer

Answer: 3. Merge sort

Explanation:

  • Merge Sort for Linked Lists: Merge sort is well-suited for sorting linked lists because it does not require random access to elements and can efficiently divide and merge the list.

  • Conclusion: Merge sort is commonly used to sort linked lists.

99. Which sorting algorithm is most efficient for small data sets?

  1. Quick sort

  2. Bubble sort

  3. Insertion sort

  4. Merge sort

Show me the answer

Answer: 3. Insertion sort

Explanation:

  • Insertion Sort: Insertion sort is efficient for small data sets because it has a low overhead and performs well when the number of elements is small.

  • Conclusion: Insertion sort is the most efficient sorting algorithm for small data sets.

100. What is the time complexity of the insertion sort algorithm in the best case?

  1. O(n2)O(n^2)

  2. O(nlogn)O(n \log n)

  3. O(n)O(n)

  4. O(1)O(1)

Show me the answer

Answer: 3. O(n)O(n)

Explanation:

  • Insertion Sort Best Case: In the best case (when the input is already sorted), insertion sort only needs to make n1n-1 comparisons and no swaps, resulting in a time complexity of O(n)O(n).

  • Conclusion: The best-case time complexity of insertion sort is O(n)O(n).

Last updated