Data Structures Notes

mobiprep (6).png

Data Structures Basic

mobiprep (6).png

Stacks

mobiprep (6).png

Queues

mobiprep (6).png

Linked List

mobiprep (6).png

Sorting

mobiprep (6).png

Tree

mobiprep (6).png

Graphs

mobiprep (6).png

Hashing and Searching

Heading

Q

1

What are data structures and why are they required ?

LRM_EXPORT_207556595493866_20190724_1939

Ans

A data structure is a particular way of organizing data in a computer so that it can be used effectively.
For example, we can store a list of items having the same data-type using the array data structure.
Data structure can also be defined as a particular way of storing and organizing information in a computer. Different kinds of data structures are meant for different kinds of applications, and some are highly specialized to specific tasks.
Data structures are important for the following reasons :
1. Data structures are used in almost every program or software system and are essential part of the code.
2. Specific data structures are essential ingredients of many efficient algorithms, and make possible the management of huge amounts of data.
3. Some programming languages emphasize data structures, as the key organizing factor in software design.

LRM_EXPORT_207556595493866_20190724_1939

Q

2

Explain basic types of Abstract data structures?

LRM_EXPORT_207556595493866_20190724_1939

Ans

Abstract Data type (ADT) is a type (or class) for objects whose behaviour is defined by a set of value and a set of operations.
The definition of ADT only mentions what operations are to be performed but not how these operations will be implemented. It does not specify how data will be organized in memory and what algorithms will be used for implementing the operations. It is called “abstract” because it gives an implementation-independent view. The process of providing only the essentials and hiding the details is known as abstraction.
There are three ADTs namely List ADT, Stack ADT, Queue ADT.
List ADT

The data is generally stored in key sequence in a list which has a head structure consisting of count, pointers and address of compare function needed to compare the data in the list.
The data node contains the pointer to a data structure and a self-referential pointer which points to the next node in the list. Following are some operations that can be performed on the list.
get() – Returns element from the list at any given position.
insert() – Inserts element at any position of the list.
remove() – Remove the first occurrence of any element from a non-empty list.
removeAt() – Remove the element at a specified location from a non-empty list.
replace() – Replace an element at any position by another element.
size() – Return the total number of elements in the list.
isEmpty() – Return true if the list is empty, else return false.
isFull() – Return true if the list is full, else false.

Stack
In Stack ADT Implementation instead of data being stored in each node, the pointer to data is stored.
It allocates memory for the data and address is passed to the stack ADT. The head node and the data nodes are encapsulated in the ADT. The calling function can only see the head pointer to the stack. The stack head structure also contains a pointer to top and count of number of entries currently in stack.

Queue

Queue is as it is known, a queue of people.In technical terms, a Queue contains elements of the same type arranged in sequential order. Operations take place at both ends, insertion is done at the end and deletion is done at the front. Following operations can be performed:
enqueue() – Insert an element at the end of the queue(yellow person).
dequeue() – Remove and return the first element of the queue, if the queue is not empty(first green person to the right).
peek() – Return the first element of the queue without removing it, if the queue is not empty.
size() – Return the number of elements in the queue.
isEmpty() – Return true if the queue is empty, otherwise return false.
isFull() – Return true if the queue is full, otherwise return false.

In technical terms, a Queue contains elements of the same type arranged in sequential order. Operations take place at both ends, insertion is done at the end and deletion is done at the front. Following operations can be performed:
enqueue() – Insert an element at the end of the queue(yellow person).
dequeue() – Remove and return the first element of the queue, if the queue is not empty(first green person to the right).
peek() – Return the first element of the queue without removing it, if the queue is not empty.
size() – Return the number of elements in the queue.
isEmpty() – Return true if the queue is empty, otherwise return false.
isFull() – Return true if the queue is full, otherwise return false.

LRM_EXPORT_207556595493866_20190724_1939

Q

3

What are Asymptotic Notations and its types?

LRM_EXPORT_207556595493866_20190724_1939

Ans

Types of Data Structure Asymptotic Notation
1. Big-O Notation (Ο) – describes the worst case scenario.
2. Omega Notation (Ω) – Omega(Ω) describes the best case scenario.
3. Theta Notation (θ) – represents the average complexity of an algorithm.
Big-O Notation (Ο )
Big O notation specifically describes the worst case scenario. It represents the upper bound running time complexity of an algorithm.
O(1)
Big O notation O(1) represents the complexity of an algorithm that always execute in same time or space regardless of the input data.
O(n)
Big O notation O(N) represents the complexity of an algorithm, whose performance will grow linearly (in direct proportion) to the size of the input data.
O(n^2)
Big O notation O(n^2) represents the complexity of an algorithm, whose performance is directly proportional to the square of the size of the input data.
Other examples: Bubble sort, insertion sort and selection sort algorithms
If we have to draw a diagram to compare the performance of algorithms denoted by these notations, then we would draw it like this:
O(1) < O(log n) < O (n) < O(n log n) < O(n^2) < O (n^3)< O(2^n) < O(n!)
Omega Notation (Ω)
Omega notation specifically describes best case scenario. It represents the lower bound running time complexity of an algorithm. So if we represent a complexity of an algorithm in Omega notation, it means that the algorithm cannot be completed in less time than this, it would at-least take the time represented by Omega notation or it can take more). Theta Notation (θ)
This notation describes both upper bound and lower bound of an algorithm so we can say that it defines exact asymptotic behaviour. In the real case scenario the algorithm not always run on best and worst cases, the average running time lies between best and worst and can be represented by the theta notation.

LRM_EXPORT_207556595493866_20190724_1939

Q

4

What do you understand by constant time complexity?

LRM_EXPORT_207556595493866_20190724_1939

Ans

An algorithm is said to be constant time (also written as O(1) time) if the value of T(n) is bounded by a value that does not depend on the size of the input. For example, finding any single element in an array takes constant time as only one operation has to be performed to locate it.

LRM_EXPORT_207556595493866_20190724_1939

Q

5

Explain with example the concept of linear space complexity?

LRM_EXPORT_207556595493866_20190724_1939

Ans

Linear complexity – takes space directly proportional to the input size. Consider the following example:
#include <stdio.h>
int main()
{
int n, i, mul = 0;
scanf("%d", &n);
int arr[n];
for(i = 0; i < n; i++)
{
scanf("%d", &arr[i]);
mul = mul *= arr[i];
}
printf("%d", mul);
}
Explanation :
In the code given above, the array consists of n integer elements. So, the space occupied by the array is 4 * n. Also we have integer variables such as n, i and mul. Assuming 4 bytes for each variable, the total space occupied by the program is 4n + 12 bytes. Since the highest order of n in the equation 4n + 12 is n, so the space complexity is O(n) or line.

LRM_EXPORT_207556595493866_20190724_1939

Q

6

What do you understand by order of growth?

LRM_EXPORT_207556595493866_20190724_1939

Ans

Order of growth in algorithm means how the time for computation increases when you increase the input size. Order of growth provide only a crude description of the behavior of a process.

LRM_EXPORT_207556595493866_20190724_1939

Q

7

Explain lower bound and upper bound of an algorithm ?

LRM_EXPORT_207556595493866_20190724_1939

Ans

Providing an upper bound means you have proven that the algorithm will use no more than some limit on a resource. Proving a lower bound means you have proven that the algorithm will use no less than some limit on a resource. Also The Lower and Upper Bound Theory provides a way to find the lowest complexity algorithm to solve a problem.
Lower Bound –
Let L(n) be the running time of an algorithm A(say), then g(n) is the Lower Bound of A if there exist two constants C and N such that L(n) >= C*g(n) for n > N. Lower bound of an algorithm is shown by the asymptotic notation called Big Omega (or just Omega).
Upper Bound –
Let U(n) be the running time of an algorithm A(say), then g(n) is the Upper Bound of A if there exist two constants C and N such that U(n) <= C*g(n) for n > N. Upper bound of an algorithm is shown by the asymptotic notation called Big Oh(O) (or just Oh).

LRM_EXPORT_207556595493866_20190724_1939

Q

8

Describe the Symmetric, Reflexive and Transitive property of various Asymptotic notations ?

LRM_EXPORT_207556595493866_20190724_1939

Ans

Given Asymptotic functions f(n), g(n) and h(n), the properties can be defined as follows:
Properties:
Reflexivity:
If f(n) is given then
f(n) = O(f(n))
Symmetry:
f(n) = Θ(g(n)) if and only if g(n) = Θ(f(n))
Transitivity:
f(n) = O(g(n)) and g(n) = O(h(n)) ⇒ f(n) = O(h(n))

LRM_EXPORT_207556595493866_20190724_1939

Q

9

According to you which is more important Time complexity or Space complexity?Justify .

LRM_EXPORT_207556595493866_20190724_1939

Ans

The importance of time and space complexity is defined by the problem requirement in hand.
Your space is fixed for any set of hardware. If you don't have enough, you just can't run the algorithm. If you're lucky, you can fudge having more space with clever caching to disk, but not all algorithms (in point of fact, most, since most algorithms do not have any particular “nice” property) will permit this. Thus space complexity plays an important role practically.
On the other hand, if you can always run for longer, that is, have an abundance of space available . If the time complexity is high enough, this doesn't help. Thus improvement in time complexity is always preferred as it would eventually reduce the need of space required.
Thus, in cases where the space time trade off apply, you'll usually want to use as much space as practical to accelerate the algorithm, as you'll probably have that space anyway.
If building hardware to match the problem, then it's a very complicated cost-benefit trade off.

LRM_EXPORT_207556595493866_20190724_1939

Q

10

How can data structures be classified?

LRM_EXPORT_207556595493866_20190724_1939

Ans

The following picture describes a very well classification of Data Structure .

LRM_EXPORT_207556595493866_20190724_1939