Binary Search trees are a data structure that has many advantages over arrays and linked lists. With a binary search tree you are able to both treverse the tree fast to find nodes, and you can insert and delete nodse much faster than in an array, because you don't have to traverse the whole array and move each element to the fill the blank space. They also can work as a sorting algorithem that is very effecent. O(nlog(n))

BST are binary... meaning they have two children per node... a right and a left. The rule with a BST is that all children to the left of the node must equal to or smalller than the value stored in the parent node. All the values in the right node must be greater than the parent node.

Now BST's are not ideal if they are very very unballanced, which can happen fairly easily, so there are lots of algorithems to not let the trees get too unblananced and force the hight to be more reasnible, which increases traversal effecency.

there is a possiblity that a BST that is not ballenced with some algorithem, could potentally just turn into a linked list, where your lookup time will end up being linear time, which is not good in this case.

```
In [7]:
```class Node(object):
def __init__(self, value=None):
self.value = value
self.lChild = None
self.rChild = None

```
In [34]:
```class Tree(object):
def __init__(self):
self.size = 0
self.root = None
def insert(self, root, node):
if root is None:
root = Node()
else:
if root.value > node.value:
if root.lChild is None:
root.lChild = node
else:
self.insert(root.lChild, node)
else:
if root.rChild is None:
root.rChild = node
else:
self.insert(root.rChild, node)
def in_order_print(self, root):
if root is None:
return
else:
self.in_order_print(root.lChild)
print(root.value)
self.in_order_print(root.rChild)

```
In [35]:
```root = Node(50)
n2 = Node(30)
n3 = Node(70)
n4 = Node(65)
n5 = Node(55)
tree = Tree()
tree.insert(root, Node(93))
tree.insert(root, Node(33))
tree.insert(root, Node(88))
tree.insert(root, Node(65))
tree.insert(root, Node(3))
tree.insert(root, Node(12))
# print(root.lChild.value)
tree.in_order_print(root)

```
```

Here is the awesome resource I used to solidify my understanding of this info. Here's a video on AVL trees

Basicly the concept behind AVL trees is that you set a value for the hight of each node in the tree. If any side of the tree has a hight grater than the oppiste side by more than one we have to move nodes around in such a way that we satisife that condition.

There are 4 types of cases where we need rotations: right right, right left, left left, left right. In each of these cases we do a diferent mode of rotation to keep the tree ballenced.

There are two base cases to this, because this process is recursive:

- The recursive call has reached the root of the tree.
- The balance factor of the parent has been adjusted to zero. You should convince yourself that once a subtree has a balance factor of zero, then the balance of its ancestor nodes does not change.

```
In [ ]:
```class Node(object):
def __init__(self, value, parrent):
self.value = value
self.parrent = parrent
self.rightChild = None
self.leftChile = None
class AvlBst(object):
def __init__(self):
self.root = None
self.size = 0
def add(self, node:Node):
if self.root is None:
self.root = node
else:
pass
def find(self, value) -> Node:
pass
def remove(self, node):
pass
@property
def height(self) -> int:
pass
@property
def ballance_factor(self) -> int:
pass