jeancarlov / Algorithms

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

IMG_0053

How recursive functions work

IMG_0049 2

Invoke the same function with a different input until you reach your base

Base Case is where the recursion ends !important!

Second recursive function

Ask your Self this questions

Can you spot the base?

Do you notice the different input?

what would happen if we didn't return?

Common recursion pitfalls

No having a base case or case base is wrong

Forgeting to return or returning the wrong thing

If the return 1 is taking off then the function will go never ending the return is the break

Helper method recusion

Helper method recursion has two functions one in and one out

Pure Recursion

Pure recusion mean that the function itself is self containing, there is not external data structure and nested functions. Pure recursion is a bit more challenging than helper method but is more efficient

Tip for Pure Recursion

For arrays, use methods like slice, the spread operator, and concat that make copies of array so you do not mutate them

Remember that strings are immutable so you will need to use methods like slice, subsstr, or substring to make copies of strings

To make copies of objects use Object.assign, or the spread operator

Searching Algorithms

Index search methods on arrays indexOf , includes, find

This methods search each element at a time

Big O Linear Search LINEAR IS GREAT FOR UNSORTED ARRAY

Best is O(1)

Average O(n)

Worst O(n) as n growth so is the time

Binary Search is great O(log n) this close to 0(1) but it only works in sorted arrays (in order)

• Rather than eliminating one element at a time, You can eliminate half of the remaining elements at time.

Tip for binary => Divide and Conquer

TIME COMPLEXITY WITH BINARY SEARCH

Worst and Average O(log n)

Best Case (1)

Elementary Sort Algorithms: Bubble, Insertion , Selection

Bubble sort: placing large values first into sorted position at the end. bubble sort is quadratic so in large data it will take longer (time complexity)

Selection sort: places small value first into sorted position at the start.

Insertion sort: Builds up the sort by gradually creating a larger left sorted portion --- ideal for new data to be place in continues order positions.

Shortcommings: Elementary algorithms do not scale well in a large scale

Intermedate Sort Algorithms : Merge , Quick , Radix Sort

Why this algorithms? faster sorting algorithms that can improve time complexity from O(n^2) to O(n log n)

Merge

MergeSort O(n log n) this actually the best we could get from a sorting algorith unless is help like in quick sort

Mersor is good with time complexity => O(n log n)

Quick Sort

It works in the same assumption as mergeSort. We slpit the data until is individually sort. However, Works by selecting one element (pivot) ex: 5 and finding the index where the pivot should end up in the sorted array, and then we repeated the process of pivoting an element for the left and right side

Avoid picking the first Element in array [0] as the pivot instead try the median or random number

Quick sort has a major array that is decomposed into subarrays in right and left, then return newArr in recursion to an order sort array as the final output.

IMG_0052

Comparison Sort group:

(bubble, insertion, selection) -lacks on time complexity O(n^2) in a large scale good for small arrays

(Merge, Quick) - It inprobes time complexity to O(n log n)

Can we do beter? is there faster Algorithms? Yes, but not by making comparisons, best case secnerio is getting O(n log n)

Radix sort (Works on numbers usually binary)

It exploits the fact that information about the size of a number is encoded in the number of digits. More digits means a bigger number.

radix

binary search is similar to binary heap the difference is that for binary search the value of the node or vertex does not matter incontrast to binary heap

Binary heaps is important because is use for building priority queue

Graph are used to connect verted or nodes through edges. This is used in real like in social media, netflix and other like suggesting a movie that is similar to the recently seen.

Dijkstra's Algorithms - it acts upon a weighted graph - its used in alot of times in tech companoies, in some cases tech companies build their secrect algorithms on top od Dijkstra- What is it? It finds the shortest path between two vertices on a graph.

This is my review and notes in algorithms that I learn from taking JavaScript Algorithms and Data structure course from instructor Colt Steel. The main purpose of this review is to help me learn, understand, implement, and review Algortims and Data Structure. This is not meant to be used for commercial purposes.

Dynamic Programming -a method for solving a complex problem by breaking it down into a collection of simpler subproblems, solving each of those just once, and storing their solutions

requires a knowledge in recursion, objective is to used solve overlapping subproblems

So in order for us to use dynamic dynamic programming on a given problem there have to be subproblems that overlap in some way.

Subproblems means that we can break one problem down into smaller pieces but some of those pieces are re-used again. So a problem is said to have optimal substructure if the optimal solution for a bigger problem can be constructed from the optimal solutions for its subproblems. some solutions is using memoization and Tabulation, storing the result of a previous result in a " table" usually an array. Better space complexity can be achieved using tabulation.

About


Languages

Language:JavaScript 100.0%