Tuesday 1 April 2014

Week 11: Last but Most Efficiently...

Over the course of the last few weeks we were introduced the the concepts of code run-time and efficiency in various types of sorts. Although there are many different types of sorts, we mainly went into selection sort, insertion sort, quick sort as well as merge sort. Each of these different types of sorts had various concepts, algorithms as well as run time. In selection sort, the program iterated through the list and finds the smallest element and places it at the beginning. Insertion sort is a different type of sort where the program iterates through the elements in the list to be sorted and compares each element to the previous sorted elements until it finds the correct location for the element. Quick sort is a recursive sorting method where the program chooses a pivot and orders the elements less than the pivot to the left of the pivot and elements greater than the pivot to the right. This is then recursively on the sub lists created with a pivot smaller than the original pivot and a pivot greater than the original pivot until the base case of a size 1 list is reached, meaning that the sorting is completed and the list is sorted  Merge sort is an entirely different concept where the program breaks the list into many smaller sub lists and once again using recursion takes the base case of a list size of one which is assumed to be sorted and merges the sub lists back together and sorts the sub lists until one giant sub list is created with the same amount of elements as the original sub list. The topic of run times comes up quite frequently when analyzing and comparing these different sorts, and therefore we were introduced to the measure of algorithm performance O(n)[Big-Oh]. This measure to be efficient has to only measure the algorithm and be independent of hardware, programming language as well as random events. In O(n) measure we compare the algorithms running time in comparison when running a very large input of size n and compare how the running time of the algorithm changes as we change n. There are many types of running types such as constant run times(consistent for all size lists), logarithmic(very quick as it divides the list into small parts effectively, efficiently and quickly), linear(runs as fast as their are elements), quadratic, cubic, exponential and horrible algorithms. Because O(n) is an upper bound of sorting, we mainly worry about the worst case situations and compare the various sorts based on their worse case as the chances of obtaining and list with the perfect criteria or selecting the "perfect" pivot is very slim and almost impossible. Through the knowledge of sorts and the ability to analyze their algorithm running time, we are able to effectively compare various kinds of sorting algorithms based on their worst case situations and see what kind of impact the size of the given list, n, when very large, impacts the overall running algorithm running time.

Sunday 23 March 2014

Week 10: Sort of an Assignment Week

With the week of regular expressions coming to a close, assignment two ignited sparks in our recursive mindsets as we were given the sample code and were required to complete various methods involving regular expression trees. Most of these functions we were required to complete we were able to solve using our toolkits packed with knowledge involving recursion and linked tree lists that we have been constantly adding under out belts as programmers. During the lectures we were introduced to the topic of sorting algorithms, the basics involving their steps, run-times as well as the many perks and flaws among them. Many sorting algorithms involving various steps which may run differently on different computers which have different quantities of memory. This week we were introduced to an effective notation to dictate the general run-time on all machines such that the run-time would be relative to the number of elements n rather than the amount of cores, memory and RAM your machine has. This notation is called big O notation and by learning this notation, we are able to accurately predict the average changes in run-time when changing the amount of elements when sorting using various algorithms.

Sunday 16 March 2014

Week 9: Just a Bit of Binary

This week in both the lectures and the lab I attended we were taught the concept of run times as well as binary search trees. A traversal of a tree is when you express a tree in terms of a sort of list, where there are many types such as pre-order and in-order traversals. These traversals are each unique in that independently they do not tell us that much information but for example with both a pre-order traversal and in-order traversal, we are able to accurately replicate the binary tree of the traversals. I had trouble grasping the idea of how the pre-order and in-order traversal incorporated together to create the binary search tree, but adhering to the advice on Exercise and practicing on various trees of different degrees of size and cases, I was able to find out a base case and ultimately solve the problem of recursively creating the binary tree from the pre-order and in-order traversals. We also briefly covered run time in terms of big O notation, this means that the run time is relative and can be compared to different machines that may run at different speeds relative to others as we only take the highest exponent which has to greatest impact on the runtime.

Sunday 9 March 2014

Week 8: Second Time's the Charm

This week was mostly spent on the concept of trees using lists and linked lists. Linked lists are lists which have two components to them; the value, as well as the remaining list. These lists are useful as we are able to create them and use recursion to greatly reduce the amount of code needed and create unique data structures such as trees. This knowledge was helpful during our second assignment as knowledge of linked lists and trees were essential to completing the assignment about regular expressions in python. Through many struggles I had during the assignment as I was unsure of what the assignment was looking for in it's description, I was able to arrive at an end product that I was proud of and believed to fit the criteria for the assignment. Course resources such as the annotated slides, python code examples as well as the discussion board Piazza was very helpful as many individuals such as myself had difficulties grasping the criteria for the assignment.

Sunday 2 March 2014

Week 7: Recursion

What is recursion exactly? That's a question that I most definitely had going into this course. Through the course of high school computer sciences courses, you graze lightly upon the topic of recursion and the idea and concept of how it all works never really sinks in. In these past few weeks my basic foundation of knowledge and understanding of recursion has improved significantly. Recursion is a technique used for many applications in programming and can greatly shorten the amount of code that is written. Although the code may be shorter, the method of recursion is not necessarily always the most efficient method. Recursion occurs when you attempt to break a large problem into sub-problems and use the results of these sub-problems in order to achieve the result of the large problem. The problem that most people struggle with is finding the base case, which is the simplest sub-problem which we can easily calculate the result for. By working from the base case and solving larger problems until we are able to solve the entire problem, we can figure out a solution solely knowing the algorithm in python and the base case. I had a great amount of trouble learning and adapting to recursion, but through the many examples and questions we obtained in our lectures and labs, I was able to effectively and thoroughly grasp recursion and put it on many applications such as our assignments and test.

Saturday 15 February 2014

Week 6: The First Assignment

This week was one of the more hectic weeks of my school career. With midterms to cram for, assignments to be completed, it was without a doubt one of the most stressful times of the year. The first assignment we had to complete in this course required basic programming knowledge as well as knowledge in recursion, both of which we obtained through the content covered in lectures, labs and exercises. At first glance, the assignment seemed very easy to complete as all you had to do was write method headings with their docstrings along with their method bodies for the classes provided. The difficultly of the assignment escalated quite quickly as the recursion and finding the most efficient method of sorting the cheeses on the various stools using the intermediate was difficult. Through collaboration with my partners and discussion, we split the assignment into various parts and helped one another in the parts that gave us trouble. As a whole I feel that this assignment contributed greatly to my group working skills as well as my skill set in recursion and strengthened my basic foundation as a programmer.

Sunday 9 February 2014

Week 5: The Method in a Method

Recursion is a concept of programming that I only skimmed over during my career as a high school student, and this week taught me many insights and techniques as well as the fundamentals of using recursion in programs. I have always had difficulties with understanding and tracing recursion as I had no idea where to start and how the programs actually handled such method calls. Through these lectures and the lab this week, I can say that my knowledge and understanding of recursion has increased exponentially as my TA gave me great insight on the significance of using base cases as well as using the methods of tracing recursions taught in the lectures and reviewed in the lab. I feel that the knowledge I have obtained over these few classes and labs need to be polished and refined more and that these skills I now have will be very beneficial for me in my first assignment involving recursion.