Tuesday 1 April 2014

Sorting and Efficiency.

Sorting algorithms is another broad subject. Really, all a sorting algorithm is is a particular method of arranging a number of items into a particular order. For the sake of the course, we typically sort lists integers into ascending order. 148 focuses on a few sorting algorithms in particular: Insertion Sort, Selection sort, Quicksort, Merge sort, and bubble sort.

-Insertion sort works by creating an already sorted sublist at the front of a list, and loops through the unsorted values, adding them to the correct place in the sorted section. The worst case complexity of insertion sort is n^2, and the best case is n.

-Merge sort works by splitting the entire list of integers into individual sublists, then merges pairs of numbers randomly into sorted sublists of two items. It then repeats that process on the larger sublists, merging until there is only one list left. The best and worst case performance is nlogn.

-Quicksort works by first picking a pivot point anywhere in the list. It then swaps elements larger than the value of the pivot point to the right of the pivot, and items smaller than it to the left. Then, it picks two new pivots, one on each side of the original pivot, and recursively sorts each side of the pivot, until there are no more values on either side of the pivot to sort. The worst case performance is n^2, and the best case is nlogn.

-Selection sort works by finding the minimum value of the list, and moving it to the front of the list. That section is then designated sorted, and then it repeats, finding the minimum value of everything past the sorted section, and moving it to the bottom of the sorted section, until it reaches the end of the list. Selection sort always runs at n^2 complexity.

-Bubble sort works by 'bubbling up' larger values in the list. It looks at the first value of the list, then, if it is larger than the value to the right of it, it is swapped up one index. The comparison and swapping is done until a value is found that is larger than the value being bubbled up, then the swapping stops and the algorithm looks back to the front of the list. When the first values are in order, it looks to the pair of values to the right, and swaps them if they are awry. If it reaches the end of the list without swapping, the list is sorted. Bubble sort has a best case performance of n, and a worst case performance of n^2.

That's a quick overview of all the algorithms we've made from scratch in 148. However, this ignores an algorithm that we've used, but not programmed. It's worth mentioning python's builtin sorting algorithm, which is Timsort. Unfortunately, I don't really know how timsort works.

Monday 3 March 2014

Week 7: Recursion

Since I wasn't expecting to have to write an assigned entry on recursion, the last post was a bit early. This post is the actual post regarding recursion, with formal descriptions on the concept and such.

In programming, recursive coding is simply the act of coding a method that solves a problem by solving smaller cases of the same problem from within. In terms of how one programs this, it's usually in the format of the return statement of a function calling the method itself again. For example, a method that is supposed to count the amount of nested elements would begin by counting all the elements of the base list, and then for every one of those elements of that list that are also lists, the function would call itself on the sub-list. A function like this could account for any number of nested items, since the same function of checking a list for sub-lists would occur each time until no sub-lists remained.

Recusion is an ideal method of solving a problem in computer science when the problem at hand is made up of similar, smaller problems. In this case, it is extremely useful; in terms of lines of code it is typically far simpler than writing a series of loops, since recursion will account for any number of nested sub-problems. However, since recursion requires the sub-problems to be similar to that of the base problems, it doesn't necessarily apply to all programming tasks. Recursion is the ideal solution to anything that is nested; for example: nested list, nested objects, trees, etc..

Thursday 13 February 2014

Recursion Woes

You know, I'm ok with recursion. It's a rather straightforward concept. The idea that a solution to a big problem is breaking into a number of smaller problems solved in the same way is usually much more elegant that manually smashing your face against the wall.



Consider the formula given for Assignment 1 for calculating the minimum amount of moves required to solve a 4-Stool Tour of Anne Hoy problem. The recursive part of that formula calls itself, reducing the n in the equation each time. When the n reaches 1, the equation stops recurring, and just kicks a 1 out. So, in the end, what's really happening is at the bottom, a 1 is added to a bunch of other numbers, and that number is then added to a bunch of numbers, and so on.

The issue with this solution for the minimum amount of moves is that, in the given formula, some i value between 1 and n-1 is required, but it is not specified which one would ideally work. The solution is to run a loop to recursively test all i values between 1 and n-1. The problem with this is that the runtime on calculating an ideal amount of moves quickly becomes extremely high, n values past 20 can take several seconds. Not the perfect solution, but since according to Danny we don't need to expect values above 20, it gets the job done accurately.

The solution for a 4 stool TOAH model is very similar to that, too. Basically, it recusively calls itself and a method for solving a 3 stool method, until it finds a single cheese to move at a time. Hard to visualize, but it's interesting to see that something I programmed seems to understand how to solve the problem better than I do.



And that understanding is the biggest problem with recursion, it's straightforward as a concept, but specific applications of it take work with the problem to understand. I wasn't able to really get either of the recursive operations in this assignment until after I had implemented both of them. Not for lack of effort, mind you, it is just hard to wrap my head around.

Wednesday 22 January 2014

Week 3: Object Oriented Programming

 Well, 3 weeks into the course, we have begun Object Oriented Programming for Python. The concepts really aren't new to me, since I have a pretty strong background in Java. Most of the practice I do is just to get a hang of the differences between the languages.

To me, the most glaring difference between python and Java is that private variables are protected by convention rather than the language itself, which to me, is pretty annoying. Obviously python is designed to be simple, but I feel like that's a major oversight on a design level. Additionally, I'm curious as to how static variables work in Python, if it all. I'm assuming that they do, but with some of the funkier things about this language, I'm prepared to be surprised. After that, the other major thing I'm expecting is Interfaces, but, according to friends, that isn't covered until second year any ways.

Overall, object oriented programming in Python seems pretty straight forward, but, like other things in Python, its simplicity ends up impeding it from being very adaptable in certain circumstances. For that reason in particular, I'm looking forward to moving onto Java next year.