Monday, March 31, 2014

Week 11/12: Sorting Efficiency

It started in CSC108 with Bubble Sort. A rather simple algorithm that takes a list at index i and j = i + 1. If list[i] > list[j], i is switched with j.Then compare list[i + 1] and list[j + 1]. Very similarly, we have Selection Sort, which instead makes one exchange on the largest number in the unsorted part of the list for every pass through. These are O(n^2).

Insertion Sort is O(n^2), but it keeps the sorted part at the beginning of the list. It assumes that the first number is sorted (which it is, because one number is always sorted). The next number is checked against the sorted list starting at the right, and inserted to the left once it is less than the number it is being checked against.

Then we get to CSC148 and get hard-to-read algorithms like Merge Sort and Quick Sort. Merge Sort and Quick Sort both deal recursively with lists. Merge Sort takes a list and divides it by two and then divides those by two until it has length 1 lists. They are then pieced back together. Whenever two lists are compared, they are compared at index 0. So, if one list is completely read before the other, the other can just be appended to the end of the new one because it is sorted and its number at index 0 > all the numbers in the other half of the list. Quick Sort takes a pivot point (usually random to get an average runtime of O(nlogn) and a worst case of O(n^2)). This involves a leftmark and rightmark. They are moved toward each other, sorting on the way, and once they pass each other, a split point is found. Then it recursively quick sorts the left half and right half based on the split point.

Overall, large lists become too much to handle for the first 3 sorting methods, despite Quick Sort being unstable at times.

Sunday, March 23, 2014

Week 10

Well, A2 is officially over. I actually turned it in 2 hours early.
It definitely helped me understand recursion more than A1 did. Actually A1 didn't really help me at all, since I never got the recursive function to work well.

In the function is_regex, you have to recursively check a string to see if it passes as a regex or not. If it has a bar or a dot, you check that is_regex of the left side and is_regex of the right side return True. Since it's recursive, the function does all that for you, given that you have the right basecases.
I personally found regex_match to be the hardest. I had to read every single Piazza post pertaining to it and talk it out with a friend before I even vaguely understood what I had to do. Then it clicked, and the feeling I got when all the functions came together and worked is probably the reason why I'm still even in Computer Science.

Monday, March 17, 2014

I just wanted to say, A2 is a great assignment. No sarcasm intended.

Saturday, March 15, 2014

Week 9

A2 part 2 is. Really. Hard. I've been staring at it for about 2 days. I drew out a few examples; from string representation to tree representation and the other way. It's easy that way, but I can't wrap my head around the recursive aspect of finding subtrees in the parenthesis of more complex regular expressions. I first thought, "I could separate the regex at indexes of certain symbols", but then I realized there could be more than one of the same type of symbol in a regex. Then I thought, "maybe I should take out all the parenthesis", and then I realized that was an even dumber idea.

I've been trying to roll with the idea of having a helper function that checks simple regexes, like (0.1) and 2*, and then the overall function is_regex() check the larger regexes using the helper function recursively. The problem is that I don't know how to separate a complex regex without completely screwing up the order of the original regex.

The forums are usually pretty helpful to me, but they haven't been quite as useful this time around. I would have included more solid code/work in my SLOG this week if I actually had some code, but I don't have any. All I have is this rant about my struggles in CSC148.

Saturday, March 8, 2014

Week 8

Since A2 part 1 was extended, I had a few days to think about the code I had pretty much finished and was about to turn in before I realized I could possibly improve it. Something I realized was that I could have my subclasses inherit __repr__ if I just initialized a name and had __repr__ use the name unique to the subclass. Since part 1 is mostly about inheritance, I'm thankful for the extension. (I also completely forgot about docstrings, so there's that. . .)
There was a question/answer posted in FAQ for A2 about using __repr__ to write __eq__. It said it depended on the quality of __repr__, but I'm not sure why that would be so, since __repr__ should be giving the name of the classes in addition to their children. Am I missing something?

I was a little confused on making attributes read-only after initialization. I understand its purpose and the general idea of it, but the actual implementation of it seemed foreign. Unless I really spaced out in class, I don't remember ever seeing property(get_x, set_y). I had to copy the code Professor Heap posted from the class and play around with it until it worked with my code. Either way, I'm feeling pretty confident about A2.

Sunday, March 2, 2014

Week 7 - Recursion

We've had a whole assignment and a term test on recursion, and this week's Slog topic is recursion, so here goes nothing:
Recursion is, put very simply, when a function calls itself within itself. In more detail, a recursive function has a base case and one or more else/elifs. A base case is what the function would return if the input given to it is already in the form it needs to be in or if it's already what should be returned. For example, if I wanted to find the sum of all the numbers of a list, I would need to make sure I were only adding ints or floats because I can't sum lists.
Consider:
def sum_list(L: list):
      return sum([sum_list(x) if isinstance(x, list) else x for x in L])
If I did sum_list([1, [2, [3, 4], 5]]), it would add 1, but [2, [3, 4], 5] is a list, so it would then do sum_list([2, [3, 4], 5]. It would then add 2, but [3, 4] is a list, so it would do sum_list([3, 4]) and add 3 and 4. 5 is not a list, so it would add 5 instead of calling itself again. The grand total: 15.

Apparently I don't understand recursion at all because I didn't do very well on the midterm. I'm not too worried, though, because I can't honestly say I studied enough. I put way more effort into the MAT136 exam I had on the same day, worth 3 times as much as the CSC148 test. Hopefully with the help of more assignments and labs, I'll finally wrap my head around more complicated recursive functions.


Saturday, February 15, 2014

Week 6 -- February 15th

After a hectic week of scrambling to finish A1 and studying for a CSC165 midterm, Danny left us pondering the tree data structure. The writer at http://gr148slog.blogspot.ca/ described it really well -- in a way that I didn't think about, but I think it will help me in long run. He describes the roots and children as the way we drew them when we were young. We started with a base tree, and drew two branches leading out of each leaf part.

I must confess, I did not finish A1. I completed all the steps, but I just could not figure out the 4-stool move algorithm. I knew that finding "i" would be a lot harder than just integer-dividing it by 2, but I did not even understand the recursive part of it. In the end, I had a function using the 3-stool cheese function Danny wrote in class that moved the cheeses in fewer moves than the Hanoi method, but more moves than the best-known TOAH method. I realize that assignments require students to just sit and THINK. I would not say that I spent enough time thinking about the assignment in general. It was pretty interesting how everything came together, and I think it showed me how useful object-oriented programming can be on a grand scale.

All in all, I think I did pretty well for having worked alone. Note to self: look for a partner sooner.

Who actually puts cheese on stools? I couldn't even find an image of one.