SICP_1.14-1.15
SICP_1.16
SICP_1.17
SICP_1.18
SICP_1.19
SICP_1.20
SICP_1.21-1.22
SICP_1.23-1.28
SICP_1.29 (+ work notes)
SICP_1.30
SICP_1.31
CS106A:
Assignment 1: Karel:
CollectNewspaperKarel.java
StoneMasonKarel.java
CheckerboardKarel.java
MidpointFindingKarel.java
Art & Sci of Java exercises:
ASJava_2.01-2.02
ASJava_2.04
I got into a bit of a working groove and kept putting off a new blog post. Now I got quite a bit to put down here. Before we go in, a note: I need to start making small notes as I write. I often realize something, the feel of creating an algorithm, or how the work seems to be getting easier, or any one of a number of things - but those are the types of thoughts that flash for a few minutes before getting absorbed into your subconsciousness; small steps on the path of development, but very hard to identify explicitly until long after. Noted. So anyway..
What I've done in the last 7 days (I'm writing this in the morning of the 18th, so I haven't started working today yet).
SICP, Structure & Interpretation of Computer Programs:
Exercise 1.14 asks to sketch the tree of the recursive count change procedure. This was the first time I really got an appreciation of recursion. I also had to specify its time and spacial complexity. Here's my sketch below:
1.15 quickly shattered the feeling of comfort I got from 1.14, and really drove home the difference between Normal and Applicative order evaluation. I'm still not 100% on it, but I turned what was a linear-recursive process into a tree-recursive one because I expanded it wrong on paper.
1.16 involved writing an iterative exponentiation procedure through successive squares using the identity: (b^{n/2})^2 = (b^2)^{n/2}
1.17 asked to define a multiplication procedure similar to the <<fast-expt>> procedure in the book, that uses a logarithmic number of steps. Think of a logarithmic progression as the inverse of exponential (but in the same direction). Whereas with exponential increase additional inputs are magnified, in a logarithmic increase additional inputs give rise to diminishing returns. This is why in orders of growth, Log(n) < n < n^k. Figuring out how to write the procedure also helped me unpack some important concepts in the book. Not 100%, I still had to use BilltheLizard's blog for help (I now use that blog as a kind of minimum-speed governor: when I'm stuck after a set time of struggling, or I need to check my answers I go there).
1.18 I liked a lot. In this exercise you have to take the русский крестьянский способ, Russian peasant method, of multiplication and turn it into code. The method is iterative, and to me says something about human development: it's an iterative algorithm for multiplying numbers that requires very little thought; like the finger counting games your uncle showed you when you were little. Why and how did this come about? Well this algorithm is just a version of one invented (for all we know) in ancient Egypt. So what's up? Well us humans needed a way to perform multiplication... now how does any sentient animal with no prior knowledge of mathematics do accounting? For all intents and purposes we at that stage were very similar to computers: we only know how to do very simple operations. So the algorithm is a way to accomplish a more complex task using very rudimentary tools to get the job done. Tricks. But useful tricks. And without understanding the nature underlying what you're doing. I think that makes sense of much of human history, and perhaps provides context for understanding the growth of calculation ability in a more universal frame. What. Anyway. If it works for ancient Egyptians and Russian peasants, it'll work for my computer.
I managed to solve this algorithm on my own, only using Wikipedia to see the actual algorithm. That's practice I need: take a concept and algorithm, and convert that into code. It's a great feeling when it works. The algorithm works thusly: take two numbers being multiplied, X and Y. Halve X and double Y to form a pair, and do it again until X = 1 (ignore remainders). The product of X and Y is the sum of all sub X & Y pairs, ignoring pairs where X is even. Example:
13 * 238:
13 238
3 952
1 1904 +
= 3094
Exercise 1.19 was harder and I had to look online to walk myself through the solution. Again, I'm always wary when I do this, but I'm counting on me having the judgement to decide when to let myself struggle and when the point of diminishing returns and inefficient use of time has been reached. This one was a case of using a mathematical proof and some math-jiujitsu to turn a recursive algorithm into an iterative one with a logarithmic number of steps. It involved showing that the Fibonacci sequence could be represented as a transformation function, which itself could be represented as a special case of a more general function \rightarrow which it is your job to prove by showing that the special case can be written in terms of the general case, then turn it into a procedure. Notes in GitHub link above.
Exercise 1.20 stumped me one night, then I worked through it in 2 hours in the main library of Rutgers U. It asks you to write out the entire process taken by the <<gcd>> procedure evaluated in Normal and Applicative order. Applicative was straight-forward, Normal not so. I took a brute force method to get through this one: I put up a solution online, and started walking myself through it line by line until I understood what was going on. Then I started writing the entire evaluation process, checking my work at the end with the solution. This taught my the usefulness of indenting in Scheme, and some of the logic of Normal-Order evaluation.
For 1.21 and 1.22 were made more challenging because a necessary procedure, <<runtime>> was not available on the interpreter I use. I tried setting up the SISC interpreter which would use the <<current-inexact-milliseconds>> workaround, but after setting up Java to not block it on Firefox, no luck. I just did my best writing the procedures (timed tests for determining & finding prime numbers), and checking with results online.
Skipping through a bit, the rest was just work. I learned about how you can write a series-summation procedure and convert it into a product one; and for both define them either iteratively or recursively. This was further reinforced in Exercise 1.31 where I had to take the Wallis Formula for approximating Pi [See: Wolfram Mathworld, line (3)]. It was cool to see how surprisingly easy it was to take this big complicated formula, and just break it down into code; especially the series terms and how they progress.
The ease of 1.31 was probably made possible by the 5 hours I spent working on 1.29 where I took the Simpson Formula for approximating an integral and turned it into code. My mistake (though a useful mistake) was spending 4 hours working at the problem without looking at the help in the book. I was trying to reinvent the wheel by finding a way to capture the terms in the series and how they progress into sub-procedures; and for syntax I managed to confuse myself with how to define a variable in Scheme. Long story short, I learned a bit more on how to use summation for a series.
CS106A: Programming Methodology
After the pain of SICP, Stanford's intro CS course is like a gently breeze. I may be further along than I think, because, to my surprise, I find myself surprised by just how easy this class is. I'm also building better study/work habits, because a course is a totally different (and much more docile) animal when you do the readings and exercises before the lecture. This is my first time touching Java, and norms from SICP are spilling over. Finishing the Karel reader, I'm into the Art and Science of Java book and doing every exercise. I'm glad I took the time to figure out how to start a new Java project (wow is this language convoluted...too much overhead). I did the first assignment in the course, programming the little robot Karel to do a few tasks. Here I took what was theoretical in SICP and really reinforced and put it into practice; namely breaking problems down into smaller parts. There were 4 problems in the assignment. The first two were easy: pick up a marker on a tile, and write an algorithm to repair columns. The third: an algorithm to create a checkerboard, took me 10 hours. The biggest problem I had with that one was getting disorganized. There were too many branches in the algorithms logic, and my mind couldn't hold them all in at once, and it turned into a mess. Finally made it all work though. Because of the work on that problem, the 4th which was even more challenging, only took me 1 hour. It was simple: design an algorithm, map it out into its sub-procedures, write it. Fucker worked on the first run. That problem was designing and implementing an algorithm to find the midpoint of any line, without being able to count. I liked it a lot; it was using logic to solve a math problem. Cases like these make your eyes open to the possibilities of computation and what crazy stuff may lie ahead in the future. I like it.
I made a short video demonstrating a few of the programs, and a snapshot of the code as well. Also included is a snowman from a book exercise.
I'm noticing a kind of feedback-loop in my work. What I learn in SICP - whether it be technical, habitual, or conceptual - comes back and reinforces my work in CS106A. Then what I've put together in CS106A, a much easier and forgiving environment, comes back and strengthens my ability to understand and solve problems in SICP. And this process becomes more powerful and accelerates with each completion of the loop. It's interesting and we'll see where it goes.
Still, it may seem great, but it's necessary. Maybe another way of looking at things is to be grateful that they're actually working. Computer Science & Programming for me is a sideshow. After I get through this, it's back to Physics and then on back to Engineering. By that time I should be either in school or a company (my own or someone else's), so I'll have to make time.
As a side note, it was interesting to hear Elon Musk at a talk at Stanford Futurefest note that the logic behind CS is that it isn't capital intensive. While it's nice to hear what I've thought for about a year being spoken back to me from the ether, it also makes me wonder if I'm being too slow.
Gur du way, eh? That's the update for Day 18, now back to work.
There is no such thing as fast enough.
No comments:
Post a Comment