Ah haaa, and now we enter the densest, most bare metal forest of CS50, I AM SO PLEASED.
But now! Now it's been a wonderful ride through challenging and informative basic CS topics. This is what I came for! (I also came for data structures and HTTP, which are the next few weeks, woohoo.)
Week 2: Algorithms
This week's topic was algorithms, the varying measures of their speed, and common ones used for sorting integers:
On speeds: given that processors have different speeds, there's idiosyncratic noise to computer speed, and blah blah, there is something called big-O notation. Big-O is the number of operations the computer needs to perform to complete an (algorithmic) task.
"We use big-Θ notation to asymptotically bound the growth of a running time to within constant factors above and below. Sometimes we want to bound from only above. [...] It would be convenient to have a form of asymptotic notation that means "the running time grows at most this much, but it could grow more slowly." We use "big-Θ" notation for just such occasions."
"Big O notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity."
Normally written as \(O(n)\) where \(n\) is the size of the task: e.g. the length of the list you want to sort. Big-O is also the worst case scenario: that is, how long it would take to sort a list of integers if you never got any lucky breaks (as defined by the way the algorithm sorts). It's our upper bound.
Good big-Os are things like logarithmic algorithms: stuff that, no matter how big of a list you throw at it, won't ever take longer than the log of that list's length.
Behold now eight different algorithms, eight different speeds:
There's also a notation for the best case scenario: omega, \(\omega(n)\). If you need to sort a list and it's already, by chance, sorted - you need to only look through it once to be sure, ergo in that case \(\omega(1)\).
And finally, theta - \(\theta(n)\) - as the range between the upper and lower bounds.
Beyond measuring algorithms' efficiency, what about what these algorithms are actually doing? Pictures/videos are worth infinity words here, and I really enjoyed these two resources -
- Visualizing algorithms - To watch each sorting algorithm in action.
- Sorting Algorithms Animations - To watch several in action, as they race against each other.
Problem set: Music
- Expectations: This PS didn't ask students to implement any of the algorithms covered in the lecture (which I was expecting). It also didn't explicitly ask for any recursion - another lecture topic - and I didn't (and still don't) see much opportunity to try solving the puzzles with a recursive function.
- Content: This was a lot of fun: converting text to music, via magical music math and how it relates to the wavelengths of each pitch. This spoke to my idiosyncratic love of waves.
- Difficulty: Quite a bit trickier than earlier weeks, I ended up arriving at the Cognitive Wall - that place where you're just banging your head, making no progress, and some time away from the problem works miracles. Taking a break helped. Also, I found r/cs50. :)
Week 4: Memory
Memory! The stack and the heap! We actually drew it upside down - with the stack at the bottom and the heap at the top. I really enjoyed this lecture, since it started demystifying a lot of stuff - specifically, demystifying the connection between programming variables, values, scope, and functions, and how they're all connected to (i.e. stored in) memory. This was very empowering. We learned about pointers (which are just addresses/locations in memory). Basically, in C, you can:
- Initiate some variable, e.g. an integer, and call it
x. When you do this, some space - 4 bytes, specifically - in memory is allocated to an integer called
- Set its value to
x=5. The memory location now holds the value for
5, in binary, written with those 4 bytes.
- Grab its address,
&x. This returns a memory address, written in hexadecimal (i.e. base-16 numbering).
- Dereference it,
*x: that is, access and - if desired - change the value at its memory location. Of course, we could just change
x's value by setting
x=1or whatever, but if we want to swap variable names, or use a second variable in general to reference to
x's memory and make changes there, then that's something pointers are useful for. Here's the wiki example.
Also exciting to learn about was hexadecimal, dynamic memory allocation (grabbing a piece of memory from the memory heap using
malloc()) and the attendant memory leaks, and the call stack (which totally demystified recursion). Wonderful!
Problem set: Forensics
I've only just started this PS, which looks daunting - but manageable. The first part - Whodunit - had an enormous amount of info on bitmap file storage, which I ended up finding super interesting, and then it kinda took the training wheels off and was like, RTFM - i.e. reading and understanding documentation! Not for the faint of heart! But, like
bash and regular expressions and reading code, an important meta-skill.
Alrightie, these two weeks were some of the biggest value add so far. Really excited to keep going! Huzzah!
A side note: Every time anyone said
malloc(), I would hear Allen Ginsberg's wonderful exclamations from Howl: "Moloch!" Damn Moloch! From the poem:
What sphinx of cement and aluminum bashed open their skulls and ate up their brains and imagination?
Moloch! Solitude! Filth! Ugliness! Ashcans and unobtainable dollars! Children screaming under the stairways! Boys sobbing in armies! Old men weeping in the parks!
Moloch! Moloch! Nightmare of Moloch! Moloch the loveless! Mental Moloch! Moloch the heavy judger of men!
Moloch the incomprehensible prison! Moloch the crossbone soulless jailhouse and Congress of sorrows! Moloch whose buildings are judgment! Moloch the vast stone of war! Moloch the stunned governments!
Moloch whose mind is pure machinery! Moloch whose blood is running money! Moloch whose fingers are ten armies! Moloch whose breast is a cannibal dynamo! Moloch whose ear is a smoking tomb!