My Journey


From Student to Coder

CS 100.9: Graphs

As promised, the follow-up to our discussion of trees: graphs. Of course, we are all familiar with what a graph is. For those that have taken discrete mathematics in some form (not me!), you probably even have a sense of the specific definition of a graph and the different kinds. For those have taken graph theory, skip this blog post, it will bore you (Just kidding, don’t do that). Nonetheless, it is important to hone in on exactly what a graph is as a CS data structure, because graphs are illustrative of many important CS principles, and can be used to solve many different problems.


CS 100.8: Trees

Today marks the beginning of a two-part series on graphs, starting with a special type of graph, and perhaps the most common one in interview-style problems, the tree. Trees are a hierarchical data structure, meaning that data is arranged in a parent-child relationship. You are, doubtless, familiar with various kinds of trees, as they are a common way to visualize relationships in the real world. A family tree, a corporation with all its subcomponents and employees, or, for a more to-the-point example, the file tree in your computer, all represent data as a tree, in which a root node points to its child nodes, which point to their child nodes, and so on and so forth. For all my Windows users (Woot Woot! Go Windows!), you have probably at some point had to manipulate your hosts file, which lives at the path C:\WINDOWS\system32\drivers\etc\hosts. To help visualize, here is the same thing, written in tree form.

 C: 
  |  
	\Windows 
	|
	\system32 
	|
	\drivers 
	| 
	\etc 
	| 
	\hosts

CS 100.7: Dynamic Programming and Memoization

Today, we will briefly consider a few distinct but closely related concepts in algorithm design: dynamic programming, memoization, and the “greedy algorithm”. Another closely related concept is caching, which I will not go into detail about, but encourage you to study up on. The following material assumes that you have some familiarity with recursion, which I have not gone over, but assume you have seen at least a few times. The idea of recursion is to divide a problem into a series of subproblems and solve each one. It can be referred to as “divide-and-conquer” or “top-down”, and merge sort and quicksort, which I discussed previously, are examples of recursive algorithms. Right then, let’s consider an example to illustrate why dynamic programming is useful.


CS 100.6: Basic Binary

Today, I will introduce the concept of the binary numeral system on which computers rely. Please note that this is a VERY basic introduction to binary numbers, their purpose and uses, and the operations that can be performed on/with them. Until very recently, I had almost no experience working with binary, and my knowledge is still a work in progress. Bit manipulations such as bit vectors etc. are complex topics, and as a junior engineer, you will need to have only a very surface-level understanding of bits and the like. That said, I encourage you to study further on the topics I introduce today, since understanding how computers work at the lowest level can only help you as a programmer. Okay, that aside, let’s get cracking.


CS 100.5: Sorting Part 2

Before we begin, I did in fact butcher the quote from yesterday. Mr Dimopolous actually said “Jumping from failure to failure with undiminished enthusiasm is the big secret to success” See, way more poetic. Now, onto topics of substance.