Another week of reading, lectures, and confused Googling and skimming through videos is in the bag. This week was all about NP-complete. It has been fun to experience an “aha!” moment in each course I’ve taken so far, and this was an especially fun one. In large part, because seeking answers to these questions is seemingly beyond the reach of computers (in a reasonable time).

We’re accustomed to computers being incredibly fast. So accustomed that we forget just how fast they are. They’re really fast. I’m writing this on a 4-year old computer. It has a 2.5 GHz Intel Core i7 processor. That number equates to how many cycles the system clock of this computer runs in a second. So 2,500,000,000 cycles in one second. The version of the CPU  running on this machine is quite powerful. It should execute around 9 instructions per cycle for a single core in the processor and there are 4 total cores running.

This all adds up to a lot of numbers and those numbers represent commands that we expect the computer to execute for us so that we can….. I dunno….. watch videos of cute cats.

This is to say, you should think, when you hear a computer can’t find an answer to a problem, that it’s a big deal.  These are the sorts of problems that a reasonably programmed (or so we think) computer with enough processors to require fans that sound like this:

just can’t solve if the domain is large enough.

The full list of these problems is here and I suppose there are two ways to think of this. One is very abstract. The other is with a concrete example. I think it’s important to understand the core essence of these problems, so I’ll begin with the abstraction.

Imagine that you have a problem whose possible solutions represent all of the possible groupings of all the variables in the problem. Let’s say you have 2 possible solutions and n possible groupings. That’s 2^n possible solutions. Generally, to find each solutions, you have to do a bit of work to verify it as well. Let’s call that m. Now the total time to find the solution to the problem is proportional to 2^n * m. If n is 5 and m is 10 then 2^(5) * 10 = 320. Let’s say that represents seconds, so we’re at 5 minutes, 20 seconds. If n is 10 and m is 100, then we’re at 102,400, or 1,706 minutes (or 71ish days).

So numbers cause big shifts in our ability to get a problem done in a time that makes sense. All essentially because the problem’s answer could be this or this or this or this. If we had a computer that could simply calculate all of the this’s in parallel with how many or‘s there were (i.e, for each possible solution, have a single processor dedicated to exploring a single solution set) then we could do this in polynomial time. But you can see just how quickly that becomes a problem. Too many processors equals fires (or really loud fans).

The other way to think of this is through a concrete problem. One of the easier ones (in my opinion) to understand is the traveling salesperson problem:

The problem is essentially a question of permutations. Given a list of cities and the distances between those cities, what is the shortest possible route that visits each city and returns to the origin city. Again, to get to the answer we have to look at one path or another path or another path or another path. There is no way to know which path is the shortest without first computing the lengths of all the paths. You can imagine that with more cities, there are more paths and with a large enough set, the problem is so complex that we cannot reasonably answer it. 

In short, these are the problems that baffle us with or without computers. That they all share some key characteristics says something about the nature of these problems and about problems in general. In fact, one of the key notions of NP-complete problems is that if you can solve one in polynomial time (i.e., efficiently, or non-exponentially) then you can solve all such problems in polynomial time. But perhaps some problems are just too complex for a person or machine to do.

Having just finished my midterm in Analysis of Algorithms (yes, the class is as dry as it sounds), my brain is still sharp on a few topics; one of them being dynamic programming, which I mentioned in my last post. In that post, wherein I tried to find motivation for forcing myself to relearn calculus, I used the classic example of trying to calculate the nth term of the Fibonacci sequence.

I thought it would be helpful to see this example running with some real code. Below, we have a JavaScript function – fibRecursive – that takes an integer as a parameter. This integer represents the term that we want from the Fibonacci sequence. For example, a call to the function like so fibRecursive(6) would return 8. 

Continue Reading A Short Story About Dynamic Programming

This is a question that is plaguing me at the moment as I force myself to relearn calculus for Analysis of Algorithms at Oregon State University. In moments like this, where the concepts are abstract and I need to learn even more abstract concepts so I can understand the first class of abstract ideas, motivation is key.

Why does this matter? I’ve been working in the industry for over five years and never needed this knowledge before, why now?

Well, the short answer is that you don’t need this body of knowledge to develop a wide range of applications and features to applications. In my world, many of the concerns that common sorting, searching, and general optimization algorithms address are not real concerns because they’ve been abstracted to parts of the language or framework. I’m able to do my job because someone else has figured out how to do other parts of my job that normally would need to be created from scratch. So while learning merge sort and analyzing its complexity is a fun exercise, I’ll not be writing it from scratch anytime soon.

Continue Reading Why Study Algorithms?

This was one of the most eventful summers in my life both personally and professionally. In July, Garry (LexBlog’s COO) and I had a chance to go to Chicago and spend some time talking about LexBlog’s future product line and general opportunities for integrating with our platform. It’s not often that I get an opportunity to do face-to-face meetings of these sort, and it was nice to get back in the saddle. It was also my first time visiting Chicago, and Garry seemed more than happy to drag me around.

The Chicago Riverwalk where I forced Garry to walk – he was incredibly pleased to be outdoors, walking for hours.
The Chicago Gate – aka “The Bean” – which actually looks pretty cool close up.

Continue Reading A LexBlogger’s Summer in Review

I’m about halfway through Oregon State University’s (Go Beavers!) post-bacc program for computer science, but feel like I’ve just entered the belly of the beast. On the docket for the summer is CS 271 – Computer Architecture and Assembly Language; a fine relaxing course to take in the months before and during my wedding, right? Not so much.

The material is dense as we learn to program how to move memory around on a computer and perform basic actions on the contents of said memory.  The class is focused on the IA-32 – a 32-bit version of the x86 instruction set architecture found in early IBM workstations and personal computers, and then later in embedded systems for phones, aerospace tech, and electronic musical instruments.  I’m only a few weeks in, but already it’s painfully obvious to me that assembly is not like any other language I’ve used.

Continue Reading Oh The Places You’ll Go! …… with Assembly

Engineers make hardware and software for humans. It should go without saying, but remembering and staying true to that axiom is complicated depending on where you’re standing. With each passing year, it seems that things get more complicated, more random, more uncertain. This year was no different, especially in the realm of technology.

Facebook and Twitter are defending their platforms amidst allegations that they were used for interfering in America’s 2016 Presidential elections. Net neutrality seems to be going by the wayside with nary a peep from the so-called “Big N”, many of whom participated in protests in 2014 when the issue first came to the public’s attention. Uber dug itself into a hole as scandal after scandal rocked the company; the first of which was a female engineer lifting the veil and exposing a misogynistic and Darwinian culture, followed by revelations that the company had written software to avoid local law enforcement agents in areas where Uber was prohibited from operating. Meanwhile, the threat of automation and the looming specter of artificial intelligence have every working professional worried about the future of employment in this new economy.

The list could go on and on, and doesn’t end when last year began. As long as corporate greed and bad company culture are not only allowed, but praised, problems of this ilk will continue. The problem as I see it, is that it’s most troubling in the context of computers.

Continue Reading Make Technology for Humans

This title speaks to my life for the past four months. For years, I’ve known that JavaScript is the language of the present and future on the web and for years, I’ve avoided learning it. It’s easy to chalk this up to a myriad of reasons, but ultimately, the two largest factors were intimidation and motivation.

Intimidation because my entire programming experience is on the server-side using languages that support classical object oriented programing practices. JavaScript is the antithesis of both those paradigms. A language that is compiled in a completely different fashion and relies nearly entirely on the client to interpret and run the code, while also seeming to generally laugh in the face of OOP and passes around functions like it was going out of style.

Ultimately, I had to admit that I didn’t know JS.

Continue Reading JavaScript JavaScript JavaScript JavaScript

One of my more interesting decisions in life was to major in History (yup, with a capital “H”). Today, the only time that degree gets use is when flipping to one of the many books about the birth of the computer that are stored away on my Kindle.

Recently I’ve been reading The Idea Factory: Bell Labs and the Great Age of American Innovation – if you’re interested in the birth of the communications age then this is the book for you. Bell Labs is a research facility that, at the peak of its influence, helped determine the outcome of World War II, gave us the transistor, and launched the first communications satellite. The way that we live today is in part owed to the people that shuffled through all the various research labs owned and operated by AT&T during the heyday of the company. Today, it is but a shadow of itself, run by Nokia (who, given the resiliency of their older products, are undoubtedly looking for ways to make a phone that can survive the crushing pressure of a black hole), operating mostly in obscurity.

Continue Reading Ma Bell and Fostering Innovation