Engineers make hardware and software for humans. It should go without saying, but remembering and staying true to that axiom is complicated depending on where you’re standing. With each passing year, it seems that things get more complicated, more random, more uncertain. This year was no different, especially in the realm of technology.

Facebook and Twitter are defending their platforms amidst allegations that they were used for interfering in America’s 2016 Presidential elections. Net neutrality seems to be going by the wayside with nary a peep from the so-called “Big N”, many of whom participated in protests in 2014 when the issue first came to the public’s attention. Uber dug itself into a hole as scandal after scandal rocked the company; the first of which was a female engineer lifting the veil and exposing a misogynistic and Darwinian culture, followed by revelations that the company had written software to avoid local law enforcement agents in areas where Uber was prohibited from operating. Meanwhile, the threat of automation and the looming specter of artificial intelligence have every working professional worried about the future of employment in this new economy.

The list could go on and on, and doesn’t end when last year began. As long as corporate greed and bad company culture are not only allowed, but praised, problems of this ilk will continue. The problem as I see it, is that it’s most troubling in the context of computers.

Computers are designed to manage repetitive tasks and data for humans. If the design of their software is such that the task they are doing and the numbers they are crunching are used for some unethical purpose, then the acceleration of the problem is, well, frightening. One human sifting through data to enact some evil plan is not nearly as scary as one computer doing the same task.

So what to do?

It’s all too easy as an engineer to say, “Not my problem.” The issues most engineers face are so complex and difficult to wrangle that adding a layer of ethics on top of that is almost too much to handle. Unfortunately, engineers are also at the point where ideas become actions in this new world.

As Jeff Atwood puts it over on Coding Horror:

Programmers don’t think of themselves as people with the power to change the world. Most programmers I know, including myself, grew up as nerds, geeks, social outcasts [….] What do you do when you wake up one day and software has kind of eaten the world, and it is no longer clear if software is in fact an unambiguously good thing, like we thought, like everyone told us … like we wanted it to be?

So what do we do in this new world, where a keystroke, a function, a feature could lead to Donald Trump being the President of the United States?

Personally I’ve found that it’s all about surrounding yourself with the right people and mission. If the people around you have good motives and the thing you’re all working toward is to make the world better for the people that use your services/product, then you greatly increase the odds that the next decision you make won’t be subverted and used in an unintended fashion. It’s a huge reason I work at LexBlog, and it should be a factor in every job searcher’s mind.

We have a finite amount of time on this world, and even less time to work toward making it a better place (I have to eat and sleep, after all). Endeavor to do some good with that time.

This title speaks to my life for the past four months. For years, I’ve known that JavaScript is the language of the present and future on the web and for years, I’ve avoided learning it. It’s easy to chalk this up to a myriad of reasons, but ultimately, the two largest factors were intimidation and motivation.

Intimidation because my entire programming experience is on the server-side using languages that support classical object oriented programing practices. JavaScript is the antithesis of both those paradigms. A language that is compiled in a completely different fashion and relies nearly entirely on the client to interpret and run the code, while also seeming to generally laugh in the face of OOP and passes around functions like it was going out of style.

Ultimately, I had to admit that I didn’t know JS.

Continue Reading JavaScript JavaScript JavaScript JavaScript

One of my more interesting decisions in life was to major in History (yup, with a capital “H”). Today, the only time that degree gets use is when flipping to one of the many books about the birth of the computer that are stored away on my Kindle.

Recently I’ve been reading The Idea Factory: Bell Labs and the Great Age of American Innovation – if you’re interested in the birth of the communications age then this is the book for you. Bell Labs is a research facility that, at the peak of its influence, helped determine the outcome of World War II, gave us the transistor, and launched the first communications satellite. The way that we live today is in part owed to the people that shuffled through all the various research labs owned and operated by AT&T during the heyday of the company. Today, it is but a shadow of itself, run by Nokia (who, given the resiliency of their older products, are undoubtedly looking for ways to make a phone that can survive the crushing pressure of a black hole), operating mostly in obscurity.

Continue Reading Ma Bell and Fostering Innovation