On Monday, I looked at the concept of Infinity and noted some of the very real problems we have with the concept. But, if infinity is an irrational concept (a fact Aristotle pointed out thousands of years ago), why do we still employ it so frequently?
I think the problem of infinity can be expressed
Have you ever watched a Roomba work its way around a room? They work in a random and chaotic manner - first they go a bit this way, then a bit that way. They bounce off of walls, chew up cords, and generally make a nuisance of themselves. Sometimes, they sit for a few
I've alluded previously to a particular brand of laziness that separates a good engineer from the average person. It sounds disparaging at first, but there is an aspect to engineering laziness that is absent from almost every other kind of laziness:
Engineering laziness encourages progress
How can laziness encourage progress? When we think of laziness, we
We all know (or at least we should know) that we can use #define to create macros that replace names with values during the preprocessor phase of compilation. Sometimes, it's very important for us to have the ability to change large amounts of code very quickly, because we have a new size limit for
Aristotle talked at great length about the essences of things and the accidens of things. The accidens were aspects of appearance that could change (or simply which could be different) while the essences are aspects which one cannot change without fundamentally changing the thing. For example, the accidens of an acorn is a brown
Best thing ever: discovering an exploit that no one else has found (so far as you know).
Worst thing ever: being exploited.
Runner up: Learning about a beautiful exploit AFTER the company has implemented measures to stop it.
I was recently exposed to an excellent blog post about an exploit of the Kindle Unlimited system. For those
In the world of C/C++, we use headers extensively. The basic rule is that C/CPP files contain code that becomes binaries, and H files contain the interfaces that allow us to reference them in other C files. Any program more complex than a calculator will likely contain multiple C files that are combined into
malloc(): memory corruption
When you look at an error message like that, what could possibly lead you to believe that, a hundred lines up, you didn't properly initialize a size variable? After all, all we know is that this malloc() operation could not complete because the memory it should be able to touch is corrupted.
The hardest interview I ever had: someone told me to go up to a whiteboard and solve a programming problem, in code, optimally, on the first try.
It's a basic fact of our field that we iterate toward the final product. Not unlike a sculptor bringing the David out of a piece of marble, we
It's amazing how often programmers forget the simplest rule of programming: 1=1.
This is the principle of logical unity (or modularity - they're largely synonymous).
Unity of Function
If we adhere to the principle of top-down modular design, all our functions should be relatively small. At the bottom of the tree, every function serves one purpose and