The Real Zack Morris

The State of the Art is Terrible

I stumbled onto this post the other day through Hacker News. Basically Ryan Dahl (the guy who wrote node.js, a web framework that makes running web servers easy with javascript) called out the deplorable state of software today.

I fully agree with him.

And here’s why - I’ve seen things. In college I learned how to design a computer from the ground up at the mask level. We learned the equations that attempt to describe the properties of semiconductors and applied them to hundreds of problems, everything from amplifiers and logic gates to full adders and CPUs. One of my favorite projects was wiring together a primitive robot from logic gates that followed a line on the ground.

I wrote programs and games in assembly language on both the x86 and 68000 processors in the 90s. I even went the level below that and wrote opcodes in binary to interface with hardware. I learned hardware description languages like VHDL for controlling FGPAs. I’ve also dabbled in functional languages like Scheme and can tell you the ins and outs of probably a dozen different languages and methodologies from macro languages to query languages. I can tell you why D or javascript are better than c++ for instance (and no, it’s not just about prettier syntax or garbage collection).

I helped write a commercial game in the Mac App Store that briefly got to #6 in top paid. I’ve installed linux and a full LAMP server remotely. I’ve compiled and installed dozens of open source libraries from Ogg Vorbis to Freetype. I wrote what was effectively a TCP stack above UDP for our now defunct online environment. I underestimated the difficulty of network programming (so far the most difficult subject I’ve ever encountered, probably somewhere between 3D and AI in complexity). I know why atomicity, REST and ACID are fundamental and if you think you’ve found a way around them, you are on borrowed time.

I’ve worked around dozens of compiler bugs and have been knee deep in DLL hell. I’ve dealt with the needless myopia of tools that don’t understand spaces in names or case insensitivity. I’ve overcome poor planning on the part of software architects.

And you know what? After everything I’ve seen, after traveling to the deepest recesses of the Matrix, sometimes I just kind of look off into the distance wistfully and think about what might have been. It really, truly, is all crap. And it’s so much worse than anybody realizes.


When I read about the elegant computer science of the 50s and 60s, when great minds like Alan Turing and Peter Landin laid down the foundations of computation, I have to admit that I’m jealous. It’s so quaint to write lisp on a whiteboard. It’s adorable when programming language tutorials show how to print “Hello, world!” Does anybody really use that stuff anymore? No, of course not.

Software engineers today learn the quickest way to look up a code snippet on Stack Overflow and then load a PNG in iOS with a cryptic language like objective-c and then scratch their heads and wonder why the alpha is baked into the color channels. Then they find themselves knee deep in a conversation on chat/irc/a forum somewhere debating the relative merits of file size vs. ease of use. After reaching a stalemate with coworkers, they end up doing everything twice to satisfy both camps and carry around the baggage of added complexity for all time. The code grows and grows as it’s made to work cross-platform and eventually breaks one day when someone unfamiliar with the reason for the complexity decides to axe it in favor of the One True Way, which breaks some other app he or she isn’t working on.

This is a story about one line in one file. But there are a hundred files, each containing a thousand lines, in a million apps. It doesn’t matter if you learn how to do it better next time. There are infinitely many problems, so if you think it ever gets any easier or that you will be spending any less of your time overcoming these types of obstacles, you are fooling yourself. It gets worse. And worse. And even worse.

Software engineering is a misnomer. It’s more like software fabrication. As in fraud.

I think the last time I felt like I was dealing even remotely with any kind of computer science was when I used Matlab at hp back in 2005. Matlab’s primary focus is leverage. Let the code be a multiplier that amplifies a programmer’s abilities.

But then I went back to c++. Its primary focus is verbosity. Force the programmer to be explicit, no matter how small a problem is. In any given 12 hour day of programming, I’d say less than a single hour goes to writing new code now. I get up, in a haze I remember the list of bugs from the day before, I fix one and trigger three others, then I go on google and research a problem with our SVN server for four hours because it doesn’t want to check in the library file I just compiled. Rinse, repeat. Every day is some new horror, some deeper dimension of absurdity that slowly drives me into madness.

Most computers today, for all of their potential speed, are largely a mistake, based on the provenly unscalable Von Neumann architecture, controlled with one of the most shortsighted languages of all time, x86 assembly. They are almost unfathomably inefficient. Their processors have close to a billion transistors, most of which sit idle while a tiny fraction of a fraction of them perform some operation. Three quarters of a processor may be devoted to the quagmire of cache memory and its demands. All of this brute force horsepower gets stacked in an ever higher tower of babel in the relentless race to perform more sequential calculations per second. If people only know what engineering was required to implement branch prediction and 20 stage deep pipelines… It’s like seeing being the walls of a meat packing plant. You just don’t want to know.

If you knew that your computer performed two or three hundred empty cycles waiting for some piece of data to be fetched from main memory on a cache miss, or that when you see the little spinny thing, you are actually waiting for your hard drive to track down dozens of fragments of a file scattered across the hard disk drive because it got too full that one time, or that your web browser locked up on you because some novice programmer wrote some portion of it in blocking network code that is waiting for the last byte to arrive from the web server, and that the web server is sending that byte over and over again because a router is temporarily overloaded and is dropping packets like crazy so your neighbor can download a youtube clip of a cat hitting a ball into its owner’s crotch, you might throw up in your mouth a little bit. Sure, your computer can perform 10 billion floating point operations per second. But most of the time it’s not doing anything at all. Just like you.


Once again, I find myself writing another rant about things that are as obvious to me as a child’s love for its parents. But I’ve spent over 20 years immersed in this crap. Most people are blissfully unaware that computing as it exists today acts as more of a barrier to progress than an avenue.

Let me give you an example.

You might have an idea for an iPhone app, but I can tell you with all honesty that you won’t be able to write it. Because first off you won’t want to pay the $100 to be an Apple developer. Then you won’t understand how provisioning works and for the life of you, it won’t make any sense why you have to sign your app cryptographically to make it run on your own iPad. Then after spending two weeks learning objective-c, it won’t help because you won’t be able to get the game development framework you downloaded to compile on your Mac, because you also need to install SDL or a javascript interpreter or MacPorts or some other tool you aren’t familiar with. You won’t understand the unix command they tell you to use, to configure and make the library. You won’t realize that there are two compilers on your computer, gcc and LLVM. You won’t know that even if you choose gcc, that the newest 4.2 version sucks and the library only compiles with 4.0. And you won’t know how to change the project setting in Xcode, because there are also target settings that can override them. Even if you manage to make it through all of these hurdles, and the dozen more that I haven’t even mentioned yet (I’ll give you a hint: corrupted project), there is still the matter of submitting the app to Apple, which won’t work either. When you decide to add iAds to your app, they might work a few weeks after you start. But when they do, you won’t make any money. Then you’ll learn about the ad services that serve ads from multiple agencies and the idiosynchracies that are required to implement them. And after all of this, three months later, when you are earning $1 per day in ad revenue, you won’t know how to deal with that empty feeling you get when you hear how many millions of dollars Angry Birds is making. You won’t have an inkling of what people who have been in the business decades are feeling. And you won’t know that you should have been investing your time in affiliate marketing and just hired out the development to some poor shmuck who got sucked into this world before knowing what was entailed.

Because they know what you are just starting to grasp. That it shouldn’t be like this. That computers have vastly underserved their users. Conceptually, mobile and casual interaction is the future of computing. But it has no formal basis. It’s a beautiful shrine built on a foundation of tinker toys. The good tools like functional programming and provably correct algorithms are either too esoteric or too expensive for the mainstream. So far all investment has gone into racing ahead, instead of planning what type of future was being built. Today companies decide what tools you will use and the manner in which you use them, and are driven by profit, not progress.

And here’s why I find that at least a little demoralizing. Because the real secret they won’t tell you, heck, that I think is only dawning on a few people, is that today’s computing can’t take us into the future. It can’t provide true artificial intelligence or bring the kind of multiplication of effort that hackers take for granted to the masses. Computer science has utterly failed to tackle the real world problems, things like automating jobs so people don’t have to work, or working hand in hand with humans to explore solutions we have trouble seeing ourselves. We are so far from a Star Trek-style future utopia that it breaks my heart.

If you want to see the future of computing, look to every place that computing fails. The world is crying out for loose programming languages that just insert the darn semicolon instead of telling you that it’s missing, or dispense with it altogether. Sequential computing is at a standstill, and has been for close to 10 years now. We are long overdue for parallel computing. We need computers with thousands or millions of cores to better recruit the billions of transistors in chips today. We need better maths to find, describe and simulate the networks that will let these parallel computers communicate. Evolvable hardware and self modifying code need to go mainstream. Programming, at least the fundamentals, should be taught in elementary school just like any other language, but the teachers don’t yet have the required familiarity, so we should start with them. We don’t even have computers we trust enough to handle machinery, so we subject humans to a 21st century version of slavery in manual labor. I could go on and on and on about how a $700 iPad has about as much to do with revolutionizing computing as a TV hospital drama does with revolutionizing medicine. It’s all a farce, a sham. It’s giving candy to starving children.

I’m on the threshold now of rejecting this false idol, but for at least a little longer I have to cling to it to carry me through. I have a dream of starting some kind of open source movement for evolvable hardware and languages. The core philosophy would be that if your grandparents can’t use it out of the box to do something real (like do their taxes or call 911 when they fall down) then it fails. You should literally be able to tell it what you want it to do and it would do its darnedest to do a good job for you. Computers today are the opposite of that. They own you and bend you to their will. And I don’t think people fully realize how trapped we are within this aging infrastructure.

Think of the most programmable application out there, maybe something like Microsoft Excel. Do you know anybody who can actually get the macros to work? Probably not. And most programs are far worse than that, or don’t support programming at all. Visual Basic and Applescript are the laughing stocks of the programming world. For the most part, a user is either a programmer with full control (a tiny minority) or a novice who doesn’t even know that it’s possible to alter a program. In other words, the majority of users are algorithmically illiterate. That’s a travesty in 2011. Heck, my Mac Plus in 1987 with HyperCard was more approachable than anything today.

I guess I’m writing this to speak out against the status quo. I’ve heard a lot of regular folks criticize geeks for creating this mess and idolizing technology with a kind of zealotry, but what they don’t know is that many of us are just as appalled as they are. I don’t know if I can take hearing another self-aggrandizing nincompoop get up on stage and tout their amazing software. They talk so much about how the secret to their success was some tool or observation or trick. No. The truth was a combination of being in the right place at the right time, surviving all the B.S. that life threw at them when they were down and out, and frankly luck. The actual part about writing a new tool is such a tiny part of the equation that you could practically leave it out altogether and still find successful companies.

I guess I am just full of criticism and not offering a lot of solutions. So be it. I don’t expect any sort of revolutionary transition to truly innovative computing any time soon. What’s more likely to happen is that a few really smart people will write tools like node.js that encapsulate the nonsense so we don’t have to deal with it.

That’s a reason why one of my favorite languages is php. It just works. Screw up your variable typing? Who cares. Forget a symbol somewhere? Heck you still get some output and can see your error. All that matters is that for a given input, you get a certain output. In a way, php is acting more functionally than real functional languages. That’s why it’s a hacker’s language. It doesn’t get between you and what you are trying to do. I need something with the flexibility and ease of use of php, but the formalism of lisp. Matlab is about the closest thing I’ve found.

But I would take it further. I’ve often thought about writing a human language based syntax like Hypertalk that has the power of Scheme. I don’t care so much about speed or efficiency. What I really need is expressivity. The language should adapt to my needs instead of throwing its hands in the air when I told it to give me a list’s size when I should have asked for length or count.

Rather than going on at length about how I would do things differently, I will instead ask you to be open to the possibility that some of what I say is true. That we ain’t seen nothin’ yet. That the powers that be prefer to keep things the way they are and make millions of dollars a year instead of liberating us from restrictive technology. One might look at someone like Richard Stallman and think that he resembles a crazy person. He has perhaps seen too much. But his adament stance on free software is to be commended. If we don’t speak out against authority, the transnational oligopolies of the world will eventually force us into a sandboxed existence where we rent our technology from them for a recurring fee and they’ve patented every method of computing so we can’t use it. It’s scary how many aspects of our lives already work this way, and how many millions of people are completely oblivious to it. I still think this fight for the future is one of the great battles of our time. And like any other offensive, capturing hearts and minds is how it will be won.

Which is why I wrote this, another rant in a long list of many!

Continued here with a possible solution

Comments

  1. paper-cutter-and-trimmer-machine reblogged this from zackarymorris
  2. financial-advisor-roanoke reblogged this from zackarymorris
  3. insurance-charlottesville reblogged this from zackarymorris
  4. fishing-guides-smith-mountain-la reblogged this from zackarymorris
  5. 2013-super-bowl reblogged this from zackarymorris
  6. atlantic-city-discounts reblogged this from zackarymorris
  7. war-abroad reblogged this from zackarymorris
  8. andrewhickeywriter reblogged this from zackarymorris
  9. zackarymorris posted this
blog comments powered by Disqus