Wednesday, October 19, 2011

Tribute to Dennis Ritchie from Small Dog Electronics

RIP Dennis Ritchie By Liam Flynn  
   
  The outpouring of praise and tributes for Steve Jobs this past week was really something to see and was well deserved. He was a true innovator and bold entrepreneur who had a huge influence on the development of our modern digital world and how we interact with it. However, all the focus on Steve Jobs made it easy to miss the news that another incredible genius passed this week, one who arguably had a bigger influence on the world than Steve Jobs did.
In the early 1970s a man named Dennis Ritchie invented a programming language he called C, and it is possibly the most important thing ever invented in the computing world besides the computer itself. C is a programming language first developed for the Unix operating system (which Ritchie helped develop), and now C and Unix form the basis of almost everything we do with computers. The Internet basically runs on Web servers and routers using Unix written in C (or Java or C++, which are based on C).
Mac OS X is a Unix-based operating system, a direct descendant of the operating system and programming language Dennis Ritchie brought to life, as is iOS, the mobile Apple OS that runs the iPad, iPod touch and iPhone. Even Windows was written in C at one point. Steve Jobs and any number of other developers stand on the shoulders of this quiet man who worked behind the scenes yet was one of, if not the most influential person in computing. It’s not an exaggeration to say the world as we know it today would not exist without Dennis Ritchie’s work. He passed away this week at the age of 70 after a long illness.
Ritchie worked at Bell Telephone Labs like his father before him. Bell Labs of the 20th century was one of the most creative technological research and development facilities anywhere ever, and its inventions and personnel have shaped the world we live in. Among a long list of Bell Labs’ accomplishments in pure research and practical applications is the development of the transistor by three Bell researchers, probably the most important invention to ever come out of Bell.
The transistor changed the world. It let us move away from vacuum tubes as components in electronic devices and let us start making things smaller, cooler, more portable and more powerful. It made modern electronics possible. One of those employees who developed the transistor left Bell and in 1956 formed his own company in a little town that would become known as Silicon Valley. Employees who split off from his company formed some of the most successful tech companies of all time, including Fairchild, AMD and Intel. The phone company may not seem too exciting, but Bell cultivated a culture of research and innovation that let brilliant minds have their way and produced brilliant results.
It is in this culture of not only innovation, inventiveness and technology but also practicality where Ritchie worked on developing operating systems in the computer sciences division of Bell Labs. He and several colleagues developed Unix in the late ’60s. Like most operating systems of the time, it was written in assembly language. Assembly language is a low-level programming language. It is actually the abstraction level just above programming in machine code, which is programming by directly addressing what happens to single bits of data in memory or call stacks, or in what order these single operations are to happen. In assembly language, something that might take dozens of lines of machine code can be expressed in a few lines, and so on up through abstraction levels. The higher the level of programming, the more powerful and involved the commands can be and the more abstracted from the machine code they are.
Of course higher-level programming is the standard today; if it wasn’t, there wouldn’t, for instance, be people designing apps for the App Store. Creating even the simplest apps we use today would involve monumental amounts of design and labor if each were created at the assembly-language level. In the era during which Ritchie worked, most engineers felt that low-level programming was the way to go. It provided an extreme level of control over the machines of the time. Higher-level programming languages tended to not use resources efficiently, and that was a huge driving paradigm then. To put this in context, you have to remember what that time was like. No iMacs or PCs in people’s homes because they didn’t exist. No Internet. No i-Anything. No slick user interfaces. Computers were mostly huge things guys in white coats with advanced degrees worked with.
After working for a couple of years with the Unix he had developed, Ritchie found he needed a better way to program different builds of it. One of the drawbacks of low-level programming is that it gets more platform-specific as you get closer to machine language. At that lowest level the programming is written for the specific processors and memory architecture used in the machine. The result is that, at low levels, if you want to move an OS to a different machine and be able to program it, then that OS needs to be completely rewritten to interface with that machine’s hardware—a complex and time-consuming task to say the least. Ritchie decided to develop a higher-level language that would overcome this hurdle and make Unix a portable operating system. Over the next few years he developed C and the rest is history. Unix and C could be ported to almost any hardware with ease, and C was the software that made it happen and then let people start really programming.
The other part of the equation that made C spread so quickly was that AT&T at the time was not allowed to sell computer products including software due to antitrust agreements, and they were required to license anything they developed in the field to anyone who asked. So in a very short time this powerful portable language was in the hands of thousands of people in universities and companies, all exploring its capabilities and developing software with it. And here we are forty years later and C or one of its variants powers our digital world. C and its variants run everything from the smallest handheld devices and system hardware like routers right up to supercomputers, and everything in between. Steve Jobs gave us ways to interact with the digital world that are elegant and visionary. Dennis Ritchie’s C built that world in the first place.

No comments: