The “Bibles” in the Information Technology Industry

The Computer Corner

By Charles Miller

Many professional disciplines have their own “bible” enumerating the industry minimums required to maintain safety standards. Ask any fireman about the fire code and be prepared to witness a solemn mood descend as he explains that the code is “written in blood.” Every page, even every word, in the code is there because some unfortunate soul once lost his life in a fire. I mention that because the standards “bibles” in the information technology industry also have a lot to do with death and dying, albeit of a different kind.

In the 1960s the engineers who laid out standards for what became the Internet decided on a “best effort” approach to routing communications. With amazing foresight they anticipated situations in which a router could get overloaded, so they wrote into the protocols permission for the routers to simply drop information packets when they could not be routed further. When overloaded with traffic, a router does not have time to send back failure notices, so the Internet protocol simply allows a busy router to ignore anything it cannot handle. Telephone engineers who looked at this idea were horrified; at the phone company they were accustomed to the idea that it was necessary to connect two people to each other before a conversation could take place. The very idea of chopping up communications into thousands of individual packets, then dumping those into a network where each individual router along the way had permission to kill any packets it was too busy to handle— well, the telephone engineers thought that was ludicrous. Preposterous or not, the concept of packet switching has worked out pretty well; in fact engineers now agree that this is the only way the Internet could function and handle all the different types of data it does carry.

When you communicate via the Internet, your iPad, computer, or other device breaks your information into small packets and then sends them off through a network of routers through which the packets may or may not even take the same route to their destination. Any of the routers anywhere along the line might simply delete some of those packets if it cannot handle forwarding them to the next destination. The design of the routers is such that if one deletes traffic it cannot handle, that router simply takes it on faith that somebody will notice that some packets did not arrive and do something about it. In other words, the router is counting on you and your computer to notice that not all your packets got through and to send the missing ones again. Without your knowing about it, your computer is busy doing this constantly.

As I mentioned earlier, this “best effort” approach to handling communications was something telephone engineers had a hard time accepting. They were unaccustomed to the idea that a system could allow so much of the traffic it handled to die in transit. Likewise, it is unimaginable that any fire marshal could sign off on any fire code that allowed for so much death and dying. In coming weeks I am going to elaborate further on how the “best effort” approach contrasts with the kind of building code the fire marshal follows, and how that affects every Internet user.

Charles Miller is a freelance computer consultant, a frequent visitor to San Miguel since 1981 and now practically a full-time resident. He may be contacted at 415 101-8528 or email FAQ8 (at)



Comments are closed

 photo RSMAtnWebAdRed13.jpg
 photo RSMAtnWebAdRed13.jpg
 photo RSMAtnWebAdRed13.jpg

Photo Gallery

Log in | Designed by Gabfire themes All original content on these pages is fingerprinted and certified by Digiprove