Locked in Tour Europe

 

Author: John Day (Follow on Twitter: @JeanJour)

How many people can you get in a Citroën Deux-Chevaux?

I’ll tell you: Seven.

Empirical evidence for this comes from the first OSI (Open Source Initiative) Working Group meetings in October 1978, taking place in the AFNOR Tour Europe building in La Defense, Paris.

john day fun blog 3 - John Aschenbrenner’s first draft of the Hour-Glass Model as it appeared in N117.We had been meeting all week and it had come time to produce the first real draft the OSI Reference Model, TC97/SC16/N117. The document with the first drawing of the hour glass model, only it was a martini glass. This was my first standards meeting. It was going to be quite an education, but I got a huge surprise that first morning, when I walked in to find an old friend, Kenji Naemura, one of the Japanese delegates, sitting across the table from me.  Kenji had gotten his PhD working on Illiac IV!

John Aschenbrenner’s first draft
of the Hour-Glass Model as it
appeared in N117.

I was staying out in Versailles with one of the CYCLADES guys, Michel Gien. We were putting in long hours all week.  Meeting well past 6 with lots of homework in the evenings. That Thursday, most of us were working on producing that final draft.  Although the meetings were in the basement in Oceanie (Over the years I grew to have a real love-hate relation with that room!), Michel and I were upstairs in the AFNOR offices editing sections of the documents on computers at IRIA to be reproduced for the Plenary in the morning. This was quite a problem for me, who spoke little French. I was using an editor that I knew had all the usual commands, just not in English! And the keyboard wasn’t quite the same either. At some time after midnight, Michel and I realized that nothing more we edit could be reproduced and collated in time for the meeting in the morning.  So we bagged it and went home to get some sleep.

A devoted cadre of 7 people stayed, including Charlie Bachman (US, it was his 7 layers), Hubert Zimmermann (France, WG1 Chair), Kenji Naemura (Japan), Don Shepherd (Canada), Tilly Bayard (AFNOR WG1 Secretariat), John Aschenbrenner and Jerry Foley (both US). Poor Tilly had just been hired two weeks before. This was her introduction to standards.

In those days, copiers just copied.  They would run off the number of copies of a page. Lay them out on a table. And then each person would go around picking up a page to make a complete copy. They finished at 4:30 AM, only to find themselves locked in Tour Europe. Zim found a way out by climbing through a transom and opening the door from the other side.

They then piled everyone into Zim’s green Deux-Chevaux. How they did it I will never know. Charlie and Don were not small guys! I heard Kenji was sitting on Charlie’s lap. Zim dropped everyone at their hotel and headed to his place for very little sleep.  We re-convened at 8:30 that morning, and we did not look good!  But the document was ready for review.

For the next meeting in June 1979 in London, I re-organized the document to more or less the structure it ended up with and file transferred it to UCL over the ARPANET so we would have it for the meeting. So far so good!

The Necessity of Theory in Science Or Big Data is Anti-Science (2)

Author: John Day (Follow on Twitter: @JeanJour)

Part 2 of 2. Read part 1 here.

I had been asked to write a review of a book for Imago Mundi, the premier history of cartography journal. Over the 2014 holiday break, I decided to knock it out.  The book was on Jesuit Mapmaking in Early 18th century China. (I have published a bit on this period.)
The book is primarily about the first major scientific mapping effort anywhere instigated by the Emperor Kiang-Xi and the resulting Atlas. But the book also discussed one of two well-known incidents in the late 17th century where the Jesuits had been pitted against the Court astronomers to see which could most accurately predict three astronomical events: a lunar eclipse, the length of a shadow cast by gnomon at a given time of day, and the relative and absolute positions of the stars and planets on a given day.
The Jesuits produced more accurate results than the Chinese Court Astronomers, resulting in their being put in charge of the Court observatory in Beijing.

Why were the Jesuits’ calculations more precise? It certainly wasn’t because the Chinese couldn’t do the math to the proper precision. After all, the Chinese had been using the decimal system for centuries. (When discussing surds, Needham notes that the Chinese had adopted the decimal system so early it wasn’t clear they noticed that there were irrational numbers.)

Then why?

Because the Jesuits were using techniques developed with and backed by theory.  They didn’t develop the techniques or the theory. Others in Europe had done that. But the “theory” behind it had forced the Europeans to be more precise to back up what they knew, to look more critically at their work, to think more deeply about it, improve their arguments. Hence creating more precise techniques.

The Chinese, on the other hand, had a procedure to follow. They didn’t understand why it was correct other than it had always worked “well enough,” so why look further? (Hmmm, where have I heard that before!) They had been trained that it was the way to do it.  They just knew it worked. And, the procedure didn’t really indicate directions that would lead to how to improve it. (Needless to say, respect for authority and ancestor worship didn’t help in this regard.)

We are seeing the same thing in the systems side of computer science today and especially in networking, where it has been a badge of pride for 30 years that they do not do theory.  In 2001, the US National Research Council lead a study of stagnation in networking research, one quote from their report sums up the problem:

“A reviewer of an early draft of this report observed that this proposed framework – measure, develop theory, prototype new ideas – looks a lot like Research 101. . . . From the perspective of the outsiders, the insiders had not shown that they had managed to exercise the usual elements of a successful research program, so a back-to-basics message was fitting.” [1]

It must have been pretty sobering for researchers to be told they don’t know how to do research. Similarly, the recent attempt to find a new Internet architecture has come up dry after 15 years of work. The effort started with grand promises of bold new ideas, new concepts, fresh thinking, clean-slates, etc and has deteriorated through ‘we should look outside networking for ideas’ (a sure sign they don’t have any ideas when, in fact, the answers were inside as they always are); to ‘the Internet is best when it evolves,’ (they have given up on new ideas) to ‘we should build on our success’ (It is hard to get out of that box).
When I asked my advanced networking class to read recent papers on the 6 efforts funded by NSF on Future Internet, after a chance to read some of the papers, their first question was, “These were written by students, right?” Embarrassingly, I had to reply that, they had been written by the most senior and well-respected professors in the field.

This is a classic case of confusing economic success with scientific success. They were focused on what to build, not asking the much harder and more dangerous question: what didn’t they understand. They didn’t question their basic assumptions. Even though, fundamental flaws were introduced as early as 1980 and made irreversible by 1986 and compounded in the early 90s.

On the other hand, our efforts, which have questioned fundamentals and forced us (me) to change long held views, have yielded new and often surprising result after new result: that a global address space is unnecessary; reducing router table size by 70% or more; recognizing that a layer is a securable container, greatly simplifying and improving security; that decoupling port allocation from synchronization yields not only a more robust protocol but more secure; etc.

Of course they have also shown that connectionless was maximal shared state not minimal; that of the four protocols we could have chosen in the 1970s TCP/IP was the worst, that of the two things IP does (addressing and fragmentation) both are wrong, that the 7 layer model was really only 3 (well, by 1983 we knew it was only 5), and much of what has been built over the past 30 years is questionable. Of 9 major decision points in the Internet, they have consistently chosen the wrong one, even though the right one was well known at the time.

There are many examples from networking, where not doing theory has missed key insights. A few examples should suffice:

  • It is generally believed and taught in all textbooks that establishing a connection requires a 3-way handshake of messages. However, this is not the case. In 1978, Richard Watson proved that the necessary and sufficient condition for synchronization for reliable data transfer is to bound three timers: maximum packet life-time, maximum time to send an ack, and maximum time to exhaust retries. The three messages are irrelevant. They have nothing to do with why synchronization is achieved. There are three messages exchanged, yes, but there are always 3 messages. They aren’t the cause. Watson then demonstrated the theorem in the elegant delta-t protocol. By not doing theory they missed the deeper reason that it worked and missed that the resulting protocol is more robust and more secure.
  • Many people will tell you that network addresses name the host. That naming the host or device is important. (Several of the projects noted above among them.) As it turns out, it may be useful for network management, but not for communications. In fact, it is irrelevant for communications. If you construct an abstract model and carefully look at what has to happen, you see that what the address names is the “process,” the locus of processing, that strips off the header of the packet carrying the address. The host is merely a container. Well, you might say, there are places where there is only one “process” stripping off that header, so it and the host are synonymous. Yes that case exists and in large numbers. But it is not required to exist in all cases and doesn’t in some very significant ones. By not doing the theory, they missed this insight, which made dealing with VMs very messy.
  • In 1972, we first realized that in peer networks, the “terminals,” now computers, could have multiple connections to the network. In all previous networks, the “terminals” were very simple and having only had one was all that was possible. The advantage of a computer having more than one link to the network is obvious: if one fails it still has connectivity. However, addresses in the ARPANET like all previous networks named the wire to the “terminal,” i.e. the interface. If one interface went down, the network had no way to know that the other address went to the same place. To the network, it appeared to be two different hosts. One could send on both interfaces, but not receive on both (not without re-establishing a connection to use the other address). Addressing the interface made addresses route-dependent. Addresses had to be location-dependent but route independent. The solution is apparent if there is a theory: Which we had in Operating Systems. In operating Systems, Application names designate what program and are location-independent, Logical addresses provide location-dependence, but route-independence (independent of where in physical memory), and physical addresses are route dependent addresses (dependent on accessing the memory). Naming the node, not the interface solved this problem. Not only did it not cost anything but it is significantly less expensive, because it requires between 60% and 90% fewer addresses and router table size is commensurately smaller. All other network architectures developed in the 1970s/80s got this right, only the Internet, which doesn’t do theory, got it wrong.
  • But we still thought that addresses could be constructed by concatenating an (N-1)-address with an (N)-identifier. This seemed natural enough. After all, files were named by concatenating directory names down through the tree with the file name as the leaf. That was until in 1982 when we started to look at the detailed theoretical model of what would happen if we did that. It quickly became apparent that it defined a path up through the stack. Concatenating the addresses made them route dependent. Precisely what we were trying to avoid. The address spaces at each layer have to be independent. Of course it was obvious once you remembered what Multics called a filename: a pathname! But it was doing the theory that lead to recognizing the problem. So why does IPv6 embed MAC addresses in the IPv6 address? Because they don’t do theory.

There are many more examples. All cases where not doing theory lead to missing major insights and improvements. But notice that today we are doing the same thing the Court Astronomers were doing. Our textbooks recount how things work today, which students take as the best way to do things. We teach the tradition, not the science. We don’t even teach how to do the science. We don’t teach what needs to be named and why. Watson’s seminal result is not mentioned in any textbook. (One young professor asked me, why he should teach delta-t if no one is using it. (!) I almost asked him to turn in his PhD! We aren’t teaching the fundamental theory, we are teaching the tradition.)

For a talk on this problem a few years ago, I paraphrased a famous quote by Arthur C. Clarke to read:  Any sufficiently advanced craft is indistinguishable from science.”  (Clark said, “Any sufficiently advanced technology is indistinguishable from magic.”) We are so dazzled by what we can do; we don’t realize that we are doing craft, not science.

Big Data is the same thing only worse. Big Data is accelerating the move to craft and is sufficiently sophisticated to appear to be science to the naive. Correlation is not causality. We create algorithms to yield results, but do we have proofs?   Big data is supposedly telling us what to do without telling us why or contributing to a framework of theory that could lead to deeper more accurate results and likely even deeper insights.

Even Wired Magazine called Big Data the End of Science. Although as usual, they didn’t realize what they were advocating stagnation. Of course, it is always the case that every field goes through a period of collecting a lot of data before it becomes clear what is important and the theory is. This has happened before. But what hasn’t happened before is to advocate that we don’t need theory. It is putting us in the same position as the Court Astronomers in 17th C China. And the rate of change and adoption is far faster now than then.

There are those who claim it is a *new* science!  When it is actually the greatest threat to science since the Catholic Church found Galileo guilty of proving a heathen (Aristotle) wrong. (I never have understood that one!)  The Scopes trial was more circus than threat, though that may have changed in the backward US.

We are taking on the same characteristics seen in Chinese science in the 17th C.  It isn’t pretty and it isn’t just networking.  Read the last 5 chapters of Lee Smolin’s The Trouble with Physics.  He sees it there!  And others have told him they are seeing it in their fields as well.

Big Data has us on the path to stagnation, if we are not careful. Actually, we are a long way down that path…

  1. Looking over the Fence at Networking, Committee on Research Horizons in Networking, National Research Council, 2001.
  2. Needham, Joseph. Science and Civilization in China, Cambridge University Press, (Vol 1- Vol. VII, Book 1) 1965-1998.
  3. Smolin, Lee. The Trouble with Physics: The Rise of String Theory, the Fall of Science, and What Comes Next, Houghton-Mifflin, 2006.

Crypto-fascist lackeys of the military-industrial complex – The Illiac IV adventures, part II

Author: John Day (Follow on Twitter: @JeanJour)

(Continued from this post)

Let’s go back to the Illiac IV (the world’s first supercomputer) I was telling about in my previous blog. One time, the four of us leave Champaign for Philadelphia to do some work on the machine. The whole tone of the trip was set as we were getting off the Ozark flight to Chicago. The flight attendant said to me “Are you guys in a band!?” I replied, “No, we are crypto-fascist lackeys of the military-industrial complex”.  A stock phrase the radical fringe leveled at us.

Once in Chicago, we make our way to the connecting flight to Philadelphia. This trip was about a week after that first major hijacking where they blew up those planes in the Syrian dessert. TWA had started putting metal detectors at their gates. All this was way before there were metal detectors at the entrance to the concourses.
The four of us walk into the gate area and everyone is staring at us, giving us the hairy eyeball. We are told about the metal detectors. The problem is that we are carrying magnetic tape reels with our software in our backpacks. Hmmm, are these active or passive detectors we asked ourselves. I called the office.  No one knew. Okay.

So we hold back so as not to create a scene. As the last people are boarding, I walk up to the gate agent and say, “We can’t go through your metal detectors.” He replies, “You want to get on the plane, you gotta go through the metal detector.”
I say, “Look, we are carrying thousands of dollars of software on tape.  I don’t want to get to Philadelphia and find out I have nothing.” He thinks about it, says “Okay, put your bags down and we will search them. You guys go through individually and we will unplug the detector and you can get on.” I say “great!” and so we do. However, Kravitz, who is a bit of a burly guy and has a frizzy beard to the middle of his chest and frizzy long hair that comes pretty much out to the edge of his shoulders, takes off his army field jacket and gives it to the gate agent saying “you better search this, it will set off your metal detector”.

The gate agent gives Kravitz a weird look, but dutifully, starts going through the pockets pulling out: a spark plug, crescent wrench, another spark plug, pair of pliers… On the 3rd or 4th spark plug he looks at Kravitz. Kravitz says, “I ride a motorcycle, and like to think I can get myself started.”

You can tell this completes the gate agent’s image of us (and it isn’t flattering). He rolls his eyes, hands Kravitz his jacket and we bolt for the plane. I end up sitting next to a black activist on his way to a demonstration and we talk politics for the whole flight. Kravitz ends up sitting next to two grandmothers and spends the flight trying to explain to them that, no he isn’t a student at the University of Florida, he is a member of the staff!

I think we were asked four or five times on that trip if we were in a band and every time, I answered, “No, we are crypto-fascist lackeys of the military-industrial complex”!

Long haired bearded freaks – The Illiac IV adventures, part I

Author: John Day (Follow on Twitter: @JeanJour)

The first job I had was as a grad student at the University of Illinois, working on Illiac IV in 1970. For those of you unfamiliar with US geography, the University of Illinois has a student population of 35,000, located between the twin cities of Champaign-Urbana (200,000 population) about 300 kilometres south of Chicago on the Euclidean plains (not a typo). It is flat. Outside of the city, you can stand and see the horizon in all directions. On a 15’ topographic map, there is less than 2 meter variation in elevation over most of the map, and it is surveyed into 1 mile squares called sections. In the midst of some of the richest farm land in the world, the top soil is black and sticky, prairie and 3 meters thick!

John Day Blog 1 illustrati copyThe Illiac IV
Now that the scene is set, let’s go back to Illiac IV. This was the first supercomputer ever. It had a parallel processor containing 64 processors, and a control unit.  The project was funded by ARPA (Advanced Research Projects Agency) and while the machine was designed at Illinois, the hardware was built by the Burroughs Corporation’s Defense Space and Special Systems Division near Philadelphia.

Our group at Illinois was developing the software for it: operating system, languages and compilers, and applications. Illiac IV was designed to be a “peripheral processor” to a B6700, which was currently being completed. In fact, we were using Serial #2 of the hardware.
For those you who never saw a B5500, or B6700, you missed a life changing experience. This machine was elegance personified. It taught us that a system did not have to be a jumble of hacks, but an elegant complete whole that just worked.  Perhaps a bit on that later.

Hippies, flower power, SD&RR!
I should note for our younger readers that this was a rather unusual time in the US and in Europe, especially at universities. Hippies, flower power, sex, drugs and rock n’ roll! Civil rights and the Vietnam war were hot.  Student demonstrations were common. In those days, you could tell software guys from hardware guys by looking at them. Hardware guys had crew cuts, pocket protectors, white shirts and ties, etc. Software guys were long haired usually bearded freaks in jeans, denim work shirts, and for us Dingo boots.

Also being the largest military contractor on campus, various radical factions were demonstrating against us. There was an attempted firebombing of our office, but it didn’t go off. That culminated later in the rally called by the attachment, which was a little odd, since our politics were for the most part in line with the student radicals. If anything, we had a better grasp of what was going on than they did, but that is another story.

2001, A Space Odyssey
One night we couldn’t work on the machine (hardware problems), so we decided to go to the movies.  The only reasonable movie playing in the area was 2001, A Space Odyssey. Movies played at theaters a lot longer back then.  We had all seen it, some probably more than once, but what the heck. If you remember, the last part of the movie has this extended “light show”, it was not unusual for some people to go see 2001 stoned. It was the time of Haight-Ashbury, the Jefferson Airplane and White Rabbit.

If you remember, HAL was built in Urbana Illinois.  So when that part came up in the movie we all applauded.  When the movie was over, the house lights came up. As we were leaving, we came across a row of freaks still sitting in their seats, who clearly *were* stoned. So we stopped to talk to them. The conversation went like this:

“You guys from around here?”
“No, we are from Illinois.”
“O, you are the guys who applauded.”
“Yea, that’s right?”
“What are you doing out here?”
“Building the world’s largest computer.”
“You mean like HAL??!!” (said with a real tone of “Wow!”)
“Kind of.”

They were pretty out of it so we said “so long” and left.

I figure the next day, they either thought they hallucinated the encounter or could never convince their friends that they *really HAD* met the guys who were building HAL!! at the movie last night.  You can just hear their friends say: “Yea, right. You guys were so stoned, you could have met Easter Bunny!”

To be continued… Part two will be published in two weeks!

Read more about our #TNC15 keynote speakers

The TNC15 conference programme is available on the TNC15 website and the keynote speakers are confirmed.

Six high-profile speakers will address hot topics in the realm of research and education networking:

    • John Day (Boston University, USA) will challenge our thinking about the engineering and design choices that were made in the early days of the Internet in the light of today’s Internet. In the opening plenary session he will issue a call to arms to adopt a new paradigm for Internet communications.
    • Manfred Laubichler (Arizona State University, USA) will show how hotspots of scientific activity can be detected by analysing large networks of collaboration over time. To effectively pursue this kind of interdisciplinary research requires developing a novel type of research system for data-driven computational approaches at the intersection of science, history of science and science policy.
    • Sarah Kenderdine (University of New South Wales, Australia) will elaborate on the challenges of using digital media in the research, preservation, management, interpretation, and representation of cultural heritage. She will examine the relationship between material and digital objects; the implications of digital technology for knowledge creation, documentation, and the concept of authority; and the possibilities for ‘virtual cultural heritage’ – the preservation and interpretation of cultural and natural heritage through real-time, immersive, and interactive techniques.
    • Timo Lüge (Social Media For Good) will discuss ‘Disaster response in a connected world’. The increased availability of mobile Internet access around the world is changing lives and relationships, and has a significant impact on how information flows in the case of natural disasters. His keynote speech addresses how this changes the way we have to think about disaster response and how disaster responders interact with the affected people.
    • Avis Yates Rivers (Technology Concepts Group Intl.) will talk about how unconscious bias works, how to spot it, and what to do about it. He will give lots of reasons why we want and need a diverse workforce.
    • João Paulo Cunha (University of Porto) will address the topic ‘smart cities’, taking the TNC15 host city Porto as an example where the local authority and city institutions have collaborated to create a Future Internet (FI) living lab at a city scale. He will present the concept, the current status and the future of this living lab, discussing new trends and FI services envisioned by the multidisciplinary multi-institutional project team.

‘Connected Communities’

The conference theme ‘Connected Communities’ reflects the complex situation NRENs (national research and education network organisations) find themselves in today. They are at an exciting crossroads in society. Not only do they provide the basic network infrastructure on which big science and big data build, but they also address the requirements of ever-increasing and more diverse communities of Internet end users in the wider research and education areas. Many questions remain unanswered in a time when privacy and security are not a given.

TNC15 captures the current debate with the conference theme ‘Connected Communities’. The conference provides a full programme in which a wide variety of other hot topics in research and education networking will be discussed, including future education models, advanced networking infrastructures, and federation and middleware.

Sponsors, exhibitors and demonstrations

If you would like to become a sponsoring partner / exhibitor of TNC15, or would like to demonstrate at the event, please contact the conference organiser, Gyöngyi Horváth, at horvath@terena.org.

Community Awards

During TNC15, we’ll reward people who have made a significant contribution to the research and education networking community with our yearly Community Awards. You can still submit your nominations! Visit the news page for more information.

Further information

TNC15 is organised by GÉANT Association and hosted by FCT-FCCN, Portugal’s unit for the operation of the National Research and Education Network within the funding agency for science and research in Portugal, between 15-18 June 2015 in Porto. All information, including the full conference schedule can be found on the TNC15 website.