Feed on
Posts
Comments

Author’s Note: This essay appears on pages 15-18 of the March 2009 COOK Report on Internet Protocol.

A Tipping Point for the Internet?

Catching the precise moment of a tectonic shift in a global system as large and important as the Internet may be viewed as an exercise in the improbable. However, I point out in this summary that I think we are precisely in the midst of such a shift. The largest portion of this March 2009 COOK Report issue is approximately 20,000 words of discussion of the ramifications of the exhaustion of the remaining pool of routable IPv4 address blocks - (pp. 30-63). This article serves as an introduction to that detailed discussion.

The RIPE policy announcement of December 16, 2008 sets conditions by which RIPE members who have IPv4 address block assignments can reassign some or all of them to other members. Although the action, at first glance may seem trivial, it is happening only because the pool of assignable IPv4 numbers, a limited resource on which internet growth depends, is running out. As the discussion in this issue points out, this change in policy has ramifications of which almost all of those who depend on use of the internet are unaware. This shift in the permissible use of the Internet’s most basic economic resource will have profound consequences. It is tantamount to the adjustment of stresses deep with in the San Andreas fault. The new policies will begin to send tremors through the global system in ways that bode ill for the open and competitive internet we have known so far.

The Context

IPv4 numbers are the fundamental building blocs of the global internet. While people can “participate” in the direct provision of internet content offering and web hosting based on transitory assignments of IPv4 numbers, the assignment of IP numbers on a permanent basis from a Regional internet Registry (RIR) is the only way a business can enable itself to route its customer’s traffic via an ASN number. Such capabilities establish such a business as one of about 30,000 independent providers of Internet service in the world.

With the beginning of the Internet in the 1980s, a handful of universities and large corporations were able to participate in the deployment of TCP/IP as a transparent overlay of carrier networks. The protocol was hardware independent in a way that that other networking protocols were not. Because of this independence, the university community and its technology partners were able to construct an inter-network of networks on a large and rapidly growing scale. US government policy under the leadership of the NSF enabled the NSFnet backbone to interconnect to foreign networks in the late 80s and early 90s. At the same time, with a loosening of acceptable use policy, the NSF enabled small scale dial up commercial providers to connect to university endpoints. The university based Internet began to morph into the commercial Internet.

With the advent of the commercial Internet as marked by the decommissioning of the NSFNet backbone on April 1, 1995, the number of service providers, by then in the hundreds, quickly increased to thousands of independent ISP businesses. To facilitate the process of IP number assignments (the street addresses for the delivery of ‘packets’), regional internet registries (RIR)s were established. In turn these registries were used by their members to administer the policies that they, the members, established for the gradual assignment of the IPv4 blocks of numbers needed by newly formed ISP business. These were businesses that wanted to become independent economic participants in the rapidly growing protocol overlay being constructed with a diverse mix of technology inputs that included campus and corporate LANs, new green-field point-to-point facilities, and fractional infrastructure products bought or leased form incumbent carriers.

The IPv4 numbers were simply an indispensable part of the TCP/IP protocol. To set up a service using TCP/IP, the service providers needed to be able to assign their customers unique IPv4 numbers from which and to which packets could be sent. The use of the IPv4 address blocks of specific sizes was supported on the basis that the ISP needed that many IP addresses — no more, no less — in order to connect useful things to the Internet, and that particular allocation was justified for as long as that need remained — no more, no less. If the business was disbanded, the IPv4 blocs had to be returned to the registry to be reassigned. The blocks were not owned, They were not property. In economic terms, they were in effect “inalienable” — not subject to being sold or transferred by the original allocation recipient to a third party, or to being purchased or acquired in other ways by a third party.

The blocks were like spectrum frequency assignments before frequency was auctioned. While spectrum frequency was there to be used according to the rules of the regulator, the IPv4 blocs were there to be used according to the rules set by the ISP members of the RIR. In the case of radio you broadcast on the frequency. Before auctions you could not claim to own the frequency. In the case of the ISP, IPv4 numbers are obtained directly from the reserve address resource pool administered on behalf of the ISP community by the Regional internet Registry, or alternately from a RIR member ISP. This hierarchical arrangement, with neutral RIRs at the top, competing ISPs at the next level, and individual users below that, parallels the organization of the banking sector, with a central bank (or occasionally, a “banker’s club”) at the top, competing lending institutions at the next level, and aspiring borrowers below, according to Tom Vest.

Vest suggests that this symmetry is no coincidence, but rather a product of the fact that IP addresses performs the same kind of “medium of exchange” function that money plays in the conventional economy — and that the uses of IP addresses are subject to the same kind of systemic risks that can render money useless in certain circumstances in the wider world, for example, in times of extreme inflation or deflation.

Consequently IPv4 number assignments had economic utility by virtue of the fact that they were an indispensable part of the TCP/IP protocol that could be used as a transparent overlay” technology across carrier networks. The overlay was transparent to the networks that did not distinguish data network protocols from voice. Inter networks could be built atop various (telco) inputs, generally without requesting or securing their explicit permission about how the inputs would be used. Why? Because TCP/IP was just one of many data protocols to which the telcos were obligated to provide common carriage.

For the 1980s all this worked well. Nevertheless, as long as IP addresses are essential but scarce, the prospect of a single entity having control of IP addresses created an inescapable conflict of interest for competing service providers. Recognizing this fact, beginning in 1993, ISP communities started establishing quasi-independent central bank-like institutions — “regional internet registries” or RIRs– to administer the distribution of these critical resources for the purpose of connecting useful things to the Internet.

The Tectonic Shift

Until December 2008, IP addresses distributed via the RIR system could not be bought or sold; i.e., they lacked the quality of “alienability,” which economists regard as an essential feature of private property. But this lack was itself a feature – not a bug.  IPv4 could be used only by an entity that agreed to create value by actively contributing to the Internet system.  IPv4’s lack of any other kind of use value, and the conscious collective decision to prevent it from acquiring exchange value, were the primary causes and rationale for creation of the RIRs and the “needs-based” allocation regime in the first place.

However, IP addresses can also have an “exchange value” if they have become scarce; that is, if those that actually need IPv4 now to attach useful things to the Internet can only obtain addresses from someone else that needs them less, and thus may be persuaded to part with them for some consideration. This is the strategy that was, in effect, chosen by the RIPE community when an IPv4 resource transfer policy was approved on December 16, 2008.

It was not the only conceivable strategy. For example, if the successor IPv6 addressing format had been transparently adopted by most or all community members, the scarcity and intrinsic exchange value of IPv4 addressing would have disappeared. But that did not happen.

Moreover, the fact that IPv6 has been rejected has even more far-reaching implications. Making IP number resources “alienable” strips them of one of the critical features that previously made them (1) irrelevant and transparent to countries and national jurisdictions, and (2) effectively manageable through voluntary, “self-governance” mechanisms.

If and when IP numbers become alienable, economic substance doctrine will eventually dictate that they ARE property for all practical (and legal, and regulatory, and taxation) purposes, regardless of whether some people might want to claim otherwise. The reason for this conclusion is that, given the economic impact of these markets, there will be litigation. When litigation occurs that the transaction (address block transfer) must have a meaningful economic purpose to be legitimate or sustainable in a court of law. An affirmative answer means that the IP Block transfer is one that involves property (something of economic value to the possessor. Going down this road invites government involvement. Because when IPv4 assignments become property, the only entity that can tell you what to do or what not to do with your private property is the property rights guarantor, i.e., the government.

Given the new opportunity to “own” IP addresses, how are incumbent services providers — and especially incumbent facilities-based carriers — likely to respond? The outcome of another recent and relevant privatization initiative — i.e., U.S. spectrum auctions — suggests a likely, if chilling, scenario. Given the benefits of securing scarce resources for their own customers, but even more importantly of blocking any possibility of competitive bypass, one may assume that incumbent territorial facilities owners will do everything they can to acquire all available IPv4 addresses.  Success would empower them to become the unilateral arbiters of all uses of the TCP/IP overlay — they will be able to demand whatever share they want of any IP-based service that they permit, and absolutely preclude any services that they dislike.

Even if this scenario seems too pessimistic, it is hard to avoid the conclusion that, after all the usable IPv4 blocks have been distributed (circa 2011), the inheritors of RIR-era IPv4 will literally possess the keys to the kingdom. They will stand to achieve and enjoy permanent market power simply by doing nothing. That could mean that the “open Internet” could be finished forever — or at least that the next moment of openness may come only after some new technology is invented that makes it possible to bypass TCP/IP in the same way that the latter made it possible to bypass the arbitrary restrictions imposed by telco facility owners.

The prospects currently look grim. On 24 January 2009 the ARIN Advisory Council (AC), acting under the provisions of the ARIN Policy Development Process, recommended that the ARIN Board of Trustees adopt: Draft Policy 2008-6: Emergency Transfer Policy for IPv4 Addresses

This policy document will put in play a process even less transparent than that created by the members of RIPE. The only apparent way out might be for the registries and IANA to step back from the brink and establish procedures by which new entrants could be very gradually allocated the remaining numbers.

Where Do We Go Now?

No, the Internet will not disappear, it will very likely however become much more expensive to use. It will also likely fragment and lose much of its ability to stimulate growth and innovation. Once a property right is recognized, history suggests that its beneficiaries rarely give it up willingly. If IPv4 addresses become property, then there will be no IPv6 transition, or any other successor addressing format or technology that can succeed without the active support of future IPv4 owners — or at least none that might undermine the advantages and rewards of IPv4 ownership. To expect otherwise would be to assume a level of altruism that has been scoffed at as unrealistic by the current generation of IPv6 refuseniks. If a transition is contrary to their individual private interests today, it will be doubly so if and when the only globally interoperable IP addressing resources are their exclusive property.

In this new uncharted territory, anticipating the most likely course of Internet development is incredibly daunting. And yet, with the stakes as high as they are it would be reckless to simply trust to fate that we will get to the other side with all of the Internet’s critical features intact, if not improved. A hasty, incautious reading of the new RIPE and ARIN policy developments might lead one to assume that that nothing much of significance has changed. However, once one understands the dependencies between IP address allocation, registration, and address uniqueness (without which the Internet stops working) — and the links between neutral, eligibility-based allocation practices and overall industry openness, this seemingly modest little change takes on the greatest, broadest possible significance.

Of course, the new “resource transfer” policies won’t necessarily lead to industry closure, or to the inevitable erosion of the presumed uniqueness of individual “public” IP addresses — but it’s not clear that implications of these possibly existential risks have been given all due consideration. If managed properly, perhaps a better, more open — possibly IPv6-based — Internet will result. If not, the scarcity of globally routable IP addresses could easily become the greatest bottleneck to continued growth and evolution in the Internet’s brief history.

The choices that will determine which course we all will follow are being made right now. Everyone who has any interest in the Internet, today or in the future, should take note, and if necessary speak up — now, while the future still remains malleable.

Acknowledgment: Thanks to Tom Vest who gave valuable assistance in getting all this in focus, and clarified many historical and technical details. Editorial comments or interpretations, especially regarding specific RIRs or RIR policies, are my own.

Editor’s note:

Thanks to Paul Budde for this essay.

Paul called it “Australian case study” I have changed the emphasis slightly because i intend to follow it tomorrow with a piece on the emerging private market for IPv4 addresses, and the following day with a piece on RFC 1744. I fully agree with Pau’ls main thrust towards structural separation and open access. Telco facilities should serve the public interest. Would not it be a wretched irony to see in three or 4 years time… or even sooner, the emergence of a new set of telco owned facilities - namely all the IPv4 address blocks in each region.

Paul writes:

Most developed nations are now revisiting their telecoms policies with a view to using telecoms infrastructure as a tool to revive the economy.

And when exploring this it quickly becomes clear that open networks are necessary if we are to achieve the economic benefits that the digital economy has to offer.

The multiplier effect of open infrastructure is obvious. It stimulates developments in healthcare, education, energy, media and Internet – this in stark contrast to the closed (vertically-integrated) networks that are currently operated by most incumbent telcos around the world.

There are several ways to achieve open networks, depending on local circumstances. Some countries have been able to use existing regulations to move in that direction, while others have introduced structural and functional separation. More positive approaches are also possible, depending on the participation of the incumbents in the process. Empowering local communities to develop their own networks would be one of the preferred options.

And, of course, there are combinations of all of the above.

However, in most situations some sort of regulation is required to get the market moving towards open networks – particularly in countries where there are strong vertically-integrated incumbents. In those cases I have not seen any solution other than separation (regulated or voluntary). This certainly is not the end game but it would be quite an achievement to be able to separate the operation of the infrastructure and the services, especially with such powerful players. Separation would certainly eliminate monopolies or duopolies (telco/cable), as these would no longer make sense. Instead new business models would evolve around functionalities (infrastructure, network management facilities, services, content, and distribution).

Lessons learned from Australia

The Australian experience shows how daunting it is to implement any policy activity that will affect the incumbents – even decisions that are less significant than separation. Remember Telstra threatening the Australian government with: You will get the mother of all legal battles.

Having had to deal with the consequences of the wrath of Telstra’s CEO, Sol Trujillo, for over three years I can imagine what kind of battle will lie ahead if separation were to be implemented, for example, in the USA.

Under Trujillo’s leadership Telstra has sued a Federal Minister, has called the national regulatory body untrustworthy, a rogue and maggots (eating its profits away). Its competitors (their major wholesale customers) have been called parasites.

The incumbent has refused all invitations from Ministers (two different governments) to sit down and discuss telecoms from a national interest point of view. It has refused to build a national FttH network in which others would be involved. It wants an ROI on this investment (in their own words, ‘north of 18%’) and it has also indicated that this would mean a wholesale access price of around $65 for something a very basic (1 or 2 Mbs) broadband service.

Digital Economy Industry Group

I am presently leading industry workgroups on three continents, looking at government policies that will lead to open networks.

In Australia this group represents 140 companies. Supported by the Federal Minister for Broadband (Australia has a Minister for Broadband and the Digital Economy) the group has provided input in the broader debate as well as in specific inquiries. They also looked at plans for a combined industry the incumbent to bypass – to ignore it. And, the companies involved in this having spent several million dollars to investigate this, the conclusion was that it would be totally impossible to ignore the incumbent.

The government has put $5bn on the table, but even with that the industry can’t get a viable business plan together without strong regulations to ensure that the incumbent won’t spoil the investment.

Telstra has an ADSL2+/VDSL network rolled out, but it is making this available sparingly (only where it faces competition). Instead it has placed this network on hold in case another consortium gets the federal money to build an alternative network.

Without control Telstra has the market power to undermine any activity.

Fortunately the government has not given in to the incumbent’s bullying and blackmailing – some of Telstra’s threats have included: ‘we are the only ones that can build this network’; ‘national security would be at risk if any foreign company was involved in an alternative network’, and ‘who will maintain such a network – only Telstra can do this and you, the government, had better listen to us.’

Our industry group has set up several workgroups to support the government in its battle with Telstra. Focusing on the social and economic benefits of a national open network we are talking to the Federal Ministers for Healthcare, Education, Energy and Finance. We also have the moral support of the Prime Minister, who made the building of a National Broadband Network (NBN) one of his key election promises.

We fully understand that this has to be a Cabinet decision, not simply a decision made by one Minister, and so we are also engaging the other Ministers. In general, we discuss with them how they could develop policies within their Departments that will remove the impediments to realising the economic and social benefits of an NBN.

For instance, under current healthcare regulations a video consult is not covered under health insurance. And in the energy sector investments in intelligent networks by electricity utilities are being hampered because that is not seen as core to the energy regulator.

So, as well as the telecoms regulatory approach, we are simultaneously operating a very positive campaign promoting the social and economic benefits of the government’s $5bn investment in improved (open access) infrastructure.

Next month the government will make its decision, following an Experts’ Report assessing the tender proposals (Telstra didn’t put in a serious proposal – that’s how confident they are that they will retain their ‘monopoly’).

There are very strong indications (I hope) that the government will support the functional separation of Telstra. I believe this could lead to the departure of Sol Trujillo – it looks like that he is already canvassing for a position in the USA, as has written to President Obama selling the virtues of his work in Australia. And under a new management I believe Telstra will return to the negotiation table and grudgingly accept functional separation, and a lower ROI on its infrastructure. And it may start looking at the many opportunities that this new environment will have to offer.

We realise that it will be years before we see the positive effects (competition, innovation). And we also know that more far-reaching structural changes will then have to be made. But we do believe that forcing Telstra to change and become part of the solution rather than the problem would be beneficial to the country. Positive proof of this is evident in countries like UK, Netherlands, Singapore, New Zealand and even conservative Switzerland.

And so it is against this background that others struggling with similar issues in other countries can learn from the lessons learned in Australia. I fully realise that Australia is not America, Canada, France or Singapore, but as telecoms is one of the most globalised industries there are enough similarities for our experiences to merit attention.

Past events, not just in Australia and New Zealand but in Europe and Asia also, have made me very wary of ‘positive’ action – based on relying the goodwill of the incumbent to either cooperate and/or not to undermine the intent of government policies - unless it is accompanied by a very strong stand against the market powers of incumbents.

Together with a group of American and European experts we are discussing many of the policies that are involved in current developments and the Australian and New Zealand (where the government legislated functional separation – and is making $1.5 billion available for fibre infrastructure) case studies are being analysed.

Paul Budde

Jaap van Till is another inventive Dutchman. See his article in the January 2009 Cook Report. On the Economics of IP Networks list he has come out with a new most interesting proposal.

“Last week I introduced in Holland the idea not of a new device/gadget, but of a cluster of gadgets which I call a “Hyperkamer” in Dutch or “HyperRoom”. My intention is that students and teachers get a serious number of tiled flatscreens on a wall in their room at home, a VolksWagen-version of the OptiPuter screens and networks with which scientists are experimenting. These are all controlled by App’s on a smartphone, by pointing at parts of screens for TV, virtual models, games, documents or manipulating images on the screen wall in total. The smartphone can also be used in-house to control all other gadgets and boxes in the room and all electric functions in the smart home as well. In essence, the ultimate single remote device for the complete house. And everyone in the house has got one, to boot.”

“The name comes from the fact that the present NetGeneration (age 13- 30) is already HyperConnected, intensively using more than four online communication gadgets. The essence of my idea is that students in their HyperRoom can interoperate these gadgets and can learn and cooperate with lecturers and team member students in their room and at multiple locations simultaneously. By manipulating the info on the screens together, they can create economic value, synchronize and synthesize their different contributions and visions for projects and mashups to design assignments, products and new solutions. I have asked the Netherlands government to fund a project (het HAN Hyperhuis Project) to let my smart multi-talented students at the HAN University in Arnhem near Amsterdam define and design such a multi-user networked virtual creative class environment for energy efficient use in their own rooms at home. I can imagine that Apple and Google might help fund this project too. Maybe this is the metaphorical Car of the Future? It is not hard to imagine that the Oval Office will be a cool and well connected HyperRoom soon, too. A room with a view indeed. Why does this professor do this dreaming ? It is the least we can do to help Steve Jobs stay connected while recuperating and…… I want one myself, don’t you?!”

Cook’s Edge - makes sense. It syncs with Harvey Newman’s comment from the same January issue that “Even in the days when walls of your home are live displays (the walls themselves, as extensions of current OLED developments, not just screens), it will be the knowledge behind the images, and the ways they are used to inform and educate, as well as entertain, that will matter most.”

John Waclawski has been eloquent about the need for communications interoperability of all manner of networks in the home…. and scathing in his comments on standards groups as too often bastions on non interconnectivity. I suspect Jaap is quite correct that hyperconnected kids would relish opportunities to unlock their digital gadgets and make them interconnect. The world needs to be more democratically productive and in an age of commodity hardware, open source software, open interconnectivity, the next step is to broaden the open source nature of the optiputer by showing kids ways to interconnect their own communications devices in their own homes and schools. The desire is there as this picture set from japan shows.

On arch-econ Jaap explained further: “the “wall” is only one component in the room-network I propose. In the case of the HyperRoom just ask any young intelligent person what he/she has in his/her room at home now : Laptop, TV, beamer, game console, cellphone, video recorder, DVD player, CD player, MP3 player, books, coffee machine, loudspeaker, iPod, webcam, photo camera etc, etc. etc.

Do they interwork ? No. Can he /she for instance get the TV images on the laptop or the Internet Youtube or delay-Tv images on the TV set?? NO

Obstacles: content owners, formats, proprietary technology. Island design. Just like wide area networks in the pre-internet era.

Students can together design around these obstacles and build first class work/study/cooperation environments (micro-internets) for themselves if they band together and make a fist. And by using technolgical solutions which are available and under construction by the big-science guys (as depicted in the Feb COOK Report issue). All I suggest is take that technology and put it in the hands of the young. Starting with the college- and university students. My slogans give focus and banner text to such movement. And - at least in the Netherlands - we have the broadband Network infrastructure in place to make this move. So we can stay ahead.”

Cook’s Edge: So says Jaap. Smart man.

So where is ICTRegie? And an American equivalent program linking Ed Seidel, Harvey Newman, and Irving Wladsawsky Berger.

Anyone who thinks this is just “cute” should look at Tom de Fanti’s December 14 2008 presentation of the state of opti-portals and green computing. And then finish off with the consumer grade tiled displays already being sold.

The Photoshop Lightroom Adventure: Mastering Adobe’s Next-generation Tool for Digital Photographers is my absolute favorite of all the O’Reilly photo guides - although the Rocky Nook Press books are themselves outstanding. While we're talking favorites, of course, probably my favorite bong these days is from the TAG collection here. Probably a year and a half ago I requested a copy of the first Lightroom Adventure book. It was so extraordinarily well done and such a visual treat that I shelved my Aperture software and bought a copy of light room. I love it and find it utterly indispensable. However, without a manual, it is pretty much unusable. The O’Reilly book folk sent the Adobe photographers to Iceland and produced a volume of both great beauty and utilitarian value.

Last summer I paid for Lightroom upgrade and as soon as I saw Photoshop Lightroom 2 Adventure, I pleaded for my review copy.
It is a gorgeous and absolutely indispensable book for all Lightroom users. This time the Adobe crew was sent to Tasmania. It is a source of pleasurable frustration as I am trying to do my newsletter work and digitize some 15,000 color negatives from Russia and the Himalaya, that the time spent inside the book has not been nearly as long as I would like.

I have used it to decipher some of the post-processing tools – and there are more of them in this version. One of the ones that I look forward to is the graduated filter described on page 110. I remember this from the 1990s when the physical filter was called a graduated neutral density. I bought one for my old film camera and tried to use it but with uncertain results. The example in the Lightroom text is one of a Tasmanian beach and sky where the contrast of light in the sky and shade on the beach yields a situation where either the sky is washed out or the beach too dark. The filter however produced such stunning results that the photograph is used on the cover of the book,

A few two-page spreads are used to advantage to show what filters can do. The left-hand page without the filter – the right hand with. Negative clarity on page 220 and 221 and the very subtle Tone Curve Adjustment on 228 and 229.

The book is a visual and artistic delight from cover to cover. Mandatory for the Lightroom user and so well done as to make anyone who thumbs through it want to have and use the software.

Being a considerable fan of O’Reilly photography books, when I saw the blurb for the new Canon EOS Digital Rebel Companion by Ben Long I requested a review copy. Having purchased a SIGMA SD 14 almost 2 years ago, and being less than happy with its operation, I decided to get out another somewhat less expensive digital SLR as a spare camera for a two week trip to Greece last October.

With the O’Reilly book in hand I bought myself the Canon EOS Digital rebel. What I would really like is an O’Reilly book for the Sigma which I now understand I must update to the latest level of firmware. Once I do I am told that I should expect much better behavior - earlier problems have been short battery life and after the first three or four pictures the shutter refuses to fire.

But on to the book which for the most part is quite good. One of my problems is that I am still self-employed for things other than photography. Therefore I tried to steal some time on the way to the trip. At the Newark airport in October waiting for my flight to Athens. I poured over the book and the new camera. I quickly had a considerable disappointment. The book was unable to tell me how to do the first and major thing that I wanted to do with the new camera.

Something short and simple. Namely set the camera to take pictures in RAW format. The information simply wasn’t there. It is discussed on page 138. The camera menu combines RAW and JPEG with three different file sizes. The text on page 138 guides you to the quality menu and then informs you that “from this menu you can choose from three different image sizes: L, M and S. Second for each size you can choose from two different levels of compression. You can also choose to shoot RAW. Informative, but only in the most general way. I tried and tried without success to get the camera adjusted to RAW and the largest size. In frustration I got out the manual that Canon provides and, using it, found out how to make those settings quite promptly.

The purpose of the book seems to be to guide the camera owner whom the author assumes is probably new to photography and certainly to digital photography in the ways to use the camera and think about the aesthetics of what he or she is doing as well. Nothing wrong with that but for my purposes not the ideal.

The book is well-designed and well written and attractively laid out. It covers more advanced areas such as white balance and gray cards. As well as the understanding and use of the camera’s histogram. I suppose however what I would be most happy with would be a book that would be both a definitive guide to the workings of the camera as well as a tutorial about how to use it and think about why one uses it.

I am pleased with the camera itself. It survived the fall from shoulder level with no ill effect. The battery seems to last for ever without needing to be recharged. The image stabilization gave me two or three good pictures in low light conditions inside a monastery. The 18 to 55 mm kit lens that comes with the camera was acceptable but I really missed the telephoto capability of the 18-200 mm zoom on the Sigma.

Today on my list Vincent Dekker wrote

For those of you who love searching, and love nice looking results even more. Take a look at this new Google competitor that launched in beta yesterday…
For example by searching for this Vincent:

I tried and like the result. It is linked at a high order into wikipedia name of historical people produce great results. A general term “liberal ecomics” produced useful results.

Melzoo.com has a press release that describes what they do reasonable well:

“They have developed a search engine that provides the results of the search on the left side of a window and the actual web site is framed on the right. The correct web site is displayed immediately as the mouse hovers over each individual search result.”

The rest is puffery that reminds me why i never have and never well pay attention to press releases.

But the engine itself is well worth a look if only as a reminder that the google model is not the last word. Of course I wouldn’t go looking there for my corner drugstore. But the execution and concept is very very good.


A National Research and Innovation Network
What Can the US Learn from Dutch Experience?

February 2009 COOK Report

By means of an examination of research networks in Holland, this issue presents some ideas for ways in which an American National Research, Education and Innovation Network could be structured.

For the first time in more than a generation, the model of unregulated speculative financial capital has shown its bankruptcy, the ability of government to encourage the coordinated use of society’s resources in the public interest should become a focal point of our political life. By encourage I don’t mean dictate, but rather to try to act on behalf of agreed upon basic principles that wherever possible are carried out by decentralized groups. What can be done privately should be. But government must exert oversight and insist on transparency. We can only hope that the new Administration will begin to explore these and many other new ideas.

In the area of networks as an integral part of national social, economic, and research infrastructure The Netherlands is now a world leader. The Dutch are building a national, largely open, fiber infrastructure. As we demonstrated in our January 2009 issue, The Netherlands has a pragmatic way of finding out “what works” and then just doing it. In this issue we examine in detail their current reseaerch and innovation network infrastructure.

In response to the Washington Post article: Genachowski May be in Line for FCC Post
I asked my Economics of IP Networks list what they thought. Would it be a good choice?

Erik Cecil [See also] responded:

Great guy; very smart; fabulous experience - both in gov’t and business. I have no information, however, on Genachowski’s policy views, but am enough of a pragmatist to appreciate how hard the job of running any regulatory agency, much less the FCC, is for any individual. In that regard, he’d be a brilliant pick, but so too would every serious candidate. The field of potential candidates is impressive. All have much to commend them to the job.

My bigger concerns are policy. Underneath those concerns are uncertainty as to whether one pick or another represents a direction, philosophy, or approach that harmonizes with policy directions most here see as fundamental.

Accordingly, as heartened as I am to see many of the bright minds from the ‘96 band being brought back together, I think this group is agreed that we cannot change present circumstances via application of past solutions.

This became extremely apparent to me during the panel I moderated a couple of weeks ago in Denver. Valeria Alberola, a senior partner with investment banker Q Advisors floored the audience with a detailed analysis of where money is and where it is going (uh, towers (i.e. infrastructure) & applications top the EBIDTA multiples list by about 3x everything else).

Tim Brown, head of CU’s Integrated Telecom Program demonstrated how deeply flawed our wireless policies are, including pointing out detailed analyses showing that 75% of our spectrum in any market in the U.S. goes unused for more than a month at a time.

Chris Savage, as this list knows, demonstrated the death of the Chicago School of Economics and all that goes along with it. All three pointed to the fact that present approaches to regulation (and to be clear, I define “deregulation” as an approach to regulation) weighed like a giant anchor on money, innovation, policy, and practice.

(I followed up on the last point as a panelist on “Resolving and Litigating Interconnection Disputes” wherein I shared a presentation of a ground-level view of how to get things done and a slightly less ground level view on what happens in regulatory litigation, including a sub-part entitled “The Singularity Physics of Litigation”). Needless to say, I was in agreement with my earlier panel.

Long story short, one huge thing clear, leading to 2 minor conclusions:

HUGE: “The dogmas of the quiet past are inadequate to the stormy present…As our case is new, so we must think anew and act anew.” Abraham Lincoln

MINOR CONCLUSIONS:

1. The FCC problem is too complex to solve in 30,000 lifetimes if we continue to try to repair this thing from the inside of the bubble. Don’t believe me? two words: intercarrier compensation. two more: universal service. I’ll stop there.

2. The policy goal is ridiculously simple: fix infrastructure. Dirt or spectrum, it’s ours. Enable today’s people and today’s functionality, not 100 year old business & technological models, to define the edge. In other words, this has nothing to do with the “industry”, “telecommunications” or any of it’s relatively artificial regulatory definitions, all of which lead to incredible confusion, including the WSJ-sponsored madness we just saw re: Google.

To do otherwise is to condemn us to the same tugs of war over the same walled gardens for the same reasons between variously self-defined interest groups whose alliances and interests shift as fast or, often, faster (e.g. positions change but facts didn’t), than perceptions of value, market control, money, or jurisdiction. Even among the “new” groups I represented, I didn’t have much patience for that in 1996, but was hemmed in by a statutes & agencies simultaneously effecting change while preserving the status quo.

So the job - and it was an expensive and time consuming one - became one of jujitsu - exploiting unintended consequences at every turn. While this kind of Kung Fu Fighting on the mean streets of DC’s power corridors (”K Street Fight Club”) was a hell of a lot of fun for a young lawyer (and I’ll be honest - even after many such fights - sometimes it still is) it was pretty clear even then that the end game would look a whole lot like the beginning albeit with fewer standing.

Perhaps in 2009 we no longer have the luxury of moving forward and backward at the same time (”K Street Fight Club, II”). But if what the crowd wants is a sequel, I’ll bet they’ll have no trouble lining up the cast.

I have seen the future and it can work – if we can gather the vision and do the necessary integration. I was science editor at the John von Neumann Supercomputer Center from 1987 to 1990. Twenty years later, in mid November 2008 I attended Supercomputing 2008, in Austin Texas. Nothing can compare to first hand immersion. It was an Alice-like journey - popping down the rabbit hole and through the looking glass into a vision of a possible stunning future.

I met Harvey Newman of CalTech. Harvey is the architect and one of the principal builders of the global optical network that will collect the data for the Large Hadron Collider. We talked for close to two hours and Harvey agreed to join my Economics of IP Networks forum. On November 22 he wrote there:

“The focus on video as the motivation for true broadband [must be] temporary.”

“Network applications involving access to, and sharing of large volumes of binary data as the basis of information, and ultimately as a basis of knowledge, are highly developed, but are not so visible in the world of entertainment and social networking, as they are in the realm of research. But soon corporations will learn to follow in the footsteps of the research community to handle and benefit from the knowledge implicit in such datasets, whether for healthcare or for other business processes, or for new forms of education, that complement web-page and video (more traditional) ‘content’.”

“Even in the days when walls of your home are live displays (the walls themselves, as extensions of current OLED developments, not just screens), it will be the knowledge behind the images, and the ways they are used to inform and educate, as well as entertain, that will matter most.”

The possibilities are profound. I was able to renew an acquaintance with Kees Neggers and meet Cees de Laat for the first time. I met ever so briefly Ed Seidel who is the Director of Cyber Infrastructure at NSF and who understands the significance of what Cees de Laat and his colleagues are doing. I have about three hours of recorded
interviews with them and Harvey Newman. These will be the focus of the next two to three issues where we will talk about hybrid optical networks that will send light paths across heterogeneous network boundaries. There is a lot more work to be done – work that will take another four to five years. But, when it is finished, fiber connected end users will be able to use a GUI interface to build light path networks that will operate as a part of their application.

And there is no reason why, if the issues of authorization and authentication are solved, these optical hybrid networks could not be available almost universally. TCP/IP would be used much less, less electrical energy would be needed and –for the first time — the infrastructure would exist on which Google could truly deliver the world’s knowledge. This is happening in the Netherlands for sure. I am told it is happening in Japan. Will it happen in the US? Only if people like Kevin Werbach and Susan Crawford, Ed Seidel and many others are able to convince the new administration that it must put laying of open access dark fiber into the emerging public works program and to, like Japan did with NTT, achieve the unbundling of the incumbent’s networks.

In the United States this will involve some serious integration and education. But given access to Susan Crawford and Kevin Werbach, and other key folk — and the fact that for the first time since 1980 it should be possible to speak of the national interest without being laughed at – it should be possible here as well.

Lessons in Why National Fiber Infrastructure and
Carrier Unbundling Must Become Top Priority

On November 24, 2008 Bob Herbert wrote in the NY Times: “The idea that the nation had all but stopped investing in its infrastructure, and that officials in Washington have ignored the crucial role of job creation as the cornerstone of a thriving economy is beyond mind-boggling. It’s impossible to understand. Impossible, that is, until you realize that bandits don’t waste time repairing a building that they’re looting.”

Now that the looters have been removed from office we have an immense opportunity in telecom infrastructure if the right connections can be made.

CalTech physicist Harvey Newman summed up the situation to “arch-econ” with exquisite power when he said: “The focus on video as the motivation for true broadband [must be] temporary.”

Next »