By Ismael Peña-López (@ictlogist), 14 May 2007
Main categories: Connectivity, Cyberlaw, governance, rights, Digital Divide, Hardware
4 Comments »
Robert Guerra kindly handed me three books back when we met in Sevilla. One of them was Internet Governance. Issues, Actors and Divides, by Jovan Kurbalija and Eduardo Gelbstein. As the authors themselves state, there are at least five dimensions to Internet issues: Infrastructure, Legal, Economic, Development, and Socio-cultural. Each one is discussed in the chapters that follow
.
Personally, I find the book really interesting and, honestly, much broader in scope and depth than an initial understanding of governance might bring to one’s mind. Actually, I’d compare it to Chris Nicol’s ICT Policy: A Beginner ’s Handbook as they share most of the aim, being Internet Governance. Issues, Actors and Divides more recent, and, indeed, going some steps further than Nicol’s. Both books make, in my opinion, a perfect pair to have a good overview — and something — on what is the Internet and how it does affect me. In other words: they both represent perfect e-awareness raisers.
Taking the book as a toolkit, and quoting again:
The main purpose of such an Internet Governance Toolkit would be to:
• organise the tools currently used in the Internet Governance debate;
• create additional cognitive tools;
• facilitate the inclusive nature of the Internet Governance process by providing interested parties with the tools to understand the issues, positions, and developments.
The Internet Governance Toolkit consists of:
• patterns and approaches;
• guiding principles;
• analogies.
Perspectives associated with Internet Governance
[image taken from the book]
Concerning the part of development and the digital divide, this table is of special interest to me:
ICT does NOT facilitate development:
• The “network externalities” help firstcomers establish a dominant position. This favours American giants so that local firms in emerging economies would be effectively frozen out of ecommerce.
• The shift in power from seller to buyer (the Internet inevitably gives rise to “an alternative supplier is never more than a mouse-click away” scenario) will harm poorer countries. It will harm commodity producers mainly from developing countries.
• Higher interest in high-tech shares in rich economies will reduce investor interest in developing countries.
ICT facilitates development:
• ICT lowers labour costs; it is cheaper to invest in developing countries.
• Very fast diffusion of ICT across borders occurs, compared to earlier technologies. Previous technologies (railways and electricity) took decades to spread to developing countries, but
ICT is advancing in leaps and bounds.
• The opportunity to leapfrog old technologies by skipping intermediate stages such as copper wires and analogue telephones encourages development.
• ICT’s propensity to reduce the optimal size of a firm in most industries is much closer to the needs of developing countries.
While the list is not complete, it does give some good hints on what is happening on the developing countries arena.
More info
By Ismael Peña-López (@ictlogist), 07 May 2007
Main categories: Cyberlaw, governance, rights, Hardware, Meetings
Other tags: idp, idp2007
No Comments »
The Congress on Internet, Law and Politics has the aim of continuing the task of reflecting on, analyzing and discussing the main changes taking place in law and politics in the information society. This third congress focuses on the questions that currently represent the most important challenges and new developments in the fields of copyright, data protection, Internet security, problems of responsibility, electronic voting, and the new regulation of e-Administration, as well as dedicating a specific area to the current state of the use of new technologies by law professionals.
One of the biggest struggles in History has been literacy. Now that that it seemed [at least in developing countries] that we had achieved some “good” literacy level, then comes the Internet and everyone has to get into digital literacy. Indeed, last year [2006] more transistors were produced than rice grains.
Data ownership goes back to the citizen. Data is not a Government property, but citizens’. the big challenge to e-Administration is not putting online all procedures, but making them disappear. An event (i.e. your husband’s death) should automatically activate all kind of procedures (i.e. widow subsidies) without the need of having to apply for them.
Universal access should cover telephony (both fixed and mobile lines), broadband, audiovisual services and content. This access means information, training and, actually, more democratic exercise, transparency, etc.
Keynote speech: The Future of Internet and How To Stop It
Jonathan Zittrain, Professor of Internet Governance and Regulation, Oxford Internet Institute
Jonathan Zittrain
One model of computer: in the hands of experts, you rent a solution, that comes from designing software and putting it into hardware, but you never ever approach neither the machine nor the design of the solution. This is IBM’s model when it was created back in the times of Herman Hollerith when he calculated the US 1890 census. IBM (then the Tabulating Machine Company) had “general purpose” machines that could be reprogrammed for whatever.
Flexowritter represented a second model: one solution, unchangeable, nonreprogrammable, but you could have it home.
Bill Gates made up a general purpose personal computer: it could be reprogrammed by the end user, that had the machine at home.
And the story with Networks is alike. Prodigy and Compuserve would show the user closed content depending on what he wanted (i.e. Weather) and who he was (i.e. Professional).
The birth of the Internet, the network its founders built, was totally open: it had an “hourglass” architecture where any use, any platform, were joined by a common “bypass”: IP, that just moved bits from one place to another.
And collaboration goes on an on. More and more software is built upon a “procrastination principle”: we build it until here and someone will go further
. But this kind of collaboration can also be subverted to break others’ barriers to secure their content, etc.
And now the problem is that you can not malicious code, malware, viruses, etc. from operating or entering your own system without being disconnected from the Network… or go out of the generative PC paradigm where everyone could program/contribute with his own code/content. If we are going back to the “main menu”, where all options are predetermined, is this the end of the generative Internet? actually, is this the end of the Internet itself? And same applies to who connects to what . It looks like our needs to “save” our systems, are creating more and more barriers similar to those of censorship (something dealt with at the OpenNet Initiative).
Stop BadWare is about writing a software that should help knowing how some other software works and whether it is making “happier” your computer (and its user).
The generative pattern:
- Origins in a backwater
- Ambitious but not fully planned: procrastination principle
- Contribution welcomed from all corners
- Success beyond expectation
- Influx of usage
- Success is cut short: movement towards enclosure
The solution? For the solution to be true to the generative pattern, it must be originated in a backwater. The Internet is an educative system: it does require to people to give feedback and build along with the others.
See also:
3rd Internet, Law and Politics Congress (2007)
By Ismael Peña-López (@ictlogist), 12 April 2007
Main categories: e-Government, e-Administration, Politics, Hardware
2 Comments »
Here comes the bibliography I’m using to teach my course Technological grounds of the e-Administration belonging to the Master in e-Administration at the Universitat Oberta de Catalunya.
Bibliography
Fabra, P.,
Batlle, A.,
Cerrillo, A.,
Galiano, A.,
Peña-López, I. &
Colombo, C. (2006).
e-Justicia: La Justicia en la Sociedad del Conocimiento. Retos para los países Iberoamericanos. Santo Domingo: ejusticia.org. Retrieved October 07, 2006 from
http://www.ejusticia.org/component/option,com_docman/task,doc_download/gid,89/lang,es/
Further information
This is not a evolving selection, though it might have slight changes. The up-to-date version of this list can always be consulted here: Fundamentos Técnicos de la Administración Electrónica. Feel free to write back to me with proposals for inclusion in the list and/or corrections for found errors.
By Ismael Peña-López (@ictlogist), 13 March 2007
Main categories: Connectivity, Hardware
2 Comments »
Working with Information Society / Digital Divide indicators is a tricky thing to do, as definitions (along with technology) change in short periods of time. Some months ago, Tim Kelly asked me what did I consider “broadband”, as it was one of the hottest issues that researchers, in general, and the ITU, specially, had to deal with. Let’s see an example.
Broadband is defined by the International Telecommunication Union (ITU) Telecommunication Standardization Sector (ITU-T), in their recommendation I.113, as transmission capacity that is faster than primary rate Integrated Services Digital Network (ISDN) at 1.5 or 2.0 Megabits per second (Mbits)
. On the other hand, the OECD gives its own definition of broadband stating that for a service to be considered broadband, [the threshold] in respect to downstream access [should be up] to 256 Kbps
. The fact is that, as the OECD itself admits, Network operators widely advertise DSL and cable modem services to users starting at 256 Kbps as being
‘broadband’
. Actually, the Core ICT Indicators, promoted by the Partnership on Measuring ICT for Development — partnered by the ITU — also defines broadband as technologies that provide speeds of at least 256 kbit/s, where this speed is the combined upstream and downstream capacity
.
Summarizing, all of these are technical definitions, based on the fact of transmitting more than one data stream in the same wire by using different frequencies or channels. But for the not-technical user, broadband is strictly tied to “effective” speed, or, in other words, “subjective” speed: if your 1 Mbps is the slowest in town, it is no more broadband. This was Tim Kelly’s point last time we met.
Thus said — and leaving technical issues behind to focus in this “subjective” broadband perception — my proposal is to build a basket of tasks the way economists use to calculate changes in inflation based on a basket of products. Of course, this basket of tasks is also likely to evolve with time, but what is crystal clear is that the technical definition of broadband (the one about channels) is no more useful, and the decision to state that i.e. 256 Kbps is broadband should lean on objective basis more than “network operators advertisments”.
Proposal of a basket of tasks for a broadband definition
- Work in online, synchronous collaborative environments with rich media: VoIP, videoconference, screencasting, presentations/drawings…
- Work intensively/exclusively with online, asynchronous desktop/office applications: word processors, spreadsheets, math/scientific calculators…
- Usually access online applications with richest graphical content: SIGs and mapping tools, 2D and 3D simulators and environments
- Have online environments as primary communication and information channel: e-mail, instant messaging, browser and desktop widgets. It includes software downloads and updates.
- Manage a website: upload files, install applications, change configuration/setup. It does not include writing on a weblog/wiki and other low-tech “webmastering”
- Work with remote computers or in grid computing, including intensive use of P2P networks
This basket of tasks and the minimum speed required to perform them correctly/comfortably should help in setting the threshold of what we could call broadband. As those tasks will evolve dynamically along time, same will happen with the broadband threshold. As an example, some years ago you needed a then-so-called-broadband to check the Perry-Castañeda Library Map Collection when looking for geographical information, as most maps are some hundreds of Kb weight, being the heaviest up to some Mb. Nowadays, you would browse Google Maps, for which a today-so-called-broadband is required, maybe more than the “official” 256 Kbps to browse at ease.
Proposals, corrections, comments gratefully welcome.
Further reading
By Ismael Peña-López (@ictlogist), 27 February 2007
Main categories: Hardware, ICT4D
No Comments »
We have here talked about the subject of mobile phones for development several times. Positively, as a proven and effective tool to let poor people access the Information Society, when other more costly infrastructures are unexistent and/or cannot be provided — because of cost or because of technical difficulties (say, cost again, as almost every and each difficulty can be overridden with money). Negatively, as mobile phone in lesser developed countries usually relies on GSM networks, hence, low band networks that while providing access, it is a less quality access than broadband — fixed and mobile — networks provide in developed countries, thus widening the digital divide.
Nicholas P. Sullivan now provides us with another example on how can mobile telephones enhance both local development and Information Society fostering, by explaining the GrameenPhone experience in Bangladesh, something we already reviewed when we talke about the Village Phone Replication Manual.
Sullivan’s book, You Can Hear Me Now: How Microloans and Cell Phones are Connecting the World’s Poor to the Global Economy
offers a compelling account of what Sullivan calls the external combustion engine —a combination of forces that is sparking economic growth and lifting people out of poverty in countries long dominated by aid-dependent governments. The “engine” comprises three forces: information technology, imported by native entrepreneurs trained in the west, backed by foreign investors.
The book has two parts. The first one, The GrameenPhone Story, about why and how the project took place, and the second one, Transformation Through Technology, seemingly devoted to reflection and analysis.
See also:
By Ismael Peña-López (@ictlogist), 12 December 2006
Main categories: Connectivity, Hardware, ICT4D
No Comments »
Michael Trucano and Marco Zennaro respectively sent me two resources concerning ICT Infrastructures, both of them published by the World Bank.
The first one is the updating of the Quick guide to low-cost computing devices and initiatives for the developing world. The guide is a short inventory of known projects related to ‘low cost ICT devices for the developing world’
authored by Michael Trucano himself. While the list looks quite complete to me, I’d rely on the accuracy of a previous work of him, Knowledge Maps: ICTs in Education, to sincerely believe that the list is surely complete.
The second one is World Bank Working Paper no.27 Telecommunications Challenges in Developing Countries: Asymmetric Interconnection Charges for Rural Areas, by Andrew Dymond. Going against what tradition has dictated, Dymond states that the solution for the last mile problem should not be — as it usually happens — subsidies, but asymmetric (i.e. not the same for everyone) pricing, to adjust what end users pay to the cost of providing them with connectivity. Though the approach is quite unheard of and really defying, he provides examples on how this new scenario could be possible… and even desirable by the end users themselves — let aside the companies.
For those new to Dymond, he coauthored — with Sonja Oestmann — the handbook Rural ICT Toolkit For Africa, and — with Juan Navas-Sabater and Niina Juntunen — the well-know book Telecommunications and information services for the poor. Toward a Strategy for Universal Access.
By the way, infoDev’s ICT Regulation Toolkit has also been updated and should be almost complete by January 2007.