In the 2014 Istanbul-Turkey IGF workshop on policies to promote broadband access in developing countries organised by Rui ZHONG of ISOC China, we realized that while technological solutions are advancing rapidly, policy and regulations remain a significant barrier to affordable internet especially in the developing world. According to a report by Alliance for Affordable Internet ( A4AI), the key to affordability is the policy and regulatory environment that shapes the different actors in the market.
There are many predictions that the next big wave in telecoms is M2M and that this will be the next growth market for the telecoms industry. There is no doubt that M2M is a revolutionary development, but we need to separate the hype from the reality. In order to do this it is best to divide the major developments into two main areas, although there is no doubt that others will emerge over time. One area is the sensors that are being installed in networks such as electricity, the environment, roads and other infrastructure.
This never-ending story is used by opportunistic telcos and their lobbyists to confuse the issue in order to gain regulatory or political advantage. The debate is now raging again in the USA. In an attempt to talk down their monopolistic position in the market the three telcos - and this time in particular, Comcast - are claiming that real competition does in fact exist in the American broadband market, citing competition from the mobile 4G LTE services as an example.
Last year, I gave a short talk at the Commonwealth Telecommunications Organisation (CTO) Forum in (Abuja, Nigeria) on this topic and I thought I'd offer more details here. Let's say an African government has realised that IPv4 address exhaustion imposes limits its country's ICT development. Initial investigations indicate that deploying IPv6 is the only sustainable solution. Governments are however not very skilled in the bottom-up approach that is popular in the Internet world.
I can't help but think that the situation in this rather old joke applies very precisely to the current Australian efforts to compel network operators, through some contemplated regulatory instrument, to record and retain network-collected data about their customers' online activities. What I'd like to examine here the emerging picture that while networks, and network operators, make convenient targets for such surveillance efforts, the reality of today's IP network's are far more complex, and Internet networks are increasingly ignorant about what their customers do.
August 2014 is proving yet again to be an amusing month in the Australian political scene, and in this case the source of the amusement was watching a number of Australian politicians fumble around the topic of digital surveillance and proposed legislation relating to data retention measures. The politicians assured us that the proposed data retention measures were nothing untoward, and all that was being called for was the retention of "metadata" by Australian ISPs for a period of two years.
At APNIC Labs we've been working on developing a new approach to navigating through some of our data sets the describe aspects of IPv6 deployment, the use of DNSSEC and some measurements relating to the current state of BGP. The intent of this particular set of data collections is to allow the data to be placed into a relative context, displaying comparison of the individual measurements at a level of geographic regions, individual countries, and individual networks.
The Internet Society (ISOC) has been working with the African Union (AU) to facilitate the African Internet Exchange System (AXIS). This AXIS project funded by the EU-Africa Infrastructure Trust Fund and the Government of Luxembourg will help keep Internet traffic in Africa internal to the continent and avoid expensive international transit costs and delay latency in routing Internet traffic through other continents.
I'm sure we've all heard about "the open Internet." The expression builds upon a rich pedigree of term "open" in various contexts. For example, "open government" is the governing doctrine which holds that citizens have the right to access the documents and proceedings of the government to allow for effective public oversight, a concept that appears to be able to trace its antecedents back to the age of enlightenment in 17th century Europe.
The Internet Engineering Task Force (IETF) is the standards body for the Internet. It is the organization that publishes and maintains the standards describing the Internet Protocol (IP -- versions 4 and 6), and all directly related and supporting protocols, such as TCP, UDP, DNS (and DNSSEC), BGP, DHCP, NDP, the list goes on, and on... But how do they do that? How does the IETF produce documents, and ensure that they are high quality, relevant, and influential?