In the not too distant future, internet access will be dominated by wireless networks. With that, wireless edge using optical core next-generation networks will become as ubiquitous as traditional telephone networks. This means that telecom engineers, chip designers, and engineering students must prepare to meet the challenges and opportunities that the development and deployment of these technologies will bring.
Bringing together cutting-edge coverage of wireless and optical networks in a single volume, Internet Networks Wired, Wireless, and Optical Technologies provides a concise yet complete introduction to these dynamic technologies. Filled with case studies, illustrations, and practical examples from industry, the text explains how wireless, wireline, and optical networks work together. It also:
Details optical networks involving long-haul and metropolitan networks, optical fiber, photonic devices, and VLSI chips
Provides clear instruction on the application of wireless and optical networks
Taking into account recent advances in storage, processing, sensors, displays, statistical data analyses, and autonomic systems, this reference provides forward thinking engineers and students with a realistic vision of how the continued evolution of the technologies that touch wireless communication will soon reshape markets and business models around the world.
Frequently asked questions
Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
No, books cannot be downloaded as external files, such as PDFs, for use outside of Perlego. However, you can download books within the Perlego app for offline reading on mobile or tablet. Learn more here.
Perlego offers two plans: Essential and Complete
Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Both plans are available with monthly, semester, or annual billing cycles.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, weâve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere â even offline. Perfect for commutes or when youâre on the go. Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Internet Networks by Krzysztof Iniewski in PDF and/or ePUB format, as well as other popular books in Computer Science & Computer Networking. We have over one million books available in our catalogue for you to explore.
This introductory chapter focuses on new opportunities provided by the unrelenting technology evolution to further develop the telecommunications business well into the next decade. It takes into account the evolution of storage, processing, sensors, displays, statistical data analyses, and autonomic systems and discusses how such an evolution is going to reshape markets and business models into a new era where business ecosystems supplement value chains.
1.1HAVE WE REACHED THE END OF THE ROAD?
The evolution we have witnessed in these last 50 years in electronics, optics, smart materials, biotech, and all fields using these technologies has been relentless. Although there is currently no sign of a plateau, we know that a physical limit to progress lies somewhere. The fact that in many fields, like electronics, this ceiling seemed to have been approaching and engineers have found ways to circumvent it does not change the fact that a physical limitation exists.
In economics we have seen what happens when we reach a ceiling, such as when we run out of liquidity: The downward spiral of stock markets in the second half of 2008 and into 2009 is a clear statement of the havoc that can happen when progress is suddenly stopped.
The technology evolution has progressed with such regularity that it no longer surprises us, we have gotten used to it, and we have actually built a world that relies on it. If the evolution of technology were to stop next year, we would need to reinvent the way we are doing business and that would cause tremendous problems.
Looking at the physical barriers, like the speed of light, the quantum of energy, the smallest dimension that exists, we can determine where the ultimate limit lies [1]. The good news is that such a limit is very far from where we are today. At the present pace of evolution we wonât be reaching it for the next few centuries. This does not mean, however, that such limits will ever be reached. Actually, I feel that we will discover unsolvable issues long before getting to those physical barriers.
The investment required for chip production plants is growing exponentially and payback requires huge revenues. As we will see, this is a push toward huge volumes, with individual products costing less and less to make them sustainable by the widest possible market. The economics is already slowing down the creation of new plants but new production processes could circumvent what we see as an upper boundary today.
The continuous progress of technology has continuously increased the amount of energy being consumed (see Figure 1.1). It is estimated that the power consumed by residential households in Europe to access broadband networks in 2015 will reach 50 TWh. To give a sense of this number, 10 years ago there was no power consumption used in accessing the network; all that was needed was provided by the network itself. Networks in the period between 1980 and 2000 doubled their power consumption, and in 2008 the overall power consumption of networks in Western Europe reached 20 TWh. This means that we are expecting that a consumption level that was nonexistent 10 years ago will more than double in Europe the present consumption of all European Telecommunications Networks.
Energy is becoming a bottleneck to evolution: In 2008, China consumed as much energy in total as the United States, but the per-capita consumption is a fraction of that in the United States. Between 8 and 10 billion people are expected to populate the Earth in the next decade. This, in terms of energy, is not the 25% to 80% increase in population, but an 800% increase because that population will consume on average what is now consumed by an average U.S. due to a widespread increase in quality of life.
We simply do not have the energy available to meet those projected requirements. This means that either global wealth, in terms of energy consumption, will not be at the same level that a U.S. citizen enjoyed in 2008 or we will have found ways to dramatically decrease power consumption and increase energy production (both are indeed required to come anywhere close to meeting those requirements).
The energy issue is going to influence overall evolution in the next decade, in terms of both availability and cost. The shift toward a âgreenerâ world, although important, is going to increase the impact of energy on evolution.
The bright side is that the energy âcrunchâ will force investment in alternative energy sources and decreased consumption. This, rather than slowing evolution, is likely to shift its direction, accelerating the deployment of optical networks that are much more energy savvy than copper networks, moving toward radio coverage made by smaller cells, as the energy required to cover a given surface decreases (approximately) with the square of the number of cells being used.
For the time being, and for the horizon we can reasonably consider today (not beyond 2050), we can be confident that technology evolution will continue at different paces in different sectors as it has over the last 50 years, but overall at a similar rate to what we have been experiencing in these last decades.
FIGURE 1.1 The growth of energy consumption.
Now, saying that the pace will be substantially unchanged does not mean that it is going to be âbusiness as usualâ for two reasons. First, as Mooreâs law claims, saying that in the next 18 months we are going to have an evolution that will double todayâs performances means that in the next 18 months we will go a distance that has taken us the last 40 years to walk. This is quite a change! Second, performance increase has a linear effect on the ecosystem until a certain threshold is reached: Beyond that point it is no longer seen as a performance increase, but rather as a change of rules.
Think about electronic watches: There was a time, in the 1970s, when owning an electronic watch was very expensive. As prices went down, more and more people could afford an electronic watch. At a certain point the cost dropped, basically, to zero (it passed the thresholds of cost perception) and there was no more market for electronic watches: The industry had to reposition itself. Long gone are ads claiming better precision of a particular watch. The marketing value of a Swiss certified chronograph dropped to zero.
A similar thing will happen with the deployment of broadband networks and their enabling optical infrastructures: Once you reach a bandwidth of 1 Gbps, and possibly before that, it will be impossible to market increased bandwidth at a premium. Bandwidth value will drop to zero and marketers will need to find new slogans.
Notice how the threshold links technology with market value. As these two factors change the business model and the rules of the game: a disruption occurs. As this happens, consolidated industries need to reinvent themselves and new ones find leverage to displace the incumbents. In discussing technology evolution, we come to this point over and over. The question to consider, therefore, is not whether a technology is reaching its evolution limit, but rather if that technology is leading to a disruption threshold.
1.2GLOCAL INNOVATION
Innovation used to be easier to predict because only a few companies and countries were leading the way. The evolution of infrastructures was so easy to predict that the International Telecommunications Union (an international standardization body based in Geneva, Switzerland) published a table with the status of telecommunications infrastructures and telecommunications service penetration annually. Maps showed the progress made and others showed what penetration would be reached in 10 or 20 years. As an example, it would take 19 years for a country to move from 1% to 10% telecommunications penetration but only 12 years to move from 10% to 20%, 8 years to go from 20% to 30%, and so on.
The advent of wireless changed the world. Countries like India that used to have less than 3% of telecommunication density (and that is still the quota of the fixed lines) moved within 10 years from no density to 20% and are likely to reach a 50% penetration rate within the next decade. China moved from 0% to over 30% (in wireless) within 10 years.
However, more than in the infrastructure domain, where globalization has an impact in the decrease of price that in turns enables a progressively broader market thus affecting indirectly the local situation, the service domain has a distribution cost that is basically âzero.â Once an application is developed, it can be made available over the network with no distribution cost hampering its marketing. The network is also playing another trick: An application that can make sense in the U.S. market can be developed and âmarketedâ from India.
Innovation is no longer confined to the domain of a few rich companies or countries because of huge investment barriers. The real barriers have moved from financial capital availability to educational capital availability. It is the increase in production of engineers in India and China that is placing these two countries at the forefront of innovation, not their huge (potential) market.
The market of innovation is no longer local but global. This globalization of innovation is going to continue in the next decade and it is not by chance that President Barack Obama has set the goal of increasing scientific education and the number of engineers as the way to remain at the forefront of progress, both scientific and economic.
Optical and wireless networks will further shrink the world. Distance is already irrelevant for information flow; it is also becoming irrelevant for delivery cost of a growing number of products and services, and it is certainly becoming irrelevant for innovation. Offshoring will become more and more common for big industries, whereas smaller companies will in-shore innovation from anywhere and will make a business out of their ability to localize a global world. The latter is going to stay, but the former will be viable only until a labor cost differential âplaguesâ the world. Eventually, it will disappear. Politics, regulations, and cultures will be the determining factors in the evolution. From a technological point of view, the Earth will be no bigger than a small village.
The Web 2.0 paradigm will evolve from being a network of services and applications made available by a plethora of (small) enterprises and made available thanks to the huge investment of few (big) enterprises to become a Web 3.0 where the interaction will take place among services and applications to serve the userâs context. Questions like âWhen is the next train?â will become answerable because somewhere in the Web there is an understanding of my context, and it will be obvious from this understanding that I am looking for the train to Milan. It is not a small step.
Again, it is a matter of glocalization. We are moving from the syntax, from infrastructures providing physical connectivity, to semantics, to the appreciation of who I am, and this includes the understanding of who I was (the set of experiences shaping my context) and the forecasting of who I will be (my motivations and drives to act). This might seem scary, Orwellian, and definitely not the way to go.
However, the evolution, if it is to continue, has to be beneficial; otherwise it will not become entrenched. Because of this, we can expect that the balance between what is technologically possible and what the market is buying will depend on us, as individuals and as a community. Contextualization can raise many issues, from privacy to ownership, from democracy to the establishment of new communities continuously reshaping themselves.
Contextualization is not likely to result from an âintelligent, Orwellianâ network, but rather from an increased intelligence of my terminal, and that is under my control. I will make decisions, most of the time unconsciously, of what to share of my context; the network will be there to enable it. My terminal, and the my is the crucial part, will act as an autonomous system, absorbing information from the environment, both local and, thanks to the network, global. It will let me communicate with my context, my information, my experiences, my environment, and, of course, my friends and acquaintances in the same seamless way as today I walk into a room and act according to my aims, expectations, and environment.
Therefore, telepresence, one of the holy grails of communications, will also be glocal. I will communicate locally and remotely as if both remote and local are present at the same time.
This will be possible because of technologyâs evolution. Although the list of technologies to consider would be very lengthy, we can examine a few of them in terms of the evolution of functionalities that such a technology evolution makes available rather than in terms of the evolution of single technologies.
1.3DIGITAL STORAGE
Digital storage capacity has increased by leaps and bounds over the last 50 years. The original digital storage solutions have basically disappeared (magnetic cores, drums, tapes, etc.) to leave space to new technologies, like magnetic disks, solid state memory, and polymer memories (on the near-term horizon).
As of 2008, hard drives, or devices using magnetic disks for storage, reached 2 TB capacity in the consumer market, and 37.5 TB disks are expected to appear in 2010 (from Seagate). Storage capacity of 100 TB will become commonplace by the end of the next decade. The new leap in magnetic storage density is achieved through heat-assisted magnetic recording (HAMR).
Solid state memory has advanced significantly, and compact flash cards are cheap and ubiquitous these days. They were invented in 1994 and have moved from a capacity of 4 MB to 64 GB as of 2008. A capacity of 128 GB has become available in June 2009 in flash drives (solid state disks [SSD] based on flash technology appeared in 2007). The announcement at the end of 2008 of new etching processes able to reach the 22â15 nm level (down from the current 60â40 nm standard) clearly show that more progress in capacity is ahead.
This increase in capacity is placing flash memory on a collision course with magnetic disks in certain application areas, like MP3 players and portable computers. They consume only 5% of the energy required by a magnetic disk and they are shock resistant up to 2,000 Gs (corresponding to a 10-foot drop).
The bit transfer rate has already increased significantly and there is a plan to move their interface to the Serial Advanced Technology Attachment (SATA) standard, the one already used by magnetic disks, thus raising the transfer speed to 3 Gbps. By comparison, the current Parallel Advanced Technology Attachment (PATA) interface tops out at 1 Gbps.
Polymer memory has seen an increased effort by several companies to bring the technology to the market. Commercial availability is likely in 2010. Polymer memory is made by printing circuit components on plastic, a precursor to fully printed electronics. Its big advantage over other types of memory is in its extremely low cost and potential capacity. In an area the size of a credit card, one could store several terabytes (TB) of data (see Figure 1.2).
FIGURE 1.2 The evolution of digital storage.
Data will be stored both at the edges and within the network. Ericsson predicts that a 1-TB cell phone will be available in 2012, home media centers will be able to store the entire life production of a family in their multi-TB storage, exabytes (EB; a billion billion bytes) will become commonplace at data warehouses for databased companies like Googleâ˘, Snapfishâ˘, Flickrâ˘, Facebookâ˘, and those to come in the future. Institutions and governments will harvest the digital shadow of their constituencies daily to offer better services. Raw data generated by sensors will have economic value through statistical data analyses.
Storage is becoming one of the most important enablers for business in the next decade. What is the consequence of this continuous increase in storage capacity? Clearly, we can store more a...
Table of contents
Cover
Half Title
Title Page
Copyright Page
Table of Contents
About the Editor
Contributors
Chapter 1 An Exciting Future
Chapter 2 802.11n: High Throughput Enhancement to Wireless LANs
Chapter 3 Mobile WiMAX and Its Evolutions
Chapter 4 Service Layer for Next-Generation Digital Media Services