The Quantum Age of IT
eBook - ePub

The Quantum Age of IT

Why everything you know about IT is about to change

Charles Araujo

Share book
  1. 300 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

The Quantum Age of IT

Why everything you know about IT is about to change

Charles Araujo

Book details
Book preview
Table of contents
Citations

About This Book

In The Quantum Age of IT, Charles Araujo examines what has led us to this point and what it means to the future of IT organisations. With a broad perspective on the fundamental changes affecting the industry, he offers practical guidance that every IT professional needs to compete in this new era of IT. Whether you are an IT executive, or just beginning your career, this book will offer you the key insights you need to understand what is happening and what is coming. Understanding that future, Araujo blends a wide range of research and case studies to help you discover the skills you must develop in order to succeed and thrive in The Quantum Age of IT.

Frequently asked questions

How do I cancel my subscription?
Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is The Quantum Age of IT an online PDF/ePUB?
Yes, you can access The Quantum Age of IT by Charles Araujo in PDF and/or ePUB format, as well as other popular books in Ciencia de la computación & Ciencias computacionales general. We have over one million books available in our catalogue for you to explore.

Information

PART I

IT IS DEAD

CHAPTER 1: THE HISTORY OF OUR DEATH
(WHY THE MODERN IT STRUCTURE HAS FAILED US)

Jeff Winston was on the phone with his wife when he died.
First line from Replay by Ken Grimwood
The book Replay by Ken Grimwood is one of my favorite books of all time. It was a bit of a sci-fi cult classic when it was published in 1986. It was at once entertaining and profound. It tells the story of a man who dies suddenly at the age of 43 – only to wake up back in his freshman year of college. He learns that he has been given a great gift. A chance to live his adult life again. A ‘do-over.’ He decides that he will not make the same mistakes twice and lives his life differently. Until he reaches age 43 – and he dies again.
As this cycle repeats, he comes to a realization. He learns that changing his past is not the road to changing his future. He finds that his past experiences were a part of who he was and that spending his life looking backwards was only squandering the one thing of value that he really had – his future.
“The IT organization was in the middle of its next reorganization when it died.”
Perhaps that should have been the opening line of this book. Much like Jeff Winston, we are at a similar point in the life of the modern IT organization. (As a happy coincidence, the modern IT organization is about 45 years old!). Our organizations have grown and evolved – in many cases, without much conscious thought. There was always too much work to be done to be contemplative. Sure, some strategic planning took place and there have always been the pundits and the prognosticators, but for the most part IT leaders were far too busy getting things done to waste time imagining their future. And, for the most part, it worked out just fine.
Then we died.
We just didn’t know it.
But as Jeff Winston realized, this death is an amazing gift. It is an opportunity to give a fresh, new life to the organization. The lesson that Jeff Winston learned is the same one that we must now take to heart. There is nothing to be gained by complaining about our past or living in a world of ‘what-ifs.’ Our future lies in front of us, not behind us. But there are lessons for us in our past. There is clarity for our future to be found in the things that led us here. By understanding our past, we can better accept our today and then guide our tomorrow with an eye toward the future that we want to create. In order to envision our future, we must begin with the past.

The history of our death – part 1

How the function of IT came to be and the evolution of our organizational structure

The first computers were not computers at all.
The term ‘computer’ dates to the mid-18th century and literally referred to mathematicians whose job it was to perform long and arduous calculations by hand. They were typically hired by scientists to speed what would otherwise be a laborious process. Over time, the ‘computers’ realized the benefits of dividing their tasks and creating specialization. Eventually they created large books of ‘premade’ tables of already completed calculations so that greater calculations could be built from them. The first electronic computers were essentially created to replicate and replace the manual process that ‘human computers’ were performing. That fundamental process has continued to be the foundational drive behind all computing. To take what humans do slowly and imperfectly and enable it to be done rapidly and accurately.
By the time the modern mainframe computer was created in 1951,2 this simple vision had spawned an entire industry, its own scientific discipline and, most importantly for our purposes, the beginnings of a new profession. By 1964, it was clear that there was a huge market for computers. But the complexity and cost of the technology made it difficult for most organizations to make the leap. It was into this market that IBM introduced the computer that would largely define the industry going forward, the IBM System/360 Series. It was a compatible series of computers that were all capable of running the same software. Based on this common architecture, it opened up vast possibilities for customers. The fact that it fit into IBM’s existing infrastructure, combined with IBM’s legendary sales force, suddenly made it practical and affordable for companies to begin purchasing the IBM System/360 Series and utilizing them for a wide range of purposes. As company executives began using their new technological marvels, however, they soon realized that they needed to employ a staff of people who could program and operate them. And the function of IT was born.3

Technical foundations

From the very beginning, computers were set apart from ‘normal’ life. They were born in one of the greatest eras of technological advancement the world had ever seen. During the fifty years preceding the dawn of the commercial mainframe, we had been introduced to mass-produced cars, commercial air travel, and vast levels of ‘automation’ on both industrial and consumer levels. Everything from the automated assembly line to dishwashers, washing machines, and, of course, television had come on the scene in the short fifty-year period before the introduction of the commercial computer.
The world was in awe of technology. In 1955, Walt Disney inspired imaginations around the world with his new Tomorrowland area of Disneyland. In the 1950s and early 1960s there were over 150 movies released that dealt with the wonder of the modern era and imagined wild futures of flying cars and robots. It was into this world that the computer began its journey into the mainstream. It is no wonder that computers and the folks that operated them were viewed as something separate from the rest of the company.
The very first computers required highly technical people to design and implement them. They were advanced mathematicians and technicians who built and managed the entire platform. While the great innovation of the modern mainframe computer was that it was ‘programmable,’ it still required a very technical skill set to write the binary code necessary to make it work. The work of writing this code was often long, arduous, and fraught with error. It was easy to make a simple mistake in the sometimes millions of lines of binary.
Companies, however, began to see the promise. They began imagining more diverse and more complex tasks that computers could handle. What started as a machine to do ‘computations’ was suddenly being used for a wide variety of purposes. With each new use imagined, the challenge of programming it became more acute.
Because of this complexity, two things happened. First, it became clear to organizations that the people that they needed to program and operate these new computers were going to be a special breed of people. This was not going to be something that just anyone could do. They would need to hire or train people with this specific skill set.
Second, the computer companies realized that they needed to do something. It was becoming apparent that, in some cases, it was taking longer to write the program to automate a task than it would have taken simply to do the task manually. So, they began developing ‘programming languages’ that made the job of programming a computer much easier. Languages such as FORTRAN and COBOL were introduced and represented the first major shift in how computing was done.

Specialization and separation

The creation of the first programming languages created a fundamental shift in how computers were used and operated. They opened up a world of possibilities for organizations by making it easier to do more complex and specialized tasks. This created an explosion in their use and was a boon for the growing computer industry. Suddenly, there was intense demand for programmers who could harness the power of these new investments.
The programming languages ushered another new aspect into the world of computing – specialization. Up to this point, computing was essentially a unidimensional discipline. The advent of programming languages and the large number of new computer companies that arose during this era brought with them a large number of subdisciplines specializing in specific platforms, programming languages, or industries. It was no longer enough just to be a programmer. Companies were looking for a “COBOL programmer on the DEC platform with experience programming financial systems.”
This first level of specialization began creating divides. While at the beginning it was common for people to learn both FORTRAN and COBOL, over time people began to self-segregate. Scientifically oriented organizations were most interested in using their computers for complex calculations. The programmers were, therefore, predominately focused on FORTRAN because of its more advanced calculation capabilities. Business-oriented organizations concentrated on automating workflows and less complex calculations, so they focused on COBOL, which had been built to specifically meet this need. It became clear that there was not a great deal of crossover and so programmers began ‘picking sides.’
This was not adversarial. Overall, IT people have always been collegial. It was more like the Tower of Babel. In the beginning we all spoke the same language. We could communicate, share stories, and trade roles. But over time, we began to forget. As programmers picked their sides and became specialized, they had little to no need for the other languages. So, we ended up working in different domains, speaking our different languages and working on different problems. Even within the same company this happened. If a company had a need for both technical computation and business-oriented computing, the two programming teams would self-segregate, each working on their own problems.
Soon, specialization became separation. Entirely separate camps of programming disciplines developed. They often involved different approaches, methodologies and documentation standards. The separation continued with the proliferation of additional computer makers, with each introducing its own separate set of parameters. What had begun as a singular approach to programming had evolved into a wide range of programming disciplines, each demanding different skills – and often different perspectives on how things should be done.
It was the first of many cultural divides to come.

The first silos

While programming skills were being internalized and stratified to meet the specific and increasingly unique needs of organizations, a separate discipline was developing elsewhere in the world of computing: the computer operator.
Originally, computers were operated in much the same way that the card tabulators had been operated before them. ‘Programs’ came in the form of punch cards and simply needed to be loaded in order to run the computer. The job was simple and was typically done by the same people that had put the old punch cards into the mechanical tabulators.
As computers became more intricate, however, this broke down. As they moved from punch cards to tape and from binary to programming languages, it became clear that the old way of operating the computer was not going to work. Companies realized that they had to hire people who could function in this new and specialized world and keep everything running. The role of the computer operator was born.
The division of labor was pretty simple and straightforward. The programmers wrote the programs and the operators operated the machines. They were fairly distinct disciplines and, while it may seem strange to us today, they had little dependency on one another. There was really nothing that operators could do to impact the programmers. As long as they ran the job, switched the tapes or whatever other operational task was required, whether or not the program ran correctly was solely in the hands of the programmer.
The same was largely true from the other direction. The operators did not much care what was in the program. There was little that the programmers could do that would have a major impact on the operators’ ability to do the job. As long as they had sufficient instructions on the operational tasks required, the worst thing that could happen from their perspective was that the job would not run.
The discipline of computer operations was task-oriented. It was about executing a series of tasks consistently, reliably, and predictably. It was important work. But it was not one that required a high degree of creativity or ingenuity. A different type of person was attracted to the role of a computer operator. They were more mechanical. They were more task-oriented and enjoyed the idea that they were the guardians of the kingdom, making sure that everything worked as it was designed.
They also saw themselves as separate from the programmers. The divide continued.

The kingdom grows

Over the ensuing decades, this fundamental structure grew and expanded. New languages were added, new technologies were deployed, but everything ended up in one of these two ‘silos.’ In part, this was because the fundamental technology structure had not changed much. Improvements were made to the capabilities of specific components of the technology, making things faster or more efficient, but the basic architecture did not change.
Eventually, things like Project Management Offices (PMOs), security teams, and other ancillary functions were added to the IT operation. Sometimes this resulted in creating separate subfunctions within IT, but for the most part the two primary silos persisted.
After a time, these silos became fully entrenched. As IT grew from a technical discipline into a career, IT managers began to have a vested interest in this structure. Inertia set in. People were happy to keep things the way they were. After all, they seemed to work, right?
At no point did anyone really stop and ask if this was the right approach. IT and ‘computing’ were never seen as a strategic core competence during the early years. Viewed as largely technical functions, they did not warrant much strategic energy. They simply evolved.
The challenge was that it evolved based on a set of basic assumptions regarding the relationship between Applications and Infrastructure and between IT and its customers, who would themselves evolve over time, eventually leading to a fundamental break in these relationships.

The history of our death – part 2

The rise of the IT corporate culture and why we work the way we work

You can’t win. You know that, don’t you? It doesn’t matter if you whip us, you’ll still be where you were before, at the bottom. And we’ll still be the lucky ones at the top with all the breaks. It doesn’t matter. Greasers will still be Greasers and Socs will still be Socs. It doesn’t matter.
From The Outsiders, by S.E. Hinton
The book and movie The Outsiders tells the story of two rival groups of teenagers locked in an age-old battle of cultural warfare. They come from different sides of the railroad tracks that physically and metaphorically divide those who have from those who have not.
The Socs (pronounced ‘so-shus’) come from educated and prosperous families. They are the ‘thinkers.’ They do not have to do manual labor. They will go off to college and, one day, they will be the boss. They see themselves as better than the rest.
The Greasers are working-class. They get their hands dirty. They are the ones that have to do the real wo...

Table of contents