Computer Science
Petabyte
A petabyte is a unit of digital information storage that is equal to 1,000 terabytes or one quadrillion bytes. It is commonly used to describe the storage capacity of large computer systems, data centers, and cloud storage services. A petabyte can hold vast amounts of data, including text, images, videos, and other types of digital content.
Written by Perlego with AI-assistance
3 Key excerpts on "Petabyte"
- eBook - ePub
Digital Transformation
Survive and Thrive in an Era of Mass Extinction
- Thomas M. Siebel(Author)
- 2020(Publication Date)
- Rodin Books(Publisher)
Using base-2 arithmetic, we can represent any number. The ASCII encoding system, developed from telegraph code in the 1960s, enables the representation of any character or word as a sequence of zeros and ones.As information theory developed and we began to amass increasingly large data sets, a language was developed to describe this phenomenon. The essential unit of information is a bit . A string of eight bits in a sequence is a byte. We measure computer storage capacity as multiples of bytes as follows:One byte is 8 bits. One thousand (1000) bytes is a kilobyte.One million (10002 ) bytes is a megabyte.One billion (10003 ) bytes is a gigabyte.One trillion (10004 ) bytes is a terabyte.One quadrillion (10005 ) bytes is a Petabyte.One quintillion (10006 ) bytes is an exabyte.One sextillion (10007 ) bytes is a zettabyte.One septillion (10008 ) bytes is a yottabyte.To put this in perspective, all the information contained in the U.S. Library of Congress is on the order of 15 terabytes.1 It is not uncommon for large corporations today to house scores of Petabytes of data. Google, Facebook, Amazon, and Microsoft collectively house on the order of an exabyte of data.2 As we think about big data in today’s computer world, we are commonly addressing Petabyte- and exabyte-scale problems.There are three essential constraints on computing capacity and the resulting complexity of the problem a computer can address. These relate to (1) the amount of available storage, (2) the size of the binary number the central processing unit (CPU) can add, and (3) the rate at which the CPU can execute addition. Over the past 70 years, the capacity of each has increased dramatically.As storage technology advanced from punch cards, in common use as recently as the 1970s, to today’s solid-state drive (SSD) non-volatile memory storage devices, the cost of storage has plummeted, and the capacity has expanded exponentially. A computer punch card can store 960 bits of information. A modern SSD array can access exabytes of data. - eBook - ePub
Digital Knowledge
A Philosophical Investigation
- J. Adam Carter(Author)
- 2024(Publication Date)
- Routledge(Publisher)
6A digital epistemology of Big DataDOI: 10.4324/9781003098966-66.1 The Petabyte Age: an introduction
When Steve Jobs launched the original Apple Macintosh computer in 1984, it was thought to be a revolutionary computer. (Remember the famous Ridley Scott 1984 video?) But this revolutionary computer contained just 128 kilobytes of memory storage. That is about enough space to hold about 1/15th of the book you are reading now, which is around 2MB. In 1986, the next version of the Macintosh – the Macintosh 512K – was released, and it more than tripled the storage space of the 1984 model. Although still not enough to hold this book’s 2MB, that might have still seemed impressive at the time! But just imagine if we’d told folks in 1986 that just 15 years later, in 2001, the first iPod would ship with 5,000,000 kilobytes (i.e., 5 gigabytes) of memory storage, the equivalent of about 40,000 1984 Macintoshes that you could put in your pocket!We are now 20+ years past the first iPod, and not only does 5 gigabytes of storage not sound impressive anymore, but just as we once made the transition from conceptualising information in kilobytes, and then to megabytes and gigabytes, we are now becoming more accustomed to thinking (as far as computer storage goes) in terabytes, which themselves are just a small fraction of (increasingly relevant) Petabytes, exabytes and zettabytes. Figure 6.1 offers a perspective on these unit sizes.Source: My Nasa Data mynasadata.larc.nasa.govFigure6.1The International Data Corporation (IDC) estimates, in their Data Age 2025 report, that 175 zettabytes - eBook - ePub
Data Insights
New Ways to Visualize and Make Sense of Data
- Hunter Whitney(Author)
- 2012(Publication Date)
- Morgan Kaufmann(Publisher)
A single bit has two possible values, commonly notated as “1” and “0.” Think of a single bit like a light switch that can be either “off” or “on.” One bit isn’t a lot of storage, but it can be used to store things like the value of a “yes/no” statement (see Figure 1.19). 8 bits = 1 byte 1000 bytes = 1 kilobyte 1,000,000 bytes = 1 megabyte 1,000,000,000 bytes = 1 gigabyte 1,000,000,000,000 bytes = 1 terabyte 1,000,000,000,000,000 bytes = 1 Petabyte 1,000,000,000,000,000,000 bytes =1 exabyte FIGURE 1.19 Kerplunk! On their own, these numbers seem fairly meaningless. Although it’s possible to picture in your mind’s eye having five of something (five jelly beans easily fit in the palm of your hand) or even 100 of something (the number of tiles in a Scrabble ® game), we can’t picture a trillion of something. The flow of data The process by which data is collected, transmitted, and stored varies widely between types of data and storage methods used, but the basic process is always the same. Digital data is collected through electronic sensors or through human input. This data is then transmitted across some distance that can be as short as the few centimeters between the keyboard on a laptop and its hard drive or as far away as the Voyager 1 Spacecraft, more than 11 billion miles from Earth. Once the data reaches its destination, it is stored on magnetic, optical, or solid-state media. “Big Data” collections are stored in data centers or data warehouses. A data center is a special room or building with rows and rows of computers stacked on top of each other in a rack. The computers used in data centers, servers, are generally much more powerful than standard desktop or laptop computers, with significantly more storage space, processing power, and random-access memory (RAM). The servers talk to each other on a high-performance network that allows data to be transferred at very high speeds
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.


