Cloud-Based Music Production
eBook - ePub

Cloud-Based Music Production

Sampling, Synthesis, and Hip-Hop

Matthew T. Shelvock

Share book
  1. 168 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Cloud-Based Music Production

Sampling, Synthesis, and Hip-Hop

Matthew T. Shelvock

Book details
Book preview
Table of contents
Citations

About This Book

Cloud-Based Music Production: Samples, Synthesis, and Hip-Hop presents a discussion on cloud-based music-making procedures and the musical competencies required to make hip-hop beats.

By investigating how hip-hop producers make music using cloud-based music production libraries, this book reveals how those services impact music production en masse. Cloud-Based Music Production takes the reader through the creation of hip-hop beats from start to finish – from selecting samples and synthesizer presets to foundational mixing practices – and includes analysis and discussion of how various samples and synthesizers work together within an arrangement. Through case studies and online audio examples, Shelvock explains how music producers directly modify the sonic characteristics of hip-hop sounds to suit their tastes and elucidates the psychoacoustic and perceptual impact of these aesthetically nuanced music production tasks.

Cloud-Based Music Production will be of interest to musicians, producers, mixers and engineers and also provides essential supplementary reading for music technology courses.

Frequently asked questions

How do I cancel my subscription?
Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is Cloud-Based Music Production an online PDF/ePUB?
Yes, you can access Cloud-Based Music Production by Matthew T. Shelvock in PDF and/or ePUB format, as well as other popular books in Social Sciences & Media Studies. We have over one million books available in our catalogue for you to explore.

Information

Publisher
Focal Press
Year
2020
ISBN
9781351137089
Edition
1

1

Understanding Samples, Synthesis, Editing, and Mixing

The previous chapter introduces some core cloud-based music production (CBMP) concepts. In this chapter, I cover a number of beat-making activities which are directly supported by CBMP services such as Splice, Loopcloud, and Noiiz. In particular, these services help producers streamline production activities such as (i) sampling, (ii) synthesis, (iii) triggering sounds, (iv) arranging, and (v) mixing. Most of the time, when these activities are discussed in literature, they tend to be treated as though they are completely separate from one another. And, in certain music production traditions, this may be true.
However, in most forms of digital music making, such as the creation of hip-hop beats, producers freely switch between composing, arranging, designing and recording sounds, and mixing. These production methods, and how they relate to one another, are the main focus of this chapter, rather than the classic DJ-centered techniques commonly associated with hip-hop in the 1980s and early 1990s. This is because most of today’s beat producers prefer to work within DAWs, which make it quite easy to switch between beat-making activities such as sampling, synthesis, triggering sounds, arranging, and mixing (Shelvock 2017a: 172).
In some cases, such as Fytch’s Spring 2018 hip-hop beat-making challenge on Splice, producers may also combine CBMP resources with live instruments. Yet, as Fytch demonstrates, when producers incorporate live instrumentalists within their work, they often treat these recordings like any other sample downloaded via a CBMP service. That is to say, through the use of editing technologies, they mutate the rhythmic and timbral properties of these performances in the same way as they do with pre-recorded samples provided by Splice, Loopcloud, or Noiiz.
Since the sounds producers create using digital production methods exert such a clear influence on hip-hop’s key influential instrumentalists, in this chapter I focus on these computer-based music-making techniques. While a very small portion of hip-hop beat makers use live instruments exclusively (or at least mostly) within their work, this is somewhat of a unique approach. Groups such as The Roots or Bad Bad Not Good feature drummers, bassists, guitarists, keyboard players, and others who try to emulate the sound of hip-hop sampling when they play. In order to accomplish this, they simply draw inspiration from the type of rhythmic phrasing predominantly heard on quintessential sample-based hip-hop records. Indeed, sampling is such a foundational technique within this genre that even instrumentalists try to perform as though they have already been sampled and edited by a producer.
In the sections of this chapter, I provide a brief overview of the most crucial hip-hop production techniques so that readers can see how CBMP supports practices related to (i) sampling, (ii) synthesis, (iii) triggering sounds, (iv) arranging, and (v) mixing and how each of these (i-v) practices coalesce during the beat-making process. While additional approaches and techniques exist, I have chosen the methods which are most commonly discussed (or used) by producers in popular video series offered by Mass Appeal, Waves, Maschine, FL Studio, Ableton, HotNewHipHop, and others. In addition, I have chosen the techniques I encounter most often as a producer, manager, publisher, and label representative. However, I do not claim that any specific production technique I mention constitutes some form of secret knowledge, or that these tools are unknown to practitioners. In fact, I argue just the opposite: these production methods are used on a routine basis by beat makers, yet no peer-reviewed text provides an in-depth analysis of how these techniques work together to create hip-hop beats per se (although many authors do, of course, discuss topics surrounding beat production through the lens of cultural theory, such as Schloss 2014). It is my hope that what follows will spark increased discussion on the topic of record production, as an area worth analyzing in-and-of itself, so that many of the misunderstandings which currently pervade popular music texts can finally be laid to rest.

1.1 Virtual Performance: Performing Samples, Performing Synthesis

In order to make music using audio samples and synthesizers, producers record themselves while triggering these sounds using a variety of tools. Today, this almost always occurs through the use of a DAW, such as Ableton or Maschine. This section describes how producers incorporate various performance technologies, such as MIDI keyboards and trigger pads, in order to create hip-hop beats.

1.1.1 Performance Inputs: Sample-Triggering, Synthesis, and MIDI

One particularly important tool for hip-hop beat makers are sample-triggering devices, such as the classic instruments offered by Roland or Akai. In addition, today many producers have adopted new tools offered by companies such as Novation, M-Audio, or Native Instruments in order to accomplish the same task as these well-known machines. One shared feature of many of these hardware interfaces is the provision of a series of square buttons pads, which are arranged in a grid format. These pads can trigger samples, but they can also control software synthesizers in order to produce melodies because they transmit data which software synthesizers can transform into discrete pitches. Some controllers, such as the Novation Launchkey, include both a series of button pads, as well as a piano keyboard, and users are free to switch back and forth between them in order to control or trigger sounds (Figures 1.1 and 1.2).
Most of these devices make use of a form of musical software data known as MIDI, which stands for Musical Instrument Digital Interface. MIDI is a form of musical data which is stored by computer software, such as the samplers discussed above or DAWs, which contains metadata describing a number of standard musical parameters. This data is stored using software and can be recalled at any time. Some of the musical parameters which MIDI controls include pitch, velocity (i.e., dynamics), note length, and rhythm.
MIDI allows users to store and modify this musical information by either (i) using controllers to record virtual performances with sampler devices and synthesizers or (ii) modifying MIDI data outside of real time.1 In the first case, producers can simply record the performance data which enters into the computer via a controller such as NI’s Maschine or Novation’s Launchpad. Alternatively, producers can also modify or create MIDI data outside of a real-time performance. They may edit pitches, rhythms, dynamics, and other sonic parameters by prescribing these things while the producer is not performing. To do so, they rely on a user interface known as a matrix editor, which allows them to input exact note lengths, timings, pitches, and velocities (i.e., dynamics). When producers edit or create MIDI data within a matrix editor, it resembles the act of score creation (Figure 1.3).
Figure 1.1
Figure 1.1A Korg ES1 sampler has a tube preamp which can add additional warmth to digital samples triggered on this unit. This device can also act as a simple MIDI trigger device.
Figure 1.2
Figure 1.2A Novation Launchkey provides users with 16 square-shaped trigger pads which correspond to various MIDI notes. This device also has a two octave MIDI keyboard. In addition, the 8 dials on the top can be assigned to various DAW controls using its CCN function.
Figure 1.3
Figure 1.3Ableton’s matrix editor is used for altering MIDI events.
MIDI also allows user to store a second type of controller data, known as continuous controller number (CCN) data, which can store and play back pitch wheel modifications on a keyboard, for example. An even more pertinent feature for those who use these controllers includes using CCN data to control a variety of effect parameter knobs, such as filters, standard EQs, and other effect parameters, by assigning CCN values to various software parameters.
The CCN functions on MIDI controllers are fully assignable. When CCN data is used in tangent with an interface for inputting notes and rhythms, such as a keyboard or a matrix of square-shaped triggers, producers can control almost any conceivable musical (i.e., notes, rhythms, pitches, velocity, and note duration) or sonic (i.e., frequency balance, dynamic contour, ambient profile, stereo configuration) property in real time. Typically, these hardware interfaces have a series of rotary knobs, sliders, and buttons which users can program to configure various effects such as EQs, compressors, filters, ambient effects, and modulation-based processors. For example, a user can assign the Q-value for one filter on a parametric EQ to an empty CCN channel. This CCN channel can then be assigned to a dial on a MIDI controller. From this point, users may freely adjust the Q-value on this filter by simply turning the associated dial.
Typically, producers use CCN data for two purposes. These reasons include (i) controlling sampler or synthesizer parameters and (ii) specifying signal processing parameters. In both cases, CCN data facilitates a more tactile approach for altering both software instrument parameters and effects in real time in comparison to keyboard and mouse input. Another important advantage to this approach is that it allows users to record the signal as its sonic parameters change over time using a process known as automation (Figure 1.4). A producer could, for example, cause the wet/dry setting on a phaser effect to increase and decrease at specific times over the course of a project. Of course, many users also prefer to program these types of modifications using keyboards and mice, which is known as the point-and-click method by practitioners. However, th...

Table of contents