Cloud-Based Music Production
eBook - ePub

Cloud-Based Music Production

Sampling, Synthesis, and Hip-Hop

Matthew T. Shelvock

Buch teilen
  1. 168 Seiten
  2. English
  3. ePUB (handyfreundlich)
  4. Über iOS und Android verfügbar
eBook - ePub

Cloud-Based Music Production

Sampling, Synthesis, and Hip-Hop

Matthew T. Shelvock

Angaben zum Buch
Buchvorschau
Inhaltsverzeichnis
Quellenangaben

Über dieses Buch

Cloud-Based Music Production: Samples, Synthesis, and Hip-Hop presents a discussion on cloud-based music-making procedures and the musical competencies required to make hip-hop beats.

By investigating how hip-hop producers make music using cloud-based music production libraries, this book reveals how those services impact music production en masse. Cloud-Based Music Production takes the reader through the creation of hip-hop beats from start to finish – from selecting samples and synthesizer presets to foundational mixing practices – and includes analysis and discussion of how various samples and synthesizers work together within an arrangement. Through case studies and online audio examples, Shelvock explains how music producers directly modify the sonic characteristics of hip-hop sounds to suit their tastes and elucidates the psychoacoustic and perceptual impact of these aesthetically nuanced music production tasks.

Cloud-Based Music Production will be of interest to musicians, producers, mixers and engineers and also provides essential supplementary reading for music technology courses.

Häufig gestellte Fragen

Wie kann ich mein Abo kündigen?
Gehe einfach zum Kontobereich in den Einstellungen und klicke auf „Abo kündigen“ – ganz einfach. Nachdem du gekündigt hast, bleibt deine Mitgliedschaft für den verbleibenden Abozeitraum, den du bereits bezahlt hast, aktiv. Mehr Informationen hier.
(Wie) Kann ich Bücher herunterladen?
Derzeit stehen all unsere auf Mobilgeräte reagierenden ePub-Bücher zum Download über die App zur Verfügung. Die meisten unserer PDFs stehen ebenfalls zum Download bereit; wir arbeiten daran, auch die übrigen PDFs zum Download anzubieten, bei denen dies aktuell noch nicht möglich ist. Weitere Informationen hier.
Welcher Unterschied besteht bei den Preisen zwischen den Aboplänen?
Mit beiden Aboplänen erhältst du vollen Zugang zur Bibliothek und allen Funktionen von Perlego. Die einzigen Unterschiede bestehen im Preis und dem Abozeitraum: Mit dem Jahresabo sparst du auf 12 Monate gerechnet im Vergleich zum Monatsabo rund 30 %.
Was ist Perlego?
Wir sind ein Online-Abodienst für Lehrbücher, bei dem du für weniger als den Preis eines einzelnen Buches pro Monat Zugang zu einer ganzen Online-Bibliothek erhältst. Mit über 1 Million Büchern zu über 1.000 verschiedenen Themen haben wir bestimmt alles, was du brauchst! Weitere Informationen hier.
Unterstützt Perlego Text-zu-Sprache?
Achte auf das Symbol zum Vorlesen in deinem nächsten Buch, um zu sehen, ob du es dir auch anhören kannst. Bei diesem Tool wird dir Text laut vorgelesen, wobei der Text beim Vorlesen auch grafisch hervorgehoben wird. Du kannst das Vorlesen jederzeit anhalten, beschleunigen und verlangsamen. Weitere Informationen hier.
Ist Cloud-Based Music Production als Online-PDF/ePub verfügbar?
Ja, du hast Zugang zu Cloud-Based Music Production von Matthew T. Shelvock im PDF- und/oder ePub-Format sowie zu anderen beliebten Büchern aus Social Sciences & Media Studies. Aus unserem Katalog stehen dir über 1 Million Bücher zur Verfügung.

Information

Jahr
2020
ISBN
9781351137089

1

Understanding Samples, Synthesis, Editing, and Mixing

The previous chapter introduces some core cloud-based music production (CBMP) concepts. In this chapter, I cover a number of beat-making activities which are directly supported by CBMP services such as Splice, Loopcloud, and Noiiz. In particular, these services help producers streamline production activities such as (i) sampling, (ii) synthesis, (iii) triggering sounds, (iv) arranging, and (v) mixing. Most of the time, when these activities are discussed in literature, they tend to be treated as though they are completely separate from one another. And, in certain music production traditions, this may be true.
However, in most forms of digital music making, such as the creation of hip-hop beats, producers freely switch between composing, arranging, designing and recording sounds, and mixing. These production methods, and how they relate to one another, are the main focus of this chapter, rather than the classic DJ-centered techniques commonly associated with hip-hop in the 1980s and early 1990s. This is because most of today’s beat producers prefer to work within DAWs, which make it quite easy to switch between beat-making activities such as sampling, synthesis, triggering sounds, arranging, and mixing (Shelvock 2017a: 172).
In some cases, such as Fytch’s Spring 2018 hip-hop beat-making challenge on Splice, producers may also combine CBMP resources with live instruments. Yet, as Fytch demonstrates, when producers incorporate live instrumentalists within their work, they often treat these recordings like any other sample downloaded via a CBMP service. That is to say, through the use of editing technologies, they mutate the rhythmic and timbral properties of these performances in the same way as they do with pre-recorded samples provided by Splice, Loopcloud, or Noiiz.
Since the sounds producers create using digital production methods exert such a clear influence on hip-hop’s key influential instrumentalists, in this chapter I focus on these computer-based music-making techniques. While a very small portion of hip-hop beat makers use live instruments exclusively (or at least mostly) within their work, this is somewhat of a unique approach. Groups such as The Roots or Bad Bad Not Good feature drummers, bassists, guitarists, keyboard players, and others who try to emulate the sound of hip-hop sampling when they play. In order to accomplish this, they simply draw inspiration from the type of rhythmic phrasing predominantly heard on quintessential sample-based hip-hop records. Indeed, sampling is such a foundational technique within this genre that even instrumentalists try to perform as though they have already been sampled and edited by a producer.
In the sections of this chapter, I provide a brief overview of the most crucial hip-hop production techniques so that readers can see how CBMP supports practices related to (i) sampling, (ii) synthesis, (iii) triggering sounds, (iv) arranging, and (v) mixing and how each of these (i-v) practices coalesce during the beat-making process. While additional approaches and techniques exist, I have chosen the methods which are most commonly discussed (or used) by producers in popular video series offered by Mass Appeal, Waves, Maschine, FL Studio, Ableton, HotNewHipHop, and others. In addition, I have chosen the techniques I encounter most often as a producer, manager, publisher, and label representative. However, I do not claim that any specific production technique I mention constitutes some form of secret knowledge, or that these tools are unknown to practitioners. In fact, I argue just the opposite: these production methods are used on a routine basis by beat makers, yet no peer-reviewed text provides an in-depth analysis of how these techniques work together to create hip-hop beats per se (although many authors do, of course, discuss topics surrounding beat production through the lens of cultural theory, such as Schloss 2014). It is my hope that what follows will spark increased discussion on the topic of record production, as an area worth analyzing in-and-of itself, so that many of the misunderstandings which currently pervade popular music texts can finally be laid to rest.

1.1 Virtual Performance: Performing Samples, Performing Synthesis

In order to make music using audio samples and synthesizers, producers record themselves while triggering these sounds using a variety of tools. Today, this almost always occurs through the use of a DAW, such as Ableton or Maschine. This section describes how producers incorporate various performance technologies, such as MIDI keyboards and trigger pads, in order to create hip-hop beats.

1.1.1 Performance Inputs: Sample-Triggering, Synthesis, and MIDI

One particularly important tool for hip-hop beat makers are sample-triggering devices, such as the classic instruments offered by Roland or Akai. In addition, today many producers have adopted new tools offered by companies such as Novation, M-Audio, or Native Instruments in order to accomplish the same task as these well-known machines. One shared feature of many of these hardware interfaces is the provision of a series of square buttons pads, which are arranged in a grid format. These pads can trigger samples, but they can also control software synthesizers in order to produce melodies because they transmit data which software synthesizers can transform into discrete pitches. Some controllers, such as the Novation Launchkey, include both a series of button pads, as well as a piano keyboard, and users are free to switch back and forth between them in order to control or trigger sounds (Figures 1.1 and 1.2).
Most of these devices make use of a form of musical software data known as MIDI, which stands for Musical Instrument Digital Interface. MIDI is a form of musical data which is stored by computer software, such as the samplers discussed above or DAWs, which contains metadata describing a number of standard musical parameters. This data is stored using software and can be recalled at any time. Some of the musical parameters which MIDI controls include pitch, velocity (i.e., dynamics), note length, and rhythm.
MIDI allows users to store and modify this musical information by either (i) using controllers to record virtual performances with sampler devices and synthesizers or (ii) modifying MIDI data outside of real time.1 In the first case, producers can simply record the performance data which enters into the computer via a controller such as NI’s Maschine or Novation’s Launchpad. Alternatively, producers can also modify or create MIDI data outside of a real-time performance. They may edit pitches, rhythms, dynamics, and other sonic parameters by prescribing these things while the producer is not performing. To do so, they rely on a user interface known as a matrix editor, which allows them to input exact note lengths, timings, pitches, and velocities (i.e., dynamics). When producers edit or create MIDI data within a matrix editor, it resembles the act of score creation (Figure 1.3).
Figure 1.1
Figure 1.1A Korg ES1 sampler has a tube preamp which can add additional warmth to digital samples triggered on this unit. This device can also act as a simple MIDI trigger device.
Figure 1.2
Figure 1.2A Novation Launchkey provides users with 16 square-shaped trigger pads which correspond to various MIDI notes. This device also has a two octave MIDI keyboard. In addition, the 8 dials on the top can be assigned to various DAW controls using its CCN function.
Figure 1.3
Figure 1.3Ableton’s matrix editor is used for altering MIDI events.
MIDI also allows user to store a second type of controller data, known as continuous controller number (CCN) data, which can store and play back pitch wheel modifications on a keyboard, for example. An even more pertinent feature for those who use these controllers includes using CCN data to control a variety of effect parameter knobs, such as filters, standard EQs, and other effect parameters, by assigning CCN values to various software parameters.
The CCN functions on MIDI controllers are fully assignable. When CCN data is used in tangent with an interface for inputting notes and rhythms, such as a keyboard or a matrix of square-shaped triggers, producers can control almost any conceivable musical (i.e., notes, rhythms, pitches, velocity, and note duration) or sonic (i.e., frequency balance, dynamic contour, ambient profile, stereo configuration) property in real time. Typically, these hardware interfaces have a series of rotary knobs, sliders, and buttons which users can program to configure various effects such as EQs, compressors, filters, ambient effects, and modulation-based processors. For example, a user can assign the Q-value for one filter on a parametric EQ to an empty CCN channel. This CCN channel can then be assigned to a dial on a MIDI controller. From this point, users may freely adjust the Q-value on this filter by simply turning the associated dial.
Typically, producers use CCN data for two purposes. These reasons include (i) controlling sampler or synthesizer parameters and (ii) specifying signal processing parameters. In both cases, CCN data facilitates a more tactile approach for altering both software instrument parameters and effects in real time in comparison to keyboard and mouse input. Another important advantage to this approach is that it allows users to record the signal as its sonic parameters change over time using a process known as automation (Figure 1.4). A producer could, for example, cause the wet/dry setting on a phaser effect to increase and decrease at specific times over the course of a project. Of course, many users also prefer to program these types of modifications using keyboards and mice, which is known as the point-and-click method by practitioners. However, th...

Inhaltsverzeichnis