Mastering YC: Splits and Layers

The YC61 one-to-one interface makes it easy to set up, store and recall Splits and Layers. It’s all right there in front of you and with the LIVE SET you can store them for instant recall. This article will both show you how it works and give you a few tips and ideas on creative ways to set them up.
For reference, let’s take a look at the YC front panel:
body image 1

I’ll be moving around through all these sections throughout the article but most of the time I’ll be working with the Organ and Keys Sections. Notice that area in dark black is the same in each Section and I’ve highlighted the [SPLIT] and [OCTAVE +/-] buttons in blue and the [VOLUME] and [TONE] knobs in orange. These are important settings when creating splits and layers:
body image 2

Let start with splits.

SPLITS

There are a several ways to use splits in the YC61 and I’ll cover three kinds in this article: 

• Organ Split
• Organ/Bass Split
• KEYS A/KEYS B Split

Organ Split

Traditionally, most drawbar organs have two manuals or keyboards. Each manual has an independent set of drawbars that determine the tonal characteristics of what sound that manual will play. Traditionally players use the Upper manual for a lead or soloing role and use the Lower manual for comping chords. There really is not set rule regarding that. In general, just keep in mind that you have two manuals with unique controls sounds for your musical applications. The YC61 is capable of the same task.
Pictured below is the Organ section of the YC61:
body image 3

This is probably the easiest split to show on the YC61, and at the same time probably the most inconspicuous. Every organ setup in the YC61 has TWO manual drawbar settings, an Upper and a Lower.

For this example, I’ll select LIVE SET 1-1, Jazz Lead.

Pushing the Lower/Upper switch allows you to toggle between selecting playing either the Upper or Lower manual drawbar setting on the YC61 keyboard. By default, the drawbar LED colors are set to different colors so that you can tell them apart.

If you want to play both drawbar manual settings (Upper and Lower) from the YC61 keyboard, the keyboard can be split to do this. The SPLIT button in the Organ section allows you to choose if the Organ will play in the Upper [right LED], the Lower range [left LED], or BOTH, by simply pushing the switch so that BOTH LED lights are lit.
body image 4800
The YC61 display will also indicate that both manuals are active and will even indicate the drawbar positions on both the Upper and Lower manuals.
To assign the Split Point that determines where the Upper and Lower manuals divide, simply press the SPLIT POINT button below the display:
body image 5
Once the Split Point parameter is on the display, a split point can be selected by either touching the key on the keyboard at the desired split point location OR by using the Data Dial to choose the split point. Then, push the Data Dial to enter the value.

Organ/Bass Split

This is probably the most common way to split the keyboard with a bass sound on the left and a piano or keyboard sound on the right. In this example we will use the Organ section as the keyboard sound for the right-hand, and a bass sound from the KEYS A section.

To start this setup, I’ll select LIVE SET 1-1, “Jazz Lead”. This LIVE SET will already have the ORGAN section turned [ON] soI will turn the KEYS A section [ON] to add my Bass sound and leave the KEYS B section [OFF].

Since I already have an organ sound selected, after I turn [ON] the KEYS A section, I’ll need to select which sound I want there. For this example, I will select the OTHERS category and then use the RED toggle switch to select 42 for Acoustic Bass.

At this point I have two sounds layered across the keyboard; Organ and Acoustic Bass. Yes, that’s how easy it is to layer. We’ll get to that later. Now that I have these two sounds combined, I will need to change the areas that they are mapped across the keyboard, one sound on the right (Organ) and one sound on the left (Upright Bass). This is done by pushing the SPLIT button in the Organ area so that the LED indicates [R] for the right side and then pushing the SPLIT button in the KEYS A section so that the LED indicates [L] for the left side of the keyboard:
body image 6800

The two sounds are now split across the keyboard.

For the final tweak, I need to define where the split point is between the two sounds. To do this, simply press the SPLIT POINT button below the display. Once the Split Point parameter is on the display, the split point can be selected by either touching the key on the keyboard at the desired split point location OR by using the Data Dial to choose the split point. Then push the Data Dial to enter the value.

KEYS A/KEYS B Split

This process is very similar to the Organ/Bass split in the previous example. I won’t be using the Organ section in the YC61 for this example, so I’ll be turning the ORGAN section [OFF]. For this example, I’ll use sounds from KEYS A and KEYS B, so I will need to turn them [ON].

For this Basic Bass Split I’ll use the S700 piano on the right-hand of the keyboard (using KEYS A) and the Upright Bass on the left (using KEYS B). I will also adjust the effects so my piano will have reverb, but my bass will be dry. This is a common way to set up a bass split because reverb can make the bass sound less pronounced and “muddy”. To start, select LIVE SET 1-6 “Natural CFX”. Then select the Piano category in the KEYS A section by pushing the [KEYS A/KEYS B] switch so that the LED indicates KEYS A and then use the RED toggle switch to select S700.

Next, select the KEYS B section by push the KEYS A/KEYS B toggle switch so that the LED indicates KEYS B. Select the OTHER category in the KEYS B section and then use the RED toggle switch to select Upright Bass. To make sure that the sounds are in the area of the keyboard that I want, I’ll press the SPLIT button in the KEYS A section so only “R” is illuminated. Next, I’ll select KEYS B using the [KEYS A B] button and press the SPLIT button so only “L” is illuminated.

Now I have my two Sections split with CFX in the right and Upright Bass in the Left. The next thing I need to do is set the split point, adjust the Reverb send for each Section and Save the set up to a LIVE SET for instant recall:
body image 7
1. Split Point: This button is located in LIVE SET. Press the [SPLIT POINT] (highlighted in red above) button and touch a key on the keyboard to set the split (I chose G2, the G below middle C) and press [EXIT] button to confirm!

2. Reverb: The Master Reverb is located on the right side of the keyboard (highlighted in blue above.) To set the send for each Section press the button indicated to toggle through the parts the LED will illuminate next to the selected Section (If all the LEDs are on you are selecting ALL Sections). In this set up I have a little Reverb on the CFX Piano and None on the Acoustic Bass.

3. LIVE SET: Press [SETTINGS] and use the knob (referred to as the “Encoder dial/[ENTER] button” in the manual) next to the LED to navigate to “Name” and press down on the dial to select. You use the LIVE SET buttons to navigate and name the LIVE SET…I have named this “S700-Bass Split”. To complete storing the LIVE SET push the “Encoder dial/[ENTER] button”, select “Store” (highlighted in yellow above), choose a location in the LIVE SET memory, and then Push [ENTER] to store.

Building on what we’ve learned creating a bass and piano split, let’s create a left-hand accompaniment using the KEYS A section and a right-hand lead split, using KEYS B for the lead sound. For this example, I’ll shift the left-hand comp section up an octave, select a right-hand synth lead from the KEYS B section and engage a few more effects.

THE LEFT-HAND COMP/RIGHT-HAND LEAD SPLIT

This is a great setup for playing chords in your left hand and a lead part in the right. For example, you could play a pad sound transposed up an octave in your left along with a synth lead in the right. To make this easy, I’ll start with an electric piano sound that I already like, LIVE SET 2-5, “Trem EP”.

With this setup, I’ll set KEYS A to “Trem EP” as the left-hand comp shifted up one octave and push the [SPLIT] button so only “L” is illuminated.
For this example, I’d like to use a Phase Shifter instead of the assigned Tremolo effect. To change this this I’ll push the Black Toggle switch for Effect 1 “down” once to select P4, Dual Phaser.

For the next step, I’ll switch KEYS B “On” and choose “43 Calliope Lead”, turn on EFFECT 1, and change the effect to Digital Delay for Section Effect 1. Adjust the Depth and Rate to your taste.

Next we need to adjust the split point, just like in the previous examples. Press the [SPLIT POINT] button and touch a key on the keyboard to set the split point (I chose C3, middle C) and press [EXIT] button to confirm.

For one final touch I will adjust the amount of Reverb on both sounds; the electric piano with very little and the Calliope Lead sound with a large amount.

LAYERS

Let’s take a look at Layers. Layers are easy to set up and manage.

A Layer combines two or more sounds with no split point. Layered sounds span the entire keyboard area.

Two-Section Piano/Strings Layer

For this layer, let’s start from LIVE SET 1-6 “Natural CFX”.

The setting uses the CFX concert grand piano. I have the Section Volume and Tone set high. The Tone setting to high will help the piano cut through once we add the pad layer.

Next, I switched the KEYS B section “ON” and selected the Synth category and then selected “Lite Strings” (16) as my layer. This is a nice analog synth string sound with a Harmonic Enhancer applied in the Effect 1 section.

I set the Volume of KEYS B to about 75%. As you play this sound combination, you’ll notice that the synth string overtakes then piano sound by a little as you hold down the notes. The synth string sound already has a release that is rather slows so it doesn’t require as much reverb. I set the Reverb settings as pictured right.
For the final step we need to save to the LIVE SET memory.

Press [SETTINGS] and use the knob (referred to as the “Encoder dial/[ENTER] button” in the manual) next to the LED to navigate to “Name” and press down on the dial to select. You use the LIVE SET buttons to navigate and name the LIVE SET…I have named this “S700-Bass Split”. To complete storing the LIVE SET push the “Encoder dial/[ENTER] button”, select “Store”, choose a location in the LIVE SET memory, and then Push [ENTER] to store.

External Keyboard splits

The YC61 sounds can be split/layered in a several ways, however, there may be situations where you want more keys. Obviously there are some situation where this would make sense, but if you want more playing space the YC61 has a feature that allows the connection of an “external” MIDI keyboard and various ways to configure what sounds are played on the YC61 keyboard and which sounds are played on the external keyboard.

For more information on this, check out the upcoming article “Mastering YC-Using an External Keyboard”.

Want to share your thoughts/comments? Join the conversation on the Forum here.

FM 101, Part One: Discovering Digital FM…John Chowning Remembers

Discovery Digital FM: John Chowning Remembers

How a simple “ear discovery” came to forever influence the world of synthesis.


There have been only a few major turning points in the history of synthesis. After the seminal work of Bob Moog and his analog modular designs in the 1960s, it can be argued that instrument development travelled a mostly incremental path for more than two decades. New features were continually being added, but for the most part the synthesizers of the era continued to utilize subtractive analog technology. All that changed with the release of the Yamaha DX7 in 1983 — truly a watershed moment.Almost everything about the DX7 was new. It sported green, orange and pink membrane switches and provided a tiny LED screen for editing, along with what looked like a series of hieroglyphics along the top of its front panel — things labeled “algorithms.” Its 61-key keyboard was both velocity- and touch-sensitive (radical for the times) and offered 16-note polyphony (even more radical for the times). But most importantly, it sounded like no other synthesizer anyone had heard before.

That’s because it used a completely new type of synthesis technology, called digital FM (short for “Frequency Modulation”) — courtesy of an adventurous professor at Stanford University in California and a team of forward-thinking engineers at Yamaha Corporation in Hamamatsu, Japan.

KovarskyChowning1The Yamaha DX7The story behind the development of digital FM is fascinating, and it starts with a most unlikely source: an experimental music composer who was neither an engineer, mathematician or computer programmer. Instead, he was an artist who was chasing his muse when he stumbled across a sonic phenomenon that forever changed synthesis.

His name? Dr. John Chowning.

A Convergence of Interests

Chowning is a percussionist who graduated from Ohio’s Wittenberg University in 1959, and then went to Paris for two years to study with the famed composer Nadia Boulanger. It was during his time in Paris that he was first exposed to experimental music, including early works using electronics to create what was called “music for loudspeakers.” Inspired by those influences, he returned to the U.S. and received his Doctorate in 1966 from Stanford University.

Stanford University has long been a fertile place of research in all of the sciences, but it was Chowning who initiated their forays into the fledgling use of computers to make music, although his primary interest at the time was in sound spatialization: the ability to move a sound source in a three-dimensional field, and the way the human ear distinguishes those movements.

But after reading a seminal article written by computer synthesis pioneer Max Mathews, Chowning took a course in computer programming and then made a trip to visit Mathews at Bell Labs in New Jersey. “Max made a statement in his article that really grabbed my attention,” Chowning recalls. “He wrote: ‘There are no theoretical limitations to the performance of the computer as a source of musical sounds, in contrast to the performance of ordinary instruments. At present, the range of computer music is limited principally by cost and by our knowledge of psychoacoustics. These limits are rapidly receding.’”

Chowning returned to California with a box of punch cards Mathews had given him, containing instructions for a synthesis computer program called MUSIC IV. He found a place to play them at Stanford’s newly established Artificial Intelligence laboratory, where researchers gathered to see what they could get computers to do.

A Discovery of the Ear

The convergence of Chowning’s interest in spatialization and Mathew’s search for new sounds led to a focus on one particular sonic aspect: Vibrato.

“I was searching for sounds that had some internal dynamism,” Chowning explains, “because for localization one has to have sounds that are dynamic in order to perceive their distance. The direct signal and the reverberant signal have to have some phase differences in order for us to perceive that there are in fact two different signals. Vibrato is one of the ways that one can do that.”

KovarskyChowning2aJohn Chowning working with the external programmer for the GS-1 synthesizer.
One evening in the autumn of 1967, Chowning was using the mainframe computer in the AI laboratory to model the sound of two sine wave oscillators in a simple modulation configuration — one altering the pitch of the other to produce vibrato. Curious as to what would occur if he increased the rate and/or depth beyond what was possible with the human touch on an acoustic instrument, he issued instructions to the computer try some basic multiples, doubling and tripling some of the numbers. And that’s when a curious thing happened: At the point when the rate of the vibrato increased to where it could no longer be perceived as a cyclical change, the sound changed from simple pitch fluctuation into a timbral change — a change in tonality. What’s more, as the rate and depth increased further he heard more and more timbral complexity. This was indeed the birth of digital FM.

As Chowning is fond of pointing out these days, this was a discovery of the ear, not the work of testing math equations or applying scientific principles. It was only after hearing this phenomenon did he take his experiments to an undergraduate math student to try to better understand what was going on. They researched the existing FM science as it related to radio transmission, where the rates are in Megahertz (millions of cycles per second), and saw that the equations proved out what he had discovered using rates in the audible ranges of hundreds or thousands of cycles per second. This was pivotal, because it proved that Chowning’s “ear discovery” was not just subjective; it was supported by objective science.

Over the next few years, Chowning continued to explore this technique, codifying how various frequency relationships and depths of modulation between two oscillators would result in specific timbral characteristics. Using the FM technique, he modeled brass tones, woodwinds, percussive objects and much more, developing a massive library of information.

The Search for a Development Partner Begins

Stanford University, to whom Chowning had assigned his patents, began looking for companies to license this fledgling technology in the early 1970s. At the time, organ manufacturers seemed the most likely partner, but since none of the U.S.-based ones were skilled in digital technology — at least not yet— they all passed on it.

Widening their search, it was brought to their attention that while Yamaha was best-known in America for their pianos, the company had a long heritage as a builder of organs, and were, in fact, the largest manufacturer of musical instruments in the world. So Stanford reached out to Yamaha, who sent a young engineer named Kazukiyo Ishimura to meet with Chowning. “I gave [him] some examples and showed some code — with a brief explanation — and in ten or so minutes he understood exactly what I was doing,” recalls Chowning.

The decision by Yamaha to license Chowning’s invention was not an easy one, given that the company was going to have to invest huge amounts of money in order to develop the large-scale chips needed to move the technology from a mainframe computer to a portable keyboard. But then-president Genichi Kawakami was firmly behind the idea, saying, “If we can make the best musical instruments in the world, then no matter how difficult it is, no matter how much money it costs, we’ll do it.”

And so the future of synthesis was forever altered, from the subtractive analog systems then widely in use to the all-digital ones employed by today’s synths.

Commercial Development Begins

For more than ten years, a large team of Yamaha engineers further researched and developed the technology. Chowning would visit with them many times, sharing his accumulated knowledge and helping to debug systems and develop sounds. He was, however, limited to working with breadboard designs hooked up to a computer, and was not involved in the specific design or interface of any given model. He often tells the story of how the first time he saw a DX7 was at a restaurant when the keyboard player called him over to see a new instrument he had acquired, sitting on top of his piano. Chowning didn’t know the model, but knew the sound he heard coming from it: his FM discovery.

Yamaha had actually developed a number of prototype instruments as they refined FM technology. The first one to be commercially released (in 1981, seven years after they obtained the license from Stanford) was an Electone organ called the F-70. This was followed quickly by the GS-1, a large performance keyboard using dual 4-operator FM. The sounds of the GS-1 were preset, but the user could load new sounds in via magnetic foil strips (!). A slightly scaled-down model, the GS-2, was also introduced in an effort to provide better portability. However, these early models were quite expensive, and so they were adopted mostly by top professional players and recording studios. The following year they released the smaller and more affordable CE-20, still a preset synth. Surprisingly for the times, all these keyboards were velocity-sensitive, and the CE-20 added some simple controls for sound manipulation, although those edits could not be saved.KovarskyChowning4The introductory ad for the GS1 and GS2 (and FM synthesis!).


But the stage was set for what would prove to be the most popular synthesizer of all time: The Yamaha DX7.

A New Standard Arrives

Despite its cryptic name and oddly colored membrane switches, despite its tiny editing screen and the hieroglyphics that adorned its front panel — indeed, despite the fact that it used terminology and concepts completely unfamiliar to even the most sophisticated synth gurus — the DX7 was an instant success. More importantly, its distinctive complex transients and clangorous tonalities would directly influence the sound of the music of the 1980s and beyond — all the way to the present day, in fact.

But there were a lot of things that had to be perfected to get to that point. As Chowning is quick to point out, many of those developments were the work of the Yamaha engineers. “The DX7 used my underlying principles and research, and I certainly worked with their engineers over the years,” he says, “but they put in over seven years, and the work of nearly a hundred engineers to produce the instrument.”

One of the major innovations that Yamaha contributed was the concept of feedback, where the output of an oscillator (called an “operator” in the jargon of digital FM) is routed back into its input (or the input of another operator modulating it) to produce additional, and different, sideband frequencies. Feedback created more harmonic complexity, without having to add more operators. “It was a simple but very effective way to get what [the Yamaha engineers] felt was missing from FM, which was an ‘edge’ to cut through,” says Chowning. As with analog synthesizers, subtle detuning of oscillator frequency could be used to produce warmth and give the sound extra body.

The design of the original 32 algorithms, which are configurations of operators in various modulator / carrier relationships (“modulator” operators, as their name implies, alter the signal being generated by “carrier” operators), was also very much Yamaha’s work, extending the simpler structures Chowning had experimented with. Ditto for the addition of velocity control over the sound being produced — one of the most important expressive aspects of the DX7 that Chowning felt made the product successful.

Perhaps most importantly, Yamaha decided to give the user the ability to program their own digital FM sounds (as opposed to simply providing presets), even though it was a huge undertaking, and much-contested decision within the organization. Because of the advent of MIDI at around the same time as the release of the DX7, the instrument was also one of the first to provide documentation called MIDI System Exclusive messages, which allowed the development and use of computer-based sound editors. This was huge, because it allowed musicians and sound designers to more easily create and sell alternate sounds for the DX7 — something that had never been possible before with other synthesizers.

Digital FM Today

The success of the DX7 spawned many subsequent digital FM synthesizers. Some of them incorporated only incremental changes (such as rack-mounting or availability on a computer card), while others represented large leaps forward, such as the implementation of advanced 8-operator FM-X technology in today’s Yamaha MONTAGE and MODX synthesizers.

KovarskyChowning5The Yamaha MONTAGENearly forty years after its first commercial introduction, digital FM is thriving, and Dr. Chowning looks on with pride and admiration for the work that Yamaha has done to bring his discovery to fruition and advance it. “What they have done with FM-X in the MONTAGE is just astonishing to me,” he says. “It’s moved so far beyond my original work for sure.” Every synthesist in the world — indeed, every musician who’s ever incorporated synthesizers into their music — owes John Chowning a debt of gratitude. His life’s work has quite literally change the world.

KovarskyChowning6

John Chowning today.

Check out these related articles on yamahasynth.com:

Behind the Synth: John Chowning Conversation

Holly Herndon Interviews Yamaha Legend Dr. John Chowning

Winter NAMM 2019: Interview with Don Lewis and Dr. John Chowning

Exploring the DX7

Yamaha DX7 – The Synthesizer that Defined the 80s

Dave Bristow on the First DX7 Presets

MONTAGE Article Roundup

Manny’s FM-Xplorations (Programming FM-X on the MONTAGE)

Click here to read the NAMM oral interview with John Chowning.

Click here for more information about Yamaha MONTAGE synthesizers.

Click here for more information about Yamaha MODX synthesizers.

Want to share your thoughts/comments? Join the conversation on the Forum here.

© 2024 Yamaha Corporation of America and Yamaha Corporation. All rights reserved.    Terms of Use | Privacy Policy | Contact Us