Hello,
I have been reading the 2 articles but there is no mention about latency... Am concerned that when playing back MIDI files from a DAW and recording them as AUDIO files at the same time in the DAW, those audio files will be some milliseconds delayed... (as it happens with previous flagship synth I got in the past). So the DAW needs to be adjusted manually for this extra latency.
So in my past experience I had to compensate manually for the latency so the DAW was pulling back the resulting audio files some milliseconds or the midi clips, so the resulting audio files where in synch with the rest of virtual instruments and audio files.
Am I going to encounter the same problem or this is something that has been addressed ?
Thanks
Hi Robert,
The issue here is really about computers-software-and audio handling. What you want from your DAW software is to accurately measure both input and output latency of audio handling in your computer, and automatically compensate for this latency when placing audio.
What makes Steinberg’s Cubase one of the top selling and used professional Audio workstation/Music Production software, is its awesome audio engine and its delay compensation handling.
In discussing latency there is often much confusion about what it is and how it impacts external hardware. And this can be further muddled when the external hardware is also the Synthesizer keyboard. But in such a setup let’s identify the latent signal and then discuss how with external hardware you can avoid dealing with it.
Background
The MODX, in addition to being a synthesizer, has its own USB audio and MIDI interface built-in.
The role of the audio interface is to bring together audio signals and output Digital audio signal to the DAW, and to take Digital audio in from the DAW and output Analog audio to your speakers. The audio brought in is from the MODX synth engine and the A/D Input on the back panel. (Additionally, audio returning Digitally (via USB) can be routed through the synth engine).
When you setup your MODX in your home studio you connect the Analog Outputs directly to your Studio Monitor speakers. This is called the “Direct Monitor” Signal path, and is easily understood because it works with or without the computer connected.
Next you connect the Digital audio path by connecting the MODX to your computer via a USB cable; “To Host” to an available port on your computer. This establishes a bi-directional audio path for the Digital signal to travel to and back from the computer. This is called the “Latent” signal path. It is necessarily delayed because the computer must receive it, document it, process it and send it back out.
With these two audio pathways setup, you can choose to “monitor” one or the other. The “Direct Monitor” path is the zero latency (speed of light) pathway. You would opt to listen to this when playing the keyboard or recording audio via the A/D Input (say with a microphone).
Analog World Analogy:
This is exactly the same as in a real world, old-school, Multi-Track recording studio, the engineer in the Control Room, and the musicians in headphones in the Studio, are monitoring the “Direct” signal path. The signal arrives at the studio console and (speed of light) is distributed to the Monitor and headphone systems. The multi-track tape machine would be recording the data on a separate feed from the console... in this system the latency occurs as the time it takes the recorded data to travel between the Record head and Playback head on the tape machine. This is the equivalent of the time it takes today’s computer’s to record and return the audio. In this system you opt to listen to the latency free, direct signal path. (Besides it was disconcerting to watch the musicians through the glass as they would have appeared out of sync with what you were hearing).
Fortunately, in today’s computers it can time stamp every event, and then place that event exactly where it needs to be based on the precise analysis of the input latency and the output latency of your system. The Steinberg algorithm handling this delay compensation is what makes it one of the premier music softwares on the planet.
What you want to do when recording the MODX is Monitor Direct. You do so by Muting the Audio Track in Cubase until playback. In the old tape based recording studio, after the recording (which everyone was monitoring direct) - the musicians would assemble in the Control room for the playback... it was at that time the fact that it recorded was proved... until that time you where counting on the red lights and meters to tell you the signal was arriving on tape... but not until playback did you hear and verify the results.
Today, not until playback are you listening to the recorded results. When you record, you are going to opt to listen to the direct (latency free) signal. When overdubbing, realize that the signal playing from the computer is subject to the output latency... you play along with it. You may wonder, well why isn’t your data then placed late?
That’s the purpose of Delay compensation! The data you play is time stamped, the computer “knows” the timing of the signal it sent you, it “knows” the time stamp of the data you are sending in... the Delay Compensation function of Cubase places the new data precisely where it belongs so that is in time with the existing data. This is what you are ‘paying for’ from your DAW.
Extra Credit
_Remember latency is the time it takes the computer to receive-document-process-retransmit the audio signal.
_The Direct Monitor path is zero latency, that is not a marketing term, it’s a fact. It is not “low” latency, it is literally zero latency or what is referred to as speed of light. Means -in this plane of reality, it don’t get no better!
_Remember you will normally choose NOT to listen to the latent signal when recording.
_You would opt to listen to the latent signal when you want to hear the sound ‘post’ (after) processing it on the computer... example: you setup a synth sound that you want to record through a plugin Effect. If that Effect will effect *how* you play (like a distortion would change how you approach playing) then naturally you’ll want to monitor ‘post’ the effect processor. In this case you would turn the Direct Monitor switch OFF in the MODX... and you’d monitor the latent signal.
Because the MODX is external hardware to the computer, with its own cpu and its own Effects, you can (more often than not) record without having to opt to listening to the latent signal... if the processing you plan to do to the audio track can be applied later (during mixdown) you can avoid being affected by latency at all. (That is one advantage to hardware synthesizers... soft synths *are* affected by latency!)
To hear and discover this hardware advantage, you can set your BUFFER SIZE purposefully large... laughably large. This way there will be no guessing which is the Direct signal and which is the Latent signal. You can still play and record by Monitoring the Direct signal path. It will be impossible to even play along if you Monitor the latent signal alone.
Issues with latency get worse the more you ask the computer to do... so if it is doing four or five plugin synths, and you have several bands of EQ open here and there, and you have plugin effects all processing on the same computer, the burden on the computer’s cpu is great and your latency is effected (increased). You can workaround such issues by rendering temporary audio files of your soft synths (‘Freeze’) thus lessening the real-time burden on the computer’s CPU.
But the MODX being external hardware, it has its own CPU and does its own processing... you have 3-Bands of EQ Pre the Dual Insertion Effects, and a 2-Band EQ Post the Insertion Effects, for *each* of the 16 MODX Parts... at all times, latency free!
MIDI Delay (while it exists) is not the Latency issue you need to concern yourself with in a computer DAW setup situation. MIDI is not sound. And even in a setup where Local Control is typically Off, and MIDI traverses the computer and comes back to the Tone Generator is a situation where the signal simply passes thru the DAW. (I’m sure there is a way to measure it ..in microseconds... but, it is of no concern here, as MIDI precedes any audio being created and this is not affected by your Buffer Size.
ASIO Latency Compensation is active by default on all MIDI recordings ensuring events are properly placed. For more details see the Steinberg.net website
Here is a great video that explains Latency:
@2:15 it explains ASIO, VST and Latency
Thanks very much for the detailed explanation!!
Allow me to make some extra questions based on "Bad Mister" response.
The first one is: I read about latency and if I am not mistaken, a buffer size has a corresponding latency and that happens with all audio interfaces, so a 256 buffer size will always have 6.x ms. That is what I read but my MODX gives me 9.x ms at that buffer size (my roland gives me the 6.x indeed).
My main question is about direct monitoring. If I understood correctly, within my DAW (any DAW I guess) I can setup a midi track so when I hit a key in my MODX (in the MIDI Rec on DAW mode), the MIDI signal will go to the computer via USB, it will process it and then go back to my MODX and it will trigger the sound.
That will always have latency, right? It has to go back and forward to the computer and it has to process it. I tested it with the DAWs I own. I put like 2048 buffer size and I can notice the latency. In fact, if I put the "standalone" mode it triggers the note twice like an echo (one direct monitoring and one latent).
I am unable to achieve within a DAW (ableton if possible) to set say 4 tracks, one pointing to different midi channels and use direct monitoring for the output, I can always notice the latency (I can put a 128 buffer size and live with it, but I am trying to learn). Always using the "MIDI Rec on DAW".
If I switch to standalone, I get no latency (and I can still record midi notes in the DAW) but all 4 tracks will trigger the same sound.
There are many options and configurations and I am novice with this stuff and I get so confused and sadly the manuals are quite bad (sorry!)
What I understood from here is that I can have as many tracks as I want, they will all send midi to different channels but the sound will sound without any latency at all.
Thank you