Wellbassd,
I appreciate your quest for ultimate precision and I realize there may well be sound design needs that demand timing ‘that tight’. If the workaround doesn’t cut it for some user, then by all means move to a DAW that does, or get supplemental plugins and utilities that fill the missing feature gap, or ‘request then wait’ around for Steinberg to get around to adding it…
For ‘me’ a ‘proper workaround’ would be to use Bidule. In my case Bidule was an investment I made long ago to supplement any DAW I might need to be working with in a more universal way. It’s also a big player for any live keyboarding sessions I set up for myself. It fills in ‘missing feature’ gaps in pretty much every audio host/app I’ve ever used. I don’t have to learn the eccentricities of 4 different DAWs, and my complex effect chains or hybrid synth sounds are easily ‘portable’ among any sequencer or DAW I’m asked to use for a given project.
For situations like described in the post above (for someone that does not have Bidule, nor any interest in getting it, nor a desire to switch DAWs), I typically would recommend generating the LFO pattern on a MIDI track, as close as you can get it (using the ASIO/MIDI latency configuration options in CuBase settings) first, and then convert that into a true VST lane. From there I’d disable/hide/delete/whatever the MIDI version of the LFO generated sequence. Solo problem solved. MIDI delay problem solved. Yes, it’s still a workaround with a rather alternate work-flow, but it should get the job done for quite a few cases…without ‘waiting’ for Steinberg to add their own VST/LFO generator, or switching DAWs.
From there, if it’s still not as ‘tight’ as you’d like it (some ms delay) it can be nudged exactly where you want and it’d be well locked with the transport/sample clock. No, you’re not going to get a resolution of 48khz (or whatever project sample rate you use) in the initial MIDI LFO pattern (instead it’d be tempo locked with the transport), and again you’re limited to a parameter resolution of 128 unless you jury rig some fancy NRP stuff using multiple CCs and even more third party hacks, but it should be close enough for more than 90% of the cases out there where someone would want to oscillate a given parameter in a given VST plugin.
How often do you really need that precise of an LFO pattern with really high controller resolution though? The examples given above were to simply oscillate a knob in a VST a bit with a given pattern/amplitude over a given time envelope. My ears aren’t good enough to detect 1ms or less inaccuracies, and 128 is usually plenty of resolution for any effect my monitors are capable of translating, and my ears are capable of detecting.
Obviously YOU have situations where you DO need that kind of precision and resolution…so, for that kind of precision requirement I personally would not be using a plugin VST effect directly in a CuBase mixer’s VST slot anyway. My first inclination would be to design the sound in a virtual synth instead (HALion 6 in my case) using its built in LFO and effect features. If I really needed a hybrid mix of plugins to get the job done…with CuBase in mind, I’d wire it up inline in a bidule instance (or some other similar ‘host in a host’ tool).
Hey, I’ve got an old school work-flow. I don’t try to force the DAW to shape my sounds when all that stuff can be done in the ‘instruments’ themselves. I’ve been doing it that way so long that I really never missed, or cared that CuBase could not do it. Having said that, I do wish CuBase had more native automation lanes for more things…such as arming/disarming tracks. Transport controls. Etc.
Personally I’ve never needed to oscillate a parameter living somewhere in a CuBase Mixer effect slot via LFO on a long ‘audio track’. I can see the creative potential in doing so, but I’ve got plugins in my tool box that can usually do that sort of phasing/pumping/side-chaining by their own right (no need to pump a control via automation). I grab the right plugin that generates the effect I have in mind, instead of trying to force some other plugin to do something tricky. I tend to think of a mix from the subtleties of my instruments up, instead of from the host down, and I know I usually have a plugin at hand that can generate the kind of oscillating effect I might have in mind internally. Maybe it’s not the most efficient way in today’s cutting edge studios…but oh well…old habits die hard.
Using bidule is a bit different topic of course…as that’s a third party app that isn’t free, nor ‘cheap’, and has a pretty steep learning curve. In short, with Bidule I’d build a chain that includes the sample accurate LFO generator, the effect(s) plugin(s) I want to automate with it, and set up a real VST lane to drive it from Cubase if automated user tweaks are required for some reason.
Bidule is a staple sound design and routing tool for me (my favorite swiss army knife), with any DAW I use, simply because it fills so many ‘feature gaps’ that can be missing from any given DAW. It’s a powerhouse for intercepting the data streams (both MIDI and Audio) at almost any point in a system configuration, analyzing it, and manipulating it. It also fills alot of gaps in popular synth/sampler engines, and allows me to merge and meld any number of plugins into one massive ‘seamless instrument’. I can snoop and alter pretty much any audio or MIDI event in real time with extreme precision. Mix and match any assortment of VST or VSTi (ATX, AU, etc.) plugins I like, and automate any of it via VST, MIDI, OSC, etc. With a bit of practice, Bidule isn’t a bad little synth/sampler engine in its own right.
MIDI delay in a loopback that is detectable by the human ear? I believe you, but it’s not something I’ve had an issue with when running remote maps via MIDI track in this manner. It’s pretty dawg gone well in sync with the transport, and there are settings tweaks to match up live controller manipulation with ASIO driver latency. I’ve no need to ‘push’ such a setup to sample accurate precision since I have instruments like Bidule and HALion 6. For the types of projects I do in Cubase, it’s not often I’d want lfo on a VST effect control/parameter that lives in an effect slot on a CuBase mixer anyway, and why it would need to be ‘sample accurate’ to oscillate some virtual knob in a VST effect is a new one to me. For something that really needs that kind of sample by sample precision…I personally would design the sound in HALion from the top down (or with Bidule if I needed a hybrid mix of synths/samplers/effects) anyway…as one can pile on all the LFOs he likes (plus crossfade among them, and so much more)…could even drive them with side-chain audio signals etc.
Having droned on with all this…
I agree…Steinberg should add a simple LFO generator for writing its VST events on an automation lane. Should be simple enough to do right? Judging by this thread, a significant amount of users out there ‘want to do this’ for some reason.