I'm going to file this under "examples of Yamaha doing the right thing" (Steinberg is owned by Yamaha)
previous examples:
* Yamaha saved Korg by buying it when it was in financial trouble and giving it a cash injection, only to then sell it back to its previous owners once they had enough cash[1].
* Yamaha in the 80's had acquired Sequential (for those not familiar: Sequential Circuits is one of the most admired synthesizer makers). Many years later, Sequential's founder Dave Smith established a new company under a different name and in 2015 Yamaha decided to return the rights to use the Sequential brand to Smith, as a gesture of goodwill, on Sequential's 40th anniversary (this was also thanks to Roland's founder Ikutaro Kakehashi who convinced Yamaha that it would be the right thing to do) [1][2][3]
On another note, it's very telling that companies that protect their "hey! we do this interesting thing, gonna buy?" character survives for much longer compared to companies which say "we can earn a ton of money if we do this".
The companies in the second lot does a lot of harm to their ecosystems to be able to continue existing.
I've had some impressive customer service from Yamaha concerning decades old saxophones that they have zero prospect of generating revenue from in the future, for an unrelated (musical) data point.
Japanese companies don't give you a good service because of future revenue prospects.
They give a good service because they respect the thing they built and the person who cares for it. They want their products to live and bring joy to people who guards and cares for them.
This is a completely different and much deeper philosophy.
This is not limited to Japanese companies though. A Lamy representative told me that the factory in Germany restored an out of production fountain pen to mint condition, for free.
Want a bit for your early 80s Mercedes? Your local dealer probably has it, or knows where to get it.
Want a bit for your early 50s Mercedes? Your local dealer knows a guy in Stuttgart who will send it over.
Want a bit for your early 30s Mercedes? Your local dealer knows a guy in Stuttgart who knows a guy who will *just go and make you one*, albeit it will be priced accordingly.
Yamaha still document their products properly and provide very long driver support. I currently have a Yamaha product with USB from 1999 and there is still a maintained driver for it 26 years later, for Windows 11 and modern macOS versions.
Completely separate companies, both called Yamaha. One was spun off from the other, but I don't think there was ever a time when the same company sold both. (Basically, the musical instrument company was redirected to making war materiel during WWII. After the war, they didn't want to just throw away all of their new industrial capacity so they spun off a company to make use of all their new equipment and expertise and then went back to making instruments.)
Not sure about the meaning of your asterisk, but the Nokian Tyres corporation is not related to Nokia the telecoms co, other than being founded in the same town.
Nokia did manufacture rubber boots though, before they spun off the footwear division in 1990 and went all in on electronics.
I didn't realize when I was a kid that the Yamaha music company came first.
I remember being confused when looking at high end saxophones that one was made by an old French company (that made sense, France makes many fine luxury goods including instruments) and the other was (in my mind) made by a motorcycle company. How could a motorcycle company possibly have compiled the expertise to make high end musical instruments when most musical instrument companies were chasing the low end of the market at the time?
But Yamaha music (1887) was started only 2 years after Selmer (1885). They got their start making reed organs. Reed organs (1) are technical, (2) make sound with reeds, and (3) are luxury items. So their expertise in sax (a reed instrument) and synthesizers (technical keyboard instruments) makes a ton of sense.
I love Yamaha Motor using the tuning forks as their logo. It's a proper beautiful old-timey logo (well, from 1967, apparently, but anyway) and it's just so weird seeing them on a motorcycle.
Apparently they also make network switches. I guess it is in support of their computerized audio equipment. but I was a bit surprised when they showed up in a search result.
They have the technical capability to design one, but on the surface it is enough outside their core product line that I wonder if it is a oem rebadge.
They claim that they're developing routers since 1995 and they're used widely in domestic Japan (SME and SOHO).
Looks like switches came in 2011 and there's some secret sauce which makes them autoconfigure each-other to reduce networking setup.
It might not be a standard OEM stuff.
Citing the page:
> Yamaha entered the router business in 1995, and has grown to hold a significant share of Japan’s small to medium enterprise and SOHO network market. Yamaha gigabit L2 switches that could be linked to Yamaha routers/firewalls were introduced in 2011, with features that significantly reduced the network setup, maintenance, and management workload.
Customer: "So, I need some huge IGBTs for an electric train motor, I need a 44-tonne excavator to lift the train, I need a new stereo to listen to while I fix it, and I need an, uhm, 'personal massager' to relax afterwards"
Sales guy: "Here's our catalogue, page 40, page 32, page 108, and page 7. Let me know what colours you want."
I don't think Yamaha Motor is producing any large trucks. They do a lot of things but mostly motorcycles, atv, boat engine, even car engines but not the whole car.
Also, you should note that Yamaha Corporation, the musical instrument maker and Yamaha Motor are now 2 distinct independent companies, even if were originally part of the same group.
They are independent yes, but originally the motor company was an affiliate spin-off. They do have an agreement and share the same logo, and Yamaha Corporation has some shares in the Motor one tho.
Still they're a separate legal entity and their HQ, development, support, etc. are still located in Hamburg like they used to be since the early-mid 1980s when they released their MIDI sequencing software for Atari ST (Steinberg Twenty 4 I believe it was called?). I guess you could do worse than being bought by Yamaha, but I think this decision isn't related to it.
interesting stuff. I love Yamaha for audio stuff for sure, didn't know they owned Steinberg though.
Their speakers i think are lovely examples of their engineering quality. Great and honest sound, some of the best out there, and they are not super over-priced. Also ,they are super repairable. Had some really bad experiences with other brands which were, more expensive for a more biassed sound, had 'black gunk' over the PCBs as some kind of anti-repair mechanism. (overheats the boards too! ew!) and other crappy issues.
Cool to hear there's such a story behind the quality. Makes sense!
Yet strangely for some reason Yamaha has never released a software version of any of their synths. (there's S-YXG50 and the Montage one but I wouldn't really count those)
What really kills me about companies and maybe Yamaha is a little different, or rather drastically, is any time CEO's shift, or original founding CEO is swapped, the company culture changes too drastically. There's companies whose original culture I admired, and then the CEO shifts and its just meh, or worse.
The direct result of the newer, open CLAP format being objectively better in every way. Steinberg has gone to great lengths to force adoption of the trash that is VST3 and retain it's stranglehold on the audio world, including but not limited to, takedowns of distributors, takedowns of VST2.4 SDKs, constant threats of legal action against independent VST2.4 developers forcing them to remove purchases from customers, and funding particular plugin frameworks & daw developers to slow CLAP adoption.
All the commercial ones I've bought in the past year or so do, and ever since I think JUCE 7 there have been good libraries for open source projects that want to add the format.
I think there's still a lot of bad feeling about the fact that there are many VST2 plugins that are open source but nonetheless illegal (or at least tortious) to build.
If you have the source code and use JUCE then yes, you can convert the plugin to VST3. But if you don't use a framework then you need to port the code manually.
I see more and more brands not only adopting CLAP but also offering Linux versions of their plugins. The adoption is slow but that's expected with a relatively new format but it certainly grows.
Clap doesn't allow describing plugin in a manifest (like VST3 and LV2 do). This allows to scan for plugins faster.
Also, CLAP uses 3 or 4 methods to represent MIDI data (MIDI1, MIDI1 + MPE, MIDI2, CLAP events). This requires to write several converters when implementing a host.
But, CLAP is much simpler and doesn't use COM-like system (VST3 resembles a Windows COM library with endless interfaces and GUIDs).
Also, VST3 interfaces in SDK are described as C++ classes with virtual functions (example: [1]), and I wonder how do they achieve portability, if the layout for such classes (vtables) is not standardized and may differ across compilers.
> Clap doesn't allow describing plugin in a manifest (like VST3 and LV2 do). This allows to scan for plugins faster.
VST3 only recently gained the `moduleinfo.json` functionality and support is still materialising. Besides, hosts generally do a much better job about only scanning new plugins or ones that have changed, and hosts like Bitwig even do the scanning in the background. The manifest approach is cool, but in the end, plugin DLLs just shouldn't be doing any heavy lifting until they actually need to create an instance anyway.
> Also, CLAP uses 3 or 4 methods to represent MIDI data (MIDI1, MIDI1 + MPE, MIDI2, CLAP events). This requires to write several converters when implementing a host.
I've not done the host-side work, but the plugin-side work isn't too difficult. It's the same data, just represented differently. Disclaimer: I don't support MIDI2 yet, but I support the other 3.
On the other side, VST3 has some very strange design decisions that have led me to a lot of frustration.
Having separate parameter queues for sample-accurate automation requires plugins to treat their parameters in a very specific way (basically, you need audio-rate buffers for your parameter values that are as long as the maximum host block) in order to be written efficiently. Otherwise plugins basically have to "flatten" those queues into a single queue and handle them like MIDI events, or alternately just not handle intra-block parameter values at all. JUCE still doesn't handle these events at all, which leads to situations where a VST2 build of a JUCE plugin will actually handle automation better than the VST3 build (assuming the host is splitting blocks for better automation resolution, which all modern hosts do).
duped's comment about needing to create "dummy" parameters which get mapped to MIDI CCs is spot-on as well. JUCE does this. 2048 additional parameters (128 controllers * 16 channels) just to receive CCs. At least JUCE handles those parameters sample-accurately!
There's other issues too but I've lost track. At one point I sent a PR to Steinberg fixing a bug where their VST3 validator (!!!) was performing invalid (according to their own documentation) state transitions on plugins under test. It took me weeks to get the VST3 implementation in my plugin framework to a shippable state, and I still find more API and host bugs than I ever hit in VST2. VST3 is an absolute sprawl of API "design" and there are footguns in more places than there should be.
On the contrary, CLAP support took me around 2 days, 3 if we're being pedantic. The CLAP API isn't without its share of warts, but it's succinct and well-documented. There's a few little warts (the UI extension in particular should be more clear about when and how a plugin is supposed to actually open a window) but these are surmountable, and anecdotally I have only had to report one (maybe two) host bugs so far.
Again, disclaimer: I was involved in the early CLAP design efforts (largely the parameter extension) and am therefore biased, but if CLAP sucked I wouldn't shy away from saying it.
>Also, CLAP uses 3 or 4 methods to represent MIDI data (MIDI1, MIDI1 + MPE, MIDI2, CLAP events)
Contrast to VST3 which doesn't support MIDI at all, unless you count cresting thousands of dummy parameters hardcoded to MIDI controller numbers "support."
VST3 uses proprietary events for things like note on/off and note expressions. As for MIDI controllers, the host is supposed to convert them to parameter changes.
This makes sense if you want to map a controller to a plugin parameter in a DAW. However, if you want to write a "MIDI effect", which transforms incoming MIDI data for controllers, it would be difficult.
Also it is interesting that VST3 has an event for note expression, and separate event for polyphonic pressure although it can be considered a note expression.
I don't think GCC has a special case for handling COM classes. However, I found that GCC uses "Itanium CXX ABI" on Linux which specifies vtable layout which accidentally might match the layout of COM classes. However, it is not guaranteered (for example, by C++ standards) that other compilers use the same layout.
Not really, VST3's COM-like API just uses virtual methods, they don't guarantee layout to the same degree actual COM does with compiler support. They simply rely on the platform ABI being standardized enough.
You would have thought they learned from their mistakes implementing VST2, but they doubled down going even further basing VST3 on the Windows Component Object Model. I guess it was a decision to avoid reinventing the wheel, but you can quickly realize it is a very bad model for real time audio plugins and audio host support. The API just exploded in complexity, and testing was a nightmare. In contrast you can tell the U-He developers have all the experience from the trenches.
> COM is just 3 predefined calls in the virtual table.
COM can be as simple as that implementation side, at least if your platforms vtable ABI matches COM's perfectly, but it also allows far more complicated implementations where every implemented interface queried will allocate a new distinct object, etc.
I.E. even if you know for sure that the object is implemented in c++, and your platforms' vtable ABI matches COM's perfectly, and you know exactly what interfaces the object you have implements, you cannot legally use dynamic_cast, as there is no requirement that one class inherits from both interfaces. The conceptual "COM object" could instead be implemented as one class per interface, each likely containing a pointer to some shared data class.
This is also why you need to do the ref counting with respect to each distinct interface, since while it is legal from an implementation side to just share one ref count for it all, that is in no way required.
The shift away from proprietary formats is long overdue.
As a composer and arranger working with different studios, I need multiple DAWs installed for compatibility. Every time I open my DAW or Gig Performer after a few days, it rescans all plugins. With around 800 installed, that happens across AU, VST, and VST3.
I hope Apple and Avid are holding meetings after this decision to help simplify the life of library/plugin makers. As an example AAX requires a complete mess to compile and test their plugins, and several AU plugins are just wrappers aroud VST that add another layer.
I really hope the next five years bring real standardization and smoother workflows.
This is technical people at their finest! There couldn't be any news more important than this—or more anticipated by the community. For so many years people wished for this, and they announce it this low-key in a forum! This is so awesome. Thanks to Steinberg & YAMAHA, I guess so much good is to come out of it.
There is a lot of good news in open source audio these days. Also see this video presenting the work done and planned for the future version 4 of Audacity: https://www.youtube.com/watch?v=QYM3TWf_G38
Funnily enough, that video talks about the pain of implementing a VST3 host at around the 25 minute mark. "If you're planning on doing it, set aside a lot of time."
i have my own vst3 host. it's not really that difficult.
the real problem is that theres a lot of plugins that do some random thing that wont work becasue it's not standard.
I still get customers requesting that I distribute VST(2) builds. Some old DAWs and apps still can’t load VST3. Thus far it hasn’t been possible due to licensing restrictions with commercial plug-ins, imposed by Steinberg.
As a complete audio outsider, my observations are:
1. Great news! VSTs seem to fill an important role in the audio-processing software world, and having them more open must be a good thing.
2. From the things they mention, the SDK seems way larger than I had imagined, but that is normal for (software) things, I guess. "This API also enables the scheduling of tasks on the main thread from any other thread." was not easy to unpack nor see the use of in what was (to me) an audio-generation-centered API.
3. The actual post seems to be somewhat mangled, I see both proper inline links and what looks like naked Markdown links, and also bolded words that also have double asterisks around them. Much confusing.
> the SDK seems way larger than I had imagined, but that is normal for (software) things, I guess. "This API also enables the scheduling of tasks on the main thread from any other thread." was not easy to unpack nor see the use of in what was (to me) an audio-generation-centered API
VST plugins almost all have a GUI, thus the VST SDK has to support an entire cross-platform UI framework... This threading functionality is mostly about shipping input events/rendering updates back and forth to the main (UI) thread
There is no single UI framework in VST. The plugin API only has interfaces for creating/destroying/resizing a GUI window. You are not required to use VSTGUI.
Audio is often processed on a separate thread than the UI. If memory serves (been a while) there's the UI portion and the audio engine portion of most VSTs, which can be booted together or independently. So threading is very important.
Yeah, I realized that once I finished writing my comment, that it might be about communicating with the UI since UI toolkits are usually not thread-safe enough. Thanks.
The basic threading model for plugins is the "main" and "audio" threads. The APIs specify which methods are allowed to be called concurrently from which thread.
There is also a state machine for the audio processing bits (for example you can guarantee that processing won't happen until after the plugin as been "activated" and won't go from a deactivated state to processing until a specific method is called - I'm simplifying significantly for the VST3 state machine).
The "main" thread is the literal main/UI thread of the application typically, or a sandboxed plugin host running in a separate process. You do your UI on this thread as well as handle most host events.
Plugins often want to do things on background threads, like stream audio from disk or do heavy work like preparing visualization without blocking the main UI thread (which also handles rendering and UI events - think like the JS event loop, it's bad to block it).
The threading model and state machine make it difficult to know where it's safe to spawn and join threads. You can do it in a number of places but you also have to be careful about lifetimes of those threads, most plugins do it as early as possible and then shut them down as late as possible.
The host also has to do a lot of the stuff on background threads and usually has its own thread pool. CLAP introduced an extension to hook into the host's thread pool so plugins don't have to spawn threads and no longer have to really care about the lifetime. VST3 is copying that feature.
When you see annotations on methods in these APIs about "main" vs "any" thread and "active" etc they're notes to developers on where it is safe to call the methods and any synchronization required (on both sides of the API).
If it sounds complicated that's because it is, but most of this is accidental complexity created by VST3.
That's very interesting news. Definitely brought on by CLAP as others have mentioned, but it's interesting to see how this evolves. VST is a pretty complicated standard to support whereas CLAP is much simpler, although the former is much more widely used.
Like 1 in 200 plugins supports CLAP, where 100% support VST, so if they can do it more easily and with less licensing burden, and even have some community contribution, that would be big.
It will be a while, if ever, before most plugins get the CLAP (pun intended).
Most plugins don't use these APIs directly, they rely on wrappers like JUCE. Once JUCE supports CLAP the plugins will follow. That should happen in the next year.
The bigger problem are hosts. While Apple and Avid will probably never support CLAP, everyone but Ableton does. They move slower than the rest of the industry (taking a decade or so to implement VST3). Which is odd because CLAP is significantly easier to use from both the host and plugin side.
That said, you can wrap a clap plugin as a vst3 or AU today. It's probably the lowest friction way to do it to be honest.
I know it isn't nowhere near in adoption NOW. I meant it shows there is room for another format and AU is a good example of how another format can make inroads.
I’d really like to see more plugins available in the LV2 format for my Ardour RT DAW.
Also, a quick recommendation : LSP (Linux Studio Plugins) an excellent collection of several open source plugins supporting CLAP, AU, LV2, VST2, VST3, LADSPA, and a standalone Jack versions https://lsp-plug.in/
Since I never agreed to the VST3 SDK terms, which required you to give up your license to VST2, does this mean I can finally make VST3 plugins without losing the ability to publish VST2?
You'd have to use this version (not sure if they back-licensed old versions) but MIT would mean you wouldn't have to agree to that draconian licensing.
At the same time Steinberg also open sourced their ASIO audio hardware interface standard but under GPL3. GPL2 here would have made more sense to me to align with the Linux kernel GPL2 only licensing. So why GPL3? Other commenters here have mentioned OBS, and OBS is "GPLv2 or later" so sure that works for them. Not being GPL 2 and missing on the Linux kernel just surprises me.
I have been using the nice cwASIO (https://github.com/s13n/cwASIO) re-implementation of the ASIO SDK, it's MIT licensed. https://github.com/s13n/cwASIO. It's nice there just to see something more up to date than the ancient ASIO SDK documentation. I would love to see the Steinberg ASIO SDK updated and improved, if you are listening Steinberg folks: nobody cares about the history of ASIO on Macs or Silicon Graphics Workstations, just dive in and get deep into the weeds of ASIO on Windows, and include lots more sample code, especially covering the ASIO device enumeration mess on Windows.
Wow, after all these years. This is a very Good Thing. You could get access to it before but you had to sign a very long agreement and it was always a PITA.
Steinberg is only going to benefit from this, I think.
No more takedown notices for including legacy VST SDK in your github project? Wait, mine was 2.4, so I guess steinberg would still chase me down if I hadn't complied already.
Having only used VSTs but never even looked into how they're actually built - what does this now mean in simple terms? Did you need a specific closed source framework to build them or something like that? What has changed now?
You had to accept some license terms before you could download the VST SDK. When linux audio started to get "serious" 20 years ago, it was a commonly discussed pain point.
Concretely, it made distributing OSS VST plugins a pain. Especially for Linux which generally will want to build their packages.
Note that his was the VST2 era. VST3 was commercial license or GPL 3, which was an improvement, but only slightly, because it excluded open-source software released under the GPL 2, and also MIT/BSD/whatever-licensed software couldn't use it (without effectively turning the whole software into GPL-licensed software).
Why are we still centralizing open source on Microsoft's GitHub? Haven't we learned the risks of giving one corporation, especially one with a such a shady history, exclusive control over the world's open source activity?
Because they don't have exclusive control, unlike social media where you can't take your data and move it to another provider, you can just take your repo to whichever provider or self-hosted GitOps option you want.
I'm going to file this under "examples of Yamaha doing the right thing" (Steinberg is owned by Yamaha)
previous examples:
* Yamaha saved Korg by buying it when it was in financial trouble and giving it a cash injection, only to then sell it back to its previous owners once they had enough cash[1].
* Yamaha in the 80's had acquired Sequential (for those not familiar: Sequential Circuits is one of the most admired synthesizer makers). Many years later, Sequential's founder Dave Smith established a new company under a different name and in 2015 Yamaha decided to return the rights to use the Sequential brand to Smith, as a gesture of goodwill, on Sequential's 40th anniversary (this was also thanks to Roland's founder Ikutaro Kakehashi who convinced Yamaha that it would be the right thing to do) [1][2][3]
[1] https://www.soundonsound.com/music-business/history-korg-par...
[2] https://www.gearnews.com/american-giants-the-history-of-sequ...
[3] https://ra.co/news/42428
Yamaha is an old company found on very different ethos compared to others. Their history is interesting, too: https://www.youtube.com/watch?v=y6t5F3cb810
It's worth a watch.
On another note, it's very telling that companies that protect their "hey! we do this interesting thing, gonna buy?" character survives for much longer compared to companies which say "we can earn a ton of money if we do this".
The companies in the second lot does a lot of harm to their ecosystems to be able to continue existing.
I've had some impressive customer service from Yamaha concerning decades old saxophones that they have zero prospect of generating revenue from in the future, for an unrelated (musical) data point.
Japanese companies don't give you a good service because of future revenue prospects.
They give a good service because they respect the thing they built and the person who cares for it. They want their products to live and bring joy to people who guards and cares for them.
This is a completely different and much deeper philosophy.
This is not limited to Japanese companies though. A Lamy representative told me that the factory in Germany restored an out of production fountain pen to mint condition, for free.
Want a bit for your early 80s Mercedes? Your local dealer probably has it, or knows where to get it.
Want a bit for your early 50s Mercedes? Your local dealer knows a guy in Stuttgart who will send it over.
Want a bit for your early 30s Mercedes? Your local dealer knows a guy in Stuttgart who knows a guy who will *just go and make you one*, albeit it will be priced accordingly.
Yamaha still document their products properly and provide very long driver support. I currently have a Yamaha product with USB from 1999 and there is still a maintained driver for it 26 years later, for Windows 11 and modern macOS versions.
I wouldn't depend on them to do this for all their products esp software. I can think of two of the top of my head that they dropped support.
customer: hello, I want to buy a piano please
yamaha: sure, here you go
customer: great, thanks! lol, I also need a motorcycle. Do you know where I can buy a good one?
yamaha: you're not gonna believe this...
Completely separate companies, both called Yamaha. One was spun off from the other, but I don't think there was ever a time when the same company sold both. (Basically, the musical instrument company was redirected to making war materiel during WWII. After the war, they didn't want to just throw away all of their new industrial capacity so they spun off a company to make use of all their new equipment and expertise and then went back to making instruments.)
Although the Yamaha that makes music and audio products is the same Yamaha that makes golf clubs (https://global.golf.yamaha.com/en/) and industrial equipment (https://www.yamahafinetech.co.jp/en/).
The OG Yamaha produced a motorcycle in 1954, the YA-1. That success then led to the spin off.
(fun fact: the motorcycle Triumph and the undergarment Triumph are two entirely different companies that just happen to share the same name)
A motorcycle named Norton Commander also exists, and Nokia* sold winter bicycle tires with studs on them so they would grip better on ice and snow.
Not sure about the meaning of your asterisk, but the Nokian Tyres corporation is not related to Nokia the telecoms co, other than being founded in the same town.
Nokia did manufacture rubber boots though, before they spun off the footwear division in 1990 and went all in on electronics.
Nokian studded tires for bicycles are (were?) the best! Rode many-many kilometers at winter with them!
Triumph is also a garment brand? Never heard of it.
I had no idea you've never heard of it. Thanks for keeping us informed.
It’s also a Wonder Dog, a Canadian power trio not featuring Neal Peart, and a moment when we shouldn’t evacuate the Death Star.
> Wonder Dog
I think you meant Insult Comic Dog.
I guess we were too good at Triumphing…
Same company
They may have “different legal entities” but it’s the same.
https://en.wikipedia.org/wiki/Yamaha_Corporation
I didn't realize when I was a kid that the Yamaha music company came first.
I remember being confused when looking at high end saxophones that one was made by an old French company (that made sense, France makes many fine luxury goods including instruments) and the other was (in my mind) made by a motorcycle company. How could a motorcycle company possibly have compiled the expertise to make high end musical instruments when most musical instrument companies were chasing the low end of the market at the time?
But Yamaha music (1887) was started only 2 years after Selmer (1885). They got their start making reed organs. Reed organs (1) are technical, (2) make sound with reeds, and (3) are luxury items. So their expertise in sax (a reed instrument) and synthesizers (technical keyboard instruments) makes a ton of sense.
Day Start proekror proom
I love Yamaha Motor using the tuning forks as their logo. It's a proper beautiful old-timey logo (well, from 1967, apparently, but anyway) and it's just so weird seeing them on a motorcycle.
https://www.yamaha.com/en/about/history/logo/
Just pretend it is the thing that hold the front wheel.
Apparently they also make network switches. I guess it is in support of their computerized audio equipment. but I was a bit surprised when they showed up in a search result.
https://usa.yamaha.com/products/proaudio/network_switches/in...
They have the technical capability to design one, but on the surface it is enough outside their core product line that I wonder if it is a oem rebadge.
They claim that they're developing routers since 1995 and they're used widely in domestic Japan (SME and SOHO).
Looks like switches came in 2011 and there's some secret sauce which makes them autoconfigure each-other to reduce networking setup.
It might not be a standard OEM stuff.
Citing the page:
> Yamaha entered the router business in 1995, and has grown to hold a significant share of Japan’s small to medium enterprise and SOHO network market. Yamaha gigabit L2 switches that could be linked to Yamaha routers/firewalls were introduced in 2011, with features that significantly reduced the network setup, maintenance, and management workload.
They make Dante licenced products.
Huh, thanks for sharing. Funny to see the switches in the same color scheme as some of their receivers.
See also Hitachi ;-)
Customer: "So, I need some huge IGBTs for an electric train motor, I need a 44-tonne excavator to lift the train, I need a new stereo to listen to while I fix it, and I need an, uhm, 'personal massager' to relax afterwards"
Sales guy: "Here's our catalogue, page 40, page 32, page 108, and page 7. Let me know what colours you want."
I don't think Yamaha Motor is producing any large trucks. They do a lot of things but mostly motorcycles, atv, boat engine, even car engines but not the whole car.
Also, you should note that Yamaha Corporation, the musical instrument maker and Yamaha Motor are now 2 distinct independent companies, even if were originally part of the same group.
They are independent yes, but originally the motor company was an affiliate spin-off. They do have an agreement and share the same logo, and Yamaha Corporation has some shares in the Motor one tho.
a former keiretsu?
Former, current, reformed keiretsu?
Had no idea it was bought by Yamaha.
Still they're a separate legal entity and their HQ, development, support, etc. are still located in Hamburg like they used to be since the early-mid 1980s when they released their MIDI sequencing software for Atari ST (Steinberg Twenty 4 I believe it was called?). I guess you could do worse than being bought by Yamaha, but I think this decision isn't related to it.
interesting stuff. I love Yamaha for audio stuff for sure, didn't know they owned Steinberg though.
Their speakers i think are lovely examples of their engineering quality. Great and honest sound, some of the best out there, and they are not super over-priced. Also ,they are super repairable. Had some really bad experiences with other brands which were, more expensive for a more biassed sound, had 'black gunk' over the PCBs as some kind of anti-repair mechanism. (overheats the boards too! ew!) and other crappy issues.
Cool to hear there's such a story behind the quality. Makes sense!
Yet strangely for some reason Yamaha has never released a software version of any of their synths. (there's S-YXG50 and the Montage one but I wouldn't really count those)
What really kills me about companies and maybe Yamaha is a little different, or rather drastically, is any time CEO's shift, or original founding CEO is swapped, the company culture changes too drastically. There's companies whose original culture I admired, and then the CEO shifts and its just meh, or worse.
Great recap!
The direct result of the newer, open CLAP format being objectively better in every way. Steinberg has gone to great lengths to force adoption of the trash that is VST3 and retain it's stranglehold on the audio world, including but not limited to, takedowns of distributors, takedowns of VST2.4 SDKs, constant threats of legal action against independent VST2.4 developers forcing them to remove purchases from customers, and funding particular plugin frameworks & daw developers to slow CLAP adoption.
Congrats for making it but it feels like being pushed to do it since CLAP was brought forward quite successfully [1]
[1] https://u-he.com/community/clap/
I do a fair bit of music and have never seen a CLAP plugin in the wild
Very useful for all the existing plugins though, especially if any want to become open source.
how has CLAP adoption been? do the popular plugins out there generally provide a CLAP version nowadays?
All the commercial ones I've bought in the past year or so do, and ever since I think JUCE 7 there have been good libraries for open source projects that want to add the format.
I think there's still a lot of bad feeling about the fact that there are many VST2 plugins that are open source but nonetheless illegal (or at least tortious) to build.
Hopefully this provides a path for those VST2 plugins.
No. VST2 has nothing in common with VST3 despite similar name.
Why? In Juce isn't that a matter of choosing multiple build target?
If you're using JUCE and not using any of the VST2 features removed in VST3.
If you have the source code and use JUCE then yes, you can convert the plugin to VST3. But if you don't use a framework then you need to port the code manually.
I see more and more brands not only adopting CLAP but also offering Linux versions of their plugins. The adoption is slow but that's expected with a relatively new format but it certainly grows.
There is a list of software with support here https://clapdb.tech/
Maybe Steinberg is getting ready to add CLAP to their software?
Wow, didn't realize u-he grew so big. I remember them from Zebra days.
They aren't big - but they are bigger than when it was just Urs and a couple of guys turning out plugins.
They becoming popular on the back of Diva, and Hans Zimmer using Zebra (he's very fulsome in his praise whenever mentioning u-he in interviews).
They make a lot of great virtual instruments nowadays. Diva being a big one.
clap is way better
Clap doesn't allow describing plugin in a manifest (like VST3 and LV2 do). This allows to scan for plugins faster.
Also, CLAP uses 3 or 4 methods to represent MIDI data (MIDI1, MIDI1 + MPE, MIDI2, CLAP events). This requires to write several converters when implementing a host.
But, CLAP is much simpler and doesn't use COM-like system (VST3 resembles a Windows COM library with endless interfaces and GUIDs).
Also, VST3 interfaces in SDK are described as C++ classes with virtual functions (example: [1]), and I wonder how do they achieve portability, if the layout for such classes (vtables) is not standardized and may differ across compilers.
[1] https://github.com/steinbergmedia/vst3_pluginterfaces/blob/3...
> Clap doesn't allow describing plugin in a manifest (like VST3 and LV2 do). This allows to scan for plugins faster.
VST3 only recently gained the `moduleinfo.json` functionality and support is still materialising. Besides, hosts generally do a much better job about only scanning new plugins or ones that have changed, and hosts like Bitwig even do the scanning in the background. The manifest approach is cool, but in the end, plugin DLLs just shouldn't be doing any heavy lifting until they actually need to create an instance anyway.
> Also, CLAP uses 3 or 4 methods to represent MIDI data (MIDI1, MIDI1 + MPE, MIDI2, CLAP events). This requires to write several converters when implementing a host.
I've not done the host-side work, but the plugin-side work isn't too difficult. It's the same data, just represented differently. Disclaimer: I don't support MIDI2 yet, but I support the other 3.
On the other side, VST3 has some very strange design decisions that have led me to a lot of frustration.
Having separate parameter queues for sample-accurate automation requires plugins to treat their parameters in a very specific way (basically, you need audio-rate buffers for your parameter values that are as long as the maximum host block) in order to be written efficiently. Otherwise plugins basically have to "flatten" those queues into a single queue and handle them like MIDI events, or alternately just not handle intra-block parameter values at all. JUCE still doesn't handle these events at all, which leads to situations where a VST2 build of a JUCE plugin will actually handle automation better than the VST3 build (assuming the host is splitting blocks for better automation resolution, which all modern hosts do).
duped's comment about needing to create "dummy" parameters which get mapped to MIDI CCs is spot-on as well. JUCE does this. 2048 additional parameters (128 controllers * 16 channels) just to receive CCs. At least JUCE handles those parameters sample-accurately!
There's other issues too but I've lost track. At one point I sent a PR to Steinberg fixing a bug where their VST3 validator (!!!) was performing invalid (according to their own documentation) state transitions on plugins under test. It took me weeks to get the VST3 implementation in my plugin framework to a shippable state, and I still find more API and host bugs than I ever hit in VST2. VST3 is an absolute sprawl of API "design" and there are footguns in more places than there should be.
On the contrary, CLAP support took me around 2 days, 3 if we're being pedantic. The CLAP API isn't without its share of warts, but it's succinct and well-documented. There's a few little warts (the UI extension in particular should be more clear about when and how a plugin is supposed to actually open a window) but these are surmountable, and anecdotally I have only had to report one (maybe two) host bugs so far.
Again, disclaimer: I was involved in the early CLAP design efforts (largely the parameter extension) and am therefore biased, but if CLAP sucked I wouldn't shy away from saying it.
>Also, CLAP uses 3 or 4 methods to represent MIDI data (MIDI1, MIDI1 + MPE, MIDI2, CLAP events)
Contrast to VST3 which doesn't support MIDI at all, unless you count cresting thousands of dummy parameters hardcoded to MIDI controller numbers "support."
VST3 uses proprietary events for things like note on/off and note expressions. As for MIDI controllers, the host is supposed to convert them to parameter changes.
This makes sense if you want to map a controller to a plugin parameter in a DAW. However, if you want to write a "MIDI effect", which transforms incoming MIDI data for controllers, it would be difficult.
Also it is interesting that VST3 has an event for note expression, and separate event for polyphonic pressure although it can be considered a note expression.
> As for MIDI controllers, the host is supposed to convert them to parameter changes
And nearly everyone except Steinberg considers this to be a mistake. MIDI messages (CCs, pitch bend, and so on) are _not_ parameters.
they are com classes. the vtable layout for them is specified.
I don't think GCC has a special case for handling COM classes. However, I found that GCC uses "Itanium CXX ABI" on Linux which specifies vtable layout which accidentally might match the layout of COM classes. However, it is not guaranteered (for example, by C++ standards) that other compilers use the same layout.
The ABI is stable everywhere VST3s are used. It has to be or nothing would work.
Everything would work except for VST3, if written according to standards.
Not really, VST3's COM-like API just uses virtual methods, they don't guarantee layout to the same degree actual COM does with compiler support. They simply rely on the platform ABI being standardized enough.
You would have thought they learned from their mistakes implementing VST2, but they doubled down going even further basing VST3 on the Windows Component Object Model. I guess it was a decision to avoid reinventing the wheel, but you can quickly realize it is a very bad model for real time audio plugins and audio host support. The API just exploded in complexity, and testing was a nightmare. In contrast you can tell the U-He developers have all the experience from the trenches.
> it is a very bad model for real time audio plugins and audio host support
COM is just 3 predefined calls in the virtual table. CLAP gives you a bunch of pointers to functions, which is similar.
> COM is just 3 predefined calls in the virtual table.
COM can be as simple as that implementation side, at least if your platforms vtable ABI matches COM's perfectly, but it also allows far more complicated implementations where every implemented interface queried will allocate a new distinct object, etc.
I.E. even if you know for sure that the object is implemented in c++, and your platforms' vtable ABI matches COM's perfectly, and you know exactly what interfaces the object you have implements, you cannot legally use dynamic_cast, as there is no requirement that one class inherits from both interfaces. The conceptual "COM object" could instead be implemented as one class per interface, each likely containing a pointer to some shared data class.
This is also why you need to do the ref counting with respect to each distinct interface, since while it is legal from an implementation side to just share one ref count for it all, that is in no way required.
The shift away from proprietary formats is long overdue.
As a composer and arranger working with different studios, I need multiple DAWs installed for compatibility. Every time I open my DAW or Gig Performer after a few days, it rescans all plugins. With around 800 installed, that happens across AU, VST, and VST3.
I hope Apple and Avid are holding meetings after this decision to help simplify the life of library/plugin makers. As an example AAX requires a complete mess to compile and test their plugins, and several AU plugins are just wrappers aroud VST that add another layer.
I really hope the next five years bring real standardization and smoother workflows.
This is technical people at their finest! There couldn't be any news more important than this—or more anticipated by the community. For so many years people wished for this, and they announce it this low-key in a forum! This is so awesome. Thanks to Steinberg & YAMAHA, I guess so much good is to come out of it.
There is a lot of good news in open source audio these days. Also see this video presenting the work done and planned for the future version 4 of Audacity: https://www.youtube.com/watch?v=QYM3TWf_G38
Funnily enough, that video talks about the pain of implementing a VST3 host at around the 25 minute mark. "If you're planning on doing it, set aside a lot of time."
i have my own vst3 host. it's not really that difficult. the real problem is that theres a lot of plugins that do some random thing that wont work becasue it's not standard.
IIRC that's also what is said in the video.
try implementing the same without VST3 technology.
If you're planning to do that. Set aside a lot of time.....
I just hope they don't try to sneak in Google Analytics again.
I’m keeping a very close eye on audacity 4 mostly to make sure they don’t try something again lol
I still get customers requesting that I distribute VST(2) builds. Some old DAWs and apps still can’t load VST3. Thus far it hasn’t been possible due to licensing restrictions with commercial plug-ins, imposed by Steinberg.
I wonder if that will change as well..
As a complete audio outsider, my observations are:
1. Great news! VSTs seem to fill an important role in the audio-processing software world, and having them more open must be a good thing.
2. From the things they mention, the SDK seems way larger than I had imagined, but that is normal for (software) things, I guess. "This API also enables the scheduling of tasks on the main thread from any other thread." was not easy to unpack nor see the use of in what was (to me) an audio-generation-centered API.
3. The actual post seems to be somewhat mangled, I see both proper inline links and what looks like naked Markdown links, and also bolded words that also have double asterisks around them. Much confusing.
> the SDK seems way larger than I had imagined, but that is normal for (software) things, I guess. "This API also enables the scheduling of tasks on the main thread from any other thread." was not easy to unpack nor see the use of in what was (to me) an audio-generation-centered API
VST plugins almost all have a GUI, thus the VST SDK has to support an entire cross-platform UI framework... This threading functionality is mostly about shipping input events/rendering updates back and forth to the main (UI) thread
There is no single UI framework in VST. The plugin API only has interfaces for creating/destroying/resizing a GUI window. You are not required to use VSTGUI.
For context, the variation in UI between VSTs is pretty large and tend to be very creative, much like UI in games.
Correct, it just hands you native handles. What you do with them is up to you.
JUCE is a popular UI framework (at least it was 10 years ago). But I've seen people put electron apps somehow into a VST.
Oh man, this is really starting to look like a plague.
And yet here we are discussing the value of using C++ vs other languages for real time audio processing.
Audio is often processed on a separate thread than the UI. If memory serves (been a while) there's the UI portion and the audio engine portion of most VSTs, which can be booted together or independently. So threading is very important.
Yeah, I realized that once I finished writing my comment, that it might be about communicating with the UI since UI toolkits are usually not thread-safe enough. Thanks.
Some background is needed for the thread API
The basic threading model for plugins is the "main" and "audio" threads. The APIs specify which methods are allowed to be called concurrently from which thread.
There is also a state machine for the audio processing bits (for example you can guarantee that processing won't happen until after the plugin as been "activated" and won't go from a deactivated state to processing until a specific method is called - I'm simplifying significantly for the VST3 state machine).
The "main" thread is the literal main/UI thread of the application typically, or a sandboxed plugin host running in a separate process. You do your UI on this thread as well as handle most host events.
Plugins often want to do things on background threads, like stream audio from disk or do heavy work like preparing visualization without blocking the main UI thread (which also handles rendering and UI events - think like the JS event loop, it's bad to block it).
The threading model and state machine make it difficult to know where it's safe to spawn and join threads. You can do it in a number of places but you also have to be careful about lifetimes of those threads, most plugins do it as early as possible and then shut them down as late as possible.
The host also has to do a lot of the stuff on background threads and usually has its own thread pool. CLAP introduced an extension to hook into the host's thread pool so plugins don't have to spawn threads and no longer have to really care about the lifetime. VST3 is copying that feature.
When you see annotations on methods in these APIs about "main" vs "any" thread and "active" etc they're notes to developers on where it is safe to call the methods and any synchronization required (on both sides of the API).
If it sounds complicated that's because it is, but most of this is accidental complexity created by VST3.
Can anyone fluent with all this tell me whether that will make the usage of Arturia [zillion of] plugins trivial in Linux distros?
That's very interesting news. Definitely brought on by CLAP as others have mentioned, but it's interesting to see how this evolves. VST is a pretty complicated standard to support whereas CLAP is much simpler, although the former is much more widely used.
Like 1 in 200 plugins supports CLAP, where 100% support VST, so if they can do it more easily and with less licensing burden, and even have some community contribution, that would be big.
It will be a while, if ever, before most plugins get the CLAP (pun intended).
Most plugins don't use these APIs directly, they rely on wrappers like JUCE. Once JUCE supports CLAP the plugins will follow. That should happen in the next year.
The bigger problem are hosts. While Apple and Avid will probably never support CLAP, everyone but Ableton does. They move slower than the rest of the industry (taking a decade or so to implement VST3). Which is odd because CLAP is significantly easier to use from both the host and plugin side.
That said, you can wrap a clap plugin as a vst3 or AU today. It's probably the lowest friction way to do it to be honest.
CLAP might be similar to AU in plugin support which is pretty common too.
CLAP is nowhere near close to AU in adoption.
Almost all VST plugins have an AU version (like 80%-90% or so, and 99% of the major ones).
Almost no VST plugins have a CLAP version (like 1%-5%, and that's charitable).
I know it isn't nowhere near in adoption NOW. I meant it shows there is room for another format and AU is a good example of how another format can make inroads.
I’d really like to see more plugins available in the LV2 format for my Ardour RT DAW. Also, a quick recommendation : LSP (Linux Studio Plugins) an excellent collection of several open source plugins supporting CLAP, AU, LV2, VST2, VST3, LADSPA, and a standalone Jack versions https://lsp-plug.in/
Since I never agreed to the VST3 SDK terms, which required you to give up your license to VST2, does this mean I can finally make VST3 plugins without losing the ability to publish VST2?
You'd have to use this version (not sure if they back-licensed old versions) but MIT would mean you wouldn't have to agree to that draconian licensing.
Most probably a response to CLAP gaining popularity. But they buried a lede with Wayland support. This puts VST3 ahead of CLAP in that regard.
Well done Steinberg/Yamaha.
At the same time Steinberg also open sourced their ASIO audio hardware interface standard but under GPL3. GPL2 here would have made more sense to me to align with the Linux kernel GPL2 only licensing. So why GPL3? Other commenters here have mentioned OBS, and OBS is "GPLv2 or later" so sure that works for them. Not being GPL 2 and missing on the Linux kernel just surprises me.
I have been using the nice cwASIO (https://github.com/s13n/cwASIO) re-implementation of the ASIO SDK, it's MIT licensed. https://github.com/s13n/cwASIO. It's nice there just to see something more up to date than the ancient ASIO SDK documentation. I would love to see the Steinberg ASIO SDK updated and improved, if you are listening Steinberg folks: nobody cares about the history of ASIO on Macs or Silicon Graphics Workstations, just dive in and get deep into the weeds of ASIO on Windows, and include lots more sample code, especially covering the ASIO device enumeration mess on Windows.
Before VST3 code was under GPL3 license and GPL2 software (like LMMS) couldn't use it.
Also the license change could be caused by competition from CLAP which is very openly licensed.
Wow, after all these years. This is a very Good Thing. You could get access to it before but you had to sign a very long agreement and it was always a PITA.
Steinberg is only going to benefit from this, I think.
This is fantastic.
Audio programming is still low level and difficult, but I'm looking forward towards vibe coding some experiments with this.
I like to work privately on my open source projects, and then push it to a public repo after deleting my git history ( for the public repo anyway).
Oh wow, finally! They should have done this 20 years ago but this is awesome news.
No more takedown notices for including legacy VST SDK in your github project? Wait, mine was 2.4, so I guess steinberg would still chase me down if I hadn't complied already.
Having only used VSTs but never even looked into how they're actually built - what does this now mean in simple terms? Did you need a specific closed source framework to build them or something like that? What has changed now?
You had to accept some license terms before you could download the VST SDK. When linux audio started to get "serious" 20 years ago, it was a commonly discussed pain point.
Concretely, it made distributing OSS VST plugins a pain. Especially for Linux which generally will want to build their packages.
Note that his was the VST2 era. VST3 was commercial license or GPL 3, which was an improvement, but only slightly, because it excluded open-source software released under the GPL 2, and also MIT/BSD/whatever-licensed software couldn't use it (without effectively turning the whole software into GPL-licensed software).
This simplifies a ton of things.
CLAP supports polyphonic modulation
Wow!! What a huge and wonderful change. Hats off, Steinberg.
Why are we still centralizing open source on Microsoft's GitHub? Haven't we learned the risks of giving one corporation, especially one with a such a shady history, exclusive control over the world's open source activity?
Because they don't have exclusive control, unlike social media where you can't take your data and move it to another provider, you can just take your repo to whichever provider or self-hosted GitOps option you want.
Not only can you take it with you, every developer already has a local copy of the entire repository.
The real value IMHO of github is the issue tracker and the visual diff/display of PR changes.
dupe: https://news.ycombinator.com/item?id=45665752