Time travel? Or, just clever technology

(syncdna.com)

62 points | by yabones 3 days ago ago

27 comments

  • k__ 9 hours ago

    Wouldn't this only work for DAGs?

    For example, you got a drummer that does their thing.

    The bass can react to the drummer.

    The guitar and vocals can react to drummer and bass.

    Each one could get a finished version, but with so much delay that they can't meaningfully react to each one coming after them or being on the same level.

  • AshamedCaptain 11 hours ago

    Latency compensation. This why most people do not perceive any latency or lipsync issue when browsing YouTube on their phones with their expensive bluetooth headsets that have ridiculous amounts of latency (>250ms).

    • jama211 4 hours ago

      Err you know they match the video to the audio delay on modern smart phone apps right? The only time you actually experience latency is when you attempt to pause/play the media, or you’re trying to do something in real time (like record with people around you). If you use airpods and an iphone or something the video will not be out of sync with the audio.

      You can literally even see the video lag when you hit play as it ensures it syncs with the audio.

    • zenmac 9 hours ago

      Caching in the headset. Yeah like the sibling comments. People who makes music and sound stuff hates Bluetooth! And hate those companies that takes the mini jack out of the phone.

      • reactordev 7 hours ago

        Oh man, any guitarist that comes by with those wireless Bluetooth connectors always is an eighth beat behind… no matter how we tweak it, they just can’t play on time when using them.

        Plug them in directly, no problem.

    • kgwxd 9 hours ago

      destroys game sounds and music-making apps though. Guess these things are designed for pure consumption though, no fun or creativity allowed.

      • ricardobeat 9 hours ago

        > these things are designed for pure consumption

        This is an old trope that needs to die. I used to play synths on a 1st gen iPad back in 2010, it had amazing <5ms latency at a time when you'd struggle to hit that on a PC using external hardware.

        Wired headphones have always been around, even now all it takes is a $5 adapter. Bluetooth "aptX Low Latency" has also been around for years, though adoption has been a bit slow. It is quite standard on Android phones. On the Apple side, Airpods have had decent ~100ms latency for a few years (enough for casual gaming), and more recently have a custom wireless, low-latency lossless connection (Apple Vision only atm), and <20ms latency on the new iPhone 17 using Bluetooth 6.

        It really is a wireless technology problem. Bluetooth LE audio only came around 2020, and barely adopted. Bluetooth 6 was announced late last year and just starting to show up in devices now.

        • AshamedCaptain 7 hours ago

          > In the Apple side, Airpods have had decent ~100ms latency for a few years (enough for casual gaming)

          This needs a big citation. It has always been claimed that AirPods have no discernible latency and every time it is tested it is actually pretty subpar (>150ms).

          > Bluetooth "aptX Low Latency" has also been around for years, though adoption has been a bit slow. It is quite standard on Android phones

          Almost no Android phones support it. Anything Samsung for example is excluded, even if they use Snapdragon Sound chips.

          It is not really a technology problem, since I was doing 50ms latency with plain old HFP profile on BT 1.x back in 2002 with a Nokia and the cheapest headset. Latency has being going up even though nothing really changed in the underlying technology (Classic Bluetooth Audio HFP/A2DP is practically unchanged since Bluetooth 2.x times, while LE Audio introduced in BT 5.x is used by almost no one and their codec selection can be considered sabotage).

          The problems are (from an enthusiast & armchair analyst PoV):

          - Consumers don't care (YouTube works well, after all) and can't even measure it correctly. Manufacturers don't report latency on specs.

          - Everyone has an incentive to make it subpar so that they can promote their proprietary solutions with vendor lock-in. Qualcomm/CSR is _specially_ guilty of this, and they dominate the BT headset industry. But literally everyone is doing it these days (Samsung, Sony, Apple, etc.). And even then, most of the time these techs provide negligible improvements on latency or quality (since, #1, customers can't measure).

          - The Bluetooth SIG no longer has any remaining teeth (it never had many to begin with). They just rubber-stamp and things barely interoperate with each other these days (e.g. last Sony headsets "support" LE Audio as per logo but on release could not talk with any of the existing LE Audio stacks).

          • CharlesW 6 hours ago

            > This needs a big citation.

            Here's one from 2022 (so AirPods Pro 2 and iOS 15 or 16), but: https://stephencoyle.net/airpods-pro-2

            "As you can see, the second-generation AirPods Pro perform about 40ms better than their predecessors, with an average latency of 126ms vs the original’s 167ms.

            "Perhaps a more interesting point to note is that the second-generation AirPods Pro perform only 43ms worse than the built-in speakers (at 83ms). That suggests that up to two-thirds of the time between touching the screen and hearing a noise occurs before Bluetooth data leaves the device. I think there’s still too much latency for audio feedback to feel snappy and responsive over AirPods Pro 2, but maybe at this point there are easier gains to be made by working to reduce the device-side latency."

            • ricardobeat 3 hours ago

              On the last point, most of that latency is from the touchscreen response, the audio system is capable of single-digits latency.

              Things may have gotten much worse recently as I distinctly remember the iPhone around the 4s-6s era having <30ms latency which was a huge advantage over Android.

              The Apple Pencil can also go under 10ms latency when drawing, there must be a way of taking advantage of that for music apps?

        • vunderba 5 hours ago

          You can definitely do music recording on an iPad with WIRED headphones - I have an iPad Pro 10.5 that I'll use in conjunction with a 37-key midi keyboard and it's great for travel.

          But there is NO world where you are doing realtime playback/recording even with very loose quantization with an iPad and BT headphones. Maybe some day, but that day is not now.

          Latency

          - under 30ms = acceptable

          - between 30-50ms = irritating but you can work around it

          - between 50-100ms = easily detectable when you press a key and almost unusable

          - above 100ms = patently absurd

          And this is latency figures for laying down melodies. Latency needs to be even tighter when laying down drums.

          The only decent wireless headphones I've ever used in a recording setting were AIAIAI TMA-2 Wireless+ [1] which uses a dedicated radio transmitter.

          [1] https://aiaiai.audio/stories/products/deep-dive-w-link

        • forrestthewoods 8 hours ago

          End to end audio latency is sooooo bad in games. On your desktop wired headphones a lot of AAA games will have 150ms of audio latency. Effectively no one measures this.

          https://youtu.be/JTuZvRF-OgE

          Android audio stack is notoriously easy to make very very bad. Too many layers of software abstraction. Later upon layer of buffering. It’s all so bad. :(

      • toast0 9 hours ago

        Large delays in audio/video aren't condusive to interactive applications. But chat up someone who plays a pipe organ; they often play in ensembles and have to deal with serious latency (but probably they just worry about hitting their notes in time to be on time and the ensemble has to match up with them)

        • vunderba 5 hours ago

          Great point. I'd love to see a chart for respective "attack" values for classical instruments. My experience with organs is only smaller electric organs like the Hammond B-3 so it's not really as much of a factor there.

  • vivzkestrel 7 hours ago

    I request you to edit the article and also add the point that "30 mins is only from the perspective of the observer on earth or mars" For a person who is actually making the voyage at the speed of light (impossible if you mass honestly) it is instantaneous

  • curtisblaine 10 hours ago

    This works for a listener down the chain, but obviously can't work for performers playing together. The article mentions a producer listening to remotely located performers as they were playing together, but fails to mention how these remotely located performers can sync to each other in the first place.

    • FrancoisBosun 10 hours ago

      Explained in the article:

      > A producer sends a backing track to a performer - SyncDNA adds a slight delay to the outbound feed

      The "backing track" is probably the beat or something similar.

  • alganet 10 hours ago

    I don't know about podcasts and stuff, but for music, you can already OBS this puzzle out real quick and use the natural song bars as synchronization steps.

    1. One musician plays simple bars, repetitive stuff, and streams it.

    2. Second musician receives the audio from musician 1, records a multi audio track video of himself alone (in one track) and the musician 1 output in another.

    3. Stack undefinitely.

    You play to what you hear, in real time. All tracks are recorded separately in separate computers and can be edited together quite easily.

    Plus, this is already how most jams work in real life.

    > "now" isn't a single universal instant, it's a moving target

    Rhythm is already a moving target, a delay cycle. Musicians just need to know the beat 1 for each bar (which they should already know, as it is their job).

    • curtisblaine 10 hours ago

      Yes, but musician 1 can't possibly react to musician 2's output meaningfully, because it happens after musician 2 listened to musician 1 and played its part. That's not how jams with musicians physically in the same room work.

      • alganet 10 hours ago

        Fair enough, you couldn't have something like the vocalist cueing the bassist and the drummer picking it up out of thing air and doing an improvised roll, like it happens here:

        https://youtu.be/eg_rgm9VDAw?t=1597

        The drummer takes a little bit more than a second to react to it (higher than a lot of stream delays by the way, but I can see how the stacking could mess it up).

        That is, however, a bunch of experienced jazz musicians improvising at a high level. In most jams, these conversations happen very often on the next bar (1 and 2 and 3 and 4 and you react on the next 1).

        You can see a drummer using the end of a bar to cue the flourish used on the organ in this part, for example:

        https://youtu.be/jhicDUgXyNg&t=587s

        It takes multiple seconds for the organist to recognize the cue, that is actually for the next bar, then he joins in. This sort of stuff is actually doable just with just video chat and OBS.

        Please also note that the product's example workflow is actually worse in that "reaction jammyness" regard than what I proposed:

        > The performer receives the track early, and waits the rest of the delay period to play it

        This is designed for recording. It sounds more like a studio arrangement in which you have to record your part than a jam session.

        > The fidelity of the live stream isn't high enough to record

        Seems like an incomplete product. OBS can already record multi-tracks and monitor tracks, which you can leverage to produce high quality artifact recordings. I use to sync them manually using a DAW, but with all those auto-timers, it's a surprise it doesn't do it automatically.

        • ghusbands 5 hours ago

          > This sort of stuff is actually doable just with just video chat and OBS.

          If what each person is hearing is 100-400ms delayed from what each person is producing, how can they possibly mutually react or even get their music in time? If B plays in time with what they hear from C, C hears what B did 200-800ms later - that's far too much and will sound terrible.

          Jamming would seem to require incredibly low latency audio just for the rhythm to work between two performers.

          • alganet 4 hours ago

            I just showed you, with examples. Musicians reacts to musical structure, which can be very loose compared to what engineers think of latency. A 12-bar blues can give lots of free time to improvise without feedback.

            Also, the stacked delay is part of their product. My solution just does it for free, but it's the same idea.

  • belter 10 hours ago

    > Our Universe has one very inconvenient problem: it has an unbreakable speed limit. While it may seem instant, light takes quite a while to get around, traveling at just under 300,000 KM per second.

    Google and Azure Availability Zones seem to break that limit daily ... ;-)

    • belter 10 hours ago

      This was a test and you failed...

  • Xmd5a 10 hours ago

    I think this would be easier to explain using strips of paper. Start by placing the strips horizontally, drawing a vertical line across all of them with a red pen. Then, shift the strips to the right relative to each other to represent latency. Next, having fold a section of each strip onto itself to shorten them beforehand, unfold them to show an extended length representing delay. Finally, fold them back again to represent waiting for the remaining delay from the transmit track.

    • Nurbek-F 9 hours ago

      No. It's not easier mate. I don't know what I just read...

      • jama211 4 hours ago

        To be fair, you read a description of their method, rather than them using the method in front of you.