<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
    <title>Orbiter - Blog</title>
    <subtitle>Generative ambient machine with physically modelled gong, handpan, and singing bowl instruments</subtitle>
    <link rel="self" type="application/atom+xml" href="https://orbiter.audio/blog/atom.xml"/>
    <link rel="alternate" type="text/html" href="https://orbiter.audio/blog/"/>
    <generator uri="https://www.getzola.org/">Zola</generator>
    <updated>2026-04-07T00:00:00+00:00</updated>
    <id>https://orbiter.audio/blog/atom.xml</id>
    <entry xml:lang="en">
        <title>Cross-Platform Engineering in Rust</title>
        <published>2026-04-07T00:00:00+00:00</published>
        <updated>2026-04-07T00:00:00+00:00</updated>
        
        <author>
          <name>
            
              Unknown
            
          </name>
        </author>
        
        <link rel="alternate" type="text/html" href="https://orbiter.audio/blog/cross-platform-engineering/"/>
        <id>https://orbiter.audio/blog/cross-platform-engineering/</id>
        
        <content type="html" xml:base="https://orbiter.audio/blog/cross-platform-engineering/">&lt;p&gt;&lt;em&gt;Part 2 of the Orbiter technical series. See also: &lt;a href=&quot;&#x2F;blog&#x2F;introducing-orbiter&#x2F;&quot;&gt;Introducing Orbiter&lt;&#x2F;a&gt;.&lt;&#x2F;em&gt;&lt;&#x2F;p&gt;
&lt;img src=&quot;&#x2F;blog&#x2F;cross-platform-diagram.svg&quot; alt=&quot;Cross-platform architecture: shared Rust core crates, platform bridges for desktop, web, iOS, Android, and plugins, packaged products&quot; style=&quot;width:100%; margin: 2em 0;&quot;&gt;
&lt;p&gt;Orbiter ships as a browser app, a standalone desktop app (macOS, Linux, Windows), an iOS app, an Android app, and six DAW plugins (VST3, CLAP, AU3) — all from one Rust codebase. This post is about how and why.&lt;&#x2F;p&gt;
&lt;h2 id=&quot;what-made-this-practical&quot;&gt;What Made This Practical&lt;&#x2F;h2&gt;
&lt;p&gt;A few things converged to make this level of cross-platform reach feasible for a solo developer:&lt;&#x2F;p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Rust&#x27;s compilation model&lt;&#x2F;strong&gt; — the same code compiles to native binaries, static libraries for C FFI, dynamic libraries for JNI, and WASM, with no runtime or garbage collector to adapt. The C ABI means interop with Swift, Kotlin, and JavaScript is straightforward.&lt;&#x2F;li&gt;
&lt;li&gt;&lt;strong&gt;wgpu&lt;&#x2F;strong&gt; — a single GPU abstraction over Metal, Vulkan, DX12, WebGPU, and WebGL2. WebGPU is now shipping in major browsers, making this a practical choice. One rendering path for every platform.&lt;&#x2F;li&gt;
&lt;li&gt;&lt;strong&gt;iced&lt;&#x2F;strong&gt; — a GPU-rendered GUI framework with no platform dependencies beyond wgpu. The same widget tree runs everywhere.&lt;&#x2F;li&gt;
&lt;li&gt;&lt;strong&gt;Agentic development tools&lt;&#x2F;strong&gt; — maintaining five platform bridges, each with its own FFI contract and build system, would be impractical to do entirely by hand. AI-assisted development made it feasible to write and maintain the bridge code, build scripts, and CI workflows at a pace one person could sustain.&lt;&#x2F;li&gt;
&lt;&#x2F;ul&gt;
&lt;p&gt;None of these individually is sufficient. The combination is what makes it work.&lt;&#x2F;p&gt;
&lt;h2 id=&quot;the-landscape-of-alternatives&quot;&gt;The Landscape of Alternatives&lt;&#x2F;h2&gt;
&lt;p&gt;If you&#x27;re building cross-platform audio software, there are several established approaches. Each makes different trade-offs around licensing, language, platform reach, and web support.&lt;&#x2F;p&gt;
&lt;h3 id=&quot;c-frameworks&quot;&gt;C++ Frameworks&lt;&#x2F;h3&gt;
&lt;p&gt;&lt;strong&gt;&lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;juce.com&#x2F;get-juce&#x2F;&quot;&gt;JUCE&lt;&#x2F;a&gt;&lt;&#x2F;strong&gt; is the industry standard. It&#x27;s mature, well-documented, and has a huge community. But its &lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;juce.com&#x2F;get-juce&#x2F;&quot;&gt;licensing model&lt;&#x2F;a&gt; is layered — free under GPL (your project must also be GPL), or paid tiers with revenue thresholds ($50K for Personal, $200K for Indie at $40&#x2F;month, unlimited at Pro for $130&#x2F;month). After the &lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;juce.com&#x2F;blog&#x2F;juce-acquires-roli&#x2F;&quot;&gt;PACE acquisition&lt;&#x2F;a&gt; in 2022, there&#x27;s been recurring community concern about licensing stability. On the web, JUCE has &lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;forum.juce.com&#x2F;t&#x2F;webview-gui-tutorial&#x2F;57498&quot;&gt;experimental Emscripten support&lt;&#x2F;a&gt; but it&#x27;s not a first-class target — binaries are large, browser API integration is manual, and plugin formats don&#x27;t translate to the web.&lt;&#x2F;p&gt;
&lt;p&gt;&lt;strong&gt;&lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;iplug2.github.io&#x2F;&quot;&gt;iPlug2&lt;&#x2F;a&gt;&lt;&#x2F;strong&gt; has an excellent permissive licence (WDL&#x2F;zlib) and notably good web support via Web Audio Modules (WAM). It covers VST2&#x2F;3, AU, AUv3, AAX, and standalone. The GUI framework (IGraphics) is less feature-rich than JUCE&#x27;s, and the community is smaller, but the licensing and web story are genuinely strong. Still C++ though, with the same WASM-via-Emscripten path.&lt;&#x2F;p&gt;
&lt;p&gt;&lt;strong&gt;&lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;github.com&#x2F;DISTRHO&#x2F;DPF&quot;&gt;DPF (DISTRHO Plugin Framework)&lt;&#x2F;a&gt;&lt;&#x2F;strong&gt; is lightweight and ISC-licensed (essentially MIT). Good Linux community, supports LADSPA, LV2, VST2&#x2F;3, CLAP. No mobile, no web. A solid choice for desktop plugins where JUCE is overkill.&lt;&#x2F;p&gt;
&lt;h3 id=&quot;dsl-and-code-generation&quot;&gt;DSL and Code Generation&lt;&#x2F;h3&gt;
&lt;p&gt;&lt;strong&gt;&lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;faust.grame.fr&#x2F;&quot;&gt;FAUST&lt;&#x2F;a&gt;&lt;&#x2F;strong&gt; takes a different approach: a functional DSP language that generates C++, Rust, WASM, or LLVM from the same source. Web support is excellent (generates AudioWorklet code directly). The GPL compiler produces code you can use under any licence. But it&#x27;s a DSP authoring tool, not a full framework — you still need something else for the GUI and platform integration.&lt;&#x2F;p&gt;
&lt;h3 id=&quot;web-first-approaches&quot;&gt;Web-First Approaches&lt;&#x2F;h3&gt;
&lt;p&gt;&lt;strong&gt;&lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;tauri.app&#x2F;&quot;&gt;Tauri&lt;&#x2F;a&gt;&lt;&#x2F;strong&gt; pairs a Rust backend with a web-tech UI (HTML&#x2F;CSS&#x2F;JS). Much lighter than Electron (~10 MB vs 100+ MB), and v2 adds iOS&#x2F;Android. But audio must be handled via native code, and there&#x27;s no plugin format support. &lt;strong&gt;&lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;www.electronjs.org&#x2F;&quot;&gt;Electron&lt;&#x2F;a&gt;&lt;&#x2F;strong&gt; has the same architecture with higher memory overhead and GC pauses that can interfere with real-time audio.&lt;&#x2F;p&gt;
&lt;p&gt;&lt;strong&gt;&lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;flutter.dev&#x2F;&quot;&gt;Flutter&lt;&#x2F;a&gt;&lt;&#x2F;strong&gt; with dart:ffi can call into native Rust DSP code. Excellent mobile UI story, but Dart&#x27;s garbage collector can cause audio glitches, and there&#x27;s no built-in real-time audio — you&#x27;re bridging to CoreAudio&#x2F;AAudio yourself. No plugin format support.&lt;&#x2F;p&gt;
&lt;h3 id=&quot;rust-native&quot;&gt;Rust-Native&lt;&#x2F;h3&gt;
&lt;p&gt;&lt;strong&gt;&lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;github.com&#x2F;robbert-vdh&#x2F;nih-plug&quot;&gt;nih-plug&lt;&#x2F;a&gt;&lt;&#x2F;strong&gt; is the leading Rust plugin framework — ISC-licensed, excellent ergonomics, supports VST3 and CLAP with pluggable GUI (iced, Vizia, egui). Actively maintained. It doesn&#x27;t support AU&#x2F;AUv3, which is why Orbiter uses a separate au-bridge for Apple platforms.&lt;&#x2F;p&gt;
&lt;p&gt;&lt;strong&gt;&lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;github.com&#x2F;vizia&#x2F;vizia&quot;&gt;Vizia&lt;&#x2F;a&gt;&lt;&#x2F;strong&gt; is a declarative, CSS-styled Rust GUI framework designed for audio plugins, with a nih-plug integration. Pre-1.0 but maturing. &lt;strong&gt;&lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;github.com&#x2F;RustAudio&#x2F;baseview&quot;&gt;baseview&lt;&#x2F;a&gt;&lt;&#x2F;strong&gt; provides low-level windowing for plugin UIs — the foundation that nih-plug and others build on.&lt;&#x2F;p&gt;
&lt;h3 id=&quot;where-rust-fits&quot;&gt;Where Rust Fits&lt;&#x2F;h3&gt;
&lt;p&gt;The key gap in the C++ ecosystem is the web. JUCE and iPlug2 both reach it via Emscripten, but it&#x27;s a second-class experience. Rust compiles to WASM natively, and the tooling (wasm-bindgen, web-sys, Trunk) is designed for it. The key gap in the web-first ecosystem (Tauri, Electron, Flutter) is real-time audio performance and plugin formats. Rust bridges both worlds — native performance with first-class WASM support and C ABI for FFI to every platform.&lt;&#x2F;p&gt;
&lt;p&gt;The trade-off is maturity. JUCE has decades of audio UI widgets, comprehensive documentation, and a proven track record. The Rust audio ecosystem is younger and requires more custom work — especially for mobile, where iced has no built-in support. That&#x27;s where the original bridging work for Orbiter came in.&lt;&#x2F;p&gt;
&lt;h2 id=&quot;gpu-rendering-with-wgpu&quot;&gt;GPU Rendering with wgpu&lt;&#x2F;h2&gt;
&lt;p&gt;The entire UI is rendered on the GPU via wgpu, a Rust implementation of the WebGPU standard. The iced GUI framework drives the rendering, and wgpu maps to the native graphics API on each platform:&lt;&#x2F;p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;macOS &#x2F; iOS&lt;&#x2F;strong&gt; — Metal&lt;&#x2F;li&gt;
&lt;li&gt;&lt;strong&gt;Linux&lt;&#x2F;strong&gt; — Vulkan&lt;&#x2F;li&gt;
&lt;li&gt;&lt;strong&gt;Windows&lt;&#x2F;strong&gt; — DX12 (Vulkan fallback)&lt;&#x2F;li&gt;
&lt;li&gt;&lt;strong&gt;Web&lt;&#x2F;strong&gt; — WebGPU where available, WebGL2 as fallback&lt;&#x2F;li&gt;
&lt;&#x2F;ul&gt;
&lt;img src=&quot;&#x2F;blog&#x2F;wgpu-pipeline.svg&quot; alt=&quot;wgpu rendering pipeline: one iced widget tree targeting Metal, Vulkan, DX12, WebGPU, and WebGL2&quot; style=&quot;width:100%; margin: 2em 0;&quot;&gt;
&lt;p&gt;The same iced widget tree and the same rendering code runs everywhere. Two WASM builds are produced (WebGPU and WebGL2) and the correct one is loaded at runtime based on browser support. On native platforms, wgpu selects the high-performance GPU adapter by default with a low-power fallback.&lt;&#x2F;p&gt;
&lt;h2 id=&quot;mobile-custom-iced-integration-for-ios-and-android&quot;&gt;Mobile: Custom iced Integration for iOS and Android&lt;&#x2F;h2&gt;
&lt;p&gt;On desktop, iced runs on top of winit for windowing and event handling. On iOS and Android that&#x27;s not an option — the native platform owns the window, the render loop, and the touch event system. &lt;strong&gt;iced has no built-in mobile support.&lt;&#x2F;strong&gt; Getting it running on iOS and Android required building custom integration layers from scratch — original work done for Orbiter (to be open sourced) that doesn&#x27;t exist elsewhere in the iced ecosystem.&lt;&#x2F;p&gt;
&lt;p&gt;On &lt;strong&gt;iOS&lt;&#x2F;strong&gt;, Swift creates a CAMetalLayer and passes it to Rust via C FFI. Rust creates a wgpu Metal surface directly from the layer pointer, then drives iced&#x27;s &lt;code&gt;UserInterface&lt;&#x2F;code&gt; manually — building, updating, and drawing each frame without winit. CADisplayLink controls the render loop from the Swift side, calling into Rust each frame and receiving a redraw flag back. When iced has nothing to update, the display link pauses. Touch events are batched by Swift into C structs (phase, finger ID, coordinates) and forwarded across FFI, where they&#x27;re translated into iced &lt;code&gt;touch::Event&lt;&#x2F;code&gt; values.&lt;&#x2F;p&gt;
&lt;p&gt;On &lt;strong&gt;Android&lt;&#x2F;strong&gt;, the pattern is similar but via JNI instead of C FFI. A Kotlin &lt;code&gt;SurfaceView&lt;&#x2F;code&gt; provides the native window, which Rust converts to a wgpu Vulkan surface. The render loop is driven by &lt;code&gt;Choreographer&lt;&#x2F;code&gt; frame callbacks. Touch events arrive individually through JNI calls. The crate is compiled as a &lt;code&gt;cdylib&lt;&#x2F;code&gt; (shared library loaded by the Android runtime), unlike iOS where it&#x27;s a &lt;code&gt;staticlib&lt;&#x2F;code&gt; linked into the app binary.&lt;&#x2F;p&gt;
&lt;p&gt;Both platforms share the same adaptive rendering strategy: a custom iced &lt;code&gt;Notifier&lt;&#x2F;code&gt; implementation that signals the native side when a redraw is needed, letting the platform pause the render loop when idle.&lt;&#x2F;p&gt;
&lt;h3 id=&quot;toward-shared-crates&quot;&gt;Toward Shared Crates&lt;&#x2F;h3&gt;
&lt;p&gt;The iOS and Android integration code is currently Orbiter-specific, but the core of it — driving iced&#x27;s &lt;code&gt;UserInterface&lt;&#x2F;code&gt; from a native render loop, translating touch events across FFI, managing wgpu surfaces from platform-provided GPU layers — is generic. I&#x27;m planning to extract this into standalone crates (&lt;code&gt;iced-ios-bridge&lt;&#x2F;code&gt;, &lt;code&gt;iced-android-bridge&lt;&#x2F;code&gt; or similar) that any iced project could use to target mobile. The FFI contracts are already well-defined and the architecture is deliberately minimal — the native side provides a GPU layer and touch events, Rust handles everything else.&lt;&#x2F;p&gt;
&lt;h2 id=&quot;web-four-threads-in-the-browser&quot;&gt;Web: Four Threads in the Browser&lt;&#x2F;h2&gt;
&lt;p&gt;The web build has the most complex threading model because browsers enforce strict separation between contexts. Four independent WASM modules run on separate threads — audio runs in an AudioWorklet for low-latency playback, while ML inference and collaboration stay off the main thread to minimise CPU contention.&lt;&#x2F;p&gt;
&lt;img src=&quot;&#x2F;blog&#x2F;web-threads.svg&quot; alt=&quot;Web threading model: main thread (GUI), AudioWorklet (DSP), collaboration worker, ML sequencer worker&quot; style=&quot;width:100%; margin: 2em 0;&quot;&gt;
&lt;p&gt;The &lt;strong&gt;main thread&lt;&#x2F;strong&gt; runs the iced GUI via wgpu compiled to WASM. The &lt;strong&gt;AudioWorklet&lt;&#x2F;strong&gt; runs the DSP engine in a dedicated audio processing thread — a separate Rust crate compiled to WASM independently. A &lt;strong&gt;collaboration worker&lt;&#x2F;strong&gt; handles Loro CRDT sync over WebSocket. An &lt;strong&gt;ML sequencer worker&lt;&#x2F;strong&gt; runs the generative model inference.&lt;&#x2F;p&gt;
&lt;p&gt;Trunk builds the WASM app, wasm-pack builds the workers, and Zola builds the static site that hosts it all. Two variants are produced — WebGPU and WebGL2 — and the correct one is loaded at runtime based on browser capabilities.&lt;&#x2F;p&gt;
&lt;h2 id=&quot;desktop&quot;&gt;Desktop&lt;&#x2F;h2&gt;
&lt;p&gt;Desktop is the most straightforward target: Winit for windowing, wgpu for rendering, cpal for audio I&#x2F;O. A real-time priority thread runs the DSP, with rtrb lock-free ring buffers for UI-to-audio communication. This is the standard iced architecture with no custom embedding needed.&lt;&#x2F;p&gt;
&lt;h2 id=&quot;daw-plugins&quot;&gt;DAW Plugins&lt;&#x2F;h2&gt;
&lt;p&gt;For producers and sound designers, the same engine runs as a plugin inside a DAW. Two plugin frameworks handle the wrapping:&lt;&#x2F;p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;nih-plug&lt;&#x2F;strong&gt; produces VST3 and CLAP binaries. The GUI is embedded via nih_plug_iced, which hosts iced inside the plugin window provided by the DAW.&lt;&#x2F;li&gt;
&lt;li&gt;&lt;strong&gt;au-bridge&lt;&#x2F;strong&gt; produces AU3 (Audio Unit v3) extensions for macOS and iOS. Each instrument is compiled as a separate Rust static library, and a Swift wrapper builds the AUAudioUnit parameter tree from metadata exported by Rust via FFI.&lt;&#x2F;li&gt;
&lt;&#x2F;ul&gt;
&lt;p&gt;Six plugins total: three instruments, each available as both a synthesizer and an audio effect.&lt;&#x2F;p&gt;
&lt;h2 id=&quot;midi-sync&quot;&gt;MIDI &amp;amp; Sync&lt;&#x2F;h2&gt;
&lt;img src=&quot;&#x2F;blog&#x2F;midi-sync-diagram.svg&quot; alt=&quot;MIDI and sync connections: controllers and DAWs exchange notes with Orbiter, with MIDI Clock or Ableton Link for tempo sync&quot; style=&quot;width:100%; margin: 2em 0;&quot;&gt;
&lt;p&gt;MIDI uses midir across all platforms — CoreMIDI on macOS&#x2F;iOS, ALSA on Linux, WinMM on Windows, Web MIDI API in the browser. All available ports are connected simultaneously with hot-plug detection. MPE is fully supported: per-note pitch bend, pressure, and slide (CC74) control individual voice parameters through per-voice modulation.&lt;&#x2F;p&gt;
&lt;p&gt;Tempo sync supports two alternative transports: MIDI Clock (all platforms, leader or follower) and Ableton Link (native apps only, network tempo and phase sync over LAN).&lt;&#x2F;p&gt;
&lt;p&gt;&lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;orbiter.audio&quot;&gt;Try Orbiter in your browser →&lt;&#x2F;a&gt;&lt;&#x2F;p&gt;
</content>
        
    </entry>
    <entry xml:lang="en">
        <title>Introducing Orbiter</title>
        <published>2026-04-05T00:00:00+00:00</published>
        <updated>2026-04-05T00:00:00+00:00</updated>
        
        <author>
          <name>
            
              Unknown
            
          </name>
        </author>
        
        <link rel="alternate" type="text/html" href="https://orbiter.audio/blog/introducing-orbiter/"/>
        <id>https://orbiter.audio/blog/introducing-orbiter/</id>
        
        <content type="html" xml:base="https://orbiter.audio/blog/introducing-orbiter/">&lt;p&gt;Building software is an engineering discipline in 2026 about as much as it ever has been, but it also feels like an art form more than I certainly ever felt it: suddenly it is far more likely the case that if I can imagine it, I can also build it. I am sure I am not alone in thinking that when I don&#x27;t have to worry about punishing implementation difficulty, it becomes natural to spend time getting deeply curious, imagining and experimenting on things that feel fascinating but borderline impossible to implement. Orbiter is just such a thing of engineered mild madness for me.&lt;&#x2F;p&gt;
&lt;hr &#x2F;&gt;
&lt;p&gt;&lt;b&gt;Orbiter is a generative ambient music system.&lt;&#x2F;b&gt; Its built-in sequencer composes meditative music in real time, or you compose together with it — driving physically modelled instruments. At this point there are three instruments available in the suite: handpans, gongs and singing bowls, but I am building some more.&lt;&#x2F;p&gt;
&lt;p&gt;This app is my best take on a few specific physical modelling synthesis methods as someone who is still learning, and an expression of a claim that an indie developer &lt;em&gt;can&lt;&#x2F;em&gt; now build and deploy an ambitious cross-platform suite of software, powered by something as hard as real-time audio digital signal processing (DSP) and on-device machine learning (ML) inference, across web, desktop and mobile platforms, with native technologies instead of web view slop.&lt;&#x2F;p&gt;
&lt;p&gt;Here it is, running live in the browser (click Play to set it off):&lt;&#x2F;p&gt;
&lt;div style=&quot;position: relative; width: 100%; height: 70vh; min-height: 400px; max-height: 700px; margin: 2em 0; border-radius: 8px; overflow: hidden; border: 1px solid #242129;&quot;&gt;
&lt;iframe src=&quot;&#x2F;embed&#x2F;?s=1654622616&quot; style=&quot;position: absolute; top: 0; left: 0; width: 100%; height: 100%; border: none;&quot; allow=&quot;autoplay; web-share&quot; loading=&quot;lazy&quot; scrolling=&quot;no&quot;&gt;&lt;&#x2F;iframe&gt;
&lt;&#x2F;div&gt;
&lt;p&gt;The thing is, Orbiter &lt;em&gt;also&lt;&#x2F;em&gt; runs on macOS, Windows, Linux, iOS, and Android — with &lt;a href=&quot;https:&#x2F;&#x2F;orbiter.audio&#x2F;blog&#x2F;introducing-orbiter&#x2F;#available-everywhere&quot;&gt;VST3, AU3, and CLAP plugins&lt;&#x2F;a&gt; for DAW integration. A &lt;a href=&quot;&#x2F;blog&#x2F;cross-platform-engineering&#x2F;&quot;&gt;separate post&lt;&#x2F;a&gt; covers how I approach this going cross-platform to this degree (in a few words, Rust and &lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;wgpu.rs&#x2F;&quot;&gt;wgpu&lt;&#x2F;a&gt;).&lt;&#x2F;p&gt;
&lt;div class=&quot;platform-gallery&quot;&gt;
&lt;figure&gt;&lt;img src=&quot;&#x2F;blog&#x2F;desktop-macos-orbital.png&quot; alt=&quot;Orbiter on macOS — Orbit view&quot; data-caption=&quot;macOS — Orbit view&quot;&gt;&lt;figcaption&gt;macOS — Orbit&lt;&#x2F;figcaption&gt;&lt;&#x2F;figure&gt;
&lt;figure&gt;&lt;img src=&quot;&#x2F;blog&#x2F;desktop-windows-edit.png&quot; alt=&quot;Orbiter on Windows — Edit view&quot; data-caption=&quot;Windows — Edit view&quot;&gt;&lt;figcaption&gt;Windows — Edit&lt;&#x2F;figcaption&gt;&lt;&#x2F;figure&gt;
&lt;figure&gt;&lt;img src=&quot;&#x2F;blog&#x2F;desktop-linux-edit.png&quot; alt=&quot;Orbiter on Linux — Instrument Parameters&quot; data-caption=&quot;Linux — Instrument Parameters&quot;&gt;&lt;figcaption&gt;Linux — Instrument Parameters&lt;&#x2F;figcaption&gt;&lt;&#x2F;figure&gt;
&lt;figure class=&quot;mobile&quot;&gt;&lt;img src=&quot;&#x2F;blog&#x2F;ios-edit.png&quot; alt=&quot;Orbiter on iOS (iPhone 17 Pro) — Edit view&quot; data-caption=&quot;iOS (iPhone 17 Pro) — Edit view&quot;&gt;&lt;figcaption&gt;iOS — Edit&lt;&#x2F;figcaption&gt;&lt;&#x2F;figure&gt;
&lt;figure class=&quot;mobile&quot;&gt;&lt;img src=&quot;&#x2F;blog&#x2F;android-orbital.png&quot; alt=&quot;Orbiter on Android — Orbit view&quot; data-caption=&quot;Android — Orbit view&quot;&gt;&lt;figcaption&gt;Android — Orbit&lt;&#x2F;figcaption&gt;&lt;&#x2F;figure&gt;
&lt;&#x2F;div&gt;
&lt;p&gt;Really Orbiter was initially a series of related experiments, focused on the following three challenges:&lt;&#x2F;p&gt;
&lt;ol&gt;
&lt;li&gt;Realising a software instrument that feels, sounds and looks &lt;strong&gt;immediate, beautiful, and strange&lt;&#x2F;strong&gt; in the right kind of way, on all the supported platforms.&lt;&#x2F;li&gt;
&lt;li&gt;Showcasing physical modelling realtime DSP and &lt;strong&gt;on-device ML inference&lt;&#x2F;strong&gt; across web, desktop, and mobile.&lt;&#x2F;li&gt;
&lt;li&gt;Shipping a &lt;strong&gt;single codebase&lt;&#x2F;strong&gt; as a web app, and as a standalone desktop and mobile app, and a set of DAW plugins simultaneously — without relying on Electron, Tauri, or other webview-based approaches.&lt;&#x2F;li&gt;
&lt;&#x2F;ol&gt;
&lt;h2 id=&quot;physical-modelling&quot;&gt;Physical Modelling&lt;&#x2F;h2&gt;
&lt;p&gt;The flavour of immediate and strange synthesis method that I was drawn is physical modelling, in particular modal synthesis. Perhaps because I play the cello (and lately the &lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;soundstone.co&#x2F;products&#x2F;the-string-armonica-mkii&quot;&gt;String Armonica mkII&lt;&#x2F;a&gt;) and have enjoyed tinkering with electroacoustics (&lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;efundies.com&#x2F;midi-controlled-solenoids-with-arduino-and-ableton-live-part-1&#x2F;&quot;&gt;MIDI controlled solenoids&lt;&#x2F;a&gt; and the &lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;www.leaf-audio.com&#x2F;microphonic-playground&quot;&gt;Microphonic Playground&lt;&#x2F;a&gt; for the win), it feels interesting to build an instrument that both takes inspiration from something that really exists in physical space, but is impossible, unreal in some way (for example, its physical properties can be modulated &#x2F; automated over time).&lt;&#x2F;p&gt;
&lt;figure style=&quot;margin: 2em 0;&quot;&gt;
&lt;img src=&quot;&#x2F;blog&#x2F;modal-synthesis-diagram.svg&quot; alt=&quot;Modal synthesis: an excitation drives a bank of resonators, each modelling a vibrational mode of the instrument&quot; style=&quot;width:100%;&quot;&gt;
&lt;figcaption style=&quot;color: #806B5C; font-size: 0.9em; margin-top: 0.5em;&quot;&gt;Modal synthesis at its simplest: a strike or bow excites a bank of resonators, each tuned to a vibrational mode of the physical instrument. The modes ring and decay independently, summing to produce the output sound.&lt;&#x2F;figcaption&gt;
&lt;&#x2F;figure&gt;
&lt;p&gt;Handpan seemed like an interesting starting point for exploring this: &lt;span style=&quot;color: #9A8778;&quot;&gt;(a)&lt;&#x2F;span&gt; I love the sound, &lt;span style=&quot;color: #9A8778;&quot;&gt;(b)&lt;&#x2F;span&gt; a model that captures a bit of its character is relatively simple and low in computational complexity, and &lt;span style=&quot;color: #9A8778;&quot;&gt;(c)&lt;&#x2F;span&gt; there are lovely, oddball scales involved (Hijaz, Kurd, Pygmy, to name a few).&lt;&#x2F;p&gt;
&lt;p&gt;I built the first prototype of these ideas using the Rust based &lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;github.com&#x2F;robbert-vdh&#x2F;nih-plug&quot;&gt;nih-plug&lt;&#x2F;a&gt; (with &lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;nih-plug.robbertvanderhelm.nl&#x2F;nih_plug_iced&#x2F;index.html&quot;&gt;iced&lt;&#x2F;a&gt;), as a DAW plugin, and tested with Ableton Live and Bitwig. The nih-plug provied standalone app distribution mechanism provided a handy way to prototype the feel of the app as a standalone app.&lt;&#x2F;p&gt;
&lt;figure style=&quot;margin: 2em 0; text-align: center;&quot;&gt;
&lt;img src=&quot;&#x2F;blog&#x2F;orbiter-first-look.jpg&quot; alt=&quot;The first screenshot of the app — a standalone handpan plugin called Orbiter&quot; style=&quot;max-width: 480px; width: 100%; border-radius: 8px;&quot;&gt;
&lt;figcaption style=&quot;color: #806B5C; font-size: 0.9em; margin-top: 0.5em;&quot;&gt;The first screenshot of the app (as a DAW plugin), then called &quot;Orbiter&quot;.&lt;&#x2F;figcaption&gt;
&lt;&#x2F;figure&gt;
&lt;div style=&quot;text-align: center;&quot;&gt;
&lt;audio controls style=&quot;max-width: 480px; width: 100%; margin: 1em 0; color-scheme: dark;&quot;&gt;
  &lt;source src=&quot;&#x2F;blog&#x2F;demo_d_hijaz.m4a&quot; type=&quot;audio&#x2F;mp4&quot;&gt;
&lt;&#x2F;audio&gt;
&lt;p style=&quot;color: #806B5C; font-size: 0.9em; margin-top: 0.3em;&quot;&gt;An early demo — handpan in D Hijaz.&lt;&#x2F;p&gt;
&lt;&#x2F;div&gt;
&lt;p&gt;On hearing it I was hooked and wanted to expand to additional, more complex physical models: a gong, and a bowed singing bowl, based on research literature. Each instrument reflects a different, relevant physically inspired principle:&lt;&#x2F;p&gt;
&lt;div style=&quot;display: flex; gap: 1em; margin: 1.5em 0; flex-wrap: wrap;&quot;&gt;
&lt;div style=&quot;flex: 1; min-width: 200px; text-align: center;&quot;&gt;&lt;p style=&quot;font-weight: 600; font-size: 0.9em; margin-bottom: 0.3em;&quot;&gt;Handpan&lt;&#x2F;p&gt;&lt;img src=&quot;&#x2F;blog&#x2F;handpan-model.svg&quot; alt=&quot;Handpan: detuned mode pairs and Helmholtz cavity resonance&quot; style=&quot;width: 100%;&quot;&gt;&lt;&#x2F;div&gt;
&lt;div style=&quot;flex: 1; min-width: 200px; text-align: center;&quot;&gt;&lt;p style=&quot;font-weight: 600; font-size: 0.9em; margin-bottom: 0.3em;&quot;&gt;Gong&lt;&#x2F;p&gt;&lt;img src=&quot;&#x2F;blog&#x2F;gong-model.svg&quot; alt=&quot;Gong: nonlinear energy cascade between vibrational modes&quot; style=&quot;width: 100%;&quot;&gt;&lt;&#x2F;div&gt;
&lt;div style=&quot;flex: 1; min-width: 200px; text-align: center;&quot;&gt;&lt;p style=&quot;font-weight: 600; font-size: 0.9em; margin-bottom: 0.3em;&quot;&gt;Singing Bowl&lt;&#x2F;p&gt;&lt;img src=&quot;&#x2F;blog&#x2F;bowl-model.svg&quot; alt=&quot;Singing bowl: elasto-plastic friction bowing with inharmonic modes&quot; style=&quot;width: 100%;&quot;&gt;&lt;&#x2F;div&gt;
&lt;&#x2F;div&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Handpan&lt;&#x2F;strong&gt; — uses &lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Modal_synthesis&quot;&gt;modal synthesis&lt;&#x2F;a&gt; with biquad resonators modelling individual vibrational modes. I found &lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;etheses.whiterose.ac.uk&#x2F;11390&#x2F;&quot;&gt;Alon (2015)&lt;&#x2F;a&gt; &quot;Analysis and Synthesis of the Handpan Sound&quot; particularly useful — it provided mode frequency ratios and decay times, the detuned mode pair structure that gives handpans their shimmering sustain, Helmholtz cavity coupling for warmth, and an excitation model. There is still a lot more to explore in that thesis alone.&lt;&#x2F;li&gt;
&lt;li&gt;&lt;strong&gt;Gong&lt;&#x2F;strong&gt; — also modal, but the modes interact with each other. The challenge is capturing the pitch-shifting swells and rising shimmer as energy cascades from lower to higher modes after a strike. This is modelled using the &lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;F%C3%B6ppl%E2%80%93von_K%C3%A1rm%C3%A1n_equations&quot;&gt;Föppl–von Kármán equations&lt;&#x2F;a&gt; for large-deflection plate vibration, where the coupling between modes grows with amplitude (hence &quot;nonlinear&quot; — hit harder and the modes interact more). The approach follows &lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;www.dafx.de&#x2F;paper-archive&#x2F;2023&#x2F;DAFx23_paper_7.pdf&quot;&gt;Bilbao et al. (2023)&lt;&#x2F;a&gt; &quot;Real-Time Gong Synthesis&quot; (DAFx-23).&lt;&#x2F;li&gt;
&lt;li&gt;&lt;strong&gt;Singing bowl&lt;&#x2F;strong&gt; — combines modal resonance with an elasto-plastic friction model (&lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;doi.org&#x2F;10.1109&#x2F;TAC.2002.1000274&quot;&gt;Dupont et al., 2002&lt;&#x2F;a&gt;) for bowing. The challenge is the sustained, slowly wavering tone produced by stick-slip friction when a rim is rubbed — the puja mallet alternates between sticking and slipping, driving inharmonic modes into self-sustained oscillation.&lt;&#x2F;li&gt;
&lt;&#x2F;ul&gt;
&lt;p&gt;Once I got the app running both standalone on the web and as a DAW plugin, ideas began to flow. How about…&lt;&#x2F;p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;b&gt;Sharing a sequence state by URL&lt;&#x2F;b&gt;: especially since I can ship this thing for real on the web, what if I could send the sequence I&#x27;m hearing to someone else in that same exact state and we could just sit there and enjoy it together? Or explore the sound together, &lt;b&gt;collaboratively&lt;&#x2F;b&gt;, changing the scale, or instrument and FX parameter knobs together in real time?&lt;&#x2F;li&gt;
&lt;li&gt;What if each parameter had &lt;b&gt;slightly chaotic, out-of-sync low-frequency modulation&lt;&#x2F;b&gt;, so the sound slowly breathes and evolves on its own? Who doesn&#x27;t love a good slow-evolving, complex modulation where you discover new nuances as you concentrate (omgomgomg, one day I want to reach &lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;www.nonlinearcircuits.com&#x2F;modules&#x2F;p&#x2F;triple-sloth&quot;&gt;Triple Sloth&lt;&#x2F;a&gt; Eurorack module grade dream state with the Orbiter modulation system too).&lt;&#x2F;li&gt;
&lt;li&gt;&lt;b&gt;Patch randomisation&lt;&#x2F;b&gt;: what if each random sequence produced by the app (identified by its random seed) also had its own unique set of instrument, scale type and root note parameters? Give also a fast way to randomise individual instrument parameters.&lt;&#x2F;li&gt;
&lt;li&gt;Since the rule-based sequencer is so much fun, wouldn&#x27;t it be nice to drive a DAW like Ableton or Bitwig with MIDI output from the app, with &lt;b&gt;MIDI clock or Ableton Link based sync&lt;&#x2F;b&gt;?&lt;&#x2F;li&gt;
&lt;li&gt;&lt;b&gt;A playback timer&lt;&#x2F;b&gt;, so you can relax and fall asleep to it.&lt;&#x2F;li&gt;
&lt;li&gt;… and so on.&lt;&#x2F;li&gt;
&lt;&#x2F;ul&gt;
&lt;p&gt;Slowly (I mean, not that slowly) the exploration revealed the definition for a musical tool with a pretty coherent shape, with a lot more to it than I thought I would be able to put together. The biggest surprise discovery though was still to come, and that is something I am calling a &lt;em&gt;composition assistant&lt;&#x2F;em&gt;.&lt;&#x2F;p&gt;
&lt;h2 id=&quot;composition-assistant&quot;&gt;Composition Assistant&lt;&#x2F;h2&gt;
&lt;p&gt;Algorithmic composition has a long history — from &lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Musikalisches_W%C3%BCrfelspiel&quot;&gt;Mozart&#x27;s dice games&lt;&#x2F;a&gt; through &lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Stochastic#Music&quot;&gt;Xenakis&#x27;s stochastic music&lt;&#x2F;a&gt;, &lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Generative_music#Brian_Eno&quot;&gt;Brian Eno&#x27;s generative systems&lt;&#x2F;a&gt;, and more recently neural approaches like &lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;1809.04281&quot;&gt;Music Transformer&lt;&#x2F;a&gt;. A generative process created with a small symbolic music language model baked into a tiny web based, or entirely native, app bundle, with battery friendly, GPU accelerated inference felt like quite an interesting challenge.&lt;&#x2F;p&gt;
&lt;p&gt;I trained a novel, autoregressive transformer model (a &quot;small language model&quot;, or perhaps an &lt;em&gt;agentic&lt;&#x2F;em&gt; composition assistant, to be 2026-compatible) for Orbiter, and it is indeed bundled in the app: it generates notes with velocities and durations, either autonomously or in response to notes you play. The model architecture, training method, and data augmentation methods I developed for Orbiter are going to get a post of their own at the point I feel I have explored this in full.&lt;&#x2F;p&gt;
&lt;figure style=&quot;margin: 2em 0;&quot;&gt;
&lt;img src=&quot;&#x2F;blog&#x2F;orbiter-system-diagram.svg&quot; alt=&quot;Orbiter system overview: MIDI and generative sequencer input, physical modelling synthesis engine, audio and MIDI output&quot; style=&quot;width:100%;&quot;&gt;
&lt;figcaption style=&quot;color: #806B5C; font-size: 0.9em; margin-top: 0.5em;&quot;&gt;System overview: MIDI input and the composition assistant feed notes into physically modelled instruments, with audio and MIDI output across all platforms.&lt;&#x2F;figcaption&gt;
&lt;&#x2F;figure&gt;
&lt;p&gt;There are three different generative sequencer modes bundled in the app:&lt;&#x2F;p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Rule-based&lt;&#x2F;strong&gt; — deterministic patterns where the same URL and system clock always produce the same music, making every session shareable by link. This is what you hear by default.&lt;&#x2F;li&gt;
&lt;li&gt;&lt;strong&gt;Autonomous&lt;&#x2F;strong&gt; — the model generates notes continuously on its own, streaming an evolving composition based on the current scale and tempo.&lt;&#x2F;li&gt;
&lt;li&gt;&lt;strong&gt;Call &amp;amp; Response&lt;&#x2F;strong&gt; — play notes to prompt the model, which responds with complementary musical phrases.&lt;&#x2F;li&gt;
&lt;&#x2F;ul&gt;
&lt;figure style=&quot;margin: 2em 0;&quot;&gt;
&lt;img src=&quot;&#x2F;blog&#x2F;sliding-context.svg&quot; alt=&quot;Sliding context window piano roll&quot; style=&quot;width:100%;&quot;&gt;
&lt;figcaption style=&quot;color: #806B5C; margin-top: 0.5em;&quot;&gt;In autonomous mode, the model generates continuously with a sliding context window, keeping the output evolving.&lt;&#x2F;figcaption&gt;
&lt;&#x2F;figure&gt;
&lt;p&gt;Once I had built the call &amp;amp; response mode, suddenly the MIDI in &amp;amp; out capabilities in the app came to full bloom, opening a whole new set of possibilities: I could use this thing to help generate ideas in Ableton, with all the power of its built-in instruments and effects and my plugin arsenal!&lt;&#x2F;p&gt;
&lt;img src=&quot;&#x2F;blog&#x2F;midi-sync-diagram.svg&quot; alt=&quot;MIDI and sync connections: controllers and DAWs exchange notes with Orbiter, with MIDI Clock or Ableton Link for tempo sync&quot; style=&quot;width:100%; margin: 2em 0;&quot;&gt;
&lt;h2 id=&quot;collaboration&quot;&gt;Collaboration&lt;&#x2F;h2&gt;
&lt;p&gt;Sharing a link is all it takes for two people to experiment together in real time — in the browser with no install, or from the desktop and mobile apps. One person can teach other(s) how the instrument works, change scales, twist knobs, and the other hears and sees every change instantly. It is a zero-friction way to explore sound together.&lt;&#x2F;p&gt;
&lt;p&gt;Under the hood, the entire application state — every knob, the scale, root note, sequencer seed — is synchronised between users via a &lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Conflict-free_replicated_data_type&quot;&gt;CRDT&lt;&#x2F;a&gt; (conflict-free replicated data type), so concurrent edits merge automatically. Modulators still run locally, so the sound breathes independently for each listener. I intend to open source the collaboration server I developed for this purpose separately.&lt;&#x2F;p&gt;
&lt;img src=&quot;&#x2F;blog&#x2F;collab-param-diagram.svg&quot; alt=&quot;Collaboration parameter flow: local and remote users set base values via CRDT, modulators apply offsets, final value goes to DSP engine&quot; style=&quot;width:100%; margin: 2em 0;&quot;&gt;
&lt;h2 id=&quot;available-everywhere&quot;&gt;Available Everywhere&lt;&#x2F;h2&gt;
&lt;p&gt;Orbiter runs on iOS, macOS, Windows, Linux, and Android, as well as in the browser. For producers and sound designers, it&#x27;s also available as a plugin: VST3, Audio Unit v3, and CLAP are all supported, so you can bring the generative engine directly into your DAW workflow.&lt;&#x2F;p&gt;
&lt;div class=&quot;platform-gallery&quot;&gt;
&lt;figure&gt;&lt;img src=&quot;&#x2F;blog&#x2F;ableton-plugins.png&quot; alt=&quot;Orbiter plugins in Ableton Live&quot; data-caption=&quot;Ableton Live — VST3 plugins&quot;&gt;&lt;figcaption&gt;Ableton Live&lt;&#x2F;figcaption&gt;&lt;&#x2F;figure&gt;
&lt;figure&gt;&lt;img src=&quot;&#x2F;blog&#x2F;bitwig-plugins.png&quot; alt=&quot;Orbiter plugins in Bitwig Studio&quot; data-caption=&quot;Bitwig Studio — VST3 plugins&quot;&gt;&lt;figcaption&gt;Bitwig Studio&lt;&#x2F;figcaption&gt;&lt;&#x2F;figure&gt;
&lt;&#x2F;div&gt;
&lt;div style=&quot;text-align: center;&quot;&gt;
&lt;audio controls style=&quot;max-width: 480px; width: 100%; margin: 1em 0; color-scheme: dark;&quot;&gt;
  &lt;source src=&quot;&#x2F;blog&#x2F;handpan-gong-resonator.mp3&quot; type=&quot;audio&#x2F;mpeg&quot;&gt;
&lt;&#x2F;audio&gt;
&lt;p style=&quot;color: #806B5C; font-size: 0.9em; margin-top: 0.3em;&quot;&gt;Handpan melody, through the gong resonator.&lt;&#x2F;p&gt;
&lt;&#x2F;div&gt;
&lt;p&gt;Technically, this meant addressing low-latency audio and GPU-accelerated rendering using native GPU libraries (Metal, Vulkan, DX12) on native platforms, and even in the browser (WebGPU with a WebGL2 fallback), with WebGPU acceleration and a CPU fallback for ML inference across all platforms. Rust and its WASM-targeting ecosystem turned out to be crucial: native performance across every platform, with first-class compilation to WebAssembly for the browser.&lt;&#x2F;p&gt;
&lt;p&gt;The top lesson I can draw from building this way is that creativity really blossoms when you are equipped with a well thought through &lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;openai.com&#x2F;index&#x2F;harness-engineering&#x2F;&quot;&gt;agent harness&lt;&#x2F;a&gt;. As long as I am pretty damned meticulous about that harness and the upfront degree of automated testing, I am freed from being quite so level-headed about implementation risk and limits of my capacity to realise something complex. For the first time in my life with a realtime audio related project, I felt the &lt;q&gt;It will be easy, probably just a quick weekend project&lt;&#x2F;q&gt; moment. As much as that was a fallacy (always is), it is also a sign of having hit a fun problem when you realise that now is too late to give up on it (I love a good moment of &lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;paulgraham.com&#x2F;schlep.html&quot;&gt;&quot;Schlep blindness&quot;&lt;&#x2F;a&gt;).&lt;&#x2F;p&gt;
&lt;p&gt;At any case, this is the first in a series of posts about Orbiter. Next: &lt;a href=&quot;&#x2F;blog&#x2F;cross-platform-engineering&#x2F;&quot;&gt;Cross-Platform Engineering in Rust&lt;&#x2F;a&gt; — how one codebase runs across browser, desktop, mobile, and DAW plugins.&lt;&#x2F;p&gt;
&lt;p&gt;&lt;a rel=&quot;external&quot; href=&quot;https:&#x2F;&#x2F;orbiter.audio&quot;&gt;Try Orbiter in your browser →&lt;&#x2F;a&gt;&lt;&#x2F;p&gt;
</content>
        
    </entry>
</feed>
