Creating DAW – Day 4

Back on the GUI programming, the timeline UI is a must for DAW. There were some implementations using ImGUI such as ImGuizmo’s ImSequencer or its improved version as ImNeoSequencer, or Sequentity.

The create our own we need do somthing like ruler and the cursor to indicate playback position.

The tricky part is understanding the cooridnate system in ImGui, its clipping region, and when window scrolling.

Let set padding to zero with ImGui::PushStyleVar(ImGuiStyleVar_WindowPadding, ...), before start to create child window ImGui::BeginChild(..., ImGuiWindowFlags_HorizontalScrollbar).

Within child window block the ImGui::GetWindowDrawList will has its clipping region set for current child window. The cursor position should be on the top-left corner or {0, 0}.

Here the tricky part about coordinate system within child window block:

  • GetWindowPos() -> point in the screen coordinat for child window.
  • GetWindowSize() -> size the visible child window area on screen.
  • GetCursorPos() / SetCursorPos() -> point for current draw cursor in local coordinate for child window.
  • GetCursorScreenPos() / SetCursorScreenPos -> point for current draw cursor in screen coordinate for child window.

Keep in mind that drawing using Draw List need to use screen coordinate. Since clipping already enabled, we may draw the ruler marker from the left or start of X position, even being clipped. The left side is in screen coordinate. The easy way to get the scrolled screen position is doing translate with SetCursorPos(0, ...) then GetCursorScreenPos().x.

Iterate during drawing along with moving the cursor in constant space until reach the right side. The right side is the length of contents but reaching only end of visible area is enough. For right side we need to get the right visible area on screen coordinate. Doing GetWindowPos + GetWindowSize will get the right side.

Drawing the playback cursor using AddConvexPolyFilled. To make cursor interactive such as able to receive mouse dragging, we can use InvisibleButton. Set the drawing cursor position with SetCursorPos into the left-corner of InvisibleButton with size w, h. Get the rect of InvisibleButton with GetItemRect{Min,Max}. Both functions will return a screen coordinate as an area for drawing a playback cursor.

ImVec2 points[] = {
  { rcmin.x, rcmin.y},
  { rcmin.x, rcmax.y - (cursor_wd/2)},
  { rcmin.x + (cursor_wd/2), rcmax.y},
  { rcmax.x, rcmax.y - (cursor_wd/2)},
  { rcmax.x, rcmin.y},
};

The order of polygon’s points should be in counter-clockwise. Draw the polygon with ImGuiCol_Button or ImGuiCol_SliderGrab color to use theme style.

Creating DAW – Day 3

The default example3 from TinySoundFont playing a MIDI file thru its TinyMidiLibrary (TML) within SDL Audio’s Callback. To make it compatible to the others, I need to create the streamed version such as BassMidi API. We could de-couple the TML or could send real-time MIDI event when there were a Keyboard strokes if using streamed version.

Then we need to implement these:

  • when a client or application send MIDI stream, it means to write MIDI events on a queue.
  • when a server or virtual device want to render the Audio, it means read the MIDI events from the queue.

This kind of Producer-Consumer relation while Producer is the application that send the MIDI events an Consumer is the virtual device when it need to render the MIDI events. Since the application is exclusively using the virtual device, we could cal it is Single Producer Single Consumer FIFO queue system, and even can easly implement lock-free and wait-free system. For processing streamig events in running audio stream, the good choice is using Circular Buffer for queue. Hence, I found this good article: Lock-Free Single-Producer – Single Consumer Circular Queue (old version). The old version not using std::atomic.

Here is the 1st implementation. The first row is when the MIDI event being pushed. The second row is when audio callback being called which actually to render the buffer. The third row is when the buffer being played and heard.

From the picture the buffer length is 40 samples. In real case it is 4096 samples. I think it is small enough to process whole pending MIDI events in the queues. But it is wrong, 4096 samples is equal with 92 ms. While the sequencer push the event in interval 10ms, it gonna be aligned or quantized into 92ms and make the sound horrible.

Here the fix the 2nd implementation. Each event being pushed is timestamped. It will be used to sync the event start time with the buffer being rendered. Later during rendering the MIDI event will be quanitzed again per 64 samples or near 2 ms.

For the latency we can see from the picture above that the MIDI event will be heard after two buffer time. Latency depends on SDL Audio Driver being used. The driver implementation may differ between dsound, wasapi, coreaudio, etc. For example if the driver does not using another pre-render buffers, its about 2 * 4096 samples or about 185ms.

References:

Create DAW – Day 2

After tinkering with the GUI (on Day 1), the next step to do is find a way how to read and write MIDI files and send it to MIDI out devices. Handling MIDI files is available using jdksmidi library. Additionally It allows connect to MIDI device. On its experimental3 branch the author refactor the library to utilize RtMidi when communication to MIDI out device.

Next step is to simulate the player or MIDI sequencer. Fortunately jdksmidi provide an example to create the sequencer. I add new option on test_sequencer to use RtMidi. On Windows it works flawlessly using Microsoft GS Wavetable and OmniMIDI as virtual MIDI out devices.

The challenging part creating sequencer is to send MIDI event at realtime from PC to the MIDI out devices. There were two options to use a timer to do “near” real-time. First using Audio stream as timer and second using high-precision or Multimedia Timer.

  • For Audio stream timer duration depend on the audio buffer length when to send current MIDI events. Usually the audio driver will call our callback to fill its audio buffer.
  • For high-precision time we can wait or sleep using “reasonable” small interval eg. 10ms each to send MIDI events from past interval to current. The implementation on cross-platform C++ is using std::this_thread::sleep_for.

For platform that doesn’t have virtual MIDI such as MacOS or Linux there were other options such FluidSynth, BASSMIDI, or TinySoundFont (tsf).

  • I’am starting with BASSMIDI since it is production ready, good quality, simple and has mature API. The limitation is its closed-source and the type of its license if the project want to go commercial, but on the early stage it is very helpful as long as its free.
  • FluidSynth is worth to try but for this project it is overkill and depend on several linux libraries. It is open source and could be source of insiprations and benchmark.
  • At the end of the day, I met TinySoundFont, the header-only library, small and open-source software wavetable synthesizer.

TinySoundFont rendered output is acceptable with small issues:

  • Some preset/instrument played with very low volume.
  • Additional sampling or interpolating algorithm to improve render quality.
  • Optionally: moulation, reverb, chorus DSP effect.

But dont worry, be happy. Since its open-source we can do the experiment to tackle this with our favour.

Create DAW – Day 1

DAW is Digital Audio Workstation, a computer software to create a song. Currently, I am creating MIDI using Cakewalk Pro Audio 9 the very legend software from Twelve Tone released on 1999 (25 years ago from 2024).

Around 1999, during sequencing a MIDI song I am using casette tape to play the original song. The works will be lot of doing rewind-play multiple times since old devices doesn’t support looping on several parts. Now, I’am using Audacity to do that.

Creating a DAW was one of my dream, which will combine the MIDI sequencer program and the audio player with additional features such as: sync to the measures, time-stretching the audio such as rubberband, and cross-platform.

After seek other languages and libraries, I will use native approach using C/C++ for this DAW. The GUI will be using ImGUI. So the 1st day, I try to learn how to create a simple song project view.