Start a new topic

ChatGPT v Orba 1

ChatGPT v Orba 1

Part 1

Around page 22 of the "Orba hacking knowledge base", a year or so ago, me and @Subskybox were dissecting the eventData string the Orba 1 uses to represent sequences. @Subsky did some clever mathematical analysis while I did the donkey work of setting up experiments and recording the results.


Some of the experiments were based on a song called "DPC" which played the first seven notes of a minor scale. I've attached the song file, console output, and a spreadsheet @Subsky put together after analysing the data.

The eventData string is a mix of note and performance data, but this "DPC" test simplifies things to only include note data. This is organised as a series of "note blocks":

Note Block 1:

PlayNote: 16

startTicksLSB: 7

startTicksMSB: 0

Note #: 62

Vel On: 120

Vel Off: 90

DurTicksLSB: -11

DurTicksMSB: 1

Note Block 2:

PlayNote: 16

startTicksLSB: 89

startTicksMSB: 7

Note #: 64

Vel On: 127

Vel Off: 92

DurTicksLSB: -17

DurTicksMSB: 1

Note Block 3:


PlayNote: 16

startTicksLSB: -105

startTicksMSB: 7

Note #: 65

Vel On: 113

Vel Off: 92

DurTicksLSB: -46

DurTicksMSB: 3

Note Block 4:


PlayNote: 16

startTicksLSB: -122

startTicksMSB: 7

Note #: 67

Vel On: 121

Vel Off: 80

DurTicksLSB: -31

DurTicksMSB: 3

Note Block 5:


PlayNote: 16

startTicksLSB: 108

startTicksMSB: 7

Note #: 69

Vel On: 118

Vel Off: 58

DurTicksLSB: -91

DurTicksMSB: 1

Note Block 6:


PlayNote: 16

startTicksLSB: -100

startTicksMSB: 7

Note #: 70

Vel On: 127

Vel Off: 91

DurTicksLSB: -20

DurTicksMSB: 1

Note Block 7:


PlayNote: 16

startTicksLSB: 113

startTicksMSB: 7

Note #: 72

Vel On: 87

Vel Off: 55

DurTicksLSB: 116

DurTicksMSB: 1

If you take this series of values and encode them as a Base64 string, you get the corresponding following eventData string from the .song file:

"EAcAPnha9QMQWQdAf1zvAxCXB0FxXNIFEIYHQ3lQ4QUQbAdFdjqlAxCcB0Z/W+wBEHEHSFc3dAE="

This appears in the .song XML as follows:

<LoopData writeIndex="56" recordStartTime="0" recordStopTime="11882" lastEventTime="4809"

nBars="7" eventData="EAcAPnha9QMQWQdAf1zvAxCXB0FxXNIFEIYHQ3lQ4QUQbAdFdjqlAxCcB0Z/W+wBEHEHSFc3dAE="

eventDataCrc="1ff6d4c4"/>

The problem we found is that the timing data is relative...the timing of each note, ie when it plays, is affected by the one before. That makes real-time quantisation a bit of a nightmare. It might be posisble to implement "offline" quantisation, processing a .song file to quantise the data, or create new sequences based on MIDI data, but it's a hassle and we pretty much abandoned the investigation at that point.
 
A few months later, ChatGPT arrived on the scene...

 

 

 

 

 

 

song
(31.2 KB)
txt
(1.28 KB)
xlsx

1 person likes this idea

Replacing the MIDI notes in an existing sequence in this way is quite straightforward, and you can use the same approach to replace other data, timing, duration, velocity, etc. I'm currently trying to figure out how to generate an eventData string like this from scratch along with other relevant XML data, rather than by manipulating a 'dummy' string.

Here's a new test.

Song "Seq" has eight notes.

Loopdata is:

<LoopData writeIndex="64" recordStartTime="0" recordStopTime="6857" lastEventTime="5902"

nBars="4" eventData="EAoASmUpqQAQgANKTTCnABDjA0UzJLoAEJIDRTwnxQAQ4gNBbSzXABCyA0FwLK0AEL8DPlc2vAAQxAM+djqzAA=="

eventDataCrc="e91384a5"/>

stringtoraw.py:

16,10,0,74,101,41,169,0

16,128,3,74,77,48,167,0

16,227,3,69,51,36,186,0

16,146,3,69,60,39,197,0

16,226,3,65,109,44,215,0

16,178,3,65,112,44,173,0

16,191,3,62,87,54,188,0

16,196,3,62,118,58,179,0

decodeblocks.py:

MemIdx = 0 - MIDI Note at tick 10, channel 1, note 74, duration 169, von 101, voff 41

MemIdx = 8 - MIDI Note at tick 906, channel 1, note 74, duration 167, von 77, voff 48

MemIdx = 16 - MIDI Note at tick 1901, channel 1, note 69, duration 186, von 51, voff 36

MemIdx = 24 - MIDI Note at tick 2815, channel 1, note 69, duration 197, von 60, voff 39

MemIdx = 32 - MIDI Note at tick 3809, channel 1, note 65, duration 215, von 109, voff 44

MemIdx = 40 - MIDI Note at tick 4755, channel 1, note 65, duration 173, von 112, voff 44

MemIdx = 48 - MIDI Note at tick 5714, channel 1, note 62, duration 188, von 87, voff 54

MemIdx = 56 - MIDI Note at tick 6678, channel 1, note 62, duration 179, von 118, voff 58

This attempts to generate simulated "console data" from the Orba's built-in debug utility.

The real "console data" is:

         Using 64 of 4096 bytes of looper memory available (1 %)


         Looper Configuration:

                loopBarEndTolerance: 120

                beatLengthTicks: 480

                notesPerBar: 4

                quantMode: 0

                quantStartSnapTicks: 120

                quantBarEndSnapTicks: 240

                allowRecordingSilenceStart: 1

                allowRecordingSilenceEnd: 0

                thinnerMode: 1

                cc_error_limit: 1000

                pbend_error_limit: 25000

                ccA: 1

                ccB: 74

                ccC: 75

                thickenerMode: 1

                thickenerEmitPrd: 20

                thickenerMaxDt: 5000

                noteStartWindow: 240


MemIdx = 0 - MIDI Note at tick 10, channel 1, note 74, duration 169, von 101, voff 41

MemIdx = 8 - MIDI Note at tick 906, channel 1, note 74, duration 167, von 77, voff 48

MemIdx = 16 - MIDI Note at tick 1901, channel 1, note 69, duration 186, von 51, voff 36

MemIdx = 24 - MIDI Note at tick 2815, channel 1, note 69, duration 197, von 60, voff 39

MemIdx = 32 - MIDI Note at tick 3809, channel 1, note 65, duration 215, von 109, voff 44

MemIdx = 40 - MIDI Note at tick 4755, channel 1, note 65, duration 173, von 112, voff 44

MemIdx = 48 - MIDI Note at tick 5714, channel 1, note 62, duration 188, von 87, voff 54

MemIdx = 56 - MIDI Note at tick 6678, channel 1, note 62, duration 179, von 118, voff 58

 
...so that matches OK. 

Considering the rest of the XML:

LoopData writeIndex="64"
recordStartTime="0"
recordStopTime="6857"
lastEventTime="5902"

nBars="4"

Where does that come from, I wonder. Eg that figure of 5902 for lastEventTime. The last note is at 6678.  

py
song
(31.5 KB)

5902 = 5714+188. That's actually the time plus duration of the penultimate note. Curious.

...and ChatGPT just pointed out that the "recordStopTime" is similarly the time plus duration of the final note. 6678+179=6857.


OK, well, I don't know why the XML references those two sums, but it's handy to know how they're generated. That's a useful step towards creating a viable loopData entry from scratch, which is the next target...

The latest routines just enabled me to convert a polyphonic MIDI file of Für Elise into Orba XML and it played first time, no dummy data required - pleased with that. As well as writing the code, ChatGPT has been making helpful observations, such as identifying how the essential "writeIndex" value is calculated in the XML - it's simply the number of values in the note blocks file.

https://youtu.be/G-4zJExTIN0

The workflow is a bit of a mess, but I thought I'd document it for posterity FWIW while I still remember how it works.


MIDI files have a succession of note-on/note-off messages. I've been preparing them in Sonar, now Cakewalk by Bandlab, and was surprised when I found that my version of the DAW changed note-off to note-on with velocity 0 when exporting. Seems a bit odd to me, but apparently they're interchangeable in theory. The routines I've been getting ChatGPT to write expect that format.

There's a handy website called MIDI-Dump which is useful for analysing MIDI files.

https://github.com/g200kg/midi-dump

1) First I run "midi1.py furelise.mid". The program takes a MIDI file as a parameter and creates three files, "header", "notes" and "footer". Consider the first four lines of the output for "Fur Elise" in "notes":

76,71,4,4

76,0,240,244

75,38,0,244

75,0,240,484


Four values per line:

First value - MIDI note
Second value - Velocity
Third value -  Relative timing (ticks)
Fourth value - Running total or 'absolute' timing (ticks).

This is note-on, note-off (vel 0), note-on, note-off (vel 0).

2) Next routine is "durations.py". This reads notes.txt and creates durations.txt. First four lines:

76,71,4,4,240

76,0,240,244,0

75,38,0,244,240

75,0,240,484,0

It simply appends an extra value to each line - 0 for note-off (vel 0), duration for note-on (calculated as the difference between a note-on event and the corresponding note-off event which follows afterwards at some point; vel 0 for the same note value one or more lines later. Here we see that the duration for note 76 is 240; the time between the start of the note at 4 and the start of the ensuing note-off at 244.)

3) Next routine is "noteones.py". This reads durations.py and creates noteons.txt. First four lines:

76,71,4,4,240

75,38,240,244,240

76,57,240,484,240

75,70,240,724,240

This removes the note-off lines, while updating the relative timing of the remaining note-ons to take this into account. (MIDI files have relative timing data like the Orba, but with MIDI, these relative times reflect the note-ons with +ve velocity as well as the note-ons with 0 velocity which actually represent note-offs. Orba sequence data only uses the note-ons.)

So this four lines capture the first four notes of Fur Elise now; 76 75 76 75, together with velocity and  relative time. AT this point we can run...

4) dur2orb.py. This reads note-ons.txt and crunches it into Orba note-block form as "orba_notes.txt":

16,4,0,76,71,100,240,0

16,240,0,75,38,100,240,0

16,240,0,76,57,100,240,0

16,240,0,75,70,100,240,0

We now see the same four notes in the familiar (to me now) Orba note block format. 

16 announces the start of a note in Orba-speak

Values 2 and 3 represent the absolute timing of the note in LSB/MSB form

Value 4 is the note

Values 5/6 are vel-on, vel-off. I haven't bothered reading the vel-off values from the MIDI file; I just use a standard value of 100. Vel-off is oretty niche; who cares.

Values 7/8 are the duration in LSB/MSB format

(The LSB/MSB calculations are beyond me. I simply copied them out of a spreadsheet linked earlier that @Subskybox figured out and fed them into ChatGPT. It's all discussed in detail around page 20 or so of the Orba hacking thread.)

Finally, we can use rawtostring to convert this into an eventData string. This expects the input file to be called "note_blocks", not "orba_notes", which needs to be renamed.

We also need certain other values for the loopData XML for the furelise song file:

    <LoopData writeIndex="1536" recordStartTime="0" recordStopTime="41044" lastEventTime="41284"

              nBars="50" eventData="EAQA etc etc"

              eventDataCrc="e91384a5"/>

recordStopTime and lastEventTime were discussed previously; it's the absolute time plus duration of the final and penultimate events respectively. (Perhaps lastEventTime is really about when the last note finishes - which might not necessarily be when the last note in the sequence finishes, depending on duration. Not sure.) 

writeIndex, as mentioned earlier, is the number of values in the note_blocks file.

I haven't really thought about nBars much yet, I just set it high. It only matters for loops I guess.

And that's it, so far.

********************************************

So...why such a ragged, complicated procedure...? Well, I found it difficult to get ChatGPT to write some of this stuff. It certainly couldn't do it all in one go. So I broke it down into simpler steps. I was planning to then get ChatGPT to string them all together and streamline it, but I don't know if I'll bother. Certainly not yet.

song
(33.5 KB)
py
(807 Bytes)
py
(1.31 KB)
py
(1.11 KB)
py
(1.19 KB)
mid
(1.45 KB)
py
(1.38 KB)

"Values 2 and 3 represent the absolute timing of the note in LSB/MSB form"

...correction...the RELATIVE timing.

tldr:

py midi1.py <midifile.mid>
py durations.py
py noteons.py
py dur2orb.py
rename orba_notes -> note_blocks
py rawtostring.py

Copy a song file and replace eventData along with lastEventTime, recordStopTime, writeIndex, nBars, calculated as above. (After getting creating the note_blocks.txt file, I run decodeblocks.py to report the "console output" summary with all the absolute times, ntes, durations, etc., as a reality check to see if it looks sensible, and to get the numbers for calculating those additional XML values.)
  

py
txt
(4.83 KB)

(The main stumbling block here for anyone else whos' remotely interested will be that stupid MIDI file format which replaces note-off with note-on vel 0. Unfortunately I didn't realise that Sonar was doing that until I was fairly well into proceedings. midi1.py can't handle the conventional format, so I'll need to revisit that at some point.)

OK...I think this might get things back on track.

Using "midi2.py" instead of "midi1.py" will hopefully handle conventional MIDI files with note-on/note-off sequences. I just downloaded a MIDI of the Maple Leaf Rag, followed the steps above with this new version:

py midi2.py <midifile.mid>
py durations.py
py noteons.py
py dur2orb.py
rename orba_notes -> note_blocks
py rawtostring.py

...etc., and it plays OK. (I only used the first section of the "note_blocks" file as it was pretty long and I'm still not sure what the note limit is.)

I'm interested in taking a look at the MIDI data for the other parts next, see what I can do with drums etc.

mid
(20.6 KB)
py
(1.41 KB)

(Song file for the start of Maple Leaf Rag attached...needs a slow tempo and fewer bars.)


song

Just one more to prove it wasn't a fluke...;-)


Scotland The Brave. Tempo should be set to about 70.

Because I haven't nailed nBars yet it might go quiet for a while after you first upload and play it.


song
(36.4 KB)
mid
(4.04 KB)

1 person likes this

I managed to find the Notes I had taken on what I had decoded from song eventData...:


eventData:


# PlayNote

Command (16 is PlayNote but there are likely other for CC values etc..)

Relative Start Tick [LSB] since last (event or note unknown)

Relative Start Tick [MSB]

MIDI Note #

Velocity ON (1-127 follows 7 bit MIDI standard)

Velocity OFF (1-127 follows 7 bit MIDI standard)

Duration in Ticks [LSB]

Duration in Ticks [MSB]


# CC messages require 4 bytes: 

20|23|39?|36?

startTickLSB

startTickMSB

Value


# PB messages appear to be 6 bytes: 

21

startTickLSB

startTickMSB

unused

valueLSB

valueMSB


# ??

37

byte1

byte2



# 32

8 bytes?? Similar to #16


1 person likes this

# 32 8 bytes?? Similar to #16

I seem to remember finding that multiples of 16 - 0, 32, act in the same was as 16. I'm not sure what the difference is. I found that 16s started turning into 32s in a sequence where I was playing rapid notes but I'll need to investigate further. 

Just ordered another Orba 2, must be crazy but I want to see how sequence data works on that. ;-) 

These devices have so much potential but I've lost interest in making software for them.. I hope they can make these things stable by the end of the year. Its nice to see your progress :)


1 person likes this
Login or Signup to post a comment