Start a new topic

Orba hacking knowledge base

This thread is intended to gather the feedback of Orba tinkerers.


27 people like this idea

Try playing the Orba on a flat surface.. it will emit less CC data and mostly Note data. Also this may help you to decode the values. If you can get me the serial data log and the corresponding eventData I might be able to figure it out. I've got a good understanding of MIDI messages and hex and binary data. 


1 person likes this

It makes sense that there is header data which seems to correspond with this:

                loopBarEndTolerance: 120
                beatLengthTicks: 480
                notesPerBar: 4
                quantMode: 1
                quantStartSnapTicks: 120
                quantBarEndSnapTicks: 240
                allowRecordingSilenceStart: 1
                allowRecordingSilenceEnd: 0
                thinnerMode: 1
                cc_error_limit: 1000
                pbend_error_limit: 25000
                ccA: 1
                ccB: 74
                ccC: 75
                thickenerMode: 1
                thickenerEmitPrd: 20
                thickenerMaxDt: 5000
                noteStartWindow: 240

You said that indexes 0-17 appear to be the header and there are 18 values here. Base64 gets converted to an array of 8-bit numbers (0-255) but some of these values require less space (booleans such as thickenerMode 0/1 only requires 1 bit) and some are larger requiring two bytes. I'm not sure how these are packed but I'm really hoping they match the headers above. I've been quite busy that last 3-4 days and haven't had much of a chance to dig in. Fortunately, I rebuilt my PC and have my Windows machine back up (with better hacking tools than my Mac).

I'm finding it more difficult than I thought to repeat this experiment with "tap only" presets.


1) I couldn't see how to create a tap-only copy of a song file because they don't contain Modifier lists.

2) I couldn't seem to create a tap-only Bass preset by only keeping the first Modifier/Seeker pair. 

3) Something seemed to go wrong with the song-saving system when I started trying to work with tap-only Lead presets, and the eventData looked different.


Next step will be to try starting over with a full, original Lead preset and try and pick out the header + starting note again.


...that data file I uploaded above was the wrong one, ignore that...just some rough notes. Here's the correct one...

xlsx

...this is going to be a lot easier if I start again with a set of songs for one gesture at a time, starting with tap only...

...so far so good. I've been able to record a tune on Bass, then fiddle the Song file to change the first note. It'll get a bit more complicated after that beause there might be an arbitrary number of CC messages until the next one. 


>"I suspect its the exact same data from the MemIdx above"


...yep, I guess eventData is basically a list of MIDI event codes followed by MIDI event data. Time to hit the loop logs again.

I just tried recording a few different bass loops and looking at LoopData. 


For the first one,I started with "All Out", let it count four metronome clicks in, played the lowest note, then pressed the central button to stop recording on the third click after that. I then repeated with the next two notes up. 


Comparing the strings:


F3wHdxQAAFEVAAAAACAQAAA5dSRoAA==

FxcAaxQAAFUVAAAAACAQAAA7eUpVABRIADA=

F2MHYhQAAGYVAAAAACAQAAA9cTlyAA==


I've found that other similar loops have the same string from characters 12-23:


VAAAAACAQAAA


This is followed by 5, 7 and 9 in the three strings. These decode to 57, 59, 61 at index 17, which are the MIDI note numbers the Orba sends. So, it looks like the MIDI data sequence starts there.


 (For general reference, note that the Fiddler data index starts at 0, so this is 18th value in the sequence.)


 I've attached a CSV with some of this info. The set of values under the eventData are taken from other parameters in song file's LoopData entry.

txt
(3.39 KB)

LoopData might contain some of the header info before we get to the MIDI data. I tried decoding the first 19 characters of the LoopData and comparing the values. A couple of the numbers might correlate, eg:


Index 2 and Index 10 = 74 = ccB

Index 16 = 75 = ccC


Trial and error I guess; not sure what ccA/ccB/ccC even are yet...

 

csv
(740 Bytes)

When I was looking at it yesterday, I think it was showing a log like this while playing at one point.


image


I wasn't able to get into that mode just now, but comparing it with the "loop info" file from this other loop might be helpful while trying to figure out the LoopData string. 

Here's a set of files from a simple Bass loop:


1) MP3 recording 

2) MIDI recording (from Orba USB to a DAW) 

3) Song file

4) Bass LoopData from the song file

5) Full loop info from the console with header

6) MIDI notes only from the loop info


Looking at (6), the last note in the data is the first note in the tune. Maybe that's just the timing of the playing. There's been no attempt to quantise it. It would be interesting to try and change one of the notes.


I found this a bit of a fiddle, but hopefully it's right. 

 

zip

Figuring out the calculation of ticks to notes will be important. I found this to help

This log data will be invaluable to breaking down the eventData in a song's LoopData nodes. I suspect its the exact same data from the MemIdx above. Eventually I'll build the Generic Fiddler but we can just paste the eventData from a song like the one below into the experimental fiddler but I think it might be too long.. Quantizing would just be rounding the tick values to the nearest tick..

    <LoopData writeIndex="504" recordStartTime="0" recordStopTime="0" lastEventTime="3484"
              nBars="4" eventData="INgAKkswRwAggQAqUi5ZABBoAC1ZKpoAIO4AKkMnNwAglgAqTDNPABBsACRKKKcAIBUARQsAWgAQ7wAqSi1CABCMACpLLlQAEFkALWMrpwAQ+wAqSCw/ABCMACpPLVAAEGoAJFAlkQAgHABFBABaABDMACpTLkMAEI4AKkssVgAQYQAtWCeBABD+ACpUMkYAEIwAKlo3UQAQWwAkSyiOACAbAEUFAFoAEOkAKlYxPQAgIgBFBABaABBoACpOMEoAEFwALUUwawAQ8gAqXjVFACAeAEUEAFoAEGYAKlI1SgAQcwAkVSl4ACAbAEUHAFkAEMEAKkkwRAAQogAqTTJEABBLAC1PHnsAEPkAKlk4RgAgJABFBQBZABBvACpJNkIAEFkAJFotnAAgGgBFBQBZABDjACphNzsAICAARQMAWgAQaAAqTjFDABBqAC1TI2wAEPAAKlw2OAAgHwBFBABZABBxACpVNEUAEFQAJF0upgAgGQBFCQBZABDSACpaOjsAIBwARQUAWgAQagAqSzFGABBkAC1eNHwAEAQBKlw0NgAQigAqUjVIACBIACRnPXoAEBsARQIAWQAQ6gAqVy49ACAhAEUFAFkAEGQAKlAySwAQXQAtUiZvABD9ACpoOD8AICAARQcAWgAQUwAqUjk0ABBeACRXLKUA"
              eventDataCrc="fe32e7bf"/>

 

In the meantime, I guess it should be possible to reconstruct it from the log.


Here's a simple tune with Bass. I've been messing around with quantise and I think it might possibly have done something to it, because I don't think I played it that badly. 


I've attached the data from the console. All rows except "MIDI Note" can be discarded. The note data loooks plausible. In order to reconstruct a MIDI sequence using the timing data I'd be thinking Max/MSP, if I knew how to use it; or maybe one of those music programming languages like Sonic Pi.


https://soundcloud.com/qchord/bass-tune/s-CEGeZa4mnx5?utm_source=clipboard&utm_medium=text&utm_campaign=social_sharing


csv
(11.8 KB)

The "MemIdx" (memory index?) entries might be useful...


MemIdx = 148 - MIDI Note at tick 2759, channel 1, note 52, duration 114, von 107, voff 51

MemIdx = 156 - MIDI Note at tick 2759, channel 1, note 59, duration 114, von 107, voff 51

MemIdx = 164 - MIDI Note at tick 2759, channel 1, note 64, duration 114, von 107, voff 51

MemIdx = 172 - MIDI Note at tick 2759, channel 1, note 68, duration 114, von 107, voff 51

MemIdx = 180 - MIDI CC at tick: 2822, channel: 0, CC Num: 1, CC Val: 0

MemIdx = 184 - MIDI CC at tick: 2858, channel: 1, CC Num: 255, CC Val: 44

MemIdx = 188 - MIDI CC at tick: 3462, channel: 0, CC Num: 1, CC Val: 7

MemIdx = 192 - MIDI CC at tick: 3839, channel: 1, CC Num: 255, CC Val: 83

MemIdx = 196 - MIDI CC at tick: 3839, channel: 1, CC Num: 74, CC Val: 93


It could potentially show the structure of a block of note data if we could find a way to copy it off.

...as well as the quantizationMode parameter in the "Mode" tags...


<Mode name="Lead" quantizationMode="0" volume="200">


...the Song presets have entries like this which are probably tied up with loops and quantisation.


<LoopData writeIndex="0" recordStartTime="0" recordStopTime="0" lastEventTime="0"

nBars="0" eventData="" eventDataCrc="ffffffff"/>

Login or Signup to post a comment