Close

The perils of LARPing

A project log for FATCAT: Altoids Tin Mod Tracker

A drum machine, base synth and arpeggiator that fits in your pocket. User interface inspired by classic mod tracker software.

dejan-risticDejan Ristic 10/10/2018 at 10:510 Comments

And naturally, by LARP i'm referring to the (L)ong ARP...

Oh... You don't know what that is? 

Sheesh, well I guess I'll start off with explaining that then: 

Primer on ARPs, LARPs and FLARPs and their use within chiptune music

So first of all, the arpeggio (the arp) is an indispensible tool in the classic chiptunes (and its sample-based mod cousins) bag of tricks. It allows for a way of playing a full chords worth of notes, while demanding little more system resources than would be needed for plonking out a single note. 

Furthermore, to my mind at least, there in essence exists two distinct styles of the chiptune arpeggio: 

One is the short arp. It's basically what I just said: A cheap way of "faking" a chord by a system lacking in resources. The individual constituent notes of the chord plays in rapid sequence, usually repeating several times for each row. Each consecutive chord row basically plays out the same as the previous one. 

Then there's the looong arp! That's the arp elevated into an arpform (sorry, sorry, that should read artform). The long arp usually consists of more notes and the note sequence usually plays at a slower pace. (But not necessarily--it can get real fast too.) But the long arp sequence in its entirety always range over the duration of several pattern rows. By definition, it wouldn't be a long arp if it didn't. 

If you haven't ever heard a long arp, or are unsure, I strongly encourage you to seek out the title track of the NES game "Solstice" composed by Tim Follin, which features the worlds perhaps most glorious example of the FLARP (i.e. the Fast Long ARP). 

The song data model

At this point, I should probably state clearly what is the actual purpose of this log: The log will explain how the song data model works, and also how it has changed during the development of FATCATs arp feature. I'll begin with how proto-FATCAT works in that respect, and then continue on to describe what changes were made in the current project. Don't worry though —the reason for having the previous section will become apparent eventually—I wasn't just trying to waste your time.

Proto-FATCAT data model

As with all things proto-FATCAT, designing a song data model was pretty straightforward. And I'll get in to that, but first I'll just mention a few relevant facts:

 Now let's have a look at a code snippet:

#define NR_PATTERNS 16
#define NR_ROWS     16
typedef struct {
    uint8_t track[NR_PATTERNS][NR_ROWS];
    uint16_t base_row_on[NR_PATTERNS];
    uint16_t base_row_alt[NR_PATTERNS];
} SongType;

These arrays are what holds the song data and that gets loaded/saved between RAM and EEPROM. There's no concept of loading/saving any individual patterns—it's all or nothing. The track array holds all the data about which the songs notes are and how they're positioned in the patterns. Each individual byte in track holds the note data for one specific row in the song. I'll get back to the two other arrays a bit later. 

A better name for track would've probably been "tracks", since it contains the note data for both the Drum and the Base:

Bit nr:76543210
track data:D:1D:0B:5B:4B:3B:2B:1B:0

.

The three octaves worth of Base notes fits comfortably within the six LSB:s, and the two remaining bits are just enough to fit the three Drum notes. Having separate arrays for each track would've been too wasteful—16 patterns would've required exactly 512 bytes, which is all the RAM there is.

The two other SongType arrays acts as bit-flags that augments the note data for the Base track. The base_row_alt bit-flags are used for activating the legato/portamento effect for Base note rows. 

Then there's base_row_on, which requires a bit of explanation:

In this log I showed a little UI usage example that demonstrated how an active note needs to be turned off before it can once again be turned on, and then be changed. (That's a compromise resulting from only using three buttons for the UI.) And if the user turns a note off just in order to change it he probably don't want its current value to get lost in the process. For that reason, I wanted previously "on" notes to retain their value even after being turned off. 

So bit-flags in base_row_on is used to hold the current on/off state of Base notes independently of their note values. I could've just as easily added the same functionality for the drum track, but I was a bit overzealous about not wasting space at that point, so I felt that this feature wasn't super necessary to have for just the three Drum notes. For that reason a Drum note is set to "off" simply by setting its value to "0".

FATCAT: Early data model

Proto-FATCAT could easily fit a song of 16 patterns or more in RAM. But things wasn't going to be as easy from now on. 

First let's have a look at the new data structure (NR_PATTERNS val unknown for now).

#define NR_PATTERNS ?
#define NR_ROWS     16
typedef struct {
    uint8_t x_track[NR_PATTERNS][NR_ROWS];
    uint8_t y_track[NR_PATTERNS][NR_ROWS];
    uint8_t z_track[NR_PATTERNS][NR_ROWS];
} SongType;

Here the bit-flag arrays are gone and there's just three nondescript track arrays. These roughly corresponds to each of the now three tracks: Base, Arp and Drums. However the individual track data is a bit scattered among them in order to fill up all available bits, so I used nondescript names as to not confuse myself while coding. 

At this point I'd decided it would be a good idea for track editing to work basically the same for all three tracks. That meant each track would have an "alt" function (portamento for Base and something else for the other tracks), and each track would retain values for inactivated notes. So all three tracks would need their own "alt" and "on" flags.

Here's the nitty bitty:

Bit nr:76543210
x_track data:B: "on" flagB: "alt" flagB:5B:4B:3B:2B:1B:0
y_track data:A: "on" flagA: "alt" flagA:5A:4A:3A:2A:1A:0
z_track data:D: "on" flagD: "alt" flagC: 3C:2C:1C:0D:1D:0

.

The x_track and y_track are organized exactly the same and targets the Base and Arp track respectively. The z_track contains all the odds and ends. First of all it has the Drum track bits in there. Then there's the C bits which stands for "Chord". The A:5 through A:0 bits in y_track is only used for encoding the base note of the arpeggio (base as in origin—not to be confused with the Base track notes). The C bits tells you which one of 16 hard-coded chords should be arpeggiated starting on that base note.

How many patterns?

Proto-FATCATs data model allowed for using way more than 16 patterns but I capped it at that number since the display only goes up to "F". But how many patterns would fit in RAM with this new data model?

Thing is, I've actually never bothered to learn how to determine the exact amount of RAM that's consumed by a program at different points during it's execution. I guess by using a debugger? Or that textfile you can have GCC generate? You're supposed to be able to tell something from that? 

Look people—I'm laying my ignorance out bare for the world to see. Maybe that way I'll shame myself into finally making the effort to learn that stuff.

Until then, my method for figuring out how much I have to work with is to semi-methodically just add stuff until I break something. By that method I gathered that the program needs to keep something like 60 bytes for its variables and function stack in order to not instantly crash and burn. That meant I had 450 bytes of RAM to put song data in, or let's say 440 to be on the safe side.

Long story short: 

The reduction i song size was unfortunate, but it was definitely worth it given the greater musical versatility provided by the addition of an Arp track. And the number nine seemed so appropriate for a device named FATCAT. I mean, 9 lives. How cool of a coincidence is that?

Could FATCAT become a LARPer?

So now I'd implemented the arp instrument as one of FATCATs major new features. However, at this point it was only capable of performing a short arp. Also—being ever weary of overextending myself with respect to system resources—I'd made the arp chords hard-coded presets only. That decision was also made in part for the reason that I didn't want the UI to become overly complicated to operate.

I debated with myself if I should aim for the stars (relatively speaking) and implement user editable arp chords, up to 16 notes long. That way I would add the holy grail—the long arp—to FATCATs capabilities.

During development I'd realized that the "on" flag method I'd used for retaining note values was kind of dumb. You'd only really want to retain the note value of the current row, and that would just require a single variable in RAM. So it would make sense to ditch all the "on" flags and just have note value "0" to mean "off".

I was in the process of figuring out a way of shoehorning in a chord editor function which I felt wouldn't totally mess up the UI structure. And since i felt I should fix the "on" flag situation anyway, I decided to do a final redesign of the data model, which would finally make user editable arp chords a reality.

Current data model

So this is what the data model looks like after the final redesign (well, it looks slightly different in the actual v0.8 source but it's essentially the same thing):

#define NR_PATTERNS       ?
#define NR_ROWS           16
#define NR_CHORD_PATTERNS 8
#define NR_CHORD_ROWS     16
typedef struct {
    uint8_t db_track[NR_PATTERNS][NR_ROWS];
    uint8_t arp_track[NR_PATTERNS][NR_ROWS];
    uint16_t drum_alt[NR_PATTERNS];
    uint16_t base_alt[NR_PATTERNS];
    uint16_t arp_alt[NR_PATTERNS];
    uint8_t chord_pattern[NR_CHORD_PATTERNS][NR_CHORD_ROWS];
    uint8_t chord_len[NR_CHORD_PATTERNS];
} SongType;

As you can see, the "alt" bit-flag arrays are back in style again. Also, the Drum and Base tracks has once again merged into the db_track array. And the arp_track array now contains both the base note and the chord parts of the arpeggio. The chord_pattern array contains the (up to) 16 notes long user defined chords, where each note is in actuality an offset to the arp base note (remember "base" means "origin" in the arp context). Finally the chord_len array determines how many notes are actually used for each chord. 

Here are the details:

Bit nr:76543210
db_track data:D:2D:1D:0B:4B:3B:2B:1B:0
arp_track data:C:2C:1C:0A:4A:3A:2A:1A:0

.

Since "0" now means note "off" I had to add another D bit to keep having a drumkit with four drum sounds. The max number of unique arp chords has been reduced from 16 to 8, requiring three C bits. That leaves only five bits each for the Base track note and the Arp track base note!

In this digital age, it's an unfortunate fact that we happen to use a 12 tone scale for our music, since it doesn't divide evenly with powers of 2. Five bits just isn't enough to fit the full three octaves anymore. So the third octave for the Base and Arp instruments now ends abruptly at F#. But sacrifices had to be made and I convinced myself that "hey, those high notes sounded a bit shrill anyway".

Now to the verdict. How many pattern would fit inside RAM with the new data model?

The perils of LARPing

That's the price you get to pay for LARPing—FATCAT's gonna lose one of its lives. But then again, you might argue it lost it for being so fat, so it actually sort of works that way too when you think about it.

But apart from the aesthetic or gimmicky aspects of the FATCAT project (which I was obviously becoming increasingly obsessed with), the harm actually isn't all that great. As I argued in the previous post, the size restriction of song data is one of FATCATs weaknesses but it's also one of it's strengths. Which from a bookkeeping standpoint would make the net result of having one less pattern nil. At least that's one way of looking at it. 

Sidenote: Yeah no kidding about the obsession! Look at all these tiresome puns in recent posts. What am I turning in to?

Discussions