-
How font rendering Work in Seeed_GFX
01/22/2026 at 19:39 • 0 commentsGlyphs and Bitmaps
In this case, we’ll explore in depth a key aspect of the Seeed_GFX library: the inner workings of FreeFonts.
If you open the header file for any FreeFont—like FreeMono12pt7b.h—you’ll see it has two clearly separated main parts: First, there’s a long array of bytes called FreeMono12pt7bBitmaps, and right after that, a shorter array named FreeMono12pt7bGlyphs.
This second array contains metadata: information about how each character (numbers, letters, symbols) is drawn for that font. Each of these drawings is called a "glyph".
For example, the info needed to draw the "A" glyph is as follows:
{ 408, 14, 14, 14, 0, -13 } // 'A'
Each value has a specific meaning:
- 408: Offset (position) within the FreeMono12pt7bBitmaps array where the glyph’s definition starts.
- 14: Glyph width in pixels.
- 14: Glyph height in pixels.
- 14: Cursor advance after drawing the glyph (equal to the width in this case).
- 0: Cursor shift after the glyph.
- -13: Vertical shift (−13 means it aligns with the baseline).
Each pixel in the glyph equals one bit, so the glyph is 14 x 14 = 196 bits, or 196 / 8 = 24.5 bytes (rounded up to 25 bytes).
If you look up 25 bytes in the FreeMono12pt7bBitmaps array starting at position 408, you’ll get the following values:
0x3F, 0x00, 0x0C, 0x00, 0x48, 0x01, 0x20, 0x04, 0x40, 0x21, 0x00, 0x84, 0x04, 0x08, 0x1F, 0xE0, 0x40, 0x82, 0x01, 0x08, 0x04, 0x20, 0x13, 0xE1, 0xF0
Not very enlightening, right?
But what if we view those in binary instead of hexadecimal?
0x3F 0x00 → 0011111100000000
0x0C 0x00 → 0000110000000000
0x48 0x01 → 0100100000000001
0x20 0x04 → 0010000000000100
0x40 0x21 → 0100000000100001
0x00 0x84 → 0000000010000100
0x04 0x08 → 0000010000001000
0x1F 0xE0 → 0001111111100000
0x40 0x82 → 0100000010000010
0x01 0x08 → 0000000100001000
0x04 0x20 → 0000010000100000
0x13 0xE1 → 0001001111100001
0xF0 → 11110000
It still looks like a bunch of random numbers. But wait! The glyph is 14 pixels wide—what happens if we group the bits in sets of 14 instead of 16?
01: 00111111000000 ..######......
02: 00000011000000 ......##......
03: 00000100100000 .....#..#.....
04: 00000100100000 .....#..#.....
05: 00000100010000 .....#...#....
06: 00001000010000 ....#....#....
07: 00001000010000 ....#....#....
08: 00010000001000 ...#......#...
09: 00011111111000 ...########...
10: 00010000001000 ...#......#...
11: 00100000000100 ..#........#..
12: 00100000000100 ..#........#..
13: 00100000000100 ..#........#..
14: 11111000011111 #####....#####
Now it makes sense—out of that apparent chaos of bits, the pattern for the letter "A" in this font appears.
![]()
Every time you display text on the screen, the library goes through this whole process for each letter: first, it looks up the glyph’s info in the font metadata; then it dives into the bitmap array and “draws” the letter pattern pixel by pixel in the memory buffer. Finally, as we saw before, when you call update(), that buffer gets sent to the panel so the text appears on the screen.
![]()

