I load the following table into flash:
#if FONT_NORMAL_8_EN
const INT8U font_Normal8Data[] PROGMEM =
{
' ', //character
2, //pixel width
0x00, 0x00, // [ ]
0x00, 0x00, // [ ]
0x00, 0x00, // [ ]
0x00, 0x00, // [ ]
0x00, 0x00, // [ ]
0x00, 0x00, // [ ]
0x00, 0x00, // [ ]
0x00, 0x00, // [ ]
'*', //character
6, //pixel width
0x50, 0x00, // [ * * ]
0x20, 0x00, // [ * ]
0xF8, 0x00, // [***** ]
0x20, 0x00, // [ * ]
0x50, 0x00, // [ * * ]
0x00, 0x00, // [ ]
0x00, 0x00, // [ ]
0x00, 0x00, // [ ]
The 2 after the ' ' and the 6 after the '*' is the width in pixels of
the font. However, this end up in flash as follows (taken from the .lss
file)
000005bd
5bd: 20 05 00 00 00 00 00 00 00 00 00 00 00 00 00 00
5cd: 00 00 21 04 60 00 60 00 60 00 60 00 60 00 00 00
5dd: 60 00 00 00 22 06 48 00 48 00 00 00 00 00 00 00
5ed: 00 00 00 00 00 00 23 07 28 00 7c 00 7c 00 28 00
5fd: 7c 00 7c 00 28 00 00 00 24 07 3c 00 68 00 68 00
60d: 38 00 2c 00 2c 00 78 00 28 00 25 08 64 00 6c 00
61d: 08 00 18 00 10 00 36 00 26 00 00 00 26 09 3c 00
62d: 66 00 3c 00 7d 00 67 00 67 00 3d 00 00 00 27 03
63d: 40 00 40 00 00 00 00 00 00 00 00 00 00 00 00 00
64d: 28 05 10 00 20 00 60 00 60 00 60 00 60 00 20 00
65d: 10 00 29 05 40 00 20 00 30 00 30 00 30 00 30 00
66d: 20 00 40 00 2a 07 10 00 54 00 7c 00 38 00 7c 00
67d: 54 00 10 00 00 00 2b 07 00 00 10 00 10 00 7c 00
68d: 10 00 10 00 00 00 00 00 2c 04 00 00 00 00 00 00
69d: 00 00 00 00 20 00 60 00 40 00 2d 07 00 00 00 00
6ad: 00 00 7c 00 00 00 00 00 00 00 00 00 2e 04 00 00
As can be seen in row 5bd the value after 20 is 5 and it
should have been 2 and in row 66d the value after 2a ('*') is
7 and it should have been 6.
I use this code to read the width and it is the wrong value indeed:
//read individual char's width from flash
pixel_width = pgm_read_byte(font_GlyphP + 1);
SO WHY would gcc put a completely different value in flash? I tried to
change the optimisation from s to 0, but no difference. I use winavr and avr-gcc 3.4.1.