[Solved] In Studio5, is char signed or unsigned?

Go To Last Post
7 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Edit: Solved
I'm using an external makefile - a modified version taken from the LUFA VirtualSerial demo.
It uses the compiler option: "CFLAGS += -funsigned-char" which tells the compiler use unsigned char by default.

------------------------------------
I'm using Studio5 version 5.1.208.
I thought that char was signed 8-bit type by default, but when using itoa() it seems to be unsigned unless explicitly declared as signed.

Here's the code:

void TestBits(void)
{
	ShowBits(-128);
	ShowBits(-1);
	ShowBits(0);
	ShowBits(1);
	ShowBits(127);	
}



void ShowBits(signed char Num)
{
	char	i;
	char	Buffer[9] = "        ";
	char	NumStr[9] = "        ";
	
	itoa((int)Num, NumStr, 10);
	
	for(i=0; i<8; i++) {
		 if (Num & (1 << i))
		   Buffer[7-i] = '1';
		 else Buffer[7-i] = '0';
	}
    fputs("\n\r", &USBSerialStream);
    fputs(NumStr, &USBSerialStream);
    fputs(" = ", &USBSerialStream);	
    fputs(Buffer, &USBSerialStream);
    fputs("\n\r", &USBSerialStream);  
}

And here's the results:

Using: void ShowBits(char Num)

128 = 10000000

255 = 11111111

0 = 00000000

1 = 00000001

127 = 01111111




Using: void ShowBits(signed char Num)

-128 = 10000000

-1 = 11111111

0 = 00000000

1 = 00000001

127 = 01111111

The only difference between the two runs was changing char to signed char.

Did I get it backwards all these years?

Last Edited: Wed. Apr 18, 2012 - 10:37 AM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

No. You just didn't bother to read the flags used in the avr-gcc compile stage.

Traditionally, most compilers used signed char. But it has always implementation dependent.

Like the width of an int. or any number of things.

It is best to be unambiguous. So if you want signed char, use int8_t and if you want unsigned char, use uint8_t. Then there is no confusion.

It only becomes important when it gets promoted (as in your example). In which case, you cast accordingly.

David.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

david.prentice wrote:
No. You just didn't bother to read the flags used in the avr-gcc compile stage

Thanks for the response.
I was updating the original post when you posted.
It was due to a compiler option declared in the makefile.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Every Makefile and IDE that I have ever seen, specifies -funsigned-char

At least you know what to expect.

I believe that avr-gcc will default to unsigned-char even if you don't use the switch. (Untested)

David.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I'm slowly getting used to using makefiles with Studio5.

david.prentice wrote:
I believe that avr-gcc will default to unsigned-char even if you don't use the switch. (Untested)

I didn't see a "-fsigned-char", just one for unsigned.
So I'm guessing that signed char is the default.

Edit:
I commented out the "-funsigned-char" compiler option and ran the program again using
"void ShowBits(char Num)" and the compiler treated Num as a signed char.

So the default is signed char which can be changed to unsigned char using the compiler option.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I never assume char's signed or unsigned, and make a habit of using stdint.h. I never use "char". I use
uint8_t and so on, uint16_t for unsigned int since sizeof(int) varies, and all those proven conventions.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

stevech wrote:
I never assume char's signed or unsigned ...

Yep, I've seen those used on some of the code posted here: int8_t, uint8_t, int16_t, uint16_t, etc.

Guess it's time for me to break a bad habit.