16bit registers again :-(

Go To Last Post
2 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I've read the datasheet and the appnote about accessing 16bit registers in assembler. Byte order and so on, but I use avr-gcc so the compiler should know how to deal with it right ?

I've also checked that my c-code compiles to the right read or write byte order and it does.

For reads: low byte first, then high byte
For writes: high byte first, then low byte

The problem:

I'm trying to set TCNT1, which is a 16bit register, to an arbitrary starting value. Currently it seems that the high byte is always ZERO when I do so. I've written some code that lights up 8 leds to show the contents of the high/low byte. TCNT1 is read into a variable (uint16_t) and that is used for the LED stuff. If I replace reading TCTN1 into said variable with e.g. 'variable = 0xFFFF;' , the LED code displays its content properly.

Here's the code:

#include 
#include 
#include 
#include 


int main (void);

int
main (void)
{

  DDRD |= ((1 << PD5));		// PD5 output
  PORTD |= ((1 << PD5));	// RED anodes HIGH. All 8 anodes go to this pin
  DDRB = 0xFF;			// all outputs
  PORTB = 0xFF;			// all 8 cathodes HIGH --> OFF

  unsigned char sreg;
  volatile unsigned int timer1;
  unsigned char ctr;

  while (1)
    {
      sreg = SREG;		// store IRQ flags
      cli ();			// all IRQs off
      TCCR1B &= ~((1 << CS12) | (1 << CS11) | (1 << CS10));	// stop timer1
      TCNT1 = 0xFFFF;		// set arbitrary value, should stay the same as timer1 is stopped
      timer1 = TCNT1;		// read it back. the compiler should know how to do a 16bit read
      //timer1 = 0xFFFF;            // use this instead of reading TCNT1 and it works as expected
      SREG = sreg;		// restore IRQ flags

      for (ctr = 0; ctr <= 7; ctr++)
	{			// show LOW byte of timer1
	  if ((timer1 >> ctr) & 0x0001)
	    {
	      PORTB &= ~(1 << ctr);	// turn the LED on
	    }
	  else
	    {
	      PORTB |= (1 << ctr);	// turn the LED off
	    }
	}
      _delay_ms (300);
      PORTB = 0xFF;		// all off
      _delay_ms (300);

      for (ctr = 0; ctr <= 7; ctr++)
	{			// show HIGH byte of timer1
	  if ((timer1 >> (ctr + 8)) & 0x0001)
	    {
	      PORTB &= ~(1 << ctr);	// turn the LED on
	    }
	  else
	    {
	      PORTB |= (1 << ctr);	// turn the LED off
	    }
	}
      _delay_ms (300);
      PORTB = 0xFF;		// all off
    }
}

Reading from TCNT1 is compiled to:

      timer1 = TCNT1;          
  e6:   60 91 84 00     lds     r22, 0x0084
  ea:   70 91 85 00     lds     r23, 0x0085
  ee:   7a 83           std     Y+2, r23        ; 0x02
  f0:   69 83           std     Y+1, r22        ; 0x01

Writing to TCNT1 is compiled to:

      TCNT1 = 0xFFFF;           
  de:   10 93 85 00     sts     0x0085, r17
  e2:   00 93 84 00     sts     0x0084, r16

The byte order seems to be according to specs.

I've also tried accessing TCNT1 via TCNT1H / TCNT1L, thereby manually enforcing the correct byte order, but that doesn't work either. The high byte is always zero. According to a bug report for avr-gcc the byte order for special 16bit registers reads/writes should be fixed since about 2005.

CPU: ATmega168 (old, no P or PA)
avr-libc: 1.6.7-4.7
avr-gcc: 4.4.2

My last test was compiling the code on a virtual windows xp machine using the Arduino IDE to get access to a different version of avr-gcc (4.3), but the problem is still there.

I'm at my wit's end here.

Help.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Gentlemen, I should get more sleep.

This code actually works, at least it did 5 minutes ago. Nevertheless many will find it useful to make sure that timer1 definitely runs in 16bit mode by setting the WGM10 bit of TCCR1A to 0, which is the default value according to the datasheet.