Moving from C to C++, weird issue

Go To Last Post
6 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

 

After the last 8 years of writing in plain C for my Xmega projects I took it upon myself to learn some C++ and make a C++ version of one of my telescope controllers.

 

Making the migration to objects went pretty easy, but one thing though was a "silent killer" that bit me and took me a few hours to find. I use the delay_ms and delay_us functions once in a while. For example, there is an i2c joystick I use, and you have to:

1) Do an I2C write, saying "I want to read the joystick".
2) ( wait for some time, let the joystick prepare the data )

3) Do an I2C read and get the results.

That delay at step #2 is using delay_us and it turns out in C++ it was running about 3 times faster than it had when compiled under "C". Even looking at the signals on the scope I didn't catch it right away -- as all the I2C operations I saw looked correct, it was the time duration *between* the I2c operations that got me.

 

Why would this delay change so dramatically?

 

Thanks!

Mike in Alaska

Attachment(s): 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Could you put your code, at least for the step #2?

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0


deathcow wrote:
Why would this delay change so dramatically?
Because you are defining F_CPU as a -D under "Symbols" and (curiously) AS7 has two different Symbols settings, one for C and one for C++.

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Look at the assembly.  I use "avr-objdump -d main.elf > main.asm" to generate asm from my binaries.  Another approach is to replace "-c" with "-S' on your avr-gcc or avr-g++ commands, if you use those.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Thanks for the tips. I actually had tried setting the F_CPU in both those symbols blocks. In fact I had to put weird values into those to make delay_us or delay_ms seem normal they were terribly wrong.

I used the scope to measure RS232 bit rates at 9600 baud and also the I2c bit rates and they looked correct. Just that delay_us and delay_ms seem whacked. I'll use the scope this time to see the relationship of F_CPU to the delay times it produces.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Perhaps you're just mis-using them, and the abuse manifests differently in the 2 cases?

 

As ever, you'll need to post a minimum complete project for each case which demonstrates the issue.

Top Tips:

  1. How to properly post source code - see: https://www.avrfreaks.net/comment... - also how to properly include images/pictures
  2. "Garbage" characters on a serial terminal are (almost?) invariably due to wrong baud rate - see: https://learn.sparkfun.com/tutorials/serial-communication
  3. Wrong baud rate is usually due to not running at the speed you thought; check by blinking a LED to see if you get the speed you expected
  4. Difference between a crystal, and a crystal oscillatorhttps://www.avrfreaks.net/comment...
  5. When your question is resolved, mark the solution: https://www.avrfreaks.net/comment...
  6. Beginner's "Getting Started" tips: https://www.avrfreaks.net/comment...