AVR ATxMEGA32A4 Time Question

Go To Last Post
14 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 1

Hello,

 

I come to this forum to ask for assistance in getting the unix/epoch time (seconds), using an ATxMEGA32A4. I will be using USART to send the time out to a serial monitor for logging.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Where will your source of timing come from? How accurate do you want it?

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

With the mcu, there is a RTC I can use, according to the datasheet. It is a 16-bit with a separate oscillator and the setup should be as accurate as possible.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Simply having an RTC is not sufficient. YOU have to provide the means for setting the time. That information could come  from several sources.

 

1. Manual entry by an operator

2. GPS

3. Internet connection that supports NTP, Network Time Protocol

4. Maybe from the cellular network

 

Personally, I would use a 32KHz crystal with that separate oscillator. If that is not stable enough over temperature, i would use one of the temperature compensated RTC chip/modules (Maxim, etc).

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

How do you plan to write software for this?

 

If it is avr-gcc (the C compiler that comes with AS7) then note that in recent versions it now has:

 

http://www.nongnu.org/avr-libc/u...

 

In particular notice system_tick().

 

However note that as Jim says, while you can set this up and "tick" it there has to be some mechanism to set the time in the first place. For that you would use set_system_time():

 

http://www.nongnu.org/avr-libc/u...

 

As you are presumably going to let the user set the date time in terms of yy-mm-dd hh:mm:ss then you would likely populate a  struct tm and then call mk_gmtime or mk_time:

 

http://www.nongnu.org/avr-libc/u...

 

That value could then be passed to set_system_time() and then at 1Hz you call the system_tick.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

There is now time.h file collection in avr-libc, as Cliff notes. I used it in one logger application with a Mega328 (has similar async oscillator and counter chain to your Xmega) where the user has to enter time and date via terminal. Its cumbersome but it works and its cheaper than other solutions. A newer logger application simply reports elapsed seconds (actually eighth-seconds) from the start of the logging. That is MUCH easier but it does require externally recording the start time for future reference. 

 

My point in mentioning all this is that not all situations that need "time" require the full time and date magic. Fully implemented, the time.h library does occupy significant processor resources. In my M328 program, it took over 10% of the flash and significant part of SRAM (due to copious use of 64-bit integers, which really are needed, and multiple copies of the time struct, which would be difficult to do without). And, I did not implement time zone and several other similar features.

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

Last Edited: Fri. Jun 16, 2017 - 05:08 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

ka7ehk wrote:

I was unaware that I would need more than the RTC to implement current time. From those options, I believe manually implementing the current time would be best with my device.

A logger application that could display the unixtime after every line would fit my request too.

 

 

clawson wrote:

I'm currently using Atmel Studio 7 to code the firmware for my device. I'm thankful that the time.h library is available for use.

 

Unfortunately, I'm unsure how that last step would be coded. Will the system tick change the value of the system time? How would I be able to get the time in unix from there? Then, there's the challenge of sending that value out via serial port. frown 

 

Just to note, the device features an LED screen (16x3 in terms of viewable space) with joystick controls to scroll and edit. I would probably need multiple entries for the date and for the time.

 

 

 

Thanks for the replies!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Will the system tick change the value of the system time? How would I be able to get the time in unix from there? Then, there's the challenge of sending that value out via serial port.

 

Unix time is simply a 32-bit, unsigned number of seconds from 1970-01-01 00:00:00, the Unix epoch.  Configure the XMEGA RTC to generate an interrupt every second and have that interrupt service routine increment the time.  Use the functions in time.h to convert to/from seconds and various text formats.

Greg Muth

Portland, OR, US

Xplained Boards mostly

Atmel Studio 7.0 on Windows 10 VM hosted by Ubuntu 17

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hi All,

 

I have a similar question, not trying to take over the thread so let me know if I should start a new post. I have three xMegas connected to a controller running linux. The goal is for the xMegas to provide timestamps to the controller when they send data back. I had already looked into time.h and having the controller send the time to the xMegas. My only question is would this only give seconds resolution on timestamps since you call system_tick() every second? I would need timestamps in the microsecond resolution. Note the timestamps between the xMegas needs to be somewhat synced up so just running their own timers for a timestamp wouldn't be very useful.

 

Thank you all.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Greetings Tabarious -

 

For that sort of resolution, you are talking "serious stuff". The only way I can imagine that could work over a time span longer than a few seconds is for the master to send time sync information periodically to the slaves. 

 

That said, even on an Xmega running as fast as it can, you would still be using up a LOT of the resources just for time keeping. Then, you would also have the problem of maintaing the time in the presence of interrupting processes that could take longer than 1 microsecond tick time. It will take very careful design and, even then, with uncertain outcome.

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Unixtime is and presumably (until 2038) always will be a 32 bit signed number representing the number of seconds since 1st Jan 1970. It does not contain higher resolution, not even milliseconds let alone microseconds.

 

What a lot of people do in various appllications is decimal shift the value 3 or 6 places to the left and then add in milli or microseconds in the 3/6 digits that are then opened up.

So if the time now is 1497891635 then imagine 1497891635000 or 1497891635000000 (either held in uint64_t probably) where something like 1497891635726544 means Mon, 19 Jun 2017 17:00:35.726544 etc.

 

You will likely have to strip the last 3/6 before you can create things like Mon, 19 Jun 2017 17:00:35 from the remaining 10 digits. then, as I just did, add a '.' and then the last 6 as the fractional part.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hi Jim,

 

Thanks for the reply. Sorry to clarify it doesn't need to be 1 microsecond resolution probably around 100 microsecond would suffice. So you would recommend the master sending its own timestamp back to periodically correct for error in the slaves? Then just setting up a timer on the slaves to 'tick' every 100 microseconds or so?

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

That is what I would  do.

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hello again,

 

I have implemented user-inputted time and was able to store it in mk_gmtime() successfully. In turn, I was able to use mk_gmtime() to store into set_system_time(). The trouble I am still having is implementing ISR into my program to increment system time every second. There is also the persistent confusion I have in which function to use in time.h that would allow me to get the system time in unix. Is it to use mk_gmtime() again, but with NULL in its parameters, after the system time has been set up correctly with ISR?

​I am referencing the 'AN_8047 Source Code' example found on Microchip: https://www.microchip.com/wwwpro.... Am I going on the wrong path and/or over-complicating things? crying