STK600+UC3C+ATMEL ICE DEBUG

Go To Last Post
10 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hello:

I couldnt Find or find the words for this issue and im new to Atmel products.

Setup:
I have an STK600 board which has a target chip UC3C2256C using STK600 generated clock of 12Mhz, to the JTAG connector i Have and ATMELICE DEBUGGER, i programmed the target with both STK600 and ATMELICE and the blinking Led Works.

TL:DR Setup: STK600+UCE3C2256C+AS7+CPPproject+ASF+ATMEIL ICE DEBUGGER

 

Problem:

SO i have this setup with a blinker project and it works as downloaded to target, but on Debugging with ATMEL ICE it just stucks inside "__always_inline static unsigned long cpu_is_timeout(t_cpu_time *cpu_time)" function in "cycle_couner.h" file. And stays there until debug finishes suddenly. Any Idea why the program works but the debugg wont?

 

Extra Info: i configured the JTAG clock to 2Mhz.

 

I'll Expand My Details on this, pasting the code for my project:

 

 conf_board.h

 

#ifndef CONF_BOARD_H
#define CONF_BOARD_H

#define LED AVR32_PIN_PA00
#define LED1 AVR32_PIN_PA01

#endif // CONF_BOARD_H

init.c

 

#include <asf.h>
#include <board.h>
#include <conf_board.h>
#include <uc3c2256c.h>
void board_init(void)
{
	/* This function is meant to contain board-specific initialization code
	 * for, e.g., the I/O pins. The initialization can rely on application-
	 * specific board configuration, found in conf_board.h.
	 */
	
	ioport_init();                                      // call before using IOPORT service
	ioport_set_pin_dir(LED, IOPORT_DIR_OUTPUT);        // LED pin set as output
	ioport_set_pin_level(LED, IOPORT_PIN_LEVEL_LOW);  // switch LED On
	
	ioport_set_pin_dir(LED1, IOPORT_DIR_OUTPUT);        // LED pin set as output
	ioport_set_pin_level(LED1, IOPORT_PIN_LEVEL_HIGH);  // switch LED Off
}

conf_clock.h

 

#ifndef CONF_CLOCK_H_INCLUDED
#define CONF_CLOCK_H_INCLUDED




#define CONFIG_SYSCLK_SOURCE        SYSCLK_SRC_OSC0


/* Fbus = Fsys / (2 ^ BUS_div) */
#define CONFIG_SYSCLK_CPU_DIV         0
#define CONFIG_SYSCLK_PBA_DIV         0
#define CONFIG_SYSCLK_PBB_DIV         0
#define CONFIG_SYSCLK_PBC_DIV         0

//#define CONFIG_USBCLK_SOURCE        USBCLK_SRC_OSC0
//#define CONFIG_USBCLK_SOURCE        USBCLK_SRC_OSC1
//#define   CONFIG_USBCLK_SOURCE        USBCLK_SRC_PLL0
//#define CONFIG_USBCLK_SOURCE        USBCLK_SRC_PLL1

/* Fusb = Fsys / USB_div */
#define CONFIG_USBCLK_DIV             1

#define CONFIG_PLL0_SOURCE            PLL_SRC_OSC0
//#define CONFIG_PLL0_SOURCE          PLL_SRC_OSC1
//#define CONFIG_PLL0_SOURCE          PLL_SRC_RC8M

/* Fpll0 = (Fclk * PLL_mul) / PLL_div */
#define CONFIG_PLL0_MUL               (48000000UL / BOARD_OSC0_HZ)
#define CONFIG_PLL0_DIV               1

#define CONFIG_PLL1_SOURCE          PLL_SRC_OSC0
//#define CONFIG_PLL1_SOURCE          PLL_SRC_OSC1
//#define CONFIG_PLL1_SOURCE          PLL_SRC_RC8M

/* Fpll1 = (Fclk * PLL_mul) / PLL_div */
#define CONFIG_PLL1_MUL               (48000000UL / BOARD_OSC0_HZ)
#define CONFIG_PLL1_DIV               1

#endif /* CONF_CLOCK_H_INCLUDED */

user_board.h

 

#ifndef USER_BOARD_H
#define USER_BOARD_H

#include <conf_board.h>


#define BOARD_OSC0_HZ             12000000
#define BOARD_OSC0_IS_XTAL        true
#define BOARD_OSC0_STARTUP_US     4096


#endif // USER_BOARD_H

main

 

int main (void)
{
    /* Insert system clock initialization code here (sysclk_init()). */
    sysclk_init();
    board_init();

    /* Insert application code here, after the board has been initialized. */
    while (1) {
        ioport_toggle_pin_level(LED);
        delay_ms(1000);
        if (OSC0_MODE_VALUE == OSC_MODE_XTAL)
        {
            ioport_set_pin_level(LED1, IOPORT_PIN_LEVEL_LOW);
        }
    }
}

 

 

 

Last Edited: Wed. Jul 26, 2017 - 01:51 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Show code.

Happy 75th anniversary to one of the best movies ever made! Rick Blane [Bogart]: "Of all the gin joints, in all the towns, in all the world, she walks into mine."

 

"Some questions have no answers."[C Baird] "There comes a point where the spoon-feeding has to stop and the independent thinking has to start." [C Lawson] "There are always ways to disagree, without being disagreeable."[E Weddington] "Words represent concepts. Use the wrong words, communicate the wrong concept." [J Morin] "Persistence only goes so far if you set yourself up for failure." [Kartman]

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Alefachini wrote:
im new to Atmel products

So any particular reason for starting with UC3 ?

 

http://www.avrfreaks.net/comment...

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Alefachini wrote:
debug finishes suddenly

Watchdog?

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 1

'blinker project' suggests to me that you are using the ASF cpu_delay_ms() routine and that >99.9% of the time your code will be in the cpu_is_timeout() routine waiting for the time-delay to expire.
cpu_is_timeout() uses the internal counter COUNT which is incremented by the cpu clock.
The debugger stops the cpu clock when you reach a breakpoint or after a single-step, therefore it could take a long time for the value in COUNT to reach the pre-calculated time-delay value.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I understand these are kinda Hardcore uC but i have to migrate one big project to this uC and also it is the task that my Boss gave me.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Nope. Not using it. i mean if its enabled by default could it be, but without debugging my program works perfectly

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

"Hardcore" is not the point - there are plenty of other "hardcore" alternatives.

 

"Minority" and "Legacy" are the point - why not choose a part with widespread support, and a bright future?

 

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

mikech wrote:

'blinker project' suggests to me that you are using the ASF cpu_delay_ms() routine and that >99.9% of the time your code will be in the cpu_is_timeout() routine waiting for the time-delay to expire.
cpu_is_timeout() uses the internal counter COUNT which is incremented by the cpu clock.
The debugger stops the cpu clock when you reach a breakpoint or after a single-step, therefore it could take a long time for the value in COUNT to reach the pre-calculated time-delay value.

 

What i can see is that after running board_init function the step pointer goes to "__always_inline static void cpu_set_timeout(..." function and after that if i pres RUN the sistem will keep runing until it stops.

What should i do then?

 

And if a set a breakpoint in the while loop inside main get the warning "his breakpoint will never be reached"

Last Edited: Wed. Jul 26, 2017 - 02:05 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Just because this is the chip the project has in it and i must use this one or wait that my boss decide to migrate the device, for now im stuck with this