Slow startup issues with Attiny10

Go To Last Post
35 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hello again, everyone!

 

I have a customer that for one reason or another has a really slow startup on a sensor of ours (they say 1s) and it's causing some issues with a PWM output on a Attiny10 device.  I have an output coming from another ASIC that is being fed into the ADC of the Attiny10 and we're outputting a PWM signal.  Unfortunately, it appears that because of the slow startup, it's causing an output issue with the PWM.  If we do a normal (fast) startup, an input of 0.6V sits at around 1.7 kHz.  When we run a slow startup (about 5s.  They say the startup is only 1s but it works just fine for us as 1s so I'm going with the extreme case of 5s), the PWM output is about 700 Hz.  Not sure if there's some sort of timing calibration of sorts done when the chip starts up but somehow, the timing is all wrong despite having the same input. 

 

Below are some scope shots of the problem:

 

 

 

 

 

 

CH1 (Yellow): Vsupply

CH2 (Blue): ADC Input V

CH3 (Pink): PWM Freq Out

 

As you can see above, the slow startup does cause some sizeable jumps on our ADC input but I'm not 100% that is the issue (unless there's some sort of ADC calibration being done internally on the chip at startup).

 

I tried to resolve this with the VLMCSR Register to try and delay the code from starting until the Vsupply is at a reasonable level but it's possible it's still starting up before some other process is running or maybe I'm just not setting it up correctly.  Below is the code I used:

 

VLMCSR = 0x04;                  // Set to VLM3 (3.2 to 4.5V)
while((VLMCSR & 0x80) == 0x80)  // Wait until voltage rises above trigger level
{

}

It's not complicated code by any means but it's what I have at the beginning of my code before the inits:

 

int main(void)
{
	VLMCSR = 0x04;
	while((VLMCSR & 0x80) == 0x80)
	{

	}

	HardwareInit();
	TimerInit();
	ADCInit();

	while(1)
	{
	    // ADC to PWM algorithm here
	}
}

Unfortunately, it doesn't appear to be having any effect and I'm still getting the same 600 mV input and 700 Hz output. 

 

 

 

I've started the program up with a 1s delay instead and that seemed to resolve my issue (I get the expected 1.7 kHz output). 

 

 

 

While it might still resolve the problem the customer is having (I'm still skeptical that's the root of the problem but I'll sort that out down the line when their fixture arrives), I want to do this the right way and not band aid it.  It's possible that I AM implementing it correctly and the voltage monitor won't quite resolve the issue (which is fine) but if I'm approaching this wrong or I'm coding it incorrectly, another pair of eyes would be very helpful!

 

Can anyone help me see if I'm missing something important or implementing this incorrectly?  Any help you can provide would be greatly appreciated!  Thanks!

 

Edit:  Apologies for all the edits, was adding some additional screenshots

Last Edited: Mon. Aug 27, 2018 - 09:28 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Sorry, meant to upload this in the initial email to save everyone some time (pg 51 of the datasheet, located at http://ww1.microchip.com/downloads/en/DeviceDoc/Atmel-8127-AVR-8-bit-Microcontroller-ATtiny4-ATtiny5-ATtiny9-ATtiny10_Datasheet.pdf):

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

you are drifting a bit & not so clear:

 

Unfortunately, it appears that because of the slow startup

Is it the sensor that is starting slow, the tiny10, or both?

 

has a really slow startup on a sensor of ours (they say 1s) and it's causing some issues with a PWM output on a Attiny10 device

So are these two separate things?  Or is the sensor the tiny10? 

 

When things work proper, are you intending to vary the freq or instead the duty cycle (the pwm)?  Typically, the freq is at a fixed setting & duty varies (such as dim to bright).

If your code is not changing the freq (timer) settings, then you have some other issue...maybe the clock source (or clock settings).  Freq won't simply change by itself, without some register setting changing or the master clock freq changing.

 

When in the dark remember-the future looks brighter than ever.   I look forward to being able to predict the future!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I'm not familiar with the brain dead AVR's, is the VLMCSR used instead of a proper BOD? 

Does it have BOD? if so, that should hold the cpu in reset until power supply is high enough to run the cpu.

BTW, what is the power source, why does it power up so slowly, solar cell perhaps?

 

Jim

 

Click Link: Get Free Stock: Retire early! PM for strategy

share.robinhood.com/jamesc3274

 

 

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Thanks for the replies, everyone! 

 

avrcandies wrote:

you are drifting a bit & not so clear:

 

Unfortunately, it appears that because of the slow startup

Is it the sensor that is starting slow, the tiny10, or both?

 

 

Both.  Maybe I didn't explain it very well.  The power supply is coming up slowly so it's causing the first ASIC to come up slow as well.  As you can see, the input voltage goes up and down on the first ASIC at the beginning because it's being affected by the startup as well.  Eventually, that voltage steadies out and sits at 0.6V.  The ATTiny10 doesn't seem to provide the same output with a slow Vsupply rise as it does with a quick Vsupply rise, so I'm assuming there's some sort of calibration that's being done at one point (I'm assuming it's being done in some sort of ADC initialization but I'm not really sure at this point). 

 

avrcandies wrote:

has a really slow startup on a sensor of ours (they say 1s) and it's causing some issues with a PWM output on a Attiny10 device

So are these two separate things?  Or is the sensor the tiny10? 

 

When things work proper, are you intending to vary the freq or instead the duty cycle (the pwm)?  Typically, the freq is at a fixed setting & duty varies (such as dim to bright).

If your code is not changing the freq (timer) settings, then you have some other issue...maybe the clock source (or clock settings).  Freq won't simply change by itself, without some register setting changing or the master clock freq changing.

 

 

Once again, my wording wasn't great.  The Vsupply affects both devices but the first ASIC, after going up and down (in the graphs above, the blue signal) eventually evens out at 0.6V or so.  Even though it's initially affected by the slow rise time on the Vsupply (which is what I'm referring to when I say slow startup), it eventually gets to where it should.  The ATTiny10, however, is giving a different frequency output with the same input voltage, depending on the speed of the startup (either 700 Hz for slow startup or 1.7 kHz for fast startup).  It appears to be doing something at initialization, likely on the ADC. 

 

It did appear to work with a delay so I'm guessing this allows the internal voltages on the ADC (likely whatever internal voltage it's using for initial calibration) to stabilize before doing an initial calibration, which is why it's getting the wrong output frequency.  Maybe I need to find out where the ADC is doing the initial calibration and using the register check before running that?  It's hard for me to see what that's doing at this point. 

 

ki0bk wrote:

I'm not familiar with the brain dead AVR's, is the VLMCSR used instead of a proper BOD? 

Does it have BOD? if so, that should hold the cpu in reset until power supply is high enough to run the cpu.

BTW, what is the power source, why does it power up so slowly, solar cell perhaps?

 

Jim

 

 

To be honest with you, I have NO idea why our customer is using such a slow startup time.  I'm going to try and convince them to change out their hardware but if their systems are already in the field, that's probably not going to be an option for us. 

 

The chip doesn't appear to have brownout detection though as I've searched the datasheet for "brown" and "BOD" and it's only finding one line, which is "7. BOD Disabled" on pg 160.  So it appears that might be an ATMega chip feature that isn't present on the ATTiny. 

 

It's possible there is no solution for us apart from setting up a delay (assuming our Vsupply rate is known and consistent).  Was just hoping to learn the right way to approach this rather than just to band aid it. 

 

Thanks again for the help, guys!

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I assume you know what the ADC reading would be for a 700 Hz PWM output (say e.g. 200), and for a 1.7 kHz PWM output (say e.g. 400), right?  Can you put like 2 lines of code in a test unit to light a debug LED depending on the ADC output?  If ADC < 300 (insert your correct value here), light LED.  That will let you know if it is really an ADC problem.  Or, if you don't have and cannot add an LED, have the test code do something very obvious with the PWM output that you can detect on the scope.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

The datasheet has a figure SRon which tells you how slow the power is allowed to rise for correct operation.

 

For example, on a mega328 the slowest rate allowed is 0.01v/ms which, for a 5V rail, would work out at 500ms maximum rise time on VCC.

#1 This forum helps those that help themselves

#2 All grounds are not created equal

#3 How have you proved that your chip is running at xxMHz?

#4 "If you think you need floating point to solve the problem then you don't understand the problem. If you really do need floating point then you have a problem you do not understand." - Heater's ex-boss

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

kk6gm wrote:

I assume you know what the ADC reading would be for a 700 Hz PWM output (say e.g. 200), and for a 1.7 kHz PWM output (say e.g. 400), right?  Can you put like 2 lines of code in a test unit to light a debug LED depending on the ADC output?  If ADC < 300 (insert your correct value here), light LED.  That will let you know if it is really an ADC problem.  Or, if you don't have and cannot add an LED, have the test code do something very obvious with the PWM output that you can detect on the scope.

 

That's actually a pretty good idea!  Unfortunately, we have no IO left cuz it's an ATTiny chip with only 6 usable pins . . . heh.  But I like your idea to determine the ADC value, I'll have to try that!

 

Brian Fairchild wrote:

The datasheet has a figure SRon which tells you how slow the power is allowed to rise for correct operation.

 

For example, on a mega328 the slowest rate allowed is 0.01v/ms which, for a 5V rail, would work out at 500ms maximum rise time on VCC.

 

Hmm.  That might be helpful to know but I'm not sure if it would help solve anything.  I can't really change the hardware so I'm suspecting we're definitely powering up slower than we should (in fact, I'm sure we are.  1+ second startup time is ridiculously slow!).

 

I just looked up the datasheet though and it's the same startup time for the ATTiny10.  Well, at least I know!  We'll have to be sure we put that in our specs going forward.  Unfortunately, I wasn't brought in for the specification of this part (in fact, it's meant to be a drop in replacement for a previous part so I don't have a lot of say in the matter!   Heh...)

 

Thanks again for the help!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I just had another thought, if the ADC uses a bandgap voltage for calibration and it's not high enough when I call the initialization commands, that could explain the ADC values being off, right?  Not sure when the ADC would do any sort of calibration though . . . maybe when it's enabled in the ADCSRA register?

 

Could it also have an effect on the PWM when selecting a desired clock (not sure if it does any internal frequency calibration that might rely on any reference voltages that might not be where they should be)? 

 

Thanks again!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I'm assuming it's being done in some sort of ADC initialization

No, none in this ADC...its a dirt simple low-end converter--there is no autocalibration (other than what you might chose to implement).

 

 

What do you intend to output on your so-called "PWM" ...a variable duty cycle (99.5% most common) or changing frequency (0.5%, rare) ?  You never mention anywhere what you expect.

When in the dark remember-the future looks brighter than ever.   I look forward to being able to predict the future!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Have you tried delaying 5 seconds on startup in software, before any initializing and running the regular code?

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

avrcandies wrote:

I'm assuming it's being done in some sort of ADC initialization

No, none in this ADC...its a dirt simple low-end converter--there is no autocalibration (other than what you might chose to implement).

 

 

What do you intend to output on your so-called "PWM" ...a variable duty cycle (99.5% most common) or changing frequency (0.5%, rare) ?  You never mention anywhere what you expect.

 

Apologies, jumped right over that question, I think.  I actually do need changing frequency for this part.  I'm keeping the duty cycle at 50% (with some buffer band, I forget the actual number, maybe 40-60%?)

 

Any idea what could be affected by a slow startup if not something initializing incorrectly?  Otherwise, I'd assume that with enough time, we'd get the correct frequency output.  

 

kk6gm wrote:

Have you tried delaying 5 seconds on startup in software, before any initializing and running the regular code?

 

I actually did add a delay and it did resolve the issue (see the third set of scope pics in my initial post).  It appears that everything came up correctly when the delay was placed prior to the init statements (which is why I assume something about the init being initialized before the Vsupply is the cause of the issue.  I assume it's some sort of hardware calibration but I haven't determined the root cause just yet.  Still poking at it! 

 

The weird thing is it appears adding that register check caused the code to startup later, but it still started up wrong whereas the simple delay worked.  Can't seem to understand what happened there though.  Really odd to me . . .

Last Edited: Tue. Aug 28, 2018 - 05:40 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

cradleinflames wrote:

kk6gm wrote:

Have you tried delaying 5 seconds on startup in software, before any initializing and running the regular code?

 

I actually did add a delay and it did resolve the issue (see the third set of scope pics in my initial post).  It appears that everything came up correctly when the delay was placed prior to the init statements (which is why I assume something about the init being initialized before the Vsupply is the cause of the issue.  I assume it's some sort of hardware calibration but I haven't determined the root cause just yet.  Still poking at it! 

 

The weird thing is it appears adding that register check caused the code to startup later, but it still started up wrong whereas the simple delay worked.  Can't seem to understand what happened there though.  Really odd to me . . .

Yes, I see that (later startup with the voltage level check than with the simple delay) in the scope pics.  That is utterly crazy!

 

What happens if you turn off the ADC after say 5 seconds and then reinitialize it?

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Heh, I actually did probably the dumbest test I've ran in a while to determine where the problem is . . . Below are my results:

 

/*Example works - 1.6 kHz output */
_delay_ms(1000);

HardwareInit();
TimerInit();
ADCInit();
/*Example doesn't work - 700 Hz output */
HardwareInit();
_delay_ms(1000);
TimerInit();
ADCInit();

So the issue seems to be something in hardware init.  I started moving the delay around and found this:

 

/* Example works - 1.6 kHz output */
void HardwareInit()
{
	CLKMSR = (0 << CLKMS1) | (0 << CLKMS0); // Set to calibrated internal 8 MHz oscillator		
	CCP = 0xD8; // Write to protection register to access CLKPSR 				
	CLKPSR = (0 << CLKPS3) | (0 << CLKPS2) | (0 << CLKPS1) | (0 << CLKPS0);	// Set clock prescaler to 0

    _delay_ms(1000);

	DDRB = 0b00000001; // Set PORTB pin 0 as output, rest as input
	PORTB = 0b11111110;  // Activate pull ups on inputs
}
/* Example doesn't work */
void HardwareInit()
{
	CLKMSR = (0 << CLKMS1) | (0 << CLKMS0); // Set to calibrated internal 8 MHz oscillator		
	CCP = 0xD8; // Write to protection register to access CLKPSR 				
	CLKPSR = (0 << CLKPS3) | (0 << CLKPS2) | (0 << CLKPS1) | (0 << CLKPS0);	// Set clock prescaler to 0

	DDRB = 0b00000001; // Set PORTB pin 0 as output, rest as input
	
	_delay_ms(1000);
	
	PORTB = 0b11111110;  // Activate pull ups on inputs
}

 

...If I was honest, I have no idea why this is... it's really odd.  I mean, I guess I know where the problem lies but I can't explain at all why this would cause an issue.  My money was on it breaking when placing the delay after the clocks, not the port declarations. 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

What connects to PORTB pin 0? 

 

Jim

 

Click Link: Get Free Stock: Retire early! PM for strategy

share.robinhood.com/jamesc3274

 

 

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

ki0bk wrote:

What connects to PORTB pin 0? 

 

Jim

 

 

The frequency output is on PB0 and the ADC is on PB2.  Still trying to process what that means.  Maybe it's something obvious but I didn't get much sleep last night so I might be dumb today . . . heh. 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

The ATtiny10 data sheet says you have to use CCP = 0xD8; when changing CLKMS too,

and recommends disabling interrupts beforehand.

 

--Mike

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

avr-mike wrote:
The ATtiny10 data sheet says you have to use CCP = 0xD8; when changing CLKMS too, and recommends disabling interrupts beforehand.

I was going to point that out, but then realized he was setting it to the default value anyway, so no harm done, and its early in the init, so interrupts have not been enabled yet.

 

 

Jim

 

Click Link: Get Free Stock: Retire early! PM for strategy

share.robinhood.com/jamesc3274

 

 

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

and recommends disabling interrupts beforehand.

I don't see that anywhere.

 

What I do see:

 

"Experience is what enables you to recognise a mistake the second time you make it."

"Good judgement comes from experience.  Experience comes from bad judgement."

"Wisdom is always wont to arrive late, and to be a little approximate on first possession."

"When you hear hoofbeats, think horses, not unicorns."

"Fast.  Cheap.  Good.  Pick two."

"We see a lot of arses on handlebars around here." - [J Ekdahl]

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

avr-mike wrote:

The ATtiny10 data sheet says you have to use CCP = 0xD8; when changing CLKMS too,

and recommends disabling interrupts beforehand.

 

--Mike

 

 

So to be clear, you're recommending changing from:

 

void HardwareInit()
{
	CLKMSR = (0 << CLKMS1) | (0 << CLKMS0); // Set to calibrated internal 8 MHz oscillator
	CCP = 0xD8; // Write to protection register to access CLKPSR
	CLKPSR = (0 << CLKPS3) | (0 << CLKPS2) | (0 << CLKPS1) | (0 << CLKPS0);	// Set clock prescaler to 0

	DDRB = 0b00000001; // Set PORTB pin 0 as output, rest as input
	PORTB = 0b11111110;  // Activate pull ups on inputs
}

to

 

void HardwareInit()
{
        cli();      // Disable interrupts
        CCP = 0xD8; // Write to protection register to access CLKPSR
	CLKMSR = (0 << CLKMS1) | (0 << CLKMS0); // Set to calibrated internal 8 MHz oscillator
	CLKPSR = (0 << CLKPS3) | (0 << CLKPS2) | (0 << CLKPS1) | (0 << CLKPS0);	// Set clock prescaler to 0
        sei();      // Enable interrupts

	DDRB = 0b00000001; // Set PORTB pin 0 as output, rest as input
	PORTB = 0b11111110;  // Activate pull ups on inputs
}

 

I believe that's what you're recommending, correct?

Last Edited: Tue. Aug 28, 2018 - 09:17 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Actually, I just checked and I don't enable the interrupts until after the ADC Init:

 

	HardwareInit();
	TimerInit();
	ADCInit();

	sei(); 
	ADCSRA |= 1<<ADSC;		// Start Conversion

I can add a cli() beforehand if it seems necessary.  I can move the CCP = 0xD8 before the CLKMSR though, since it sounds like that's recommended:

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Looks like that broke things.  Need to do 0xD8 right before both statements so I have the following:

 

void HardwareInit()
{
		cli();
		CCP = 0xD8; // Write to protection register to access CLKPSR
		CLKMSR = (0 << CLKMS1) | (0 << CLKMS0); // Set to calibrated internal 8 MHz oscillator
		CCP = 0xD8; // Write to protection register to access CLKPSR
		CLKPSR = (0 << CLKPS3) | (0 << CLKPS2) | (0 << CLKPS1) | (0 << CLKPS0);	// Set clock prescaler to 0

		DDRB = 0b00000001; // Set PORTB pin 0 as output, rest as input
		PORTB = 0b11111110;  // Activate pull ups on inputs
}

 

Does that look like what you are recommending?

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

cradleinflames wrote:

Heh, I actually did probably the dumbest test I've ran in a while to determine where the problem is . . . Below are my results:

 

/*Example works - 1.6 kHz output */
_delay_ms(1000);

HardwareInit();
TimerInit();
ADCInit();
/*Example doesn't work - 700 Hz output */
HardwareInit();
_delay_ms(1000);
TimerInit();
ADCInit();

So the issue seems to be something in hardware init.  I started moving the delay around and found this:

 

Not a dumb test, as yes, that makes sense.

 

With very slow rise, you are outside spec (given above), and basically can have areas where the SysCLK is too fast for the Vcc.

That's why a delay before HWinit is better than one later : Rule is, do not try and init the part when Vcc is too low.

 

The other trick you can try for marginal systems, is to pull the init code inside some repeat loop, where practical.

ie instead of firing once, and 'you hope it sticks', you repeat it every 10ms or whatever.

 

If you have slow rise, what about slow fall and dips ?

The other fishhook waiting for you, in Vcc domain, is a brownout that fails to go low enough. Can that ever occur here ?

It's common for MCUs to spec a Vcc min before proper POR occurs, as well as the v/ms slew.

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

joeymorin wrote:

and recommends disabling interrupts beforehand.

I don't see that anywhere.

 

The last paragraph describing CLKPSR says this:

 

At start-up, CLKPS bits are reset to 0b0011 to select the clock division factor of 8. If the selected clock

source has a frequency higher than the maximum allowed the application software must make sure a

sufficient division factor is used. To make sure the write procedure is not interrupted, interrupts must

be disabled when changing prescaler settings.

 

Though since CCP disables them you probably don't need an explicit cli instruction.

 

--Mike

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Who-me wrote:

cradleinflames wrote:

Heh, I actually did probably the dumbest test I've ran in a while to determine where the problem is . . . Below are my results:

 

/*Example works - 1.6 kHz output */
_delay_ms(1000);

HardwareInit();
TimerInit();
ADCInit();
/*Example doesn't work - 700 Hz output */
HardwareInit();
_delay_ms(1000);
TimerInit();
ADCInit();

So the issue seems to be something in hardware init.  I started moving the delay around and found this:

 

Not a dumb test, as yes, that makes sense.

 

With very slow rise, you are outside spec (given above), and basically can have areas where the SysCLK is too fast for the Vcc.

That's why a delay before HWinit is better than one later : Rule is, do not try and init the part when Vcc is too low.

 

The other trick you can try for marginal systems, is to pull the init code inside some repeat loop, where practical.

ie instead of firing once, and 'you hope it sticks', you repeat it every 10ms or whatever.

 

If you have slow rise, what about slow fall and dips ?

The other fishhook waiting for you, in Vcc domain, is a brownout that fails to go low enough. Can that ever occur here ?

It's common for MCUs to spec a Vcc min before proper POR occurs, as well as the v/ms slew.

 

 

Heh, it had better not . . . if it's taking multiple seconds to get the power supply to a high enough level, I'm assuming it shouldn't be possible for them to have a power issue for long enough to cause a brownout!

 

That said, it's worth investigating (just in case).  It's a little disappointing that the VLMCSR isn't able to catch this (I assume the issue is that this was intended to be useful for a certain slew rate on the Vsupply but the rise is too slow for this to be useful?).  I'm not super excited about solving this with a delay but it appears I don't have much of an option.  All the same, it sounds like I have a solution that will potentially work.  I'll have a better idea when the power supply/fixture is in hand.  For now, I'm just setting the slew rate with a function generator and an op amp.  We shall see!

 

I did try placing my 3 init functions in my while loop but that just stopped it from working altogether (it's basically trying to output that appears to be a garbage signal at 150 kHz).  Not sure why this would be but, at this point, I'm having trouble being surprised by these things . . . heh. 

 

Anyone else have any suggestions that I might be able to try or does it sound like this is my only path forward?

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

avr-mike wrote:
The last paragraph describing CLKPSR says this: At start-up, CLKPS bits are reset to 0b0011 to select the clock division factor of 8. If the selected clock source has a frequency higher than the maximum allowed the application software must make sure a sufficient division factor is used. To make sure the write procedure is not interrupted, interrupts must be disabled when changing prescaler settings. Though since CCP disables them you probably don't need an explicit cli instruction.
I'd suggest that's a datasheet copy/paste error i.e. text from the datasheet of an earlier device.

"Experience is what enables you to recognise a mistake the second time you make it."

"Good judgement comes from experience.  Experience comes from bad judgement."

"Wisdom is always wont to arrive late, and to be a little approximate on first possession."

"When you hear hoofbeats, think horses, not unicorns."

"Fast.  Cheap.  Good.  Pick two."

"We see a lot of arses on handlebars around here." - [J Ekdahl]

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

cradleinflames wrote:

  It's a little disappointing that the VLMCSR isn't able to catch this...

 

The data above shows a Reset Value there of 000, which disables VLMCSR, so it's not doing anything at all, until the MCU starts.

Sounds like you really need a better MCU, but if you are stuck with that one, maybe you can measure Vcc ramp to make the delays a bit more adaptive ?

 

cradleinflames wrote:

I did try placing my 3 init functions in my while loop but that just stopped it from working altogether (it's basically trying to output that appears to be a garbage signal at 150 kHz). 

You may not be able to swallow every single line, but most should be ok to repeat as they should merely replace what is already  in the register.

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Who-me wrote:

With very slow rise, you are outside spec (given above), and basically can have areas where the SysCLK is too fast for the Vcc.

That's why a delay before HWinit is better than one later : Rule is, do not try and init the part when Vcc is too low.

I think you may have hit on something.  Vcc high enough that the chip can run, but not yet high enough for the faster clock.  Definitely something that needs to be accounted for in such a slow-rise situation.  And yet, it still doesn't account for the fact that everything works with a simple delay that lets the initialization proceed at a -lower- Vcc.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I think I would try this:

 

- Wait for the VLM to indicate minimum valid Vcc

- Do all hardware init at the startup (slow) clock speed (optional, can be done at the end of this sequence too)

- Use the ADC to monitor Vcc

- Only when Vcc is at a value legal for the higher clock speed, increase the clock speed to desired value

- Start your main code loop

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Why don't you simply hold the reset line low, until a sufficient voltage has been reached---there are a million reset monitor & release chips out there?  Or perhaps add a cap  to your reset pin & possibly a diode for a fast pulldown discharge..

 

Also in the present situation, you must assume the startup has many resets along the way due to the slow Vcc climb, so ALL variables should be explicitly cleared by YOUR code.  Don't simply assume any particular setting of a port config or variable.

When in the dark remember-the future looks brighter than ever.   I look forward to being able to predict the future!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Unfortunately, the main answer to most of these questions is the sensor is already designed, the layout is done, and and the design was accepted.  We're now seeing limitations on applications that weren't tested initially.  I do agree that there are lots of things I would like to do.  I had the same idea to do a voltage divider to an ADC to monitor the input but there are no free ADC inputs and it would require a new layout. 

 

Sadly, my only choice is to do what I can under the current conditions, with the current chip.  If it was my choice, this wouldn't have been the device I chose initially either but it's the cheapest option available and it's a replacement part for another chip that is no longer being manufactured and we still need to meet the price point.  The situation isn't great all around and I don't have the options I wish I did.  At this point, I'm just trying to do what I can just with a firmware change (which obviously is less than ideal). 

 

I should have more information when the fixtures come in.  I'm assuming the delay is going to have to be the solution though unless there's some other form of vsupply feedback like the VLMCSR register (which I assume probably won't work anyway with the out of spec startup time).  Would another option be to leave the default clock speed (I assume the default is 1 MHz, even though I'm trying to run it at 8 MHz)?  Not sure if the math would work correctly (the space is almost all used up due to the way I had to implement the algorithm.  I was wrong, it's an ATTiny5, which has even less space) and the update rate would likely be significantly slower, which could be an issue. 

 

In theory though, if the chip was used with a 1 Mhz clock, is it likely the issue wouldn't exist any longer as we wouldn't be potentially trying to change the frequency before VCC has gone high enough?  I doubt I'll end up going this route as it would cause other issues but it would be good to understand if this could potentially solve the slow startup issue. 

 

Thanks for all the replies!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Unfortunately, the main answer to most of these questions is the sensor is already designed, the layout is done, and and the design was accepted.

Ultimately that tends to become a non-reason, when finally someone says lets just fix it properly & they wonder why they didn't start making real changes a month earlier (back when they were saying we need this fixed in the next 2 days, max!!)  Been there tooooo many times.

 

Anyhow

the space is almost all used up due to the way I had to implement the algorithm

Sounds suspicious, since you can easily read the adc, filter the values ,and set the pwm in perhaps 50 lines (100 bytes) of asm code, if even that.  What else are you trying to pack in there? 

 

 

Why do you keep calling it a PWM output?  It appears you are more so creating a variable freq output, rather than trying to specifically Modulate the Pulse Width.  Isn't your app a voltage-to-frequency converter?

 

 

 

 

 

 

 

 

 

When in the dark remember-the future looks brighter than ever.   I look forward to being able to predict the future!

Last Edited: Wed. Aug 29, 2018 - 08:55 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

cradleinflames wrote:

If it was my choice, this wouldn't have been the device I chose initially either but it's the cheapest option available and it's a replacement part for another chip that is no longer being manufactured and we still need to meet the price point. 

Hehe, you mean it has the illusion of being the cheapest... your time must be worth zero, or they are making 7 digits of these ?

 

cradleinflames wrote:
Would another option be to leave the default clock speed (I assume the default is 1 MHz, even though I'm trying to run it at 8 MHz)?  Not sure if the math would work correctly (the space is almost all used up due to the way I had to implement the algorithm.  I was wrong, it's an ATTiny5, which has even less space) and the update rate would likely be significantly slower, which could be an issue. 

The math is unaffected by MHz, merely the time it takes to complete.

The part specs 4MHz at 1.8V, so you could try Middle ground values, between 1MHz and 8MHz ?

 

It does not define MHz < 1.8V but note the POR releases at typ 1.4V, giving a significant 'no mans land' and your tests suggest 1MHz is needed in that zone.

 

Digikey shows the tiny10 as the same price as the tiny5.

If you do need to jump to a better MCU there are others on Digikey, like EFM8BB1, STM8S003, N76E003 that come in 3x3 SMD packages.

 

cradleinflames wrote:
I'm assuming the delay is going to have to be the solution though unless there's some other form of vsupply feedback like the VLMCSR register (which I assume probably won't work anyway with the out of spec startup time). 
 

You could use both a delay, and polling of VLMCSR, with a counter - eg check VLMCSR in a loop, incrementing PassCtr on a pass, clr on fail, which buys time and is a noise filter too.

 

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

First, I noticed this little item in the data sheet (for the ADC):

 

  • Supply Voltage Range: 2.5 V – 5.5 V

 

So be warned.

 

Second, is it allowable that your uC doesn't produce any output for 5 seconds after the power begins ramping (say, 5 seconds after it reaches 1.8V)?  If so then it would seem easy enough to do a 5 second delay at the initial 1 MHz clock, then init all the hardware and bump up the clock speed to 8 MHz.  Just make sure you calculate the 5 second delay based on a 1 MHz clock, not an 8 MHz clock.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

avrcandies wrote:

Unfortunately, the main answer to most of these questions is the sensor is already designed, the layout is done, and and the design was accepted.

Ultimately that tends to become a non-reason, when finally someone says lets just fix it properly & they wonder why they didn't start making real changes a month earlier (back when they were saying we need this fixed in the next 2 days, max!!)  Been there tooooo many times.

 

Heh, man, I wish that was an option.  I literally had this same discussion from the people who set this up initially.  Unfortunately, I've been told this is not possible (largely because we have a lot either in the field or production that we're hoping to reprogram rather than scrap). 

 

avrcandies wrote:

the space is almost all used up due to the way I had to implement the algorithm

Sounds suspicious, since you can easily read the adc, filter the values ,and set the pwm in perhaps 50 lines (100 bytes) of asm code, if even that.  What else are you trying to pack in there? 

Well, initially a lot of it had to do with the conversion from an ADC/desired frequency to an OCR value.  Because I can't use floating point math, I have to get pretty creative on the scaling but it still takes up a lot of space.  Adding filtering for noise concerns doesn't help.  It only have 500 kB of space though so I'm not exactly drowning in resources, unfortunately!  Heh . . . That said, I didn't write it in assembly, so I guess that could be another potential solution (though I HATE writing in assembly so I'm going to try and avoid that like the plague . . .).

 

avrcandies wrote:

Why do you keep calling it a PWM output?  It appears you are more so creating a variable freq output, rather than trying to specifically Modulate the Pulse Width.  Isn't your app a voltage-to-frequency converter?

Honestly, bad habit on my part.  It's on a PWM port so I keep referring it to PWM rather than frequency.  Sorry for the confusing lingo!

 

Who-me wrote:

cradleinflames wrote:

If it was my choice, this wouldn't have been the device I chose initially either but it's the cheapest option available and it's a replacement part for another chip that is no longer being manufactured and we still need to meet the price point. 

Hehe, you mean it has the illusion of being the cheapest... your time must be worth zero, or they are making 7 digits of these ?

 

Heh, no, my time is free, remember?  It's already paid for in salary!  Isn't that how all companies think or is that just mine?

 

 

Who-me wrote:

cradleinflames wrote:
Would another option be to leave the default clock speed (I assume the default is 1 MHz, even though I'm trying to run it at 8 MHz)?  Not sure if the math would work correctly (the space is almost all used up due to the way I had to implement the algorithm.  I was wrong, it's an ATTiny5, which has even less space) and the update rate would likely be significantly slower, which could be an issue. 

The math is unaffected by MHz, merely the time it takes to complete.

The part specs 4MHz at 1.8V, so you could try Middle ground values, between 1MHz and 8MHz ?

 

It does not define MHz < 1.8V but note the POR releases at typ 1.4V, giving a significant 'no mans land' and your tests suggest 1MHz is needed in that zone.

 

Digikey shows the tiny10 as the same price as the tiny5.

If you do need to jump to a better MCU there are others on Digikey, like EFM8BB1, STM8S003, N76E003 that come in 3x3 SMD packages.

 

I think the algorithm conversion from desired frequency to OCR changes as a result of the clock frequency so I believe there would be a change in resolution (though I'd have to see in which direction.  That might be something I can still look into though).  If the clock reduction means the Vpowerup voltage is lower though, that might be helpful . . . or could cause additional problems depending on if other components are powered up at different voltages.  Unless it's all related to the powerup voltage vs the system clock?  If that's the case and the system clock is the source of the problem, seems to make sense to focus on this as a solution and see if I can resolve that rather than focus on the startup delay.  That said, it would reduce the update rate on the frequency output so I'm not sure if that's a problem . . .

 

As far as the difference in price between ATTiny5 and 10, I might have to look into this but I'm guessing that, in volume, there is a cost difference.  We buy in bulk directly from Atmel (now Microchip) at a negotiated rate so I'm assuming the price would change if we chose a larger capacity chip (though they are drop in replacements, so that's good).   

 

Who-me wrote:

cradleinflames wrote:
I'm assuming the delay is going to have to be the solution though unless there's some other form of vsupply feedback like the VLMCSR register (which I assume probably won't work anyway with the out of spec startup time). 
 

You could use both a delay, and polling of VLMCSR, with a counter - eg check VLMCSR in a loop, incrementing PassCtr on a pass, clr on fail, which buys time and is a noise filter too.

 

 

 

I actually like that idea to add in the redundancy.  I really need to add in some sort of voltage check in the main loop as well, just to play it safe . . . good suggestion though!

 

kk6gm wrote:

First, I noticed this little item in the data sheet (for the ADC):

 

  • Supply Voltage Range: 2.5 V – 5.5 V

 

So be warned.

 

Second, is it allowable that your uC doesn't produce any output for 5 seconds after the power begins ramping (say, 5 seconds after it reaches 1.8V)?  If so then it would seem easy enough to do a 5 second delay at the initial 1 MHz clock, then init all the hardware and bump up the clock speed to 8 MHz.  Just make sure you calculate the 5 second delay based on a 1 MHz clock, not an 8 MHz clock.

 

Hmm, that could definitely be another potential issue I run into, even if I sort out the system clock problem.  If the system clock is coming up at maybe 1.8V and the ADC comes up at 2.5V, even if I get it to start up correctly by not moving to 8 MHz and staying at 1 MHz, I'm likely still going to run into a problem trying to initialize the ADC if we start up before it's at 2.5V . . .

 

Good find!

 

A lot of great information in here, guys!  Thanks a lot!  Looks like I have a few different things I can look into!