The "Complexity Trap"

Go To Last Post
82 posts / 0 new

Pages

Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 4

Greetings, folks -

 

For the last several years, I have been working on variants of a single product that uses an M328P. I find that my mental capability is just able to maintain a catalog of the basic features and peripherals, just enough to warn me when I need to look up details about something or other. By and large, things at the hardware level have been working fairly smoothly.

 

Now, I am seriously considering "upgrading" to a Mega3208 or Mega4808 (or maybe xx09). I grant you that there is major learning curve, involved. And, I grant you that many of the changes are in quantity (e.g. several USARTs). But, there are also some significant changes in detail (e.g. clock system). It strikes me that, as far as my own personal mental capability is concerned, this class of device may be pretty close to the limit of what I can catalog in my mind. It certainly is at the edge of the territory where reliance on configuration wizards becomes more prevalent. 

 

This leads me to wonder how people navigate even an ARM M0? There are so many details, from sleep modes to interrupt management to fault handling, that I really do wonder whether or not a wizard is sufficient. But, since I am not going there, I am not going  to trouble myself over it all. 

 

The real point of this little exploration is how mere mortals handle complexity. There must be a point (which probably varies a lot from individual to individual) where the human brain is really challenged to keep track of things. That seems to be where we start relying on machine expertise (e.g. wizards) and where we start relying on group intelligence (designers who cross-check each other and who have varying areas of expertise).

 

I also fret that this increasing complexity is a driver away from "lone-wolf programming". But, each time I contemplate this, as I am doing now, I need to remind myself that this really has been the case for a long time. It is absurd to think of a single individual, for example, doing the software for an engine control module in a modern passenger vehicle or a cellphone. But, we still have the class of less complex applications that has not gone away. And, that is the area where individuals CAN shine, and where a AVR Tiny can do more than enough. All I need to do is look at our microwave oven and realize - there is an important tool that gets along fine with pretty simple smarts. And there are many others in the similar class. 

 

I guess that part of the skill of an engineer or programmer is understanding just where those limits are, then use the appropriate tools and/or organizational to get the job done. Now, if I could just learn how to use that tool! smiley

 

Best to all...

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

Last Edited: Fri. Nov 15, 2019 - 07:08 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

ka7ehk wrote:
It strikes me that, as far as my own personal mental capability is concerned, this class of device may be pretty close to the limit of what I can catalog in my mind.

 

Same here. I played a bit with ARM chips, but found myself needing to print huge datasheets, plus the general documents from ARM, which the device datasheets often defer to.

I needed to print the documents because some times I wanted to cross reference 2 or 3 documents. There are only so much monitors you can have.

 

With AVR, I just need to quickly open a pdf, go to some peripheral chapter and I find the info I need with relative ease.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

For something more complex I'd even consider something with operating system and GPIOs.

 

Like currently I'm making some simple standalone dual ISP programmer for loading firmware for 16u2 and bootloader+firmware into custom arduino mega boards. Just some scripts to run avrdude.

 

However on ARM it helps doing one thing at once, it's much easier to handle so much information.

Computers don't make errors - What they do they do on purpose.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

For me with XMEGA and the little bit of stuff I am doing with ARM I have found AtmelSTART to be a very useful tool to get things going....but with the knowledge that START does not generate the most efficient code...and sometimes it did not generate the correct code either. ( search for my threads on that.  I think it had to do with the Tiny817 or something and the XMEGA)

 

With ARM I have been figuring out the 'bare metal' approach, but it is a steep learning curve.

 

ka7ehk wrote:
Now, if I could just learn how to use that tool!

 

You mean AVRFreaks, and the ARM community attached to it!? smileylaugh

EASY!  Type in a question....and give it time to percolate.

 

East Coast Jim

 

I would rather attempt something great and fail, than attempt nothing and succeed - Fortune Cookie

 

"The critical shortage here is not stuff, but time." - Johan Ekdahl

 

"Step N is required before you can do step N+1!" - ka7ehk

 

"If you want a career with a known path - become an undertaker. Dead people don't sue!" - Kartman

"Why is there a "Highway to Hell" and only a "Stairway to Heaven"? A prediction of the expected traffic load?"  - Lee "theusch"

 

Speak sweetly. It makes your words easier to digest when at a later date you have to eat them ;-)  - Source Unknown

Please Read: Code-of-Conduct

Atmel Studio6.2/AS7, DipTrace, Quartus, MPLAB, RSLogix user

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

As a part-timer in the field, and primarily a hobbyist, I am in a very different category than those who work in the field full time.

 

There are perhaps 1/2 a dozen AVRs that I typically work with, and they have different clocks, pre-scalers, T/C modules, etc.

 

The approach that works the best for me is to spend time on the high level design, sometimes in my head, but usually scratched out on paper, of the various sections of the project and how they will interact with each other.

I have a box for the different sections, and then start filling in which box gets Timer #1, which box gets USART #2, etc.

This ties in to HW pin assignments.

 

At the end of the day I might well have written several test programs to test various parts of what will eventually be tied together.

So any one test program is easier to keep within my head than juggling all of the program at once.

e.g.  Write my ISR driven, ring buffered, USART input routine.

Then the same for the output.

Easily tested, easily inserted into the bigger project.

 

When I'm actually writing code I have a three monitor setup.

So it is easy to have my Programming IDE open, one or two (Xmega), data sheets open, the schematic, my project notes, and the test program or other prior programs that used the same timer/counter in ISR CTC mode, or whatever.

 

You mentioned the new micro's clock above.

When I first started with the Xmega, it had a new clock system.

I spent an afternoon with my testbed, and the data sheets, and my scope, and wrote a bunch of routines to set the clock source and the PLL.

It took a little time and reading, but when that was my only focus, and I began to understand how it worked, it was then easy to write 1/2 a dozen routines for various combinations.

Now, moving forward, its just a matter of copy & paste!

 

So, for me, I tackle the project in small steps, and with multiple monitors for multiple files to be open simultaneously.

I agree that although I can know everything about a sub-section of the code, I have a hard time keeping that level of information within my brain for the entire project, for the more involved ones.

I also have a Word file open for "Notes" on every project.

As I make design decisions, some of which come up after the original planning..., or figure out something that took some work, I write a few notes about it in the project's Notes document.

When I come back to a project months, or even years later, it is incredibly helpful to read my prior notes, and not have to re-live the learning curves.

 

JC

 

Edit: Typo

 

Last Edited: Fri. Nov 15, 2019 - 10:18 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Have you guys heard of a thing called 'Arduino'? It now supports a significant number of different processors. Whilst I'm not suggesting you use the Arduino IDE - as we know it sucks, the core libraries give you most of what you need to get started. So much so you really don't need to know too much about the actual silicon you're running on. It takes care of the clock configuration etc so you can concentrate on what you need to do. With the likes of PlatformIO you can be up and running in a few clicks. Worried about 'efficiency'? Especially with the digitalRead/Write functions? On a recent board, I was measuring 150ns between port writes - note this was on a 400MHz 64bit dual core cpu. 

 

If you don't like the code the core library gives you - write your own. It's all open source (at least in most cases).

 

Maybe I'm just being lazy, but is my time best spent reading the minutiae of the clock setup, or just going with the default the Arduino core gives me (which is usually what you want anyways) and getting on with the Job. Similarly with usart, spi, adc etc. On a recent project I used an XMC1100 which is a little 32MHz Cortex M0 with 12 bit adc, 64k flash, 8k ram, 2 usarts and some timers. It has an internal clock generator which is good for 1% across the temperature range. And it is 5V. The peripherals are very flexible which translates to lots of setting/registers and a swag of documentation to wade through.

 

I just want a standard uart with interrupt driven queues - at least to get going. What does the Arduino core give me? If I want RS485, then I'm going to have to do a bit of reading. Nevertheless, the core gives me the basics I can modify to my own needs.

 

The ADC has a number of features like auto scanning etc - of course the Arduino core doesn't support this - just the usual read the value of an analog pin. Which is usually what we want.

 

Then if we want to use more advanced peripherals like TFT lcds, sdcards, USB, cameras, wifi, and so on, do we really want to write the low level code for these? With the Arduino ecosystem these have already been done and tested extensively. Would I be able to any better myself?

 

I had a job interview recently and i was being asked about my recent projects. When I mentioned the XMC100, I was asked what tools I used. When I said 'Arduino' the two interviewers looked shocked. I was asked 'Why not use DaVe' (Dave is the code generator and eclipse based IDE for the Infineon product). My response was:

1. The board was supported by Arduino/ PlatformIO that runs natively on the Mac.

2. My requirements were pretty simple: some adc, pwm, serial and gpio. Nothing time critical or special

3. I'd have to load up Gigabytes of Dave under Windows (I use Mac) and then come up to speed with Dave.

4. The Arduino core for XMC uses Infineon's SDK, so not much difference code wise. I can do the very same SDK calls if I so desire or even talk direct to the hardware - which is what I do for the ram parity checking.

 

There were no more questions along that line.

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I think ... I do a lot of "complexity reduction by abstraction" - I figure out how to do "the usual" things that I need (which tends not to be that complicated, in the end), and then ignore the other complex options until a time comes up where I need something more.

It's one of the things I like about Arduino (though their subset is REALLY minimal), and really disliked about ASF (and most of the other vendors' similar tools): I don't want to know all about the DFLL, FDPLL, and other clock options scattered across PM, SYSCTRL, GCLK, and Clock System sections of the manual.  I just want to (maybe) stick a crystal on there and run everything at the max possible speed...

 

(Compare the ASF "config_clocks.h" file with "What I want", which is something like "clock_init(crystal=4e6, f_cpu=48e6);")

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I would agree with Kartman and westfw.  

 

Arduino is simple and does most of what you want.

You can get a prototype up and running easily.    And even move it from one target to another.  e.g. AVR to ARM, ESP, ...

 

There might be individual sections that you want to alter e.g. peripheral with a specific hardware feature.

Study that particular chapter of the data sheet.   Not too daunting.

 

Or if you are really brave,  you could try to decipher the Start documentation or code.

Other Wizards are available from other manufacturers.   These pobably make your head hurt less than Start.

 

MBED is more sophisticated than Arduino but restricts you to a (wide range of) ARM targets.

 

David.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

The complexity trap is what drove people to invent Object Oriented programming languages.  I call it modular programming.  I'm not even sure what "Object Oriented" means.  All I know is C++ with it's classes (modules) is much better than C.  Stroustrup calls it divide and conquer.
 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

There's only so much you have to do to get any chip to the point where it flashes an LED. All the other complex stuff like interrupt controllers, DMA engines or whatever can wait until you are happy and familiar with the basics. I guess the "trick" is being able to spot what's important and what's superfluous (things like Start can help with this)

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Start is fine if you just want to run a ready-made example for a specific dev board.

 

If it is a different chip or a different board you get lost in the mire.

 

I had no problem with the legacy App Notes from Atmel.    I could port the code to a different AVR.   I could port the code to a different Compiler.

Yes,  if you can understand how to use Start it will be good experience for you.    It will enable you to use Start for your own projects.

Start makes my head hurt.   I suppose I should "try harder" but suspect that this will only make it hurt more.

 

Arduino limits you to (a fairly wide range of) supported boards.   It is likely that you own the hardware already.   There is proven code.   It is easier to follow.

 

David.

Last Edited: Sat. Nov 16, 2019 - 03:33 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

david.prentice wrote:

Start is fine if you just want to run a ready-made example for a specific dev board.

 

If it is a different chip or a different board you get lost in the mire.

 

+1 Amen

Jim

I would rather attempt something great and fail, than attempt nothing and succeed - Fortune Cookie

 

"The critical shortage here is not stuff, but time." - Johan Ekdahl

 

"Step N is required before you can do step N+1!" - ka7ehk

 

"If you want a career with a known path - become an undertaker. Dead people don't sue!" - Kartman

"Why is there a "Highway to Hell" and only a "Stairway to Heaven"? A prediction of the expected traffic load?"  - Lee "theusch"

 

Speak sweetly. It makes your words easier to digest when at a later date you have to eat them ;-)  - Source Unknown

Please Read: Code-of-Conduct

Atmel Studio6.2/AS7, DipTrace, Quartus, MPLAB, RSLogix user

Last Edited: Sat. Nov 16, 2019 - 03:41 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

 

Perfect timing

 

I just got a few of these nice boards

https://www.aliexpress.com/item/...

STM32F411CEU6 core board 128KB RAM 512KB ROM

 

Took me 10 min to get blinky running @48MHz , the project i had for s STM32F030

And another 30 min to get my dersired 96Mhz (had to make my own pll table) , and learn about GCC (g++) not being forgiving , when isung named struct initializers

 

Arm opencm3 blinky code

//
// STM32F411CEU6 core board 128KB RAM 512KB ROM - LED on PortC 13
//

#include <errno.h>

#include <libopencm3/stm32/flash.h>
#include <libopencm3/stm32/rcc.h>
#include <libopencm3/stm32/gpio.h>
#include <libopencm3/cm3/systick.h>


const struct rcc_clock_scale rcc_hse_25mhz_3v3_local[] = {
	{ /* 96MHz */
		.pllm = 25,
		.plln = 192,
		.pllp = 2,
		.pllq = 4,
		.pllr = 0,
// .flash_config MUST come after .pllr - G++ Wont accept struct members out of order
// See : hello-stm32-F4blinky/libopencm3/include/libopencm3/stm32/f4/rcc.h
		.flash_config = FLASH_ACR_DCEN | FLASH_ACR_ICEN |
				FLASH_ACR_LATENCY_2WS,
		.hpre = RCC_CFGR_HPRE_DIV_NONE,
		.ppre1 = RCC_CFGR_PPRE_DIV_2,
		.ppre2 = RCC_CFGR_PPRE_DIV_NONE,
		.voltage_scale = PWR_SCALE1,
		.ahb_frequency  = 96000000,
		.apb1_frequency = 48000000,
		.apb2_frequency = 96000000,
	},
#if 0
	{ /* 120MHz */
		.pllm = 25,
		.plln = 240,
		.pllp = 2,
		.pllq = 5,
		.pllr = 0,
		.hpre = RCC_CFGR_HPRE_DIV_NONE,
		.ppre1 = RCC_CFGR_PPRE_DIV_4,
		.ppre2 = RCC_CFGR_PPRE_DIV_2,
		.voltage_scale = PWR_SCALE1,
		.flash_config = FLASH_ACR_DCEN | FLASH_ACR_ICEN |
				FLASH_ACR_LATENCY_3WS,
		.ahb_frequency  = 120000000,
		.apb1_frequency = 30000000,
		.apb2_frequency = 60000000,
	},
	{ /* 168MHz */
		.pllm = 25,
		.plln = 336,
		.pllp = 2,
		.pllq = 7,
		.pllr = 0,
		.hpre = RCC_CFGR_HPRE_DIV_NONE,
		.ppre1 = RCC_CFGR_PPRE_DIV_4,
		.ppre2 = RCC_CFGR_PPRE_DIV_2,
		.voltage_scale = PWR_SCALE1,
		.flash_config = FLASH_ACR_DCEN | FLASH_ACR_ICEN |
				FLASH_ACR_LATENCY_5WS,
		.ahb_frequency  = 168000000,
		.apb1_frequency = 42000000,
		.apb2_frequency = 84000000,
	},
#endif
};


extern "C" {
    void sys_tick_handler(void);
}

static void clock_setup() {
    // First, let's ensure that our clock is running off the high-speed external 25MHz Xtal + pll -> 96MHz & 48MHz

    // Use our local 25MHz Xtal to 96MHz PLL settings , opencm3 doesn't support 96MHz
    rcc_clock_setup_hse_3v3(&rcc_hse_25mhz_3v3_local[0]);  // See hello-stm32-F4blinky/libopencm3/lib/stm32/f4/rcc.c

    // Since our LED is on GPIO bank C (13) , we need to enable
    // the peripheral clock to this GPIO bank in order to use it.
    rcc_periph_clock_enable(RCC_GPIOA);
    rcc_periph_clock_enable(RCC_GPIOB);
    rcc_periph_clock_enable(RCC_GPIOC);

    // In order to use our UART, we must enable the clock to it as well.
    rcc_periph_clock_enable(RCC_USART1);
}

static void systick_setup() {
    // Set the systick clock source to our main clock
    systick_set_clocksource(STK_CSR_CLKSOURCE_AHB);
    // Clear the Current Value Register so that we start at 0
    STK_CVR = 0;
    // In order to trigger an interrupt every millisecond, we can set the reload
    // value to be the speed of the processor / 1000 -1
    systick_set_reload(rcc_ahb_frequency / 1000 - 1);
    // Enable interrupts from the system tick clock
    systick_interrupt_enable();
    // Enable the system tick counter
    systick_counter_enable();
}

// Storage for our monotonic system clock.
// Note that it needs to be volatile since we're modifying it from an interrupt.
static volatile uint64_t _millis = 0;

uint64_t millis() {
    return _millis;
}

// This is our interrupt handler for the systick reload interrupt.
// The full list of interrupt services routines that can be implemented is
// listed in libopencm3/include/libopencm3/stm32/f0/nvic.h
void sys_tick_handler(void) {
    // Increment our monotonic clock
    _millis++;
}

/**
 * Delay for a real number of milliseconds
 */
void delay(uint64_t duration) {
    const uint64_t until = millis() + duration;
    while (millis() < until);
}

static void gpio_setup() {
    // Our test LED is connected to Port C pin 13, so let's set it as output
    gpio_mode_setup(GPIOC, GPIO_MODE_OUTPUT, GPIO_PUPD_NONE, GPIO13);
}

int main() {
    clock_setup();
    systick_setup();
    gpio_setup();

    // Toggle the LED on and off forever
    while (1) {
        gpio_set(GPIOC, GPIO13);
        delay(1000);
        gpio_clear(GPIOC, GPIO13);
        delay(1000);
    }

    return 0;
}

 

 

/Bingo

Attachment(s): 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

The AVR-0 series has lots of peripherals, interrupts, timers, a CPU and some interconnect capability, but IMO the item missing are chip-level architectures.   By that I mean a design paradigm that provides policies or constraints -- "constraints that deconstrain" -- the way these components are used together.   I don't think that will be solved by run-time software (e.g., Arduino or FreeRTOS) but with good design tools.  Maybe MPLABx and Studio help with this.   For me, the first step is to read the datasheets and experiment.  I have the luxury of doing this as a hobby.

 

edit: architecure => architectures

Last Edited: Sat. Nov 16, 2019 - 04:23 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I'm prob going to do libopencm3 & Bare-Metal programming , as the ST-Libs are awfull.

And some says it takes as long time to learn the libs , as the Bare-Metal bits.

 

I think someone made a "Beta" Arduino port for it , saw something on Ada

 

/Bingo

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

 For me, the first step is to read the datasheets and experiment. 

You are in the winners circle...somebody has to know what's going on....Then, building on the codes & libs & classes you create...everyone else can tag along and create new apps and act like they've got things under control...but they owe it all to you.

So when we use these things, coding has become a virtual team effort, even if some of the team is unknown and, perhaps, unappreciated. 

When in the dark remember-the future looks brighter than ever.   I look forward to being able to predict the future!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Ahhh, now we are getting to the nub of my rant.

 

For me, its all about those chips that are between plain 8/16 bit MCUs and the ones that are big enough for a full OS. Yes, we have RTOSs up the wazoo. Yes, we have STARTS. Yes, we have Arduino. 

 

Arduino, I "understand" pretty well. START leaves me totally bewildered; I have no concept of what it is supposed to do other than fill up your MCU flash. RTOS seems to trade one complexity for another. 

 

But, I simply cannot  comprehend the path from BlinkyLED to The Next Great Gizmo. The problem, for me, is that, in order to design something, you need to understand. Understand the capabilities and the limits. Understand where it shines and where it is weak. You can't do that based just on trust.

 

Its nice to talk about "learn as you go" and "learn what you need to learn" but that involves TRUST that everything else is working and is doing what you have been told to expect. At this point, I don't have that trust. And the complexity seems incredibly daunting. 

 

Now, a 4808/9 does seem to be just in range. Ultimately, I am pretty sure that I can get it to do what I want and need (trust level around 63.7%). It is on the near side of the complexity chasm for me. What I need to do, at this point, is quit worrying about that NEXT step, which I will probably never make and focus on the task in front of me. A bit like learning the ins and outs of a Lotus Super7 without worrying about that Ferrari that I will never even sit in. Though I may try to "sit in" an RPi. 

 

My thought process runs something like this:

 

A. I want to design A Great Gizmo

B. To make A Great Gizmo, I think I need to do A, and B, and C, and X, and Y, and Z

C. Those tasks all seem reasonable but... X requires a reasonably stable RealTimeClock. Can I do that with chip 1? Dunno.

D. And, to do B and C, I think that I need two SPI busses. Can I do that with chip 2? Who knows? Will that unused USART function adequately for SPI? Can't tell!

E. Y requires pretty low power consumption. How do I evaluate the bewildering list of peripheral power requirements when I don't exactly know which peripheral I need?

E. If I can't tell whether those requirements can be met, how do I evaluate the device for suitability?

 

So, I am supposed to invest gobs of time and energy to "learn as you go", only to find that the device I was trying to learn can't achieve the required power consumption with the peripherals that seem to be needed for the target project? That seems to me to be the real complexity crux.

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

Last Edited: Sat. Nov 16, 2019 - 05:54 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Bingo600 wrote:
saw something on Ada
Run-time wrt Microchip :

  • SAMV71 is most recent
  • FPGA (arm Cortex-M1, RISC-V)

 

GitHub - AdaCore/bb-runtimes: Source repository for the GNAT Bare Metal BSPs (branch : community-2019)

Ada_Drivers_Library/boards at master · AdaCore/Ada_Drivers_Library · GitHub (no Microchip boards)

 

edit : strikethru and

IGLOO | Microsemi

SmartFusion2 SoC FPGAs | Microsemi

Mi-V RISC-V Ecosystem | Microsemi

 

"Dare to be naïve." - Buckminster Fuller

Last Edited: Sat. Nov 16, 2019 - 06:11 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

One issue with open source is that you may get what you pay for.  If you buy a $5000 LCD graphics development package from some software house, it should be thoroughly tested, stable, 99.99% bug free.  If you download some package from somewhere for  running an LCD, you may feel it simplifies getting the LCD running in record time.  But after a bit, you notice it acts strangely with certain colors, and certain icons seem to make it crash.   You have to make sure you grab packages that have been top-ranked (let others find the flaws).  Development becomes easier, yet "Trust, but Verify" is critical.   You don't want your robotic welder to develop an attitude.

When in the dark remember-the future looks brighter than ever.   I look forward to being able to predict the future!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

ka7ehk wrote:
the path from BlinkyLED to The Next Great Gizmo. The problem, for me, is that, in order to design something, you need to understand.
so how is this really any different from when you first learned tiny/mega AVR8 (and I don't mean AVR0/1)? I'm guessing you learned GPIO, then UART, then ADC, then SPI, then I2C, etc?

 

So what makes it different for any other make or model of chip? If you do switch to AVR0/1 (which, let's face it are really Xmega) then it's just a relearning exercise a step at a time. In my 35+ years of professional life I'm lucky that for various projects I've had the opportunity to work with a huge variety of CPUs from a huge variety of sources and learning the nuances and foibles of one to the next is actually fun. But at the end of the day we all know what a UART or an SPI or an LCD or whatever does. With a new chip it's just a case of finding out how it does it specifically (where do you set the baud rate, where do you set 8N1, etc etc).

 

Sure the fancier chips throw in more bells and whistles and eye candy but most can largely be ignored until you start to think about a 16 byte UART FIFO or whatever then realise that in this silicon it's already a feature of the chip!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

gchapman wrote:

Bingo600 wrote:
saw something on Ada
Run-time wrt Microchip :

 

OOpzz - Ahould have written Adafruit full out ... Sorry

 

But i'll stay out of Jim's thread blush

 

/Bingo

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I have only been playing with AVR toys for 9 months, but the difference I note between the 328 and the 4809 is the interconnect: event system, CCL, pin-mapping etc.  It seems with the former the interconnect was through software registers, with the latter the interconnect can be direct, and there is more flexibility with routing to the h/w pins.  So the question to me becomes how does one get their hands around that?  For now, I just read the datasheets to try to get enough insight to be able to come up good abstractions.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

When I "got started" with AVRs, I already had some experience with 8051. The first AVR application was VERY simple, only used because of the relatively low interrupt latency (an AT90????) as a data decoder for stuff embedded on NTSC video. 

 

So, yes, I understood what a UART was, and I2C and SPI and the difference between RAM and ROM. But even comparing AVR-0 to 328P is pretty daunting. One of the big project goals is longer battery life than the present unit. I simply do not see how to figure out how it would be with a 4808, other than taking 6-9 months to build and code one (essentially a 95% complete design). If it does not work, what next?

 

And, that is just one unknown out of a half-dozen. For example, I big major problems with FatFS and uSD cards - what will that be like on a 4808? Will FatFS even work on 4808? Have yet to see any reports one way or the other.

 

A big issue, here, is age. I am 78. No problems right now. But, I see a relatively few years remaining in which I remain somewhat competent to do original designs. Wasting a year just to "learn" something that turns out unworkable? Now, THAT scares me!

 

Jim 

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Why would FatFs be any different on 4808? Both have SPI.

 

As for sleep modes and power consumption. Presumably the datasheets give modes and expected performance? While it's a guess I'd assume newer chips (perhaps fabbed on a smaller geometry?) are likely to out perform predecessors (the general trend in technology is that it gets better over time).

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I reckon you are doing very well for 78.  I am much younger than you.  I am already losing the plot.

 

I agree with Cliff.   FatFS just needs SPI.   Replace the 328P primitives with 4809 primitives for spi_init() and spi()

Power consumption just depends on going to sleep whenever you can.   Same as 328P.

 

The 4809 has been on the market for about 2 years.    FatFS and sleep have been done already.

 

Admittedly,  the Arduino 4809 boards are still going through teething stage.    The Core is stable.   Not all third party libraries support 4809 yet.

 

David.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I THOUGHT that I remembered reading some posts, a few months back, about problems with FatFS on AVR-0 chips. I cannot find those posts, now, so maybe my memory is incorrect. Would not be the first time.

 

I have the same expectation on power consumption. Spot checks generally confirm this. But, my expectations have been proven wrong, before. If I cannot add it up on a piece of paper (or spread sheet), it makes me really uneasy.

 

Its not like I feel that I am "loosing it". It is that I happened to look back and see how long my first product took from first prototype (about 3yr). Then, I look at people around me and those in their mid-80s tend to be having major  problems, health, mobility-wise, and mentally. So, another 3yr on a new product is a huge chunk out of that possible "remaining useful time". THAT is what scares me. So, the prospect of spending 1/2 to 3/4 of a year on a dead-end "learning exercise" is one that I do not relish. Sure it would be fun, but there is a bit more than fun at stake.

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

Last Edited: Sat. Nov 16, 2019 - 09:25 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

"Dare to be naïve." - Buckminster Fuller

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

ka7ehk wrote:
I have the same expectation on power consumption.

 

It's important to make sure the UPDI pin is always HIGH during normal operation, or else the UPDI unit might wake up and consume power. It does have an internal pull-up which my tests (on tiny 1614) show to be about 40kohm, so it's not very strong.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

That is useful to know. Its probably well buried in the documentation, somewhere. Do you have a recommendation for pull-up? Adding an external 39K would make a net 20K, more or less.

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

ka7ehk wrote:
Wasting a year just to "learn" something that turns out unworkable?

Once you've chosen a suitable new micro-controller, do read the errata notes. Modern silicon is chock full of bugs and knowing what they are can prevent you designing something that's unworkable.

 

Oh - and I'm sure it wont take a year to learn 4808/4809. Whilst it's conceivable that you could hold the entire peripheral set of mega328 in your head and to a reasonable level of detail, when you turn up the complexity to XMEGA or PIC24 or PIC32 or ARM you have to realise you cannot hold all that information in your head, unless of course you have superpowers.

 

As a PIC32 programmer now, I found I had to concentrate on just one peripheral at a time (although sometimes 2 or 3 peripherals have to work together). I wasn't ashamed about spending a day or two just reading the datasheet over & over and making notes on paper. I didn't read about peripherals I wasn't going to need.

 

In summary, I found the increase in complexity (in PIC24 & PIC32) wasn't a trap but was instead a boon, because I could do more with the better peripherals than I could ever imagine doing with megaAVR.

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Thanks for that suggestion and encouragement. And the tip about the Errata Notes. 

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I used one of the PIC32 MX or MZ high end parts...a few days in, I was being asked "you are still working on reading the  signal on the adc?!!?!!"...it was much more "complicated" than the old megaxxx ADC,  in that there seemed to be a billion options, 99.2% I had no interest in.  A major culprit (besides my unfamiliarity) was all of the errata...there were posts all over from developers just trying to get the ADC to work at all...finally success......Take it one struggle at a time.

 

Remember chip complexity is a different beast than application complexity. you can have a complex app & simple (8051) chip, or complex chip & simple app , etc.   What appears complex often seems much simpler once you've done it once.

When in the dark remember-the future looks brighter than ever.   I look forward to being able to predict the future!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I dropped cold onto Arm M0+ three years ago. It's not as frightening as you might think (and I would have to assume the same for other processors mentioned in this thread which I haven't played with) for a couple of reasons.

 

First, it's so much less hassle to use a higher level language. On the Arm, most C constructs come down to one or two instructions, so with the full optimisation there isn't much of an issue there. I would not even think of considering assembly for Arm without a very good reason; I need to inspect or single step it only on very rare occasions to see if the optimiser has done something unexpected.

 

Second, STM provide an excellent configuration tool (again, I'd assume other manufacturers provide similar but I haven't had a need to find out). The code it produces is admittedly often slow and sometimes clunky, but this is because it is full of special cases that allows the same interface to work for all the range of processors. The point is that the code works, and you can use that as a basis to see how it works and how you might like to implement something for yourself. In particular, the clock tree management is a doddle - a graphical interface so you can set exactly the clocks you need at the frequencies you need. Assigning i/o - direction, pullup, speed, interrupts, serial/spi etc is a doddle for the same reason.

 

Certainly the manuals are large - I tend to have two I use, the generic M0+ and the specific part manual, both running to a thousand pages or so - but that's not an issue. They cover a lot of ground and are generally well laid out (though there are of course exceptions) but once something is set up, that tends to be it. And then you're down to plain old C.

 

I agree with AVRCandies: the chip complexity is not the issue *once* you have passed that initial hurdle of getting blinky working - and that's what the automated frameworks are for. And system complexity can be significantly reduced if you don't have to really worry about getting the chip up; get the thing working and leave the optimisation until you know if you need it.

 

Neil

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 1

the "trick" is being able to spot what's important and what's superfluous (things like Start can help with this)

I dunno.  I find "libraries from other people" (ASF, Start, and Arduino included) often obscure ... "simplicities."  Whether you use a "simple" clock configuration or a complex one, you always get the 1K of clock initialization code, so you can't TELL whether there's a simpler way...  I mean, you WANT libraries to hide complexities, but they don't highlight the simplifying features of the the new chips as well.

Sort of like the way the Arduino code for the 4809 doesn't use the VPORTs and doesn't have a pin_toggle function...

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

ka7ehk wrote:

Its not like I feel that I am "loosing it". It is that I happened to look back and see how long my first product took from first prototype (about 3yr). Then, I look at people around me and those in their mid-80s tend to be having major  problems, health, mobility-wise, and mentally. So, another 3yr on a new product is a huge chunk out of that possible "remaining useful time". THAT is what scares me. So, the prospect of spending 1/2 to 3/4 of a year on a dead-end "learning exercise" is one that I do not relish. Sure it would be fun, but there is a bit more than fun at stake.

I'm almost half your age, but there is one cliche that I've learnt that has been all true so far: You don't regret what you did, only those things you didn't do.

With regard to health I believe that to keep your health as much as possible you need to exercise. That goes for mentally as well as bodily. One way to exercise your head is to learn new things and figure out problems. If you stop using your brain it will deteriorate.

To me it sounds that you are reluctant to take on a new project because you fear you might not finish it. You're going to stop doing what you like because it MIGHT be a dead end? How will that get you anywhere but turning into a grumpy old man withering away in a chair complaining about all the great ideas you never got to do? With that way of thinking it would not surprise me if your mind was not able to work on these products in three years time. But if you continue to learn new things, figure out problems, exercise your brain, who knows how long you will keep it up?

Now, I understand that there might be other tings that you would like to spend your time on, like family and friends. But then we're down to priority. What do you most want to do? Can your wants be combined? Or can you share your time between the wants?

And what if you're not able to finish your product? What then? Will it have been all for nothing? I don't know. What I do know is that you will have spent your time doing something you like to do. And maybe somebody will pick it up and finish it, hopefully praising your initial effort that made it possible for him/her to make this wonderful product that might make the world a better place for who-knows how many people. That will for sure not happen if you stop doing what you like to do.

 

Jim, from one freak to another, stop worrying. Prioritize your wants, and start doing what you like to do. :-)

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hi Jim.

 

When those youngsters approach a similar stage in life they may realise that it is only natural to consider the remaining time and capacity for their to-do-list. I am already myself.

 

Perhaps outsourcing your specific unknowns, power consumption for example, might lessen the list of project risks. Any bright "things" at your Uni that might be able/willing to invest some of their more agile neurons on the task? And that is not to suggest that your neurons are not up to the task. Just consider it a form of multitasking/multicore/time saving.

 

Or maybe someone here wants the challenge.

Ross McKenzie ValuSoft Melbourne Australia

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

valusoft wrote:
When those youngsters approach a similar stage in life they may realise that it is only natural to consider the remaining time and capacity for their to-do-list.

 

I think this should be pretty obvious even for a younger person, it just takes a little thinking to figure out why. If your average life expectancy is, say, 80 years, then when you are 30, a year means you spent 1/50 of your remaining life. But when you are 70, it represents 1/10. So a year for a 70 year old person is as valuable as 5 years for someone who is 30.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

But if your time is running short, wouldn't you rather spend it on something fun?

For me the fun is making things work. Trying, failing, trying again, come up with a clever solution. But when it finally works, then I just find something else to work on.

 

One should always consider ones to-do-list once in a while, and maybe re-prioritize. But I see no point in worrying about the length of the list and the expected time left. For me it would be more worrying if I managed to tick of all the items on the list. Then what?

Most of us don't know how much time we have left. Yes, there is a life expectancy, but who says that you're average?

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Some great comments here, all well thought and wise. 

 

Yes, much of this angst is brought on by now seeing the "end" closer at hand. Its less of an abstract thing when I see friends and companions showing aging problems. Or more. Its that real sense that the remaining time is finite, unknowable but finite. I want to make the most of it, and the thought of starting in on something that does not get "finished" just because I took an unproductive side trip grates on my sensibilities, a bit. 

 

All of that aside, your  words are well put and help even out the perspective. I do appreciate those words!

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I'd go for it. Learning new (and apparently very clever!) technology has got to be a really fun way to spend your time.

 

I'd just follow the usual learning pattern we always advocate here. So flash an LED with delays, then do it again with a timer. Then get UART sorted so you can actually talk to the thing and drive other tests. Then take the peripherals in the order you enjoy them and perhaps with eventual project use in mind like ADC, PWM, SPI, I2C etc.

 

The only thing that may make the LED flashing thing more complex than usual is going to be clocks. For traditional AVR we just set some fuses then the clock is fixed at some rate (well apart from things like CLKPR) but these new chips are much more "ARM like" in that they start up with some default clock but now it's runtime switchable which adds complexity but you could just ignore it. I think they are 20MHz/6 by default so 3.33MHz so just go with that. I think UART is still "ďo-able" because unlike traditional AVR that only gave a binary divider to set baud these have fractionable baud rate setting so you should gave more control.

 

Alternatively this is the kind of thing where Start actually proves to useful. In a simple GUI interface on the website you dial in the clock speed you want and the baud rate, push a button and out pops an AS7 project that you can later pull apart to see "how on earth did they do that?", but if nothing else it does all the timer sums to work out the magic numbers you need for fractional baud rate generation and so on.

 

I've not explored AVR0/1 myself because I'm a luddite but I do read posts about them and there's a lot of local knowledge here that should be able to help with anything  where you get stuck (and in time you'd become one of the local AVR0/1 experts too! ;-)

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Just get yourself an inexpensive mega4809 curiosity nano board. Blink an led. Then move forward to whatever is next. Eventually the datasheet will become familiar, and you will be left with mostly just translating hardware code you currently have. Unlike the 32bit world, there is really nothing overwhelming about the avr0/1. Anyone currently using the original avr should have no trouble making the transition. Any knowledge gained will apply to the whole avr0/1 series, which has a nice range of parts. Maybe in the end it doesn't make sense to switch from what you have, but I don't think it would take a big investment of time to figure out one way or the other once you have an actual part to work with.

 

You also do not have to depend on any code generators like 'start' if you do not want to. There is nothing complicated going on in these mcu's. The clock is easy- you always get a valid clock at reset (one of two values set by fuse), and from there just set a prescale value as wanted, if wanted- its like two lines of code because of register protection, hardly complicated. The biggest change would probably be the pinmux, but that is not difficult either.

 

 

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

But the OP wants his hands around the power and that is chip-system level thing.   Do any of the simulators provide status on expected current?  IIRC the 4809 nano board does provide a jumper for measuring current.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

It seems a major concern is the system power requirements, and the ability to compare that to the current hardware version.

 

That begs the question, of the present version of the project, (micro, sensors, and the SD card, etc.), what % of battery capacity is spent on the micro, vs all of the other stuff?

 

I would think that you have a good idea as to what modules / subsystems within the micro will be used for the updated version of the project.

(Ignoring any project / software improvements by using new capabilities, (DMA, Event System, Programmable logic, etc.).)

SPI for SD, I2C for accelerometers, PB switches, setup config LEDs, etc.

 

So a simple test program that exercises all of the expected modules, albeit doing nothing meaningful, gives you an upper bound on the expected current consumption by the micro.

Having a measured upper bound might make you feel more comfortable, (set your mind at ease), as you move forward with the project.

 

JC   

Last Edited: Sun. Nov 17, 2019 - 07:27 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Indeed, the current product appears to be dominated by the SD card. It is going to be hard to get that power consumption down and it will take some careful experimenting to do it. I do approach that with more than a bit of trepidation as I have, several times, thought I had FatFs and uSD card "under control", only to find that I had overlooked an important use case that blew everything out of the water, so to speak. That is more of a challenge associated with lone-wolf development as opposed to a team than age or technical expertise (in some vaguely related area) or such. 

 

On the positive side, I hope that I remember to heed those previous experiences and be more thorough, rather than just trusting an "acclaimed library". Plus, I now have more analytical tools available so there should be no excuse to skip the important testing. Except rush to market, of course. 

 

In the end, there is really no  difference with all the development work I have done almost all my life, except that I am now both boss and grunt, and have nobody else that I can blame :)

 

Cheers

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

 

Maybe you could store your readings in fram (very low current & no delays) & when full, power up the SDcard & bulk transfer.  Then empty the fram for another logging fill up.  Looks like there are truckload of FRAM's for about $2 @ 100 in perhaps 16kbit or 32kbit range

https://www.cypress.com/file/41676/download

 

 

Here is a bunch of seemingly good test info on lowering SD card consumption...might bear fruit...this guy did a lot of experiments with powering down the SD card while idle

https://thecavepearlproject.org/category/reducing-power-consumption/

  interesting:

     It’s all just a reminder that SD memory is actually more complex than the Arduino since the card itself may contain a 32 bit arm core.

 

Also:

A unique feature of the SanDisk SD Card is automatic entrance and exit from sleep mode. Upon completion of an operation, the card enters the sleep mode to conserve power if no further commands are received in less than five milliseconds (ms). The host does not have to take any action for this to occur.  The io lines must not float!

When in the dark remember-the future looks brighter than ever.   I look forward to being able to predict the future!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Thank you, avrcandies! I thought  I was the only one who had SD card power problems. Really!

 

That is a real treasure-trove. Thank you, thank you!

 

I've thought about added serial buffer, but that always drives me toward a memory co-processor to keep all that bus activity away from the host MCU. I'm really debating whether or not that would ultimately help. Have to read that blog carefully. There is a LOT of detail there.

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

Last Edited: Sun. Nov 17, 2019 - 09:26 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Why do engineers believe they need device flexibility as much as possible so there are no fewer than 1000 datasheet pages? Of course, sophisticated high-level languages, library collections, abstraction layers ​​and different tool environments try to make this chip complexity manageable. But in my opinion, often only one complexity is exchanged for a new, possibly much larger one. Would not it be much better to keep the chips easier manageable, limit possibilities to really needed things? Perhaps to make even smarter so that not so much has to be set by hand?

Last Edited: Wed. Nov 20, 2019 - 11:35 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

 

Why do engineers believe they need device flexibility as much as possible so there are no fewer than 1000 datasheet pages?

Whether its AVR's, hairdryers, or just about anything else, complexity seems to always creep upward (try the new voice activated dryer)...At some tipping point, someone will come in and say "avoid those big ,complex AVR chips, use our new 8051xyz chip.  Get away from mainframe style programming & try our little chips".  Apple computers used to be simple to work with, understand, and program...then got complex...so along comes Arduino.   Arduinos. will probably get more & more complex & eventually replaced by some baremetal device.

 

who's been in this boat?

 

https://www.chargify.com/blog/feature-creep/

 

 

When in the dark remember-the future looks brighter than ever.   I look forward to being able to predict the future!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0


"Experience is what enables you to recognise a mistake the second time you make it."

"Good judgement comes from experience.  Experience comes from bad judgement."

"Wisdom is always wont to arrive late, and to be a little approximate on first possession."

"When you hear hoofbeats, think horses, not unicorns."

"Fast.  Cheap.  Good.  Pick two."

"We see a lot of arses on handlebars around here." - [J Ekdahl]

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

But I admit I do get excited about the 8-pin ATtiny402 ($0.42) that has 4 kB flash, 256 B ram, and you can power-off and wake up when it sees its  address on I2C bus.

 

Pages