Wanted - BASIC interpreter for AVR (commercial usage)

Go To Last Post
22 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I'm posting this on the off-chance that someone might know where I could obtain a BASIC interpreter for AVR, that I could pay royalties on based on sales.

I've developed a character TV display based on AVR (nothing new, plenty of such things knocking around on the web), and I would like to look at putting a BASIC interpreter into it so that it could be used as a completely stand alone unit for education and projects. I'm thinking of having a PS2 or USB keyboard interface, so the user could enter their program direct into the unit without a need for a PC / laptop or any other software or hardware. The program would be stored in EEPROM and could also have a backup EEPROM, but that's beside the point of this post really.

So, anyone know anywhere / anyone who might be interested in letting me incorporate their AVR BASIC software into a one chip system. This is like a "butterfly" type of module, but with an 8 colour TV display instead of an LCD. The display I have at present shows 50+ columns by 20 lines high, although I can increase this. It's a lot easier to read programs and so on than an LCD display.

Any comments / suggestions welcome.

Thanks,

Zadok.

PS Datasheet for the TV chip is here.
http://www.eximiaprojects.co.uk/data/PAL1000V3.pdf

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Bitlash is close enough to your requirements to merit a look. It would fit well in your application, I think, unless pure BASIC is a requirement.

Bitlash runs on the AVR and interprets a tiny language with elements borrowed from BASIC, C, and Python. You write macros - sequences of commands - that Bitlash stores and interprets from EEPROM.

Have a look at http://bitlash.net for doc and downloads -- the Arduino / ATMega168 version is documented there, but the interpreter is easily ported to other AVRs, as you will see in the code.

I'm the author, and I'm happy to field questions here.

Best,

-br

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

This may be another option https://www.avrfreaks.net/index.p...

John Samperi

Ampertronics Pty. Ltd.

www.ampertronics.com.au

* Electronic Design * Custom Products * Contract Assembly

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Re: Bitlash

That's really interesting. I'm not fixated on BASIC as the language, the consideration was that it would have to be an interpreted language so that the EEPROM would have enough room to store more than just a tiny amount of code, as un-tokened source code would use up masses of room, and I thought the only likely thing available for this would be a rudimentary BASIC.

At this point I know very little about the Arduino, I will read up some more.

To start with I wanted to produce something for a contact of mine who works for a company which supplies to schools and colleges. As such, the product would be sold via commercial channels. In this case, if there there turns out to be a market (I mean reasonable sales figures), I would expect to pay you a royalty of some sort for your software. Apart from the fact that it would be unfair to just use GPL software in a commercial product, I'm not sure what the rules in the GPL are regarding this.

I've got a few questions if you don't mind:

Is possible to use your software WITHOUT the Arduino code?

What complications / limitations would this create?

Roughly how much ROM would the software use without it being tied to Arduino?

My existing TV code uses more or less all the registers, so I'd have to push and pull quite a bit on entry / exit to TV interrupts which run every 64 µS and consume a lot of the available CPU power. This could add on say 33 x 2 cycles = 66 cycles at 50 nS = 3.3 µS at both ends of the interrupt (where necessary - not all lines), making 6.6 µS, it leaves enough time for other things to go on, but with the TV code itself it might get a bit tight. Most available free CPU time would be in the frame sync and blank lines. At a very rough guess I would think that maybe 25% of CPU time might be left over, out of 20 MHz, meaning your code would run at the equivalent of around 4-5 MHz. Does that sound feasible? Of course the display could be blanked or semi-blanked displaying something like "Please wait" while anything specifically time demanding was going on.

It may be that trying to fit all this in just won't work, but it would be a natty little thing if we could get it going.

Not sure what part of the world you are in, but my design at present uses a SCART connector (used in Europe and Australia) to feed to the TV, not that this matters from a software point of view of course. In addition to that all the user would need is a PS2 mini-DIN socket for the keyboard input, and it would make a tiny little development system with a huge screen area by comparison to most LCD screens used for this kind of thing.

Just read a post about another similar sounding project (by joergwolfram), so I will find out more about that as well.

As I say, at this moment I know very little about Arduino. I see your "getting started" page says something about "the standard Arduino digitalWrite call" so not sure what level of integration there is between your software and the Arduino.

Enough rabbiting from me, I will start reading some more about the Arduino.

Thanks for your interest,

Zadok.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Sounds like you are re-creating the Sinclair ZX80, Wiki ZX80 . I think a couple of the forum members have extensive knowledge / history with this device.

I don't know if the source code was ever placed in the public domain. (Z80 not AVR based).

JC

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

It's easy to use Bitlash without the Arduino code. You'll find a makefile in the distribution set up to build and flash with avrdude using a usbtiny ISP. There is a small C module that provides replacements for the relevant portions of the arduino stuff. Depending on which AVR you are using, this will require some tweaking, probably to call routines you already have.

(Which AVR are you using, by the way? Sorry if I missed it in your .pdf...)

In addition to the '168, I have Bitlash running on the AT90USB162 (with LUFA), and on the Tiny85 (with V-USB). There's also working code for the ATMega328 and '644.

As for size: the full-up interpreter for the '168 weighs in at 14000+ bytes of flash - not much room to spare, though of course the situation is happier on the '328. That version has 32-bit integer arithmetic and the full function suite documented on the web site.

If that's not enough free space for the rest of your application (or if you are on a smaller AVR like the Tiny85) you can throw some switches and you end up with a 16-bit calculation engine and 16-bit variables; this saves a bunch of size in the runtime.

You can get the size down quite a bit further if you don't mind adjusting (tossing) features. In extremis, the Tiny85 version coexists happily with the V-USB stack in 8k. So integration is possible, even with something as demanding as a USB stack.

Regarding the licensing: when I publish stuff under an open source license, the intention is to get as many people using and improving the code as possible. The license I chose for Bitlash lets you use it for free, but in turn it requires you, when you distribute code containing Bitlash, to publish any modifications you make to the core Bitlash code under the same licensing terms, so that others can make use of your investment in improvements.

Not all projects and businesses can live with this requirement to share their intellectual property in this way, so of course alternate terms can be arranged. But there is a path to use and ship Bitlash in your work without charge.

Hope that helps, happy to take any followup questions.

-br

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

> Sounds like you are re-creating the Sinclair ZX80

Interesting similarities particularly with the tokenized BASIC and CPU doing the display etc. The peripherals on the AVR make it a lot more useful for experiments and interfacing, e.g. the ADC inputs can be read in and displayed as a graph on the screen (its possible to do this to 10 bit accuracy). As for saving programs to cassette and reading back again - Yuck! (shudder) EEPROM would have been a big boon in the days of the ZX80/81

Display is a lot clearer than ZX80/81, and in colour, more like the Spectrum which followed.

This is similar to a "butterfly" type BASIC stamp or whatever, but with a TV display instead to allow easier programming and display of results and graphs.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Cool project.

Brad

I Like to Build Stuff : http://www.AtomicZombie.com

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Zadok,

I don't know how committed you are to your current uC, but as you calculate overhead and CPU cycles for the user program recall that the XMega has a 32 MHz clock, as well as the ability to use external memory, (which I haven't yet done), and includes some DACs, was well as the usual peripheral modules.

JC

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Have a look at zbasic. EEprom-resident compiled pseudo-code, interpreted by a virtual machine on the AVR. Basic is very much like Visual Basic 6. Plus a cooperative multitasking scheduler.

Also a compiled native mode (not VM) version. Excellent quality.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Re: Bitlash

> There's also working code for the ATMega328

I think this would be a sensible choice. Thanks for the info on code size etc. In your opinion, would Bitlash run OK with only around 4-5 MHz of CPU speed left over?

I'm still reading up on your software and how to use it.

Z.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

code AVR in primitives the of 40 about code to need only you - Forth implement always could You.

;-)

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

> code AVR in primitives the of 40 about code to need only you
> Forth implement always could You.

I did have a laugh at that. Some people *hate* Forth, although I know it has it's devotees. I might think about producing a variant with Forth in it, but only if I can get a version which is more suitable for students to start with.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I was simply thinking of the "Jupiter Ace". http://en.wikipedia.org/wiki/Jup...

Richard Altwasser who worked for Sinclair and designed the circuits for the ZX80, ZX81 and Spectrum left the company to setup Jupiter Cantab making a ZX81-alike that ran Forth rather than BASIC (they were electronic engineers so I suppose you have to forgive them!). Having done Forth on both PDP-11 and 6809 myself I know that it's actually very easy to get working. The source is "open" and for a particular architecture you just have to provide the core and the most basic primitives then the rest of the language is built upon those primitives, themselves written in Forth.

Of course it's completely unusable by the end user which is why the Jupiter Ace was a flop and Richard Altwasser later joined our company as engineering director.

Cliff

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Clawson right is. Usual as.

Re: Bitlash in 4-5 MHz of processor bandwidth: Bitlash will run just fine in whatever processor time is available outside your interrupt chain. Whether or not this is "fast enough" kind of depends on the application. (The references to the Z8 in this thread are apt.)

Here is a benchmark: I've got Bitlash running happily here on a 1 MHz Tiny85. (One might call this a severely constrained environment for an interpreter.) This loop generates a square wave on PB5 with a period of around 7 ms:

> while 1: d5=!d5

With a little under 1 MHz it takes 3.5 ms to parse and execute "d5=!d5". With 4-5 MIPS on a '328 you'd be running correspondingly faster, perhaps in the range of 1000 statements per second.

-br

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

UBASIC is pretty easy to use and extend:
http://www.sics.se/~adam/ubasic/

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

http://www.zbasic.net/
http://www.zbasic.net/forum/

Quote:

Have a look at zbasic. EEprom-resident compiled pseudo-code, interpreted by a virtual machine on the AVR. Basic is very much like Visual Basic 6. Plus a cooperative multitasking scheduler.

Also a compiled native mode (not VM) version. Excellent quality.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

kris10an wrote:
UBASIC is pretty easy to use and extend:
http://www.sics.se/~adam/ubasic/

Quote:
The (non-interactive) uBASIC interpreter supports only the most basic BASIC functionality: if/then/else, for/next, let, goto, gosub, print, and mathematical expressions. There is only support for integer variables and the variables can only have single character names.
ouch

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

For the speed. Even at 4 MHz clock, the AVR is about 5 times faster than the 3.5? MHz Z80 in the ZX80 or the 1 MHz 6502 in the Apple II or Comodore PET. The slow thing is getting data from the EEPROM. So for speed reasons the code may need to go into SRAM.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Compiling Bitlash so it will run on an ATMega328 without Arduino‏

Bill, I think you would prefer me to discuss Bitlash on this forum so that others can also read up on it. If I've got this right then I will post on here, if you would prefer to use direct e-mail please just let me know.

To explain where I am starting from, I am a die-hard old-time assembler programmer. I like using the assembler, nothing against C but I get my kicks from actual registers. That's so sad isn't it?

Anyway, from my limited knowledge of using C on microcontrollers, I assume I would use the avr-gcc toolchain to create a hex file from your source code, for download to the ATM328 and I can do this with AVRDude and my TinyUSB programmer.

I may be able to delve answers out by reading more of the documentation and source code, but if you have a minute could you enlighten me on a few points I'm puzzled by at the moment:

I see in the section on Macros, that the macro definition itself is stored in EEPROM as an ASCII string, which makes sense, but I assume that other functions are stored in EEPROM as byte or word tokens?

Is there a way to "save" or even run a Bitlash program in spare Flash ROM space? This would be a handy way to provide a chip with several sample programs in it which you could save and load into the EEPROM for editing and so on, and then put them back in Flash when completed. I may have totally misunderstood how your software works, this kind of question may reveal my ignorance and your reply may help me understand it better.

Interrupt vectors. Whereabouts in your source code do particular interrupts point to? Or does Bitlash not itself handle any interrupts? I did a search through all the source and the interrupts I found related to serial and USB handling, so maybe chars from here just feed into your command line part.

Presumably after reset the Bitlash software jumps to a start location, or is Bitlash called from other code within the IC?

I presume I could put my assembler code into the avr-gcc assembler so that I can link it and it will then be downloaded as one big hex file with the Bitlash code, but it might be better for me to have a separate Flash memory location for both different blocks of code. Probably the Bitlash code will stay more or less constant and I will be erasing and re-programming my TV code and interface software until I get it right. Any comments on that?

As you can see I am still at the rudimetary stage where running two bits of software on one IC, and getting them to communicate, is a new concept. Maybe open discussion of this will help others to use Bitlash as well though.

I think what I need to do to start with, is get Bitlash up and running with just a basic serial port communication on an ATM328. From this I can start playing around and work out how to run my TV code fired off interrupts, while Bitlash runs as the main program loop.

Do you have any ready compiled hex I can just upload to a blank ATM328 via AVRDude, to let me have a play with Bitlash via the serial port?

Many thanks,

Zadok.

PS Sorry if there's too many questions - just trying to get my head round it...

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hello again, Zadok:

Thanks for your note. You ask a lot of good questions. Let's continue the conversation here, as you say, so that others can benefit from the trail.

"Die-hard assembler guy" is a valuable skill, and it would be a better world if we could all just be proud of our geekdom without fearing repercussions from the muggles. Anyway I admire your focus.

Regarding your toolchain question: We'll need advice from the 'freaks on how best to integrate your assembler blob with my C blob. Freaks, your advice would be welcomed.

I think the first thing we need to figure out is where the touch points are.

It may help if I start by explaining that Bitlash gets CPU time only when you poll it by calling runBitlash() from your main loop. This example won't compile but it gives you the idea:

#include "bitlash.h"
int main() {
	initBitlash();
	while (1) runBitlash();
}

So it is necessary to identify a place in your code where "idle happens" and calls to runBitlash can happen Pretty Frequently. That takes care of giving the interpreter CPU time.

Second, Bitlash will need a few things from you: you'll need to change the character-level serial io in bitlash-serial.c to use your screen and keyboard instead of the serial port. Not hard, except for the C-ASM linkage.

Then you'll probably also want to add some functions in bitlash-functions.c to control the text and background color and handle any graphics effects your hardware can do. This is not hard. (This is where you customize the language to your application by changing the function set.)

So we need to add some defined handful of call points between the two blobs. As for the mechanics of getting this done, perhaps you know, or the forum can offer some guidance on the best path. It seems like a common thing to do, but I am not familiar with the 'how'.

Moving on to the questions on macros.

First of all, let me break the news. There are no tokens. There is no parse tree. Bitlash is an old-school recursive descent expression evaluator with a hand-built interpreter front end. A command line is parsed and executed in one go every time we see it.

So your program is macros all the way down. A Bitlash application is a collection of such macros, running in the background, triggered off the 'startup' macro at boot time, or from the command line.

So you 'only' get 512 bytes or 1k of program space (on your $2 computer!). But fear not, Bitlash is a very concentrated language and you can get a lot done in very little text.

This notion of "spare flash ROM space" is interesting; I never seem to have any of it around here. Anyway, it's possible to modify Bitlash to execute from flash rom, but it would be a little work. I call this feature "built-in macros". Merits further discussion.

Regarding interrupt vectors: Bitlash doesn't use any, directly, but of course it exposes interrupt-driven services through its functions. For example, spb() sends a byte to the serial output device, and millis() needs someone to increment a counter in the background at 1 kHz.

I think you are right on track with the idea of getting a '328 version up and working as a testbed. The easy path is to get a '328 arduino, but that isn't very Freak, is it? Alternatively, one might hack up a 328 with crystal and ICSP, use the Arduino 0016 IDE to burn the bootloader via USBTiny, and then upload Bitlash using that. Then onward to avr-gcc and the Makefile once you're doing the real integration with your TV code.

I hope that is helpful, and look forward to hearing your thoughts, as well as advice from the forum on integrating the two bodies of work.

Best,

-br

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Just to say that as GCC is being used it will be possible to integrate C and Asm but only if you use GCC's own avr-as not Atmel Assembler format. It recognises the same mnemonics but the directives are quite quite different. Even the "main" that dispatches the calls to the bitLash interpreter can be done in Asm:

main:
  call initBitlash
mloop:
  call runBitlash
  rjmp mloop

Cliff