Xmega Data Transfer to Desktop

Go To Last Post
36 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I'm wondering what quick ways there are to capture data on an Xmega and transfer that data in real-time to a desktop computer.

 

Say, about 500ksps of stuff; it can be anything.

 

I'm fresh to USB, but is this a job for Atmel's USB ASF or LUFA?

 

Thanks,

 

I'm using an Atxmega256a3bu.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I don't use C, so I've not used ASF or Dean's LUFA.

 

For the Mega's, many Threads suggest LUFA is easier to use than ASF.

 

I believe there has been a Thread or two where Dean's LUFA was converted to the Xmega.

You may have to search a bit to find those.

 

If you are only building one or a few, and you have troubles with getting the above two options to work, then obviously you could just throw a FTDI USB to Serial (USART) bridge chip on the board and be done with it.

That is a very easy solution, but perhaps expensive for a large production project.

It is also technically unappealing to some to use a USB chip when the micro has USB capabilities built in!

 

Using ASF sometimes is reported to be challenging, (not just for USB, but overall).

 

JC 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

 FTDI USB to Serial can certainly do 920k baud

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Don't overlook protocol overhead. When you are up against a fast limit like this, the protocol can kill you. Things like packet framing characters, and data encoding so there is no collision between data and framing characters, and such. This can especially be a problem when the packets are small (say, a few data bytes). Then, the protocol characters can become half of the packet.

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

Last Edited: Sat. Sep 23, 2017 - 07:17 AM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Tell us more about 500ksps. How big are the samples? Are these 10+ bit ADC readings? If so then it's really 16 * 500k bits per second so 8Mbps and that's without any protocol overhead (such as start/stop bits for UART). The USB has a theoretical 12Mbps data rate but that's not going to be the true data bandwidth because of protocol overhead. 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

awneil wrote:

 FTDI USB to Serial can certainly do 920k baud

 

They can do up to 3M baud, but you have to use the D2XX drivers.

 

I just discovered while double checking the manual before posting that it might be entirely possible to go this high while still treating the FTDI chip as a virtual com port.  The D2XX programmers guide references the application note "Setting baud rates for the FT8U232AM" for details of calculating non-standard baud rates, and that document appears to suggest a way of aliasing standard baud rates to non-standard ones.  I take it to mean for example that in your terminal software (or whatever), you select 9600, which is aliased to 3M baud and the driver takes care of the rest. I just discovered this and have never tried it - not yet!

 

Not to hijack or derail, I just found this potentially incredibly useful and thought I would share.

 

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

If this is an Xmega with USB built in why would you employ any form of FTDI? Wouldn't it be like paying for the same silicon twice?

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

SolarFreak wrote:
They can do up to 3M baud, but you have to use the D2XX drivers.
Exar USB UART is about 9Mbps with apparently OS standard drivers (Linux, macOS, Windows, Android)

http://www.avrfreaks.net/forum/ch340g-ic-vendor#comment-2115341

https://www.exar.com/products/interface/uarts/usb-uarts

 

"Dare to be naïve." - Buckminster Fuller

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

USB's frame period is 1ms whereas UART SOF is immediate with near immediate completion for USART or SPI; an AVR port is immediate.

 

USB XMEGA could have standard input, output, and maybe standard error on USB.

 

SPI and/or UART could go via DGI by Atmel-ICE or Power Debugger (Power Debugger has UART for Windows virtual COM), or, by a logic analyzer.

There's printf redirection to EDBG USART in Atmel START.

Good for somewhat high speed tracing though not as integrated as MPLAB X tracing (SPI or a port)

 

FTDI has byte-wide FIFO to USB bridges that could be attached to an AVR port or to XMEGA A1U EBI.

 


http://start.atmel.com/#examples/printf 

Microchip Technology Inc

Microchip

Using MPLAB® REAL ICE™ In-Circuit Emulator for MPLAB X IDE (Poster)

(lower right)

Optional Trace Connections

via http://www.microchip.com/mplab/mplab-x-ide

http://www.ftdichip.com/FTProducts.htm (search for FIFO)

 

Edit : USB and UART SOF

 

"Dare to be naïve." - Buckminster Fuller

Last Edited: Mon. Sep 25, 2017 - 09:52 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Under ideal conditions I can move about 8 megabits per second.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Wow these are great responses.

 

The application is collecting 16 (and with tag bits 24) bit data, at a sample rate of hopefully up to are more than 100ksps without protocol, so an FTDI that can support several MBits would be perfect.

 

Has anyone tried these USB UARTS? Is it as straight-forward as sending UART data to port once the other GPIOs are set-up?

If this is an Xmega with USB built in why would you employ any form of FTDI? Wouldn't it be like paying for the same silicon twice?

USB being new for me, I'm thinking that spending more on parts could be worth saving on time. I would like to try USB though.

On that note, is it worthwhile to try and integrate Atmel's USB ASF?

 

Thanks very much

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Didn't we start with this conversation some time ago? Was there mention of usb serial having a problem with sustained high speed data?

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

daisy148 wrote:

I'm fresh to USB, but is this a job for Atmel's USB ASF or LUFA?

I'm using an Atxmega256a3bu.

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

You can use my USB code.  It's written in C++ but it could be interfaced with C code.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I'm definitely interested in giving it a try, thanks! Do you have a Github account where I can find it?

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

No GitHub. 

 

I guess you want USB CDC.  I guess you are not familiar with C++.

 

All my code is integrated with my micro OS.  It uses C++ classes and inheritance.  You probably don't want that.  I should be able to come up with a few functions you can call to handle reading and writing.

 

I'll post it here.

 

One other thing.  The USB requires a fairly accurate 48 MHz clock.  I normally use the 32kHz osc. to get the accuracy that works for me.  If you can't handle that, I could give you my clock setting code.  If need be, I could use the USB SOF to set the clock accurately.  

Last Edited: Sun. Sep 24, 2017 - 03:44 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Ok thank you.

I have some experience changing the clock to 48MHz, but that would be very helpful. Please let me know how it goes

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Using a modified (fixed) version of the nonolith minimal USB stack and a bulk endpoint I was only about to get about 7.8Mb/sec: http://www.avrfreaks.net/forum/u...

 

Assuming you are using the internal 12 bit ADC you need about 6Mb/sec to send 500 ksps. I doubt that the ASF is efficient enough, and LUFA has issues on XMEGA.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

@ mojo

 

Is the CPU/PER clock speed critical?  I do the easy thing when using 48 MHz for USB.  I just divide the system clock by 2 and get 24 MHz for the PER/CPU clock.

 

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

daisy148 wrote:

 Please let me know how it goes

I should have something in a day or so.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I found something interesting when there is C and C++ code in the same project.  If I create the project as a C project, it won't compile .cpp files.  I had to recreate the project as C++.

 

There may be a way to modify the C project, but I couldn't find it.  Under Tools > options > Toolchain, I found I could set (CPP language) but it didn't seem to do anything.  It's no big deal to me, but to someone that wants to add some C++ classes to an existing C project, it could be a pain in the behind.

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I should have something in a day or so.

Thanks Steve, and it's good to know that there are some options for integrating C++ to a C file; I'll look into that.

I doubt that the ASF is efficient enough, and LUFA has issues on XMEGA.

I'm aiming for about 650,000 bytes per second, including protocol overhead. I reached about 530,000 with ASF if the data visualizer modules are accurate. 

Not to derail this thread too much, but after importing the CDC ASF code, my timer functions and LCD functions don't work. Is this one of those board_init problems where the ASF saves power on peripherals?

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

steve17 wrote:

@ mojo

 

Is the CPU/PER clock speed critical?  I do the easy thing when using 48 MHz for USB.  I just divide the system clock by 2 and get 24 MHz for the PER/CPU clock.

 

 

 

The faster the better, but if you are just using DMA to transfer ADC readings into a buffer and then USB (which is also DMA) to send them out, 24MHz is probably fine.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

daisy148 wrote:

I'm aiming for about 650,000 bytes per second, including protocol overhead. I reached about 530,000 with ASF if the data visualizer modules are accurate. 

Not to derail this thread too much, but after importing the CDC ASF code, my timer functions and LCD functions don't work. Is this one of those board_init problems where the ASF saves power on peripherals?

 

I'd suggest not using CDC because of the overhead, and just using a raw USB bulk endpoint. They are easy enough to talk to with libusb, and you can easily control the buffer size for maximum performance.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

 

mojo-chan wrote:

I'd suggest not using CDC because of the overhead, and just using a raw USB bulk endpoint. They are easy enough to talk to with libusb, and you can easily control the buffer size for maximum performance.

Thanks for the info.  I should look into libusb.  I use winusb but I haven't figured out an easy way to allow others to use it.  

 

I think I could easily program the AVR for libusb.  Pete Batard explains winusb and libusb on GitHub.  They are both similar.  That's where I learned how to program my AVR.  Programming the PC was more difficult.  Is there a site that explains how to implement libusb on the PC?

 

On another subject, do you use Multipacket?  I love it, but it may not be for everyone, at least with Microsoft's stupid USB CDC because that gives us no way to send a ZLP.  If you are just sending a continuous stream, it may not be helpful, I suppose.  I, and avrdude, send commands that can span multiple 64 byte packets. For this, multipacket is very good.

 

I think all of us have found that to get high throughput, we need to send data in big chunks.  At least 512 bytes.  Does your device accumulate multiple data items until you get a big chunk?

 

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

The libusb manual is reasonably good. As long as you understand USB, with that and the sample code that comes with it you should have no problem with libusb. I prefer it to WinUSB because it is cross platform and not tied to the rather awkward way WinUSB likes to find devices.

 

Multipacket is essential to get good performance. I'm not sure what you mean about Microsoft's CDC, it definitely supports multi-packet and fairly large buffers.

 

My demo code (https://github.com/kuro68k/xrng) got the best performance with 4k buffers as I recall.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

mojo-chan wrote:
Multipacket is essential to get good performance. I'm not sure what you mean about Microsoft's CDC, it definitely supports multi-packet and fairly large buffers.

Thanks for the info. 

 

Microsoft's CDC does not allow the PC program to send a ZLP, at least I don't think so.  If you know how to do that, please let me know.  I believe you send mainly from the device to the PC so it's not a problem.

 

I found a way around it by calling SetCommBreak().  When my driver sees that setup packet come in, it terminates the read on the data endpoint.

 

Where do I find the libusb manual?  My way to find winusb devices is to use SetupDi.. to find the guid my devices send to the PC during initialization.  I'm thinking there may be other ways.

 

One problem I have in coming up with a simple thing others can use is I don't know what kind of programs other people write for the PC.  Are they simple console programs or are they some kind of a GUI?   

 

I rarely write "console" programs because I invented a very easy to use "text mode" GUI 20 years ago.  Now I'm wondering if anyone would be interested in a simple easy to use gui.   Maybe there is another gui

out there that's as easy.  As an example of easy, here's the way to create a window.

 

new Windo("put windo name here");

 

Adding elements to the windo is almost as easy.

 

Here's what I see when I have 2 winusb devices plugged in.  I can select either one with a mouse click. 

 

 

 

Last Edited: Wed. Sep 27, 2017 - 07:20 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 1

Daisy, here is a sample Xmega program that sets up the clock and uses my USB CDC driver.  There is a simple loop in main that checks to see if any data has come in.  If so, it echoes it back.  You can use a terminal emulator.  If you hit a key, you get something back.

 

 

Attachment(s): 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

steve17 wrote:

Microsoft's CDC does not allow the PC program to send a ZLP, at least I don't think so.  If you know how to do that, please let me know.  I believe you send mainly from the device to the PC so it's not a problem.

 

I don't understand. The ZLP is at a layer below the CDC stuff, the whole point of having CDC is to avoid getting into the detail of the underlying USB protocol. Are you saying that the CDC driver doesn't send large ZLP terminated packets?

 

Quote:
I found a way around it by calling SetCommBreak().  When my driver sees that setup packet come in, it terminates the read on the data endpoint.

 

Be careful with that kind of out-of-band signalling on CDC. The ASF code doesn't actually support all of it, I had to add some of the DTR/CTR/RTS stuff manually from memory. Anyway, even if you do use it, the problem is that it's not synchronous with the in-band signalling.

 

Quote:
Where do I find the libusb manual?  My way to find winusb devices is to use SetupDi.. to find the guid my devices send to the PC during initialization.  I'm thinking there may be other ways.

 

http://libusb.info/ look under documentation.

 

You don't need the GUID with WinUSB, you can actually just search by VID/PID.

 

Quote:
I rarely write "console" programs because I invented a very easy to use "text mode" GUI 20 years ago.  Now I'm wondering if anyone would be interested in a simple easy to use gui.   Maybe there is another gui

out there that's as easy.  As an example of easy, here's the way to create a window.

 

I usually start with a console app because

 

a) It's a minimal, easy to understand example

b) It can be wrapped by other (GUI) apps or integrated into workflows

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

mojo-chan wrote:

I don't understand. The ZLP is at a layer below the CDC stuff, the whole point of having CDC is to avoid getting into the detail of the underlying USB protocol. Are you saying that the CDC driver doesn't send large ZLP terminated packets?

CDC doesn't send ZLP period.

 

If I send a message from the PC using CDC whose size is exactly 64 bytes, or a multiple thereof, the receiver, if using multi-packet will not see end-of-message and will wait forever for a short packet. The sender needs to send a ZLP (data packet with 0 bytes of date) to terminate the message. CDC doesn't do this. One solution is to shut off multipacket and read 64 bytes at a time and resemble them. Disgusting to say the least. I might as well use RS232.

 

I find the SendCommBreak works well. This is part of the CDC protocol. I understand it is sent on another channel and could possible arrive before the message. I've never seen that happen, but if it did I could wait a few milliseconds after sending the message before sending the break command.

 

Microsoft is clueless about communications. This is an excerpt from Microsoft's WinUSB Functions for Pipe Policy Modification page.

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I could add, CDC is just another USB protocol like libusb and winusb.  It's the sender's responsibility to send the ZLP.  Only the sender knows where he wants the message to end.  USB makes no assumptions. 

 

To be more precise, it's the sender's responsibility to insure all packets but the last one are full.  The last one must be less than full.  If luck would have it, and the last one is also full, an empty one (ZLP) must be sent.

 

In the case of our XMEGA USB hardware, there is an autozlp flag we can set in the outgoing endpoint registers that will cause our USB hardware to send it for us if needed.  In this case, our hardware assumes the write contains all the message.  The autozlp is the equivalent of Microsoft's "SHORT_PACKET_TERMINATE" flag.

 

 

Last Edited: Thu. Sep 28, 2017 - 11:47 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

steve17 wrote:

mojo-chan wrote:

CDC doesn't send ZLP period.

 

If I send a message from the PC using CDC whose size is exactly 64 bytes, or a multiple thereof, the receiver, if using multi-packet will not see end-of-message and will wait forever for a short packet. The sender needs to send a ZLP (data packet with 0 bytes of date) to terminate the message. CDC doesn't do this. One solution is to shut off multipacket and read 64 bytes at a time and resemble them. Disgusting to say the least. I might as well use RS232.

 

I find the SendCommBreak works well. This is part of the CDC protocol. I understand it is sent on another channel and could possible arrive before the message. I've never seen that happen, but if it did I could wait a few milliseconds after sending the message before sending the break command.

 

Microsoft is clueless about communications. This is an excerpt from Microsoft's WinUSB Functions for Pipe Policy Modification page.

 

 

Okay, hang on... The image you posted is for WinUSB, right? But you are talking about the behaviour of "CDC", by which I assume you mean the usbser.sys driver... Or are you saying you are implementing the CDC protocol in WinUSB manually?

 

Also, in your follow up post, you talking about libusb and WinUSB being protocols... They are not. Libusb is just a generic USB access library that wraps various OS functions, but it doesn't do any protocol stuff above the standard USB endpoints/pipes layer. WinUSB isn't a protocol either, it's similar to libusb. There is some stuff with extra custom Microsoft descriptors you can do, you but they aren't actually related directly to WinUSB and can be used with any USB device. It's just that WinUSB has specific support for accessing devices via GUID rather than the usual VID/PID.

 

As for SendCommBreak, my solution was the same as yours: Call it and then wait a few milliseconds to ensure that it gets sent before continuing to write to the virtual COM port.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

mojo-chan wrote:

 But you are talking about the behaviour of "CDC", by which I assume you mean the usbser.sys driver... Or are you saying you are implementing the CDC protocol in WinUSB manually?

No.  I don't mix CDC and winusb.  The context was "Microsoft is clueless about communications" and gave an example.  The "Most devices do not have this requirement", (meaning autozlp), is nonsense.  It's probably why Microsoft doesn't allow you to send a zlp when using CDC.

 

No devices require it.  But all devices that want to use multipacket occasionally need a ZLP.  I don't think winusb actually needs it because I think you can send a data pkt with no data.  I'm not sure of this though.

 

Regarding SendCommBreak,  I guess it's more important to wait before sending it than afterwards.  That's because I have a suspicion that the control endpoint has a greater priority than the data endpoint.  But I'd have to look it up to be sure. 

 

 

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

steve17 wrote:

mojo-chan wrote:
The context was "Microsoft is clueless about communications" and gave an example.  The "Most devices do not have this requirement", (meaning autozlp), is nonsense.  It's probably why Microsoft doesn't allow you to send a zlp when using CDC.

 

That doesn't make sense.

 

Quote:
No devices require it.  But all devices that want to use multipacket occasionally need a ZLP.

 

If you send an exact multiple of the endpoint size you don't need a ZLP. That's how I managed to get the maximum 8Mb/sec in my test.

 

Quote:
Regarding SendCommBreak,  I guess it's more important to wait before sending it than afterwards.  That's because I have a suspicion that the control endpoint has a greater priority than the data endpoint.  But I'd have to look it up to be sure. 

 

That is technically correct, the data endpoint is bulk which gets lowest priority. However, I wouldn't rely on it.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

mojo-chan wrote:

If you send an exact multiple of the endpoint size you don't need a ZLP. That's how I managed to get the maximum 8Mb/sec in my test.

That's interesting.  I hadn't thought of that.  I take it you need high throughput from the PC to the device.

 

I have another technique that works for me.  I don't need to send a lot of data and high speed isn't necessary.  I just send and receive various messages that are from 20 to 120 bytes.  All my messages are structs.  The structs are in a Messages folder.  The PC software and the AVR software share the same structs.  Very handy for sending binary data.  If it should happen that one of them has a size of 64 or a multiple, I just add a dummy byte in the struct.

 

Various schemes can work if you write the software on both end.  It's when others write the PC software that it could be problematic.  I use a USB CDC bootloader with avrdude.  I've used it hundreds of times without a problem.  That's because due to the nature of avrdude's commands it's quite unlikely it will send one of the wrong size.  It will probably happen sometime.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

steve17 wrote:
That's interesting.  I hadn't thought of that.  I take it you need high throughput from the PC to the device.

 

That was sending device to PC.

 

Quote:
I have another technique that works for me.  I don't need to send a lot of data and high speed isn't necessary.  I just send and receive various messages that are from 20 to 120 bytes.  All my messages are structs.  The structs are in a Messages folder.  The PC software and the AVR software share the same structs.  Very handy for sending binary data.  If it should happen that one of them has a size of 64 or a multiple, I just add a dummy byte in the struct.

 

That's what I do too. It's actually mandatory for some protocols, e.g. you can't send half an HID report. You can't send less than the endpoint size either, it has to be either an exact multiple or you send a ZLP afterwards. Padding gives the best performance.