virtual serial ports

Go To Last Post
24 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I'm trying to use a virtual serial port
with the USBtoSerial demo from MyUSB.
The device manager says I have a healthy
COM8 and that it comes from MyUSB.
Both lights are green on the AT90USBKey.
Bray's terminal can't find it.
It's not a COM option.
It's not listed when I click on COMS.
Only COM1 is listed.

What are the magic words to make this work?

EDIT:
I can make Bray's terminal see it now,
but it still doesn't work.
I can type two characters into Bray's terminal.
It says it's transmitted two characters.
If I type a third character, Bray's terminal hangs.
It doesn't even refresh its window.
I kill it by clicking on the x in the corner.

"Demons after money.
Whatever happened to the still beating heart of a virgin?
No one has any standards anymore." -- Giles

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Have you tried the same tests with Hyperterm? It's a horrible program but generally reliable...

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

barnacle wrote:
Have you tried the same tests with Hyperterm? It's a horrible program but generally reliable...
It just hangs.
EDIT:
Eventually, i.e. just after I got bored enough to type another reply,
it told me unable to open COM8.

"Demons after money.
Whatever happened to the still beating heart of a virgin?
No one has any standards anymore." -- Giles

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

skeeve wrote:
barnacle wrote:
Have you tried the same tests with Hyperterm? It's a horrible program but generally reliable...
It just hangs.

Disable and renable the COM port. Something may be using it.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Adam_Y wrote:
skeeve wrote:
barnacle wrote:
Have you tried the same tests with Hyperterm? It's a horrible program but generally reliable...
It just hangs.

Disable and renable the COM port. Something may be using it.
Not any more.
The first time I tried it, I got error code 10.
The next time, the virtual serial port went away.

"Demons after money.
Whatever happened to the still beating heart of a virgin?
No one has any standards anymore." -- Giles

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Do I need a particular baud rate or something?

"Demons after money.
Whatever happened to the still beating heart of a virgin?
No one has any standards anymore." -- Giles

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

skeeve wrote:
Do I need a particular baud rate or something?

No. As long as the Baud rate between the device and the computer match it should work.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

No, any baud - turns out I missed calling the ReconfigureUSART() function from the SetLineCoding request anyway. Right, added to my todo list for this evening -- perhaps I screwed something up when I modified it for USBXX2 compatibility.

Does the CDC example work for you?

- Dean :twisted:

Make Atmel Studio better with my free extensions. Open source and feedback welcome!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

abcminiuser wrote:
No, any baud - turns out I missed calling the ReconfigureUSART() function from the SetLineCoding request anyway. Right, added to my todo list for this evening -- perhaps I screwed something up when I modified it for USBXX2 compatibility.

Does the CDC example work for you?

Yes.
"any baud" means that the virtual COM will accept any baud rate?
Including 12 megabaud?

"Demons after money.
Whatever happened to the still beating heart of a virgin?
No one has any standards anymore." -- Giles

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Since it doesn't currently *do* anything by changing the baud, yes - use any settings you like ;). Once the ReconfigureUSART() call is added to the SetLineCoding request (after loading in the new line coding options) the baud will actually have some effect. You'll only get readable data for baud rates that the 8MHz AVR can handle, but you won't break anything by choosing other baud rates.

I'll check out the CDC and USBtoSerial examples on my XP machine tonight, and see if it works for me. Please try the CDC example and see if you encounter any problems with that.

- Dean :twisted:

Make Atmel Studio better with my free extensions. Open source and feedback welcome!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

abcminiuser wrote:
Since it doesn't currently *do* anything by changing the baud, yes - use any settings you like ;). Once the ReconfigureUSART() call is added to the SetLineCoding request (after loading in the new line coding options) the baud will actually have some effect. You'll only get readable data for baud rates that the 8MHz AVR can handle, but you won't break anything by choosing other baud rates.
Eventually it will become USBtoSPI.
As currently intended, controls, e.g. baud rate changes,
applied to the virtual COM are supposed to be propagated to the real USART?
Quote:
I'll check out the CDC and USBtoSerial examples on my XP machine tonight, and see if it works for me. Please try the CDC example and see if you encounter any problems with that.
The CDC example works for me.

"Demons after money.
Whatever happened to the still beating heart of a virgin?
No one has any standards anymore." -- Giles

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Quote:

As currently intended, controls, e.g. baud rate changes,
applied to the virtual COM are supposed to be propagated to the real USART?

Yes, I just forgot to call the reconfiguration routine after the new settings were loaded from the host. Dumb mistake!

Quote:

The CDC example works for me.

Excellent - the USBtoSerial demo is built off the same codebase. That narrows it down a little, will check it out tonight.

- Dean :twisted:

Make Atmel Studio better with my free extensions. Open source and feedback welcome!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

abcminiuser wrote:
Quote:

The CDC example works for me.

Excellent - the USBtoSerial demo is built off the same codebase. That narrows it down a little, will check it out tonight.

Also, I made the attached version.
It avoids doing anything at all with the USART.
It is supposed to change an LED when it reads something from USB.
The light doesn't change.
As before, Brays terminal hangs after 3 characters.
It only announces transmission of the first 2.

Attachment(s): 

"Demons after money.
Whatever happened to the still beating heart of a virgin?
No one has any standards anymore." -- Giles

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hi Dean, the following code:

case GET_LINE_CODING:
			if (RequestType == (REQDIR_HOSTTODEVICE | REQTYPE_CLASS | REQREC_INTERFACE))
			{

should be:

case GET_LINE_CODING:
			if (request_type == (REQDIR_DEVICETOHOST | REQTYPE_CLASS | REQREC_INTERFACE))

Otherwise, the GET_LINE_CODING request will never be handeled.

One thing I'am not sure about, is the Endpoint_Read_Stream_LE function, I've change that line to multiple Endpoint_Read_Byte call's, and it seems to work better.

It still hangs here and there, but I'am working on it too.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Darn, you're right -- missed that on both examples. Interesting that the host doesn't complain (or at least, mine doesn't) when the request isn't correctly handled. Apparently the USB2SER.sys driver accepts the lack of a GETLINECODING request.

The stream functions are new, and so might have some bugs. But all they really do is read in the given number of bytes from the USB endpoint, over multiple packets if required.

- Dean :twisted:

Make Atmel Studio better with my free extensions. Open source and feedback welcome!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

BUGGER. I just realised what I've done -- didn't read my own notes. The stream functions don't work on control requests; only on IN and OUT endpoints.

Replace the stream read with:

uint8_t* LineCodingStream = (uint8_t*)&LineCoding;
for (uint8_t LineCodingB = 0; LineCodingB < sizeof(LineCoding); LineCodingB++)
					*(LineCodingStream++) = Endpoint_Read_Byte();

And I'll patch the demos tonight.

- Dean :twisted:

Make Atmel Studio better with my free extensions. Open source and feedback welcome!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Try the latest code at:

http://www.fourwalledcubicle.com...

I've patched it to have the above correction, as well as be slightly more reliable by checking the buffers and endpoints to see if they're full before use.

- Dean :twisted:

Make Atmel Studio better with my free extensions. Open source and feedback welcome!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

abcminiuser wrote:
Try the latest code at:

http://www.fourwalledcubicle.com...

I've patched it to have the above correction, as well as be slightly more reliable by checking the buffers and endpoints to see if they're full before use.

Thanks. Good things are happening now.
EDIT:
I've managed to write code that controls
the blink rate of an LED from USB input.
It doesn't use the ring buffers or the USART.

"Demons after money.
Whatever happened to the still beating heart of a virgin?
No one has any standards anymore." -- Giles

Last Edited: Fri. Mar 21, 2008 - 10:11 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hi Dean, I think I found the reason why it was hanging up from time to time.

First I’ve changed ISR(USART1_RX_vect) a little to check for framing and data overrun errors, and have found out that at start up the first bytes have a lot of framing errors.

But the ‘bug’ sit’s in the RingBuff routine, when updating the pointers and element counter, you can guess it already, one must clear the interrupts before doing so, otherwise the UART interrupt will mess things up.

I’ve used a simpler buffer to test it all, and it all works without any problems so far. The buffer doesn’t even fill that much after testing it a little.

The ISR routine I’ve used to test it:

ISR(USART1_RX_vect) 
{
	uint8_t bad_data;

	if ((UCSR1A & (FRAMING_ERROR | DATA_OVERRUN | PARITY_ERROR))==0)
	{
		uart_rx_buffer[uart_rx_wr_index] = UDR1;	// save the character 
		if (++uart_rx_wr_index == UART_RX_BUFFER_SIZE) uart_rx_wr_index = 0;
		if (++uart_rx_count == UART_RX_BUFFER_SIZE)
		{
			uart_rx_count = 0;
			uart_rx_buffer_overflow = 1;
		}
		PORTD ^= (1 << 4);
	}
	else {
		bad_data = UDR1;
		uart_rx_bad_data++;
	}
}

The part’s where I retrieve from the buffer:

		/* Select the Serial Tx Endpoint */
		Endpoint_SelectEndpoint(CDC_TX_EPNUM);
		/* Check if the uart rx buffer contains anything to be sent to the host */
		if (uart_rx_count) {
			/* Wait until Serial Tx Endpoint Ready for Read/Write */
			while (!(Endpoint_ReadWriteAllowed()));

			/* Write the transmission buffer contents to the recieved data endpoint */
			while (uart_rx_count  && (Endpoint_BytesInEndpoint() < CDC_TXRX_EPSIZE)){
			    Endpoint_Write_Byte (uart_rx_buffer[uart_rx_rd_index]);  // get char from buffer
				cli();
				if (++uart_rx_rd_index == UART_RX_BUFFER_SIZE) uart_rx_rd_index = 0;
				--uart_rx_count;
				sei();
			}
			// Send the data 
			Endpoint_FIFOCON_Clear();	
		}

I’ve used the same buffer and principle for the TX ISR, just to test it.

Anyway keep up the good work Dean, this helped me(and probably many people) a lot to understand a little part of the USB specification, but still a lot to learn. Thanks!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Quote:

But the ‘bug’ sit’s in the RingBuff routine, when updating the pointers and element counter, you can guess it already, one must clear the interrupts before doing so, otherwise the UART interrupt will mess things up.

Darn, it does too! The code only kicks in the atomic blocks when the buffer is declared volatile, but also when the buffer contents is larger than a char. Of course, then causes problems as you've seen when the buffer stores char but an interrupt occurs while the pointers are being manipulated...

Fixed the code, and I'll update the ringbuffer on my site. Cheers!

- Dean :twisted:

Make Atmel Studio better with my free extensions. Open source and feedback welcome!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hi Dean, I remarked the Atomic Blocs, and supposed that it would clear the interrupts somehow, but to understand that part I need to learn more about AVRGCC.

One last thing I remarked:

#define CDC_NOTIFICATION_EPSIZE        	8

Should be greater than 8, the header alone takes already 8 bytes, so you need at least two more for the data, I’ve put 16 instead.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

The current demo code doesn't use the notification endpoint, so I just made it as small as possible.

If an endpoint is smaller than the data being pushed into it, the data can be sent in multiple packets. So in that case you'd shove in the header, send it, write the data to the endpoint and then send that.

- Dean :twisted:

Make Atmel Studio better with my free extensions. Open source and feedback welcome!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Ok I see, well I didn't thought about that, good to know.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Quote:

Ok I see, well I didn't thought about that, good to know.

That's why I'm writing a book on this stuff! ;)

- Dean :twisted:

Make Atmel Studio better with my free extensions. Open source and feedback welcome!