ATmega16M1 LIN/UART bit error

Go To Last Post
4 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 1

Hello everyone,

First of all I want to introduce myself.  I´m currently developing a LIN communication board and as I don´t have much experience in AVRs I´ve been experienced problems with it. 

 

I have some troubles when I try to use the ATmegaM1 LIN/UART controller in LIN mode.

I´m using the Atmel Appnote AVR286: LIN Firmware Base for LIN/UART Controller to test the controller.

The problem is that, when I try to transmit a frame using lin_tx_header() or lin_tx_response(), LIN bit error (LBERR) interrupt flag in LINERR register is set.

 

According to datasheet:

 

"LBERR = LIN Bit ERRor A unit that is sending a bit on the bus also monitors the bus.

A LIN bit error will be flagged when the bit value that is monitored is different from the bit value that is sent. After detection of a LIN bit error the transmission is aborted."

 

I dont know what this exactly means, if only compares bit value or if it takes bit timing also into account.

 

So, after that I decided to use the listen mode of the controller:

 

"The listening mode connects the internal Tx LIN and the internal Rx LIN together.

In this mode, the TXLIN output pin is disabled and the RXLIN input pin is always enabled. The same scheme is available in UART mode."

And, surprise!, it works fine(Or that is what I suppose, because TXOK interrupt flag was set).

 

I also tried to put UART commands instead of LIN and the communication works fine too (in normal mode), so I guess the problem is the LIN.

 

Debugging with JTAGICE mkII and IAR, the bit error interrupt flag is set when the program tries to write a LIN tx command.

 

I´ve also tried pulling-up tx pin externally but it doesn´t work.

 

The ATmega is programmed via ISP, and the programming MOSI is the same pin as TXLIN (PD3-->PCINT19/TXD/TXLIN/OC0A/SS/MOSI_A), and I´ve trying disconnecting this pin from the programmer once the device is programmed, but the error dont dissapear.

 

Thanks a lot for your time.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Now I've noticed that if I connect LINTX(2) to LINRX(12), the transmission is correct, but I can´t understand why. It is supposed that a LIN transceiver is connected to tx and rx so It makes no sense to connect them at microcontroller.

The only explanation I can reach is that when LIN transceiver is transmitting, that frame also is transmitted to microcontroller via rx pin, so the monitoring unit doesnt set a bit error.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

When I was working on LIN-bus on my mega32m1 all problems disapeared when I added LIN transceiver. The thing is- the microcontroller is reading the message as it sends it out, so there couldn't be any errors. If error has occured then the transmition is aborted and error flag is set.

 

So what you need to do is to add the tranceiver or to connect Rx and Tx pins directly.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

espinete951 wrote:

Now I've noticed that if I connect LINTX(2) to LINRX(12), the transmission is correct, but I can´t understand why. It is supposed that a LIN transceiver is connected to tx and rx so It makes no sense to connect them at microcontroller.

The only explanation I can reach is that when LIN transceiver is transmitting, that frame also is transmitted to microcontroller via rx pin, so the monitoring unit doesnt set a bit error.

 

I know this is a old post but you find a solution, I have the same problem only can transmit if I jump tx/rx pins.