So there's already tons of information (and code samples and opinions) about CRC16 available. But there's no data on how each of these bits is compatible with anything else. What I've done now doesn't work and I can't find an elegant solution.
I have two devices and applications. One is an ATtiny1614 that's programmed in C++ with Atmel Studio. There's the util/crc16.h header and the _crc16_update function. I understand how to call it. It gets a number of bytes including two final bytes of 0. I write the result in these final bytes (as number, so little endianess applies here). This message is sent via RS-485 to a PC.
The other is the PC that's programmed in C# with Visual Studio. There's sample code available to compute the CRC16 of some bytes. I do the same here. The same procedure is used, setting a uint16 field and getting the memory bytes for it, so here, too, little endianess applies. To verify the checksum, I should just compute the value for all received bytes and it should be 0.
But it's not. It isn't even 0 when I add the checksum in C# and immediately let it verify its own checksum. Both sides, C and C#, consider the other side's CRC16 wrong.
So what's going on here? Is my assumption correct about how to use CRC? Are the two algorithms compatible with each other? Is the verification result always 0? Should I use my own code to compute a CRC and port it to the other language, to guarantee that I'm using the same algorithm? Can I still call that "CRC16" should I decide to publish my communication protocol and allow third-party implementations?
How do you use CRC16 in your applications? This is the first time I need a solution that's compatible across platforms. The rest of the message, a C struct of a few uint16's, can be sent and received on both ends correctly. It's just the checksum that causes trouble.