I'm communicating with BT module that requires reset during initialization. After triggering RESET, it holds TX line low for some time, which I think is interpreted as LIN break. After that I'm getting SD_BREAK_DETECTED event and I have to use chIQResetI(&SD2.iqueue) to clear that error. If I won't, I will not getting any events with CHN_INPUT_AVAILABLE flag anymore.
From I can read from stm32f1 reference manual:
If the LIN mode is disabled (LINEN=0), the receiver continues working as normal USART,
without taking into account the break detection.
If the LIN mode is enabled (LINEN=1), as soon as a framing error occurs (i.e. stop bit
detected at ‘0’, which will be the case for any break frame), the receiver stops until the break
detection circuit receives either a ‘1’, if the break word was not complete, or a delimiter
character if a break has been detected.
Without LINEN flag receiver should continue without any special action. So I've changed default configuration to disable that flag:
Code: Select all
const SerialConfig serialConf =
{
SERIAL_DEFAULT_BITRATE,
0,
USART_CR2_STOP1_BITS,
0
};
....
sdStart(&SD2, &serialConf);
However, the behaviour is still the same (SD_BREAK_DETECTED handling is required). Is this a correct behaviour? Why?
Thanks