Here we go.
I'm almost done, just need your feedback on some points.
First, let's explain how it works.
The interface is a copy of Serial Driver, with the exception of the LLD part, which I don't have, for obvious reasons.
The driver works around a tick function, ssdTickI, that must be called by the user every port tick, which must be at least 2 times the desired bit rate, 4 being ideal. In this function, it increments a counter(ssdp->tick) and process 2 state machines, one for TX and another to RX.
Every operation on the line(TX/RX) is done in tick slots, which is calculated based on some events, start bit on RX and byte inserted in the TX queue.
The operation is really straightforward, RX will receive a start bit, wait for a bit rate multiplier + half multiplier(Start bit and half a bit), sample a bit, shift, and so on, until it will wait for a stop bit, if not received consider a framing error, otherwise, add the received byte to queue. TX will start when a byte is added to the queue, send a start bit, wait a bit rate multiplier, send bit, shift, and so on, stop bit, and restart the process.
I've tested this code in a Bluepill(STM32F103), I tested up to 115200bitrate. In high bitrates(57600), if debug is enabled(CH_DBG_STATISTICS, CH_DBG_SYSTEM_STATE_CHECK, CH_DBG_ENABLE_CHECKS, CH_DBG_ENABLE_ASSERTS, CH_DBG_TRACE_MASK, CH_DBG_ENABLE_STACK_CHECK, CH_DBG_FILL_THREADS) it could not keep up with the interruptions, they got stuck. Without debug, works great. Also, with debug enabled, I have a lot of jitter in the PAL ISR, even in slow bitrate I got some shifts(See point #3, regarding externalizing this function).
Here is some feedback I'd like to finish things up:1
- I'm reusing status flags from Serial Driver(Different name though), but only 2 of them, is that OK to keep the same values or should start from 32 as the Serial Driver does?
I believe I should leave the same values so in the future if I implement other features, like parity, I can reuse the same value in the Serial Driver.
Code: Select all
* @name Serial status flags
#define SD_PARITY_ERROR (eventflags_t)32 /**< @brief Parity. */
#define SD_FRAMING_ERROR (eventflags_t)64 /**< @brief Framing. */
#define SD_OVERRUN_ERROR (eventflags_t)128 /**< @brief Overflow. */
#define SD_NOISE_ERROR (eventflags_t)256 /**< @brief Line noise. */
#define SD_BREAK_DETECTED (eventflags_t)512 /**< @brief LIN Break. */
#define SD_QUEUE_FULL_ERROR (eventflags_t)1024 /**< @brief Queue full. */
/** @} */
Code: Select all
* @name Software Serial status flags
#define SSD_FRAMING_ERROR (eventflags_t)64 /**< @brief Framing. */
#define SSD_QUEUE_FULL_ERROR (eventflags_t)1024 /**< @brief Queue full. */
/** @} */
- I added a function(ssdTickI) which must be called at X times the desired bitrate(I'm using 4 times), this will keep track of the driver tick and also will run the state machine for the driver. Is that OK? I used I class because of the lock needed to keep the internal state consistent.3
- To detect the Start Bit I'm using palEnableLineEvent, but if the code is running in debug mode, I get a lot of jitter in the callback, which will affect the bit sampling. I could get rid of this by adding another function, like ssdStartBitI, so the user could use other methods to detect startBit, like fast IRQ, or something. Would this approach more likely to be accepted? Also I'd remove the dependency of PAL.