No, a buggy dongle that doesn't signal to the driver that its buffer is
full,
or a driver that doesn't respect a signal and/or manage the buffer
properly. A
UART is a tiny 16-byte buffer and whatever is managing that has to ensure
not
to overrun it. It's a super simple piece of hardware, and if something
isn't
managing it properly then you just end up with garbage going out the
serial
line.
I don't see concrete evidence of a buffer overflow or dropped char in this thread. Not ruling it out, but I wouldn't assume that until you see proof.
FWIW, I've never seen a USB dongle drop chars. I have seen buffering issues a lot. USB devices have much more overhead for device driver <-> HW communication. Buffering more increase CPU and bus efficiency to counter the overhead.
Have you looked at the code? That's not what is happening. We're waiting
for
an ack after every block we write to the device. The inter-character
delays
being introduced basically mean we never fill the tiny buffer in the UART.
Just tossing out potentially helpful ideas. Are you sure the failure is when CHIRP is waiting for the "ack" from the radio? I didn't see anyone report that specifically. Does the radio actually send the ack?
Here's a situation I experienced: I used a USB serial dongle to communicate between a laptop and a car ECU. A Windows application would send a request character to the ECU, then wait for a response. It did this as fast as possible over and over to constantly receive telemetry data.
A lot of USB serial dongles would break communication because they wouldn't send the ECU response quick enough and the application would give up. The application thought the connection was broke, when in fact the ECU did respond, the dongle received the char and buffered it. But then the dongle didn't send the response to the host in time. It was simply waiting for more characters because it wanted a fuller buffer before sending something to the host.
There are two ways to fix this problem: wait longer in the application for the response, or allow the USB dongle to send its RX data sooner (latency timer).
HW UARTs do not suffer this problem as they don't have the same greedy buffering. Like you said, they're simpler. The driver <-> HW communication is much more efficient, so higher buffering is not needed.
I'm not really sure what the "latency" setting is affecting, and I'm not
aware
of any such equivalent on non-Windows platforms.
The latency setting controls how long the driver waits for a full buffer before sending it to the dongle. The latency timer starts when the first character arrives at the driver from an app. The buffer is sent if the timer expires, no matter how full it is. If the buffer fills before the timer expires, it is sent immediately. This scheme is used for dongle -> host buffering as well.
Looks like you can set the latency_timer via sysfs in Linux for an FTDI chipset. I doubt this is very consistent for different dongles. See:
https://askubuntu.com/questions/696593/reduce-request-latency-on-an-ftdi-ubs...
I also think that the non-Windows platforms seem to be unproblematic,
IIRC.
Pavel Milanes said this issue happens in both Linux and Windows in his post at 2:00PM MDT.
Maybe the best solution is to increase the serial port timeout value in CHIRP. I can't think of a down-side other than sluggishness when trying to open invalid ports.
-Nathan