[chirp_devel] Serial port parameters / initialization
Hi , I need to update the TG-UV2+ driver to use 2 stop bits on the UART, I've added the following line in do_ident(radio) function:
radio.pipe.stopbits = serial.STOPBITS_TWO (After the radio.pipe.timeout setting and before the first write to the radio)
The fix works, However, bBefore submitting the patch, I wanted to know if there is a need to clean this back to "normal" (1 stop bit) somewhere? Or, is the serial port re-initialized every time a new download starts (e.g. if a user would download from one radio type and then from another)?
(I tested this with the 2 radio types I have (first downloading from the TG-UV2+ with 2 stop bits and then from a UV-5R (which Imagine uses 1 stop bit ) and it worked fine, but that could be just luck....
Thanks! Ran
Before submitting the patch, I wanted to know if there is a need to clean this back to "normal" (1 stop bit) somewhere? Or, is the serial port re-initialized every time a new download starts (e.g. if a user would download from one radio type and then from another)?
Nope, it's initiated fresh every time so you don't need to reset it.
Just to be clear, can you explain why you think this is necessary? I mean other than that it works with the change, presumably the original author didn't need that to be in place... Do you observe the factory software setting those parameters? Have you read and written the radio image to make sure it's not getting mangled?
--n
Hi Dan
I think this is necessary for 2 reasons: 1. The original C application which I based the driver on does use this setting. I did the CHIRP driver development on a Mac, and missed this setting, probably due to the fact that I didn’t see any issue with the default setting for some reason. 2. A couple of weeks ago, a user reported a bug where he is getting bad responses from the radio. He was using windows…. After looking into it and trying to use a windows machine, I got similar errors / “ garbage” bytes back from the radio, digging a bit deeper and going back to the original C app mentioned above, I found the stop bits setting being the cause of the problem.
After the fix, reads and writes work both on my Mac and windows machines as well as for the user who reported the bug (on a windows machine).
I still do not understand why/how the Mac is agnostic to this setting and whether it is a pyserial Mac vs windows implementation or something in the “com port” OS framework.
Ran
- The original C application which I based the driver on does use this setting.
Ack, this is a good sign :)
I did the CHIRP driver development on a Mac, and missed this setting, probably due to the fact that I didn’t see any issue with the default setting for some reason.
Sorry, I forgot this was yours. I'm kinda suspect of anything modern that isn't 8N1, but I know there are some.
- A couple of weeks ago, a user reported a bug where he is getting bad responses from the radio. He was using windows….
After looking into it and trying to use a windows machine, I got similar errors / “ garbage” bytes back from the radio, digging a bit deeper and going back to the original C app mentioned above, I found the stop bits setting being the cause of the problem.
After the fix, reads and writes work both on my Mac and windows machines as well as for the user who reported the bug (on a windows machine).
I still do not understand why/how the Mac is agnostic to this setting and whether it is a pyserial Mac vs windows implementation or something in the “com port” OS framework.
Yeah, definitely confusing. Does this radio use a cable with the USB chip in it, or is the USB-to-serial device inside the radio itself? If the latter, then it's possible that the radio defaults the right settings and the MacOS drivers aren't actually overriding them, but the windows ones are. That's less likely if it's a generic cable, of course.
Anyway, as long as it seems consistent with the OEM software, then that's fine, I just want to make sure,
Thanks!
--Dan
participants (2)
-
Dan Smith
-
Ran Katz