Hello, I've another serial port challenge, if anyones got any tips or ideas.
I'm writing an application that updates the firmware on an embedded microcontroller, and so I'm working to a protocol specification set by the device developer.
In the specification, I have to transmit bytes to the micro with an interbyte delay of 5 milliseconds, the lower limit is 3 milliseconds and the upper limit is 20 milliseconds.
If I send bytes outside of this threshold, the whole command that is being transmitted (even if some bytes are within the limits) is discarded and I receive no response.
So what I do when transmitting a command, I store the data in a byte array, called TXBUff() and when it comes to transmit I use a delay routine like thus:
My delay routine looks like this:
It's not perfect but it's actually the most accurate delay routine i've been able to dig up.
The sequence of commands used to write the firmware to the micro are all done in the GUI thread (I don't use a background worker or anything), I don't mind if the GUI was to lock up etc (in fact it wouldnt be far from ideal if it did as I don't want users to use the computer if possible for other tasks as the process is quite crucial).
Anyway, it actually all works rather well 99% of the time, however with one major flaw - I miss messages now and then. After doing a lot of hair tearing and debugging I've realised theres no actual data loss on transmit or receiving, it's just simply the computer has a spike of activity enough to throw the delay between bytes being transmitted over the 20msec threshold, and the microcontroller doesn't respond.
I can replicate the issue by pressing the screenshot button during communication, that causes enough of a 'hiccup' to throw the message out.
Now I'm going to add code to try and detect messages that are not acknowledged and re-transmit if necessary, but I'd rather try and transmit with a more accurate timing - (during the actual writing process, data is streamed so I can't always tell if the error is occuring) reducing the interbyte delay between bytes to the minumum accepted has helped but not solved the problem.
So, the bottom line is, can anyone think of a creative way to try and guarentee bytes will be sent not less than 3msec apart and not more than 20msec apart?
I've a feeling vb.net might be a poor choice of language for this
thanks
Kris
I'm writing an application that updates the firmware on an embedded microcontroller, and so I'm working to a protocol specification set by the device developer.
In the specification, I have to transmit bytes to the micro with an interbyte delay of 5 milliseconds, the lower limit is 3 milliseconds and the upper limit is 20 milliseconds.
If I send bytes outside of this threshold, the whole command that is being transmitted (even if some bytes are within the limits) is discarded and I receive no response.
So what I do when transmitting a command, I store the data in a byte array, called TXBUff() and when it comes to transmit I use a delay routine like thus:
VB.NET:
for i = 0 to messagelength - 1
Delay(5)
SerialPort.Write(TXBuff, i, 1)
next i
My delay routine looks like this:
VB.NET:
Public Sub Delay(ByVal DelayMilliseconds As Integer)
'function for super delay
Dim delaywatch As New Stopwatch
delaywatch.Start()
Do Until delaywatch.ElapsedMilliseconds >= DelayMilliseconds
Application.DoEvents()
Loop
delaywatch.Stop()
End Sub
It's not perfect but it's actually the most accurate delay routine i've been able to dig up.
The sequence of commands used to write the firmware to the micro are all done in the GUI thread (I don't use a background worker or anything), I don't mind if the GUI was to lock up etc (in fact it wouldnt be far from ideal if it did as I don't want users to use the computer if possible for other tasks as the process is quite crucial).
Anyway, it actually all works rather well 99% of the time, however with one major flaw - I miss messages now and then. After doing a lot of hair tearing and debugging I've realised theres no actual data loss on transmit or receiving, it's just simply the computer has a spike of activity enough to throw the delay between bytes being transmitted over the 20msec threshold, and the microcontroller doesn't respond.
I can replicate the issue by pressing the screenshot button during communication, that causes enough of a 'hiccup' to throw the message out.
Now I'm going to add code to try and detect messages that are not acknowledged and re-transmit if necessary, but I'd rather try and transmit with a more accurate timing - (during the actual writing process, data is streamed so I can't always tell if the error is occuring) reducing the interbyte delay between bytes to the minumum accepted has helped but not solved the problem.
So, the bottom line is, can anyone think of a creative way to try and guarentee bytes will be sent not less than 3msec apart and not more than 20msec apart?
I've a feeling vb.net might be a poor choice of language for this
thanks
Kris