Calibration of the antenna delay

Hello! I apologize in advance for my English. I’m trying to make a reference device for calibrating the antenna delay. I use the instruction from aps14. I get the difference between the known distance and the measured one, but I don’t understand how to translate it from nanoseconds into the program code. Please tell me the formula.
I also ask you to explain about the range offset, what it is and how to calculate it.
Thank you for your attention!!!

Hey there. In the code examples you can see that the values from antenna delay 1 - 65535 and the unit (uwb unit) 1 = 15.65 picoseconds. So in this case you have to measure the real distance, take a lot of measures on that position (has you did) and see how much you have to correct the value of antenna delay. Dont forget rx and tx. In my case u used a correction for each comunication between tag and each anchors cause i didnt have a lot of equipment to make a real calibration for each device.
Hope I could help.

1 Like

Thanks! It remains only to deal with the range offset =)