Are you sure your inputs to that function are correct?
What the function does is calculate the time between the tag sending a message and receiving a reply (round1) and the time the gateway took to process the tags message and send its reply (reply1).
In the ideal world (round1 - reply 1) would give you the time the signal took to travel there and back. If you then divided by 2 to get the one way distance, multiplied by the clock period to get time in seconds and multiplied by the speed of light you would get distance in meters.
Unfortunately the world isn’t ideal. The problem with this method is that the tags clock and the gateways clock will be running at very slightly different speeds. If the clocks were 1 part per million different (a reasonable amount to assume they are wrong by) then that would be 1 ns of error every ms of time taken. It takes at least 1 ms to exchange 4 ranging packets and that is pushing it. Light travels 30 cm in 1 ns which means using this simple approach you would be getting range errors of 30cm purely due to clock issues.
The way around this is to use what is called Two Way Ranging (TWR). You measure the range in both directions and average the results.
This implies that you should be doing:
dist = (tagToGwRange + gwToTagRange) / 2
dist = ((round1 -reply1 ) / 2 + (round2 -reply2 ) / 2) / 2
the line used in that code of
dist = (round1 * round2 - reply1 * reply2) / (round1 + reply1 + round2 + reply2)
comes from an analysis of the error sources and how to minimise the impact of clock errors on the results but in effect gives the same result.
About the only change I can think of for your code would be to add some extra brackets since the order of operations isn’t as clear as it could be.
dist = ((round1 * round2 - reply1 * reply2) / (round1 + reply1 + round2 + reply2)) * DW1000_TIME_UNITS * SPEED_OF_LIGHT