Calculating Coordinates Given A Bearing And A Distance


Answer :

It seems like these are the issues in your code:

  1. You need to convert lat1 and lon1 to radians before calling your function.
  2. You may be scaling radialDistance incorrectly.
  3. Testing a floating-point number for equality is dangerous. Two numbers that are equal after exact arithmetic might not be exactly equal after floating-point arithmetic. Thus abs(x-y) < threshold is safer than x == y for testing two floating-point numbers x and y for equality.
  4. I think you want to convert lat and lon from radians to degrees.

Here is my implementation of your code in Python:

#!/usr/bin/env python  from math import asin,cos,pi,sin  rEarth = 6371.01 # Earth's average radius in km epsilon = 0.000001 # threshold for floating-point equality   def deg2rad(angle):     return angle*pi/180   def rad2deg(angle):     return angle*180/pi   def pointRadialDistance(lat1, lon1, bearing, distance):     """     Return final coordinates (lat2,lon2) [in degrees] given initial coordinates     (lat1,lon1) [in degrees] and a bearing [in degrees] and distance [in km]     """     rlat1 = deg2rad(lat1)     rlon1 = deg2rad(lon1)     rbearing = deg2rad(bearing)     rdistance = distance / rEarth # normalize linear distance to radian angle      rlat = asin( sin(rlat1) * cos(rdistance) + cos(rlat1) * sin(rdistance) * cos(rbearing) )      if cos(rlat) == 0 or abs(cos(rlat)) < epsilon: # Endpoint a pole         rlon=rlon1     else:         rlon = ( (rlon1 - asin( sin(rbearing)* sin(rdistance) / cos(rlat) ) + pi ) % (2*pi) ) - pi      lat = rad2deg(rlat)     lon = rad2deg(rlon)     return (lat, lon)   def main():     print "lat1 \t lon1 \t\t bear \t dist \t\t lat2 \t\t lon2"     testcases = []     testcases.append((0,0,0,1))     testcases.append((0,0,90,1))     testcases.append((0,0,0,100))     testcases.append((0,0,90,100))     testcases.append((49.25705,-123.140259,225,1))     testcases.append((49.25705,-123.140259,225,100))     testcases.append((49.25705,-123.140259,225,1000))     for lat1, lon1, bear, dist in testcases:         (lat,lon) = pointRadialDistance(lat1,lon1,bear,dist)         print "%6.2f \t %6.2f \t %4.1f \t %6.1f \t %6.2f \t %6.2f" % (lat1,lon1,bear,dist,lat,lon)   if __name__ == "__main__":     main() 

Here is the output:

lat1     lon1        bear    dist        lat2        lon2   0.00     0.00       0.0       1.0        0.01        0.00   0.00     0.00      90.0       1.0        0.00       -0.01   0.00     0.00       0.0     100.0        0.90        0.00   0.00     0.00      90.0     100.0        0.00       -0.90  49.26   -123.14     225.0      1.0       49.25      -123.13  49.26   -123.14     225.0    100.0       48.62      -122.18  49.26   -123.14     225.0   1000.0       42.55      -114.51 

I think there is a problem in the Algorithm provided in message 5.

It works but for only for the latitude, for the longitude there is a problem because of the sign.

The data speaks for themselves :

49.26 -123.14 225.0 1.0 49.25 -123.13

If you start from -123.14° and go WEST you should have something FAR in the WEST. Here we go back on the EAST (-123.13) !

The formula should includes somewhere :

degreeBearing = ((360-degreeBearing)%360)

before radian convertion.


Fundamentally, it appears that your problem is that you are passing latitude, longitude and bearing as degrees rather than radians. Try ensuring that you are always passing radians to your function and see what you get back.

PS: see similar issues discussed here and here.


Comments

Popular posts from this blog

Converting A String To Int In Groovy

"Cannot Create Cache Directory /home//.composer/cache/repo/https---packagist.org/, Or Directory Is Not Writable. Proceeding Without Cache"

Android How Can I Convert A String To A Editable