Let’s assume you like cruise ships, tanker, ferries or you’re so fortunate and own a fleet of vessels cruising over the oceans. But where the heck are the ones you’re interested in. First you can visit MarineTraffic and search for the Vessels you’re interested in. But what if you want to keep track of those vessels or if you want to put them on your “own” map. Now Python comes in handy and I’ll show you how to gather coordinates and put them on a map using the ArcGIS API for Python.
The used “Plattform”
For this task I am using a jupyter notebook as part of the ArcGIS API for Python. But the main steps can be done without the Esri part. We will grab all the data from a website called vesselfinder.com.
The Process of getting Vessel Data
First of all, we will need some modules for getting the web data and parsing it. Furthermore we will need some Pandas magic to prepare the data for the map:
#module import
import urllib.request
from bs4 import BeautifulSoup
import re
import pandas as pd
from datetime import datetime
import locale
locale.setlocale(locale.LC_ALL, 'en_US') #as we need to deal with names of monthes later on.
import os
As we do have all dependencies now, we will need some input for our workflow. Therefore I’Ve prepared a list of so-called IMO numbers of ships I would like to track. If you don’t know the IMO number of the ship of your choice… balticshipping might help.
As we do have everything set up, we will now make webcalls to the vesselfinder website and grab the details from the page:
hdr = {'User-Agent': 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.64 Safari/537.11',
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
'Accept-Charset': 'ISO-8859-1,utf-8;q=0.7,*;q=0.3',
'Accept-Encoding': 'none',
'Accept-Language': 'en-US,en;q=0.8',
'Connection': 'keep-alive'}
items = []
for IMO in IMOS:
url = r'https://www.vesselfinder.com/en/vessels/VOS-TRAVELLER-IMO-' + str(IMO)
req = urllib.request.Request(url, None, hdr)
with urllib.request.urlopen(req) as response:
the_page = response.read()
parsed_html = BeautifulSoup(the_page)
tables = parsed_html.findAll("table")
for table in tables:
if table.findParent("table") is None:
for row in table.findAll('tr'):
aux = row.findAll('td')
try:
if aux[0].string == "Coordinates":
coords = aux[1].string
if aux[0].string == "Vessel Name":
name = aux[1].string
if aux[0].string == "Last report":
zeit = datetime.strptime(aux[1].contents[1], ' %b %d, %Y %I:%M %Z')
except:
print("strange table found")
coordsSplit = coords.split("/")
def dms2dd(degrees,direction):
dd = float(degrees) ;
if direction == 'S' or direction == 'W':
dd *= -1
return dd
def parse_dms(dms):
parts = re.split(' ', dms)
lat = dms2dd(parts[0], parts[1])
return lat
lat = parse_dms(coordsSplit[0])
lng = parse_dms(coordsSplit[1])
items.append((lat, lng, name, zeit))
To keep track of our vessels we will store the found coordinates inside a simple textfile:
filename = 'ship_positions.txt'
if os.path.exists(filename):
append_write = 'a' # append if already exists
fw = open(filename,append_write)
else:
append_write = 'w' # make a new file if not
fw = open(filename,append_write)
fw.write("lat;lng;name;time\n")
for item in items:
fw.write("%3.5f;%3.5f;%s;%s\n" % item)
fw.close()
This will create a textfile with timestamps of ships and the associated coordinates
Putting it on a Map
As we have the coordinates in an array we can create a dataframe from those items and put them on a map:
#get it on a map:
from arcgis.gis import GIS
gis = GIS()
map = gis.map()
df = pd.DataFrame.from_records(items)
df.columns = ['y', 'x', 'name', 'zeit']
ships = gis.content.import_data(df)
map.add_layer(ships)
map.center = [lat, lng]
map
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.Ok
Leave a Reply