Note:If you are starting from scratch, download the data ".db" file in our AISdb Tutorial GitHub repository so that you can follow this guide properly.
Python Environment and Installation
To work with the AISdb Python package, please ensure you have Python version 3.8 or higher. If you plan to use SQLite, no additional installation is required, as it is included with Python by default. However, those who prefer using a PostgreSQL server must install it separately and enable the TimescaleDB extension to function correctly.
User Installation
The AISdb Python package can be conveniently installed using pip. It's highly recommended that a virtual Python environment be created and the package installed within it.
Linux
python -m venv AISdb # create a python virtual environment
source ./AISdb/bin/activate # activate the virtual environment
pip install aisdb # from https://pypi.org/project/aisdb/
The Python code in the rest of this document can be run in the Python environment you created.
Development Installation
For using nightly builds (not mandatory), you can install it from the source:
source AISdb/bin/activate # On Windows use `AISdb\Scripts\activate`
# Cloning the Repository and installing the package
git clone https://github.com/AISViz/AISdb.git && cd aisdb
# Windows users can instead download the installer:
# - https://forge.rust-lang.org/infra/other-installation-methods.html#rustup
# - https://static.rust-lang.org/rustup/dist/i686-pc-windows-gnu/rustup-init.exe
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs > install-rust.sh
# Installing Rust and Maturin
/bin/bash install-rust.sh -q -y
pip install --upgrade maturin[patchelf]
# Building AISdb package with Maturin
maturin develop --release --extras=test,docs
Alternatively, you can use nightly builds (not mandatory)on Google Colab as follows:
import os
# Clone the AISdb repository from GitHub
!git clone https://github.com/AISViz/AISdb.git
# Install Rust using the official Rustup script
!curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y
# Install Maturin to build the packages
!pip install --upgrade maturin[patchelf]
# Set up environment variables
os.environ["PATH"] += os.pathsep + "/root/.cargo/bin"
# Install wasm-pack for building WebAssembly packages
!curl https://rustwasm.github.io/wasm-pack/installer/init.sh -sSf | sh
# Install wasm-pack as a Cargo dependency
!cargo install wasm-pack
# Setting environment variable for the virtual environment
os.environ["VIRTUAL_ENV"] = "/usr/local"
# Change directory to AISdb for building the package
%cd AISdb
# Build and install the AISdb package using Maturin
!maturin develop --release --extras=test,docs
Database Handling
AISdb supports SQLite and PostgreSQL databases. Since version 1.7.3, AISdb requires TimescaleDB over PostgreSQL to function properly. To install TimescaleDB, follow these steps:
Install TimescaleDB (PostgreSQL Extension)
$ sudo apt install -y timescaledb-postgresql-XX # XX is the PG-SQL version
Enable the Extension in PostgreSQL
> CREATE EXTENSION IF NOT EXISTS timescaledb;
Verify the Installation
> SELECT * FROM timescaledb_information.version;
Restart PostgreSQL
$ sudo systemctl restart postgresql
Connecting to a PostgreSQL database
This option requires an optional dependency psycopg for interfacing with Postgres databases. Beware that Postgres accepts these keyword arguments. Alternatively, a connection string may be used. Information on connection strings and Postgres URI format can be found here.
from aisdb.database.dbconn import PostgresDBConn
# [OPTION 1]
dbconn = PostgresDBConn(
hostaddr='127.0.0.1', # Replace this with the Postgres address (supports IPv6)
port=5432, # Replace this with the Postgres running port (if not the default)
user='USERNAME', # Replace this with the Postgres username
password='PASSWORD', # Replace this with your password
dbname='DATABASE', # Replace this with your database name
)
# [OPTION 2]
dbconn = PostgresDBConn('postgresql://USERNAME:PASSWORD@HOST:PORT/DATABASE')
Attaching a SQLite database to AISdb
Querying SQLite is as easy as informing the name of a ".db" file with the same entity-relationship as the databases supported by AIS, which are detailed in the SQL Database section. We prepared an example SQLite database example_data.db based AIS data in a small region near Maine, United States in Jan 2022 from Marine Cadastre, which is available in AISdb Tutorial GitHub repository.
from aisdb.database.dbconn import SQLiteDBConn
dbpath='example_data.db'
dbconn = SQLiteDBConn(dbpath=dbpath)
If you want to create your database using your data, we have a tutorial with examples that show you how to create an SQLite database from open-source data.
The following query will return vessel trajectories from a given 1-hour time window:
import aisdb
import pandas as pd
from datetime import datetime
from collections import defaultdict
dbpath = 'example_data.db'
start_time = datetime.strptime("2022-01-01 00:00:00", '%Y-%m-%d %H:%M:%S')
end_time = datetime.strptime("2022-01-01 0:59:59", '%Y-%m-%d %H:%M:%S')
def data2frame(tracks):
# Dictionary where for key/value
ais_data = defaultdict(lambda: pd.DataFrame(
columns = ['time', 'lat', 'lon', 'cog', 'rocog', 'sog', 'delta_sog']))
for track in tracks:
mmsi = track['mmsi']
df = pd.DataFrame({
'time': pd.to_datetime(track['time'], unit='s'),
'lat': track['lat'], 'lon': track['lon'],
'cog': track['cog'], 'sog': track['sog']
})
# Sort by time in descending order
df = df.sort_values(by='time', ascending=False).reset_index(drop=True)
# Compute the time difference in seconds
df['time_diff'] = df['time'].diff().dt.total_seconds()
# Compute RoCOG (Rate of Change of Course Over Ground)
delta_cog = (df['cog'].diff() + 180) % 360 - 180
df['rocog'] = delta_cog / df['time_diff']
# Compute Delta SOG (Rate of Change of Speed Over Ground)
df['delta_sog'] = df['sog'].diff() / df['time_diff']
# Fill NaN values (first row) and infinite values (division by zero cases)
df[['rocog', 'delta_sog']] = df[['rocog', 'delta_sog']].replace([float('inf'), float('-inf')], 0).fillna(0)
# Drop unnecessary column
df.drop(columns = ['time_diff'], inplace=True)
# Store in the dictionary
ais_data[mmsi] = df
return ais_data
with aisdb.SQLiteDBConn(dbpath=dbpath) as dbconn:
qry = aisdb.DBQuery(
dbconn=dbconn, start=start_time, end=end_time,
callback=aisdb.database.sqlfcn_callbacks.in_timerange_validmmsi,
)
rowgen = qry.gen_qry()
tracks = aisdb.track_gen.TrackGen(rowgen, decimate=False)
ais_data = data2frame(tracks) # re-use previous function
# Display DataFrames
for key in ais_data.keys():
print(ais_data[key])
A specific region can be queried for AIS data using aisdb.gis.Domain or one of its sub-classes to define a collection of shapely polygon features. For this example, the domain contains a single bounding box polygon derived from a longitude/latitude coordinate pair and radial distance specified in meters. If multiple features are included in the domain object, the domain boundaries will encompass the convex hull of all features.
# a circle with a 100km radius around the location point
domain = aisdb.DomainFromPoints(points=[(-69.34, 41.55)], radial_distances=[100000])
with aisdb.SQLiteDBConn(dbpath=dbpath) as dbconn:
qry = aisdb.DBQuery(
dbconn=dbconn, start=start_time, end=end_time,
xmin=domain.boundary['xmin'], xmax=domain.boundary['xmax'],
ymin=domain.boundary['ymin'], ymax=domain.boundary['ymax'],
callback=aisdb.database.sqlfcn_callbacks.in_validmmsi_bbox,
)
rowgen = qry.gen_qry()
tracks = aisdb.track_gen.TrackGen(rowgen, decimate=False)
ais_data = data2frame(tracks) # re-use previous function
# Display DataFrames
for key in ais_data.keys():
print(ais_data[key])
The above generator can be input into a processing function, yielding modified results. For example, to model the activity of vessels on a per-voyage or per-transit basis, each voyage is defined as a continuous vector of positions where the time between observed timestamps never exceeds 24 hours.
from datetime import timedelta
# Define a maximum time interval
maxdelta = timedelta(hours=24)
with aisdb.SQLiteDBConn(dbpath=dbpath) as dbconn:
qry = aisdb.DBQuery(
dbconn=dbconn, start=start_time, end=end_time,
xmin=domain.boundary['xmin'], xmax=domain.boundary['xmax'],
ymin=domain.boundary['ymin'], ymax=domain.boundary['ymax'],
callback=aisdb.database.sqlfcn_callbacks.in_validmmsi_bbox,
)
rowgen = qry.gen_qry()
tracks = aisdb.track_gen.TrackGen(rowgen, decimate=False)
# Split the generated tracks into segments
track_segments = aisdb.split_timedelta(tracks, maxdelta)
ais_data = data2frame(track_segments) # re-use previous function
# Display DataFrames
for key in ais_data.keys():
print(ais_data[key])
Data cleaning and MMSI deduplication
A common problem with AIS data is noise, where multiple vessels might broadcast using the same identifier (sometimes simultaneously). In such cases, AISdb can denoise the data:
(1) Denoising with Encoder: The aisdb.denoising_encoder.encode_greatcircledistance() function checks the approximate distance between each vesselβs position. It separates vectors where a vessel couldnβt reasonably travel using the most direct path, such as speeds over 50 knots.
(2) Distance and Speed Thresholds: A distance and speed threshold limits the maximum distance or time between messages that can be considered continuous.
(3) Scoring and Segment Concatenation: A score is computed for each position delta, with sequential messages nearby at shorter intervals given a higher score. This score is calculated by dividing the Haversine distance by elapsed time. Any deltas with a score not reaching the minimum threshold are considered the start of a new segment. New segments are compared to the end of existing segments with the same vessel identifier; if the score exceeds the minimum, they are concatenated. If multiple segments meet the minimum score, the new segment is concatenated to the existing segment with the highest score.
Notice that processing functions may be executed in sequence as a chain or pipeline, so after segmenting the individual voyages as shown above, results can be input into the encoder to remove noise and correct for vessels with duplicate identifiers.
distance_threshold = 20000 # the maximum allowed distance (meters) between consecutive AIS messages
speed_threshold = 50 # the maximum allowed vessel speed in consecutive AIS messages
minscore = 1e-6 # the minimum score threshold for track segment validation
with aisdb.SQLiteDBConn(dbpath=dbpath) as dbconn:
qry = aisdb.DBQuery(
dbconn=dbconn, start=start_time, end=end_time,
callback=aisdb.database.sqlfcn_callbacks.in_timerange,
)
rowgen = qry.gen_qry()
tracks = aisdb.track_gen.TrackGen(rowgen, decimate=False)
# Encode the track segments to clean and validate the track data
tracks_encoded = aisdb.encode_greatcircledistance(tracks,
distance_threshold=distance_threshold,
speed_threshold=speed_threshold,
minscore=minscore)
ais_data = data2frame(tracks_encoded) # re-use previous function
# Display DataFrames
for key in ais_data.keys():
print(ais_data[key])
Interpolating, geofencing, and filtering
Building on the above processing pipeline, the resulting cleaned trajectories can be geofenced and filtered for results contained by at least one domain polygon and interpolated for uniformity.
# Define a domain with a central point and corresponding radial distances
domain = aisdb.DomainFromPoints(points=[(-69.34, 41.55),], radial_distances=[100000,])
# Filter the encoded tracks to include only those within the specified domain
tracks_filtered = aisdb.track_gen.zone_mask(tracks_encoded, domain)
# Interpolate the filtered tracks with a specified time interval
tracks_interp = aisdb.interp_time(tracks_filtered, step=timedelta(minutes=15))
Additional processing functions can be found in the aisdb.track_gen module.
Exporting as CSV
The resulting processed voyage data can be exported in CSV format instead of being printed:
AISDB supports integrating external data sources such as bathymetric charts and other raster grids.
Bathymetric charts
To determine the approximate ocean depth at each vessel position, theaisdb.webdata.bathymetry module can be used.
import aisdb
# Set the data storage directory
data_dir = './testdata/'
# Download bathymetry grid from the internet
bathy = aisdb.webdata.bathymetry.Gebco(data_dir=data_dir)
bathy.fetch_bathymetry_grid()
Once the data has been downloaded, the Gebco() class may be used to append bathymetric data to tracks in the context of a TrackGen() processing pipeline like the processing functions described above.
tracks = aisdb.TrackGen(qry.gen_qry(), decimate=False)
tracks_bathymetry = bathy.merge_tracks(tracks) # merge tracks with bathymetry data
Similarly, arbitrary raster coordinate-gridded data may be appended to vessel tracks
tracks = aisdb.TrackGen(qry.gen_qry())
raster_path './GMT_intermediate_coast_distance_01d.tif'
# Load the raster file
raster = aisdb.webdata.load_raster.RasterFile(raster_path)
# Merge the generated tracks with the raster data
tracks = raster.merge_tracks(tracks, new_track_key="coast_distance")
Visualization
AIS data from the database may be overlayed on a map such as the one shown above using the aisdb.web_interface.visualize() function. This function accepts a generator of track dictionaries such as those output by aisdb.track_gen.TrackGen().
from datetime import datetime, timedelta
import aisdb
from aisdb import DomainFromPoints
dbpath='example_data.db'
def color_tracks(tracks):
''' set the color of each vessel track using a color name or RGB value '''
for track in tracks:
track['color'] = 'blue' or 'rgb(0,0,255)'
yield track
# Set the start and end times for the query
start_time = datetime.strptime("2022-01-01 00:00:00", '%Y-%m-%d %H:%M:%S')
end_time = datetime.strptime("2022-01-31 00:00:00", '%Y-%m-%d %H:%M:%S')
with aisdb.SQLiteDBConn(dbpath=dbpath) as dbconn:
qry = aisdb.DBQuery(
dbconn=dbconn,
start=start_time,
end=end_time,
callback=aisdb.database.sqlfcn_callbacks.in_timerange_validmmsi,
)
rowgen = qry.gen_qry()
# Convert queried rows to vessel trajectories
tracks = aisdb.track_gen.TrackGen(rowgen, decimate=False)
# Visualization
aisdb.web_interface.visualize(
tracks,
visualearth=False,
open_browser=True,
)