arrow-left

All pages
gitbookPowered by GitBook
1 of 1

Loading...

Quick Start

A hands-on quick start guide for using AISdb.

hashtag
If you are new to AIS topics, click-herearrow-up-right to know about "Automatic Identification System (AIS)".

Note: If you are starting from scratch, download the data ".db" file in our AISdb Tutorial GitHubarrow-up-right repository so that you can follow this guide properly.

hashtag
Python Environment and Installation

To work with the AISdb Python package, please ensure you have Python version 3.8 or higher. If you plan to use SQLite, no additional installation is required, as it is included with Python by default. However, those who prefer using a PostgreSQL server must install it separately and enable the TimescaleDB extension to function correctly.

hashtag
User Installation

The AISdb Python package can be conveniently installed using pip. It's highly recommended that a virtual Python environment be created and the package installed within it.

You can test your installation by running the following commands:

Notice that if you are running , ensure it is installed in the same environment as AISdb:

The Python code in the rest of this document can be run in the Python environment you created.

hashtag
Development Installation

For using nightly builds (not mandatory), you can install it from the source:

Alternatively, you can use nightly builds (not mandatory) on Google Colab as follows:

hashtag
Database Handling

AISdb supports SQLite and PostgreSQL databases. Since version 1.7.3, AISdb requires to function properly. To install TimescaleDB, follow these steps:

Install TimescaleDB (PostgreSQL Extension)

Enable the Extension in PostgreSQL

Verify the Installation

Restart PostgreSQL

hashtag
Connecting to a PostgreSQL database

This option requires an optional dependency psycopg for interfacing with Postgres databases. Beware that Postgres accepts these Alternatively, a connection string may be used. Information on connection strings and Postgres URI format can be found .

hashtag
Attaching a SQLite database to AISdb

Querying SQLite is as easy as informing the name of a ".db" file with the same entity-relationship as the databases supported by AIS, which are detailed in the section. We prepared an example SQLite database example_data.db based AIS data in a small region near Maine, United States in Jan 2022 from , which is available in AISdb GitHub repository.

If you want to create your database using your data, we have a with examples that show you how to create an SQLite database from open-source data.

hashtag
Querying the Database

Parameters for the database query can be defined using . Iterate over rows returned from the database for each vessel with . Convert the results into generator-yielding dictionaries with NumPy arrays describing position vectors, e.g., lon, lat, and time, using .

The following query will return vessel trajectories from a given 1-hour time window:

A specific region can be queried for AIS data using or one of its sub-classes to define a collection of shapely polygon features. For this example, the domain contains a single bounding box polygon derived from a longitude/latitude coordinate pair and radial distance specified in meters. If multiple features are included in the domain object, the domain boundaries will encompass the convex hull of all features.

Additional query callbacks for filtering by region, timeframe, identifier, etc. can be found in and .

hashtag
Processing

hashtag
Voyage Modelling

The above generator can be input into a processing function, yielding modified results. For example, to model the activity of vessels on a per-voyage or per-transit basis, each voyage is defined as a continuous vector of positions where the time between observed timestamps never exceeds 24 hours.

hashtag
Data cleaning and MMSI deduplication

A common problem with AIS data is noise, where multiple vessels might broadcast using the same identifier (sometimes simultaneously). In such cases, AISdb can denoise the data:

(1) Denoising with Encoder: The function checks the approximate distance between each vessel’s position. It separates vectors where a vessel couldn’t reasonably travel using the most direct path, such as speeds over 50 knots.

(2) Distance and Speed Thresholds: A distance and speed threshold limits the maximum distance or time between messages that can be considered continuous.

(3) Scoring and Segment Concatenation: A score is computed for each position delta, with sequential messages nearby at shorter intervals given a higher score. This score is calculated by dividing the Haversine distance by elapsed time. Any deltas with a score not reaching the minimum threshold are considered the start of a new segment. New segments are compared to the end of existing segments with the same vessel identifier; if the score exceeds the minimum, they are concatenated. If multiple segments meet the minimum score, the new segment is concatenated to the existing segment with the highest score.

Notice that processing functions may be executed in sequence as a chain or pipeline, so after segmenting the individual voyages as shown above, results can be input into the encoder to remove noise and correct for vessels with duplicate identifiers.

hashtag
Interpolating, geofencing, and filtering

Building on the above processing pipeline, the resulting cleaned trajectories can be geofenced and filtered for results contained by at least one domain polygon and interpolated for uniformity.

Additional processing functions can be found in the module.

hashtag
Exporting as CSV

The resulting processed voyage data can be exported in CSV format instead of being printed:

hashtag
Integration with external metadata

AISDB supports integrating external data sources such as bathymetric charts and other raster grids.

hashtag
Bathymetric charts

To determine the approximate ocean depth at each vessel position, the module can be used.

Once the data has been downloaded, the class may be used to append bathymetric data to tracks in the context of a processing pipeline like the processing functions described above.

Also, see for determining the approximate nearest distance to shore from vessel positions.

hashtag
Rasters

Similarly, arbitrary raster coordinate-gridded data may be appended to vessel tracks

hashtag
Visualization

AIS data from the database may be overlayed on a map such as the one shown above using the function. This function accepts a generator of track dictionaries such as those output by .

For a complete plug-and-play solution, you may clone our .

Jupyterarrow-up-right
TimescaleDB over PostgreSQLarrow-up-right
keyword argumentsarrow-up-right
.arrow-up-right
herearrow-up-right
SQL Database
Marine Cadastrearrow-up-right
Tutorialarrow-up-right
tutorial
aisdb.database.dbqry.DBQueryarrow-up-right
aisdb.database.dbqry.DBQuery.gen_qry()arrow-up-right
aisdb.track_gen.TrackGen()arrow-up-right
aisdb.gis.Domainarrow-up-right
aisdb.database.sql_query_stringsarrow-up-right
aisdb.database.sqlfcn_callbacksarrow-up-right
aisdb.denoising_encoder.encode_greatcircledistance()arrow-up-right
aisdb.track_genarrow-up-right
aisdb.webdata.bathymetryarrow-up-right
Gebco()arrow-up-right
TrackGen()arrow-up-right
aisdb.webdata.shore_dist.ShoreDistarrow-up-right
aisdb.web_interface.visualize()arrow-up-right
aisdb.track_gen.TrackGen()arrow-up-right
Google Colab Notebookarrow-up-right
Visualization of vessel tracks within a defined time range
Linux
python -m venv AISdb   # create a python virtual environment
source ./AISdb/bin/activate  # activate the virtual environment
pip install aisdb  # from https://pypi.org/project/aisdb/
Windows
python -m venv AISdb
./AISdb/Scripts/activate  
pip install aisdb
python
>>> import aisdb
>>> aisdb.__version__  # should return '1.7.3' or newer
source ./AISdb/bin/activate
pip install jupyter
jupyter notebook
source AISdb/bin/activate  # On Windows use `AISdb\Scripts\activate`

# Cloning the Repository and installing the package
git clone https://github.com/AISViz/AISdb.git && cd aisdb

# Windows users can instead download the installer:
#   - https://forge.rust-lang.org/infra/other-installation-methods.html#rustup
#   - https://static.rust-lang.org/rustup/dist/i686-pc-windows-gnu/rustup-init.exe
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs > install-rust.sh

# Installing Rust and Maturin
/bin/bash install-rust.sh -q -y
pip install --upgrade maturin[patchelf]

# Building AISdb package with Maturin
maturin develop --release --extras=test,docs
import os
# Clone the AISdb repository from GitHub
!git clone https://github.com/AISViz/AISdb.git
# Install Rust using the official Rustup script
!curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y
# Install Maturin to build the packages
!pip install --upgrade maturin[patchelf]
# Set up environment variables
os.environ["PATH"] += os.pathsep + "/root/.cargo/bin"
# Install wasm-pack for building WebAssembly packages
!curl https://rustwasm.github.io/wasm-pack/installer/init.sh -sSf | sh
# Install wasm-pack as a Cargo dependency
!cargo install wasm-pack
# Setting environment variable for the virtual environment
os.environ["VIRTUAL_ENV"] = "/usr/local"
# Change directory to AISdb for building the package
%cd AISdb
# Build and install the AISdb package using Maturin
!maturin develop --release --extras=test,docs
$ sudo apt install -y timescaledb-postgresql-XX  # XX is the PG-SQL version
> CREATE EXTENSION IF NOT EXISTS timescaledb;
> SELECT * FROM timescaledb_information.version;
$ sudo systemctl restart postgresql
from aisdb.database.dbconn import PostgresDBConn

# [OPTION 1]
dbconn = PostgresDBConn(
    hostaddr='127.0.0.1',  # Replace this with the Postgres address (supports IPv6)
    port=5432,  # Replace this with the Postgres running port (if not the default)
    user='USERNAME',  # Replace this with the Postgres username
    password='PASSWORD',  # Replace this with your password
    dbname='DATABASE',  # Replace this with your database name
)

# [OPTION 2]
dbconn = PostgresDBConn('postgresql://USERNAME:PASSWORD@HOST:PORT/DATABASE')
from aisdb.database.dbconn import SQLiteDBConn 

dbpath='example_data.db'
dbconn = SQLiteDBConn(dbpath=dbpath)
import aisdb
import pandas as pd
from datetime import datetime
from collections import defaultdict

dbpath = 'example_data.db'
start_time = datetime.strptime("2022-01-01 00:00:00", '%Y-%m-%d %H:%M:%S')
end_time = datetime.strptime("2022-01-01 0:59:59", '%Y-%m-%d %H:%M:%S')

def data2frame(tracks):
    # Dictionary where for key/value
    ais_data = defaultdict(lambda: pd.DataFrame(
        columns = ['time', 'lat', 'lon', 'cog', 'rocog', 'sog', 'delta_sog']))

    for track in tracks:
        mmsi = track['mmsi']
        df = pd.DataFrame({
            'time': pd.to_datetime(track['time'], unit='s'),
            'lat': track['lat'], 'lon': track['lon'],
            'cog': track['cog'], 'sog': track['sog']
        })

        # Sort by time in descending order
        df = df.sort_values(by='time', ascending=False).reset_index(drop=True)
        # Compute the time difference in seconds
        df['time_diff'] = df['time'].diff().dt.total_seconds()
        # Compute RoCOG (Rate of Change of Course Over Ground)
        delta_cog = (df['cog'].diff() + 180) % 360 - 180
        df['rocog'] = delta_cog / df['time_diff']
        # Compute Delta SOG (Rate of Change of Speed Over Ground)
        df['delta_sog'] = df['sog'].diff() / df['time_diff']
        # Fill NaN values (first row) and infinite values (division by zero cases)
        df[['rocog', 'delta_sog']] = df[['rocog', 'delta_sog']].replace([float('inf'), float('-inf')], 0).fillna(0)
        # Drop unnecessary column
        df.drop(columns = ['time_diff'], inplace=True)
        # Store in the dictionary
        ais_data[mmsi] = df
    return ais_data

with aisdb.SQLiteDBConn(dbpath=dbpath) as dbconn:
    qry = aisdb.DBQuery(
        dbconn=dbconn, start=start_time, end=end_time,
        callback=aisdb.database.sqlfcn_callbacks.in_timerange_validmmsi,
    )

    rowgen = qry.gen_qry()
    tracks = aisdb.track_gen.TrackGen(rowgen, decimate=False)
    ais_data = data2frame(tracks)  # re-use previous function

# Display DataFrames
for key in ais_data.keys():
    print(ais_data[key])
# a circle with a 100km radius around the location point
domain = aisdb.DomainFromPoints(points=[(-69.34, 41.55)], radial_distances=[100000])

with aisdb.SQLiteDBConn(dbpath=dbpath) as dbconn:
    qry = aisdb.DBQuery(
        dbconn=dbconn, start=start_time, end=end_time,
        xmin=domain.boundary['xmin'], xmax=domain.boundary['xmax'],
        ymin=domain.boundary['ymin'], ymax=domain.boundary['ymax'],
        callback=aisdb.database.sqlfcn_callbacks.in_validmmsi_bbox,
    )
    rowgen = qry.gen_qry()
    tracks = aisdb.track_gen.TrackGen(rowgen, decimate=False)
    ais_data = data2frame(tracks)  # re-use previous function

# Display DataFrames
for key in ais_data.keys():
    print(ais_data[key])
from datetime import timedelta

# Define a maximum time interval
maxdelta = timedelta(hours=24)

with aisdb.SQLiteDBConn(dbpath=dbpath) as dbconn:
    qry = aisdb.DBQuery(
        dbconn=dbconn, start=start_time, end=end_time,
        xmin=domain.boundary['xmin'], xmax=domain.boundary['xmax'],
        ymin=domain.boundary['ymin'], ymax=domain.boundary['ymax'],
        callback=aisdb.database.sqlfcn_callbacks.in_validmmsi_bbox,
    )
    rowgen = qry.gen_qry()
    tracks = aisdb.track_gen.TrackGen(rowgen, decimate=False)

    # Split the generated tracks into segments
    track_segments = aisdb.split_timedelta(tracks, maxdelta)
    ais_data = data2frame(track_segments)  # re-use previous function
    
    # Display DataFrames
    for key in ais_data.keys():
        print(ais_data[key])
distance_threshold = 20000  # the maximum allowed distance (meters) between consecutive AIS messages
speed_threshold = 50        # the maximum allowed vessel speed in consecutive AIS messages
minscore = 1e-6             # the minimum score threshold for track segment validation

with aisdb.SQLiteDBConn(dbpath=dbpath) as dbconn:
    qry = aisdb.DBQuery(
        dbconn=dbconn, start=start_time, end=end_time,
        callback=aisdb.database.sqlfcn_callbacks.in_timerange,
    )
    rowgen = qry.gen_qry()
    tracks = aisdb.track_gen.TrackGen(rowgen, decimate=False)
    
    # Encode the track segments to clean and validate the track data
    tracks_encoded = aisdb.encode_greatcircledistance(tracks, 
                                                      distance_threshold=distance_threshold, 
                                                      speed_threshold=speed_threshold, 
                                                      minscore=minscore)
    ais_data = data2frame(tracks_encoded)  # re-use previous function
    
    # Display DataFrames
    for key in ais_data.keys():
        print(ais_data[key])
# Define a domain with a central point and corresponding radial distances
domain = aisdb.DomainFromPoints(points=[(-69.34, 41.55),], radial_distances=[100000,])

# Filter the encoded tracks to include only those within the specified domain
tracks_filtered = aisdb.track_gen.zone_mask(tracks_encoded, domain)

# Interpolate the filtered tracks with a specified time interval
tracks_interp = aisdb.interp_time(tracks_filtered, step=timedelta(minutes=15))
aisdb.write_csv(tracks_interp, 'ais_processed.csv')
import aisdb

# Set the data storage directory
data_dir = './testdata/'

# Download bathymetry grid from the internet
bathy = aisdb.webdata.bathymetry.Gebco(data_dir=data_dir)
bathy.fetch_bathymetry_grid()
tracks = aisdb.TrackGen(qry.gen_qry(), decimate=False)
tracks_bathymetry = bathy.merge_tracks(tracks) # merge tracks with bathymetry data
tracks = aisdb.TrackGen(qry.gen_qry())
raster_path './GMT_intermediate_coast_distance_01d.tif'

# Load the raster file
raster = aisdb.webdata.load_raster.RasterFile(raster_path)

# Merge the generated tracks with the raster data
tracks = raster.merge_tracks(tracks, new_track_key="coast_distance")
from datetime import datetime, timedelta
import aisdb
from aisdb import DomainFromPoints

dbpath='example_data.db'

def color_tracks(tracks):
    ''' set the color of each vessel track using a color name or RGB value '''
    for track in tracks:
        track['color'] = 'blue' or 'rgb(0,0,255)'
        yield track

# Set the start and end times for the query
start_time = datetime.strptime("2022-01-01 00:00:00", '%Y-%m-%d %H:%M:%S')
end_time = datetime.strptime("2022-01-31 00:00:00", '%Y-%m-%d %H:%M:%S')

with aisdb.SQLiteDBConn(dbpath=dbpath) as dbconn:
    qry = aisdb.DBQuery(
        dbconn=dbconn,
        start=start_time,
        end=end_time,
        callback=aisdb.database.sqlfcn_callbacks.in_timerange_validmmsi,
    )
    rowgen = qry.gen_qry()
    
    # Convert queried rows to vessel trajectories
    tracks = aisdb.track_gen.TrackGen(rowgen, decimate=False)
    
    # Visualization
    aisdb.web_interface.visualize(
        tracks,
        visualearth=False,
        open_browser=True,
    )