Recall live tonnage list¶
Setup¶
Install the Signal Ocean SDK:
In [ ]:
Copied!
!pip install signal-ocean
!pip install signal-ocean
Set your subscription key, acquired here: https://apis.signalocean.com/profile
In [30]:
Copied!
signal_ocean_api_key = "" # replace with your subscription key
signal_ocean_api_key = "" # replace with your subscription key
First, we need to create an instance of the TonnageListAPI
:
In [31]:
Copied!
from signal_ocean import Connection
from signal_ocean.tonnage_list import TonnageListAPI
connection = Connection(signal_ocean_api_key)
api = TonnageListAPI(connection)
from signal_ocean import Connection
from signal_ocean.tonnage_list import TonnageListAPI
connection = Connection(signal_ocean_api_key)
api = TonnageListAPI(connection)
Retrieving a live tonnage list¶
Retrieving a live tonnage list is almost exactly the same as getting a historical one except, instead of using the get_historical_tonnage_list
method, you use the get_tonnage_list
method and you don't pass a DateRange
as an argument. The get_tonnage_list
method returns a single TonnageList
that contains live vessel data.
Because of this similarity, we can reuse the parameters we used for our HTL queries:
In [32]:
Copied!
from datetime import timedelta, date
from signal_ocean.tonnage_list import VesselClassFilter
from signal_ocean.tonnage_list import PortFilter
from signal_ocean.tonnage_list import (
VesselFilter,
PushType,
MarketDeployment,
CommercialStatus,
VesselSubclass,
)
vessel_class_filter = VesselClassFilter(name_like="aframax")
vessel_class = api.get_vessel_classes(vessel_class_filter)[0]
port_filter = PortFilter(name_like="ceyhan")
port = api.get_ports(port_filter)[0]
laycan_end_in_days = 6
vessel_filter = VesselFilter(
push_types=[PushType.PUSHED],
market_deployments=[MarketDeployment.RELET, MarketDeployment.SPOT],
commercial_statuses=[
CommercialStatus.AVAILABLE,
CommercialStatus.CANCELLED,
CommercialStatus.FAILED,
],
vessel_subclass=VesselSubclass.DIRTY,
latest_ais_since=5,
)
tonnage_list = api.get_tonnage_list(
port, vessel_class, laycan_end_in_days, vessel_filter
)
tl_data_frame = tonnage_list.to_data_frame()
tl_data_frame
from datetime import timedelta, date
from signal_ocean.tonnage_list import VesselClassFilter
from signal_ocean.tonnage_list import PortFilter
from signal_ocean.tonnage_list import (
VesselFilter,
PushType,
MarketDeployment,
CommercialStatus,
VesselSubclass,
)
vessel_class_filter = VesselClassFilter(name_like="aframax")
vessel_class = api.get_vessel_classes(vessel_class_filter)[0]
port_filter = PortFilter(name_like="ceyhan")
port = api.get_ports(port_filter)[0]
laycan_end_in_days = 6
vessel_filter = VesselFilter(
push_types=[PushType.PUSHED],
market_deployments=[MarketDeployment.RELET, MarketDeployment.SPOT],
commercial_statuses=[
CommercialStatus.AVAILABLE,
CommercialStatus.CANCELLED,
CommercialStatus.FAILED,
],
vessel_subclass=VesselSubclass.DIRTY,
latest_ais_since=5,
)
tonnage_list = api.get_tonnage_list(
port, vessel_class, laycan_end_in_days, vessel_filter
)
tl_data_frame = tonnage_list.to_data_frame()
tl_data_frame
Out[32]:
name | vessel_class | ice_class | year_built | deadweight | length_overall | breadth_extreme | subclass | market_deployment_point_in_time | push_type_point_in_time | ... | open_prediction_accuracy_point_in_time | open_country_point_in_time | open_narrow_area_point_in_time | open_wide_area_point_in_time | availability_port_type_point_in_time | availability_date_type_point_in_time | fixture_type_point_in_time | current_vessel_sub_type_id_point_in_time | current_vessel_sub_type_point_in_time | willing_to_switch_current_vessel_sub_type_point_in_time | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
9488011 | Safeen Elona | Aframax | NaN | 2012 | 105258 | 244.38 | 42 | Dirty | Spot | Pushed | ... | Narrow Area | Greece | East Mediterranean | Mediterranean | Source | Source | NaN | 1 | Source | False |
9592305 | Nissos Delos | Aframax | NaN | 2012 | 115691 | 248.97 | 44 | Dirty | Spot | Pushed | ... | Narrow Area | Egypt | Red Sea | Red Sea | Source | Source | NaN | 1 | Source | False |
9291262 | Themis 1 | Aframax | 1C | 2005 | 114834 | 253.50 | 44 | Dirty | Spot | Pushed | ... | Narrow Area | Saudi Arabia | Red Sea | Red Sea | Source | Source | NaN | 1 | Source | False |
9407457 | Matilda | Aframax | NaN | 2009 | 112935 | 249.96 | 44 | Dirty | Spot | Pushed | ... | Narrow Area | Turkey | Sea of Marmara | Black Sea / Sea Of Marmara | Source | Source | NaN | 1 | Source | False |
9458016 | Delta Star | Aframax | NaN | 2013 | 115618 | 249.97 | 44 | Dirty | Spot | Pushed | ... | Port | Croatia | Central Mediterranean | Mediterranean | Source | Prediction | NaN | 1 | Prediction | False |
9253325 | Nurkez | Aframax | NaN | 2004 | 105650 | 248.00 | 43 | Dirty | Relet | Pushed | ... | Narrow Area | Saudi Arabia | Red Sea | Red Sea | Source | Source | NaN | 1 | Source | False |
9330599 | Lambada | Aframax | NaN | 2006 | 104866 | 243.56 | 42 | Dirty | Spot | Pushed | ... | Port | Saudi Arabia | Red Sea | Red Sea | Source | Prediction | NaN | 1 | Prediction | False |
9370848 | Anafi Warrior | Aframax | NaN | 2009 | 107593 | 243.80 | 42 | Dirty | Spot | Pushed | ... | Port | Italy | Central Mediterranean | Mediterranean | Source | Prediction | NaN | 1 | Prediction | False |
8 rows × 29 columns
In [34]:
Copied!
import pandas as pd
without_time_zones = tl_data_frame
without_time_zones["open_date_point_in_time"] = pd.to_datetime(without_time_zones["open_date_point_in_time"]).dt.tz_localize(None)
without_time_zones["eta_point_in_time"] = pd.to_datetime(without_time_zones["eta_point_in_time"]).dt.tz_localize(None)
without_time_zones["latest_ais_point_in_time"] = pd.to_datetime(without_time_zones["latest_ais_point_in_time"]).dt.tz_localize(None)
without_time_zones.to_excel('Ceyhan_Afra_6days_live.xlsx')
import pandas as pd
without_time_zones = tl_data_frame
without_time_zones["open_date_point_in_time"] = pd.to_datetime(without_time_zones["open_date_point_in_time"]).dt.tz_localize(None)
without_time_zones["eta_point_in_time"] = pd.to_datetime(without_time_zones["eta_point_in_time"]).dt.tz_localize(None)
without_time_zones["latest_ais_point_in_time"] = pd.to_datetime(without_time_zones["latest_ais_point_in_time"]).dt.tz_localize(None)
without_time_zones.to_excel('Ceyhan_Afra_6days_live.xlsx')
In [ ]:
Copied!