The 4th generation modelling framework for passenger transport in Flanders is fully agentbased, in design, estimation as well as in implementation.
Main principle behind the overall demand model is the explicit modelling of each individual inhabitant or agent, with all travel choices determined by their own household or personal characteristics. Most importantly this approach completely avoids the use of average behaviour or sensitivities, but instead replaces it by considering the heterogeneity of the decision takers, thus allowing for all sorts of unique agents in the model. The high level of detail results in the construction of individual tours and trip-chains as modelling object, where interactions between agents’ own choices as well as with other household members influence the number, type and combination of activities and tours. The complete demand model is implemented as a micro-simulation to obtain a most efficient framework in which to handle discrete results, with special care taken in order to quantify and minimize simulation noise by means of advanced variance reduction techniques.
All input data concerning networks, PT-services, synthetic population, socio-economic data, price settings and costs, traffic observations and counts, …, are collected in a centralized data warehouse, and this with the highest relevant resolution. In this way, the complete 4G framework is set up as a modelling structure and approach from which dedicated modelling instances can be created, on all scales and levels.
On the highest level the strategic passenger model for the Flanders region is implemented as such and provides an overall traffic model for supra-regional analyses. Moreover, this instrument offers a sandbox in which more structural developments and new functionalities can be designed and tested, and incidental out-of-the-box model applications can be explored. This modelling instrument in its first release had extensively been used to support the Mobility Scheme Flanders 2030 in which additional functionalities on roadpricing have been developed. Furthermore, other ad-hoc applications have been worked out, as for example a prospective study on impact of the COVID19-pandemic during different exit strategies, or an optimization study on remote sensing in order to test as much as possible individual private vehicles on emissions.
The current v4.2 release integrates the Visum modelling suite to manage the multimodal networks, and to assign and skim Level-Of-Services for the motorized modes and public transport. Notwithstanding the ambitious size of the strategic models with several thousands of zones, an advanced assignment technique with blocking back is incorporated, combined with consecutive modelling periods with spill-over queueing. For PT a practical rooftopping procedure is used in order to better match the timetable based Visum-approach with the agentbased demand model. Almost all Visum-facilities are further elaborated with in-house developed .NET-applications.
In order to cope with increasing complexity and computational stress of the calibration-process, Significance developed an in-house advanced calibration tool: SigKal offers a polyvalent and flexible optimization-module that offers simultaneous calibration of OD-patterns over dozens of matrices against a broad range of target conditions like traffic and ridership counts, mobile phone data, survey-distributions, … Through a profound reconsideration of the calibration problem and a strict translation thereof into dedicated algorithms, SigKal can handle extremely large and complex calibrations in which the user has complete freedom in defining large sets of target conditions.
Given these flexible and powerful functionalities of SigKal and its lack of predefined dimensions or hardcoded methods, the calibration tool can significantly step up the overall calibration process beyond the inherent shortcomings of optimization of separate modal period-matrices. The latest generation of agentbased demand models, as implemented in the Flanders 4G framework, offer extensively more detailed and rich set of results in the form of consistent tripchains per individual agent.
These outcomes can be transformed into the classic modal OD-matrices for further calibration, but this destroys the main advantage of trip consistency within the discrete tours. For this reason the agentbased calibration method turns the problem away from the focus of OD-matrices and brings OD-route information into all separate tours and trip chains: each tour from all agents in the model forms a calibration object in the optimization, taking into account its own complexity of interconnected trips.
As such, every tour can relate to a series of conditions and targets: a traffic count on the outbound trip in the morning peak as well as another traffic count on the way home in the evening peak can connect to the same discrete tour, as can target modal distributions from an urban survey link to the same tour if the agent lives in the concerned city. Optimization then no longer operates on separate modal trips per model period but intervenes on discrete tours with their related targets. As an outcome, the optimization procedure quantifies relative weights for each tour that entered the calibration: a weight of more than 1 then means that the concerned tour ideally should occur with higher frequency in order to meet the given targets, and vice versa signifies a weight below 1 that the tour itself should occur less than modelled.
In practice this set of weights is of course limited in use to the base set of synthetic tours that have been calibrated and cannot directly be used in forecasts as the set of modelled tours will differ. Therefor the evolutions or tendencies that are present in the calibration impact, need to be transformed into internal choice model corrections a supervised data-mining process searches for structural patterns in the calibrated weights, taking care of aspects like number of tours, distribution over time periods, modal market shares, … In this process, specific corrections or modifications in the probability formulation of the choice models are extracted in such manner that minimal corrections provide maximal impact on fit with the targets. The actual result of the calibration phase is in this way translated into corrective parameters that become a structural part of the heart of the choice models and that align the model results directly towards the required targets. This corrected model is then to be used in further operations and forecasting.