wlxiong / PyMarkovActv

A Markov Decision Process (MDP) model for activity-based travel demand model

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Summary

This project proposed a unified modeling framework for the traveller's choice of activity type, timing and duration. In the decision-making process, the traveler exhibits forward-looking behavior, i.e. the traveler realize the impact of the current choice on the future utility and take into account the future utility that he can obtain. Therefore, the activity scheduling behavior is formulated as a Markov Decision Process. Dynamic programming technique is adopted to solve solve the problem and maximum likelihood method is employed to estimate the model parameters.

Publication

Yiliang Xiong and William H.K. Lam. Modeling Within-day Dynamics in Activity Scheduling: A Markov Decision Process Approach. Journal of the Eastern Asia Society for Transportation Studies, Vol. 9 (2011), pp. 452--467, Fulltext.

About

A Markov Decision Process (MDP) model for activity-based travel demand model


Languages

Language:Python 100.0%