The ability to generate realistic and adaptive synthetic human mobility data is vital for applications ranging from urban planning and epidemiology to disaster response. However, most existing generative models are static, limiting their usefulness in dynamic, real-world scenarios. We introduce a paradigm shift by treating human mobility as a language. We represent sequences of locations and inter-event times as discrete tokens and train a Transformer (GPT-2) model from scratch to learn the underlying grammar of movement. Our approach offers two key capabilities. (1) Conditional generation: by prepending special tokens that encode personal attributes (e.g., gender, age) and environmental context (e.g., weekday/weekend, weather), the model produces trajectories consistent with subgroup-specific mobility patterns. (2) Rapid adaptation: a pre-trained mobility model can be fine-tuned to new, anomalous conditions (e.g., post-disaster mobility) using a small amount of data, achieving faster convergence and higher final accuracy than training from scratch. Across large-scale datasets, our Transformer outperforms Markov-chain and autoregressive baselines in long-horizon location prediction and inter-event time modeling, while closely matching real-world distributional statistics. These findings establish mobility-as-language as a powerful, flexible paradigm for controllable and adaptive trajectory simulation in social physics.



