Advancement in the automation of paved roadways performance patrolling: A review

Muhammad Saiful Islam*, Ahmed Mohamed Ibrahim, Kazi Ekramul Hoque, Karama Abdullah Bakhuraisa, Usman Ali, Martin Skitmore

*Corresponding author for this work

Research output: Contribution to journalReview articleResearchpeer-review


This review critically analyzes available tools and techniques for road damage assessment and monitoring and identifies further research areas. After the review, several are found for image and sensor data collection, such as smartphones, cameras, accelerometers, and Unnamed Aerial Vehicles (UAVs). Of these, smartphones are the most frequently used among the data processing tools, including Machine Learning (ML) algorithms, Convolution Neural Networks (CNNs), and Support Vector Machines (SVMs): they outperform in detecting and measuring various forms of road surface damage, including potholes, bumps, ruts, and patches. It is also found that the study of crack identification is comparatively lacking. Moreover, some studies attempt to capitalize on crowdsourcing or road users in data collection connected to a server while driving over road networks. However, physical observation and smartphone-based image posting to the server backed by appropriate ML tools for automatic and dynamic road damage detection are yet to be demonstrated or developed and are, therefore, of further research potential. Finally, from a national perspective, India and China are ahead of other countries. Thus, future research can focus on image processing-based automatic road performance monitoring in any country with a transportation system that is highly dependent on paved roads
Original languageEnglish
Article number114734
Pages (from-to)1-13
Number of pages13
JournalMeasurement: Journal of the International Measurement Confederation
Publication statusPublished - 15 Jun 2024


Dive into the research topics of 'Advancement in the automation of paved roadways performance patrolling: A review'. Together they form a unique fingerprint.

Cite this