In announcing a partnership with the city of Stockholm and the KTH Royal Institute of Technology, IBM is putting a powerful new analytics technology on display to help city officials manage traffic congestion, decrease air pollution and make operational improvements within their transportation infrastructure.
According to a report from the Texas Transportation Institute traffic congestion is costing the United States $78 billion in lost time and extra gas usage annually. And some estimates of air pollution pin 45 percent of the world’s problem on US traffic. In Europe, the problems are not much better, but an older patchwork of roads makes for getting around town more complicated.
For the past year, IBM has worked with Stockholm, Sweden to monitor traffic flow during peak hours. The congestion management system has reduced traffic in the Swedish capital by 20 percent, reduced average travel times by almost half and decreased the amount of emissions by 10 percent.
In an extension of that partnership, IBM is announcing today a plan to leverage new analytics capabilities in conjunction with KTH Royal Institute of Technology. Building on the previous project in the city, IBM is implementing InfoSphere Streams software to help city officials manage traffic congestion, decrease air pollution and make operational improvements within their transportation infrastructure. But perhaps more important, IBM is looking to give Stockholm traffic officials a window into the future based on real-time data.
The Streams software will tie together information from over a thousand taxicabs GPS devices with data from delivery trucks, traffic sensors, transit systems, pollution monitors and weather information to provide real-time analysis on traffic flow, travel times and best commuting options, company officials said in an interview with CivSource.
“The strength of InfoSphere Streams is that it can handle very large volumes of data and do real-time analysis as it happens,” Naveen Lamba, global lead for intelligent transportation at IBM Global Business Services, said. InfoSphere Streams has high data throughput rates that can range to millions of events or messages per second, and it supports both structured and unstructured streaming data sources, including images, audio, voice, video, radio, email, satellite data, and badge swipes, among others.
But according to James Kobielus, Senior Analyst in Business Process and Applications with Forrester Research, the magic in InfoSphere’s sauce is its ability to embed complex predictive modeling into its real-time analysis.
“InfoSphere is not the first or only [Complex Event Processing] CEP for [Business Intelligence] BI, although it’s a good one with sub-second latency…its key differentiation is its embedded predictive models.”
This means that not only can transportation officials understand traffic flows and pinpoint problems, but they also can predict where problem spots are likely to occur, or develop a clear picture of what might happen if they choose to ignore the problem. This information can then be used to see what will happen an hour in the future or weeks – which has broader policy implications beyond an operational level.
“The question of how to get more use out of existing assets is important in today’s economic environment,” Lamba said. “If you can avoid building extra lanes or another bridge – these huge capitol-intensive and costly projects – through information-based solutions, you’ll get a huge benefit.”
In addition to the high-level performance management reports and predictive capabilities, InfoSphere Streams can also have an impact on operational improvements and impacts on the traveler level by allowing travelers to participate in congestion management for themselves, Lamba said.
But despite the complex nature of the software, Michael Curry, Director of Strategy for Information Management at IBM, said it’s really a generic technology. “It can be applied to any industry – traffic, financial transactions, risk analysis, healthcare – the concept of streaming analytics and predictive analysis for data-in-motion is really powerful.”
“Stockholm is showing how this technology can be mainstream. It’s at the forefront of how people are making decisions and driving next levels of efficiency,” Mr. Lamba added.