NYC Transit is building a fleet of chatbots
- By Matt Leonard
- May 04, 2018
Want to know how many traffic accidents there were in midtown Manhattan last Tuesday? There’s a bot for that. Do you work for the New York City’s Metropolitan Transit Authority employee and need your schedule for next week? There’s a bot for that too. Or at least there will be soon.
The city's MTA is creating chatbots that will allow agency employees to ask natural language questions to get answers from information in its databases.
The agency is currently working on five bots, and Suhas Uliyar, the vice president of AI bots and mobile product management at Oracle, one of the city’s technology partners for the chatbots, said at least three will leave the pilot phase and become operational this summer.
The bots will address five use cases:
- Scheduling: providing bus drivers, train operators and other MTA employees access to their duties for the day, schedules, routes and other information.
- Citizen reporting: streamlining the intake of citizen-reported incidents, like malfunctions on a bus or subway.
- Collisions: allowing agency users to query the city's collision accident reporting system to answer questions about the number of traffic accidents on a particular day, or the number of collisions that had the same cause, or other inquiries related to accidents in the city. This bot will also tap into weather data, so it can tell the user about how weather may have affected accidents.
- Overtime and leave: giving managers answers to questions about their workers' sick leave and time off.
- Traffic: answering questions related to traffic. It could tell users how much traffic a bridge experiences in a given day and tap into weather data.
These bots will help to democratize data analysis within the organization, Uliyar said.
The chatbots use natural language processing to understand what the human users are asking and then leverages deep neural networks to build a query that gets the data from the backend databases and then delivers a response.
The bots will allow untrained users to “ask questions in a very natural way,” Uliyar said. Before, if managers wanted analysis on employee sick days, for example, they'd have to identify the relevant data, find someone trained in the software to run the analysis and get help interpreting the results. When the back-end systems were first built, "nobody envisioned that you’d be able to interact with a small device in your palm, or be able to speak to this device, or be able to type into Skype or Facebook Messenger and get data back,” he added.
But getting this system up and running wasn’t as simple as standing up some chatbots. The data informing the bots' responses resides in legacy systems, and unlocking those systems was the first challenge.
About three years ago, MTA began looking at updating parts of its highly customized collision accident reporting system. With the old system, which was built on Oracle Forms technology, enforcement officers collected accident information on paper, which they then took back to the office and keyed in at their desktop computers. The plan was to give the officers tablets with a mobile version of CARS so they could record the data in the field. That data was sent from the tablets to Oracle's Mobile Cloud Service, which then sent it back to the legacy system.
This last step required working with AuraPlayer, a company that builds web services for Oracle business systems. To allow mobile devices to access MTA’s back-office systems, AuraPlayer used robotic process automation to screen-scrape the required process from the legacy system, generate the required fields and return it as a REST application programming interface that is optimized for mobile and bots, according to Uliyar.
With this ability now added, the fleet of chatbots can become a reality. The next step is to combine all of these bots into one “master bot” so people won’t have to know which bot to go to, but will instead have one location for all bot inquiries.
Matt Leonard is a former reporter for GCN.