FDIC division builds system for storing bank data

FDIC division builds system for storing bank data

Special to GCN

The most critical job at the Federal Deposit Insurance Corp.'s Division of Research and Statistics is to maintain timely information about the 10,623 FDIC-insured banking institutions.

After 2,689 banks failed in the late 1980s and early 1990s, and FDIC inherited responsibility for tracking thrift institutions, the source date complexities made analysis increasingly difficult. Banks submitted quarterly financial reports in several formats, and the set of reported items often changed.

'Determining even rudimentary financial items on a nationwide basis was difficult. Most data requests required custom programming,' said Jon Wisnieski, special assistant to the associate director.

Public confidence in the banking system eroded, he said. 'Our only means of disseminating data was in hard copy,' Wisnieski said. 'Reports were expensive, often delayed and difficult to come by.'

Rather than attempt a sweeping upgrade, the division adopted an approach known as value-effective implementation, which seeks to parallel an organization's computing reality closely.

'To build a data warehouse for its own sake doesn't make sense,' Wisnieski said. 'What worked for us was to position our business needs senior to the storage technology. Once we fully understood what the system needed to provide, we set about assembling it from scratch.'

The division built a data warehouse that now contains more than 25 years' worth of quarterly financial data with 2,800 variables, totaling a massive 60G. Regardless of the types of forms or the way they were reported, everything is consolidated into one source using data files, macros and views created by SAS Institute Inc. of Cary, N.C.

FDIC's Research Information System (RIS), which runs under the IBM OS/390 mainframe operating system, is accessible from 200-MHz Dell OptiPlex GX Pro systems with 128M of RAM and dual 2.1G hard drives. About 100 users work over a LAN running Microsoft Windows NT Server 4.0.

Translation ability

Data normalization occurs at the metadata layer through a data dictionary that translates the different sources and formats into a single set of items. For example, the dictionary would recognize that entries for banks in 'New Jersey' and 'N.J.' belong in the same data set.

Delivery mechanisms go beyond making the data sets available on the mainframe and LAN. A SAS application called RISTABS lets users retrieve and download custom data into their PC spreadsheets via a point-and-click interface. All the applications can use SAS object-oriented tools and a SAS decision-support system for predictive modeling, trend analysis and data mapping.

The system now supplies as much as 95 percent of the data FDIC analysts need. Other FDIC divisions and agencies such as the Office of the Comptroller of the Currency also use RIS.

'Since 1996, we've been supplying individual bank statistics on the Internet for free' at www.fdic.gov, Wisnieski said.

inside gcn

  • ARL seeks private cloud to modernize IT infrastructure

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group