AI adoption: Don't ignore the fundamentals
- By Gary Newgaard
- Oct 12, 2018
Artificial intelligence is the next frontier, and the government is on board. According to a recent study, 77 percent of federal IT managers say this technology will change the way government thinks about and processes information, while 61 percent say AI could solve one or more of the challenges their agencies face today. The White House has jumped in as well with the creation earlier this year of an Interagency Select Committee on Artificial Intelligence, which aims to improve the coordination of federal efforts related to AI to ensure continued U.S. leadership in this emerging and potential-rich field.
This committee has made progress by understanding the potential of AI and highlighting the need for a coordinated effort around the development of AI initiatives in the federal government. As the committee continues its work, it is important to consider the AI challenge holistically, starting with infrastructure.
The new "big bang" of AI adoption is fueled by a perfect storm of three key technologies: deep learning software, graphics processing units and big data. Both DL and GPUs are major breakthroughs and game-changing technologies, and when they are applied to big data, the potential for innovation is incredible. However, while DL and GPUs are progressing, many storage technologies have lagged. Consequently, there has been a performance gap between the compute element (DL and GPUs) and the storage, limiting the ability to capitalize on data that has been growing at an exponential rate.
While determining how best to incorporate AI initiatives, agency leaders should already be focused on data collection and cleaning in preparation for AI implementation and ensuring they are equipped to handle this explosive growth of data. As the size of datasets has increased exponentially, moving and replicating data has become a prohibitive expense and a bottleneck for innovation. A new model is needed: Enter the data-centric architecture.
This modern design puts data at the core of an infrastructure. Data-centric architecture eliminates the need for data to be moved between old and new systems and keeps data in place while the technology is built around it. The aim is to bring the compute element to the data, enabling IT managers to spend less time and expense moving data and more time innovating and making use of datasets.
With this new architecture agencies can share and deliver data anywhere, any time, creating a data hub that can be the way forward for a modern government. A true data hub must be massively parallel and include high throughput for file and object storage, true scale-out and multi-dimensional performance that is built to respond to any data type with any access pattern.
These four features are essential to unifying data. Too much data remains stuck in a complex sprawl of silos. While each is useful for its original task, silos are counter-productive in a data-first world. Silos mean data can’t do work while it’s not being actively managed.
For agencies to truly benefit from a data-centric architecture, systems must work in real time, providing the performance needed for the next-generation analytics that make AI so powerful. They must also be available on-demand and be self-driving. Consolidating and simplifying this architecture through flash makes it far easier for teams to support the technology that is fueling tomorrow’s growth. Storage has a unique opportunity to become much more than a siloed repository for the deluge of data constantly generated, but rather a platform that shares and delivers data to create value.
Computing power, networking and storage are the foundational elements that enable the incredible work of AI, but unless these elements move forward at the same rate, the process will be off balance and other technologies will be held back.
Two new components of AI are forcing a change in data center architecture. First is the explosion of data creation. Second, software now allows mining of that data. Data-centric infrastructure design can help turn AI dreams into reality by fundamentally transforming data center architecture with the data as its core.
Because government IT managers are working with limited resources, they are often asked to do more with less. AI has the potential to transform the current landscape, but data must be at the center IT's approach before the full benefits of this technology can be realized. Truly successful AI depends on this perfect partnership of data, compute power and storage. Without the adoption of a data centric architecture, the full potential of AI won’t be realized.
Gary Newgaard is vice president, public sector, at Pure Storage.