By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

TDWI Articles

Compliance Impediments for Government Data Centers

AI and big data can enhance an agency's system capabilities, but those systems must remain secure and optimally configured.

In the government sector, several initiatives have been designed to upgrade and modernize government data centers, including FITARA, DCOI, and MGT. Although these regulations have been considered for several years, government agencies have struggled to comply. The previous administration and the current administration have both emphasized the need to upgrade government data, but numerous difficulties have slowed progress, even though CIOs of government agencies were given additional power to determine infrastructure choices under MGT.

For Further Reading:

Big Data, Infrastructure, and Performance

Modernization and Government Analytics

Cross-Border Data Restrictions and Your Cloud Strategy

One of the chief issues for these agencies is that current upgrade pressure is toward implementation of cloud computing and artificial intelligence. Agencies need to support big data and analytics as well as machine learning, and these areas involve difficult upgrades from antiquated data storage and management solutions.

After our discussion about big data and infrastructure with Jim D'Arezzo, CEO of Condusiv Technologies, and Mark Gaydos, chief marketing officer of Nlyte Software, we asked for comments about the unique situation of government agencies in their infrastructure modernization program.

Boosting Performance on a Budget

"Many organizations want to extend the life of their current infrastructure as opposed to installing a wholesale upgrade," says D'Arezzo. "This is especially true in government where they are constrained fiercely on budget and need to get more performance out of existing funds. The U.S. government's Federal Information Technology Acquisition Reform Act (FITARA), put in place two years ago, has had a low compliance rate, with only 5 of 24 agencies meeting performance targets. Regulations will increasingly push government centers toward modernization.

"The thing about government -- state, federal, or local -- is that budget is certainly a priority. Very few are flush with money. Also, government institutions have never been on the bleeding edge of technology. Most of the time federal and state agencies are on legacy platforms for longer than expected, making it hard to implement changes with current budgets."

Condusiv helps government agencies get more from their legacy systems, D'Arezzo notes. Government applications running on these systems often face compatibility issues, with many running in old server environments. This creates problems as they move into the big data and AI era. Some of these applications may require a wholesale rewrite to support big data and analytics, in addition to meeting new requirements for networks and storage.

"It's important to start with storage infrastructure and the network, looking at I/O in particular," D'Arezzo advises. "To meet the needs of big data, agencies are moving to the cloud, but the government has been slow in adoption due to security issues. As for FITARA, it's important to improve server utilization and go to virtualization. Both of these are areas where our solutions can improve overall performance.

"In general, people are more open to new ideas than in past years. They are looking for performance enhancement solutions. The old default for performance issues was to add more hardware. Now, because of budget and demand for performance, agency CTOs are willing to consider software-based performance enhancement."

Can Government Actually Drive Improvement?

According to Mark Gaydos of Nlyte, "The current environment is demanding that government organizations get more value for the money. This is a perfect environment to adopt machine learning and deep analytics capabilities in order to improve cost-efficiency and harden data center systems."

As to the modernization regulations, Gaydos believes that these pressures are having a positive effect in driving initiatives for improvement. "Many government organizations are leading the way in terms of maximizing the value they get out of their computing infrastructure," he says. "In many ways, the government is leading in terms of driving and measuring the value they receive from their infrastructure investments. The private sector does not have such driving initiatives, and as a result many sectors lag behind in terms of efficiency and cost-effectiveness of their data centers."

Looking at the overall picture of government agency efficiency and consolidation, Gaydos considers the most effective and least expensive steps to upgrade. "Many organizations still use manual methods, such as spreadsheets, to manage their data centers. These organizations would reap significant benefits by adopting software that allows individuals to collaborate, visualize resources, and optimize what they are already doing with existing assets."

Ultimately, every industry and every sector is different in its modernization requirements and the depth of improvement that might be required. "Every industry has a bell curve of participants," Gaydos points out. "Some government and private sector groups are going to lead the way and improve efficiency by using software and processes to optimize their performance. Others will be laggards and will get by with what they are doing today while living with the inefficiencies of doing things the way they always have."

A Final Word

Of course, in the present political environment, government budgets become even more important and efficiency is critical. Agencies are under serious pressure to consolidate and modernize operations. At the same time they are being urged to take on advanced infrastructures and high-speed, data-intensive processes.

Artificial intelligence and big data can add capabilities to agencies that routinely deal with huge databases. However, it is essential that these systems remain secure and are optimally configured to handle the challenges of new technology and provide the most efficient operation for the data and applications they already have.

 

About the Author

Brian J. Dooley is an author, analyst, and journalist with more than 30 years' experience in analyzing and writing about trends in IT. He has written six books, numerous user manuals, hundreds of reports, and more than 1,000 magazine features. You can contact the author at bjdooley.query@yahoo.com.

TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.