Federal Agencies Try to Cope with the Challenges Posed by Big Data: Report
FREMONT, CA: Unisys, a global IT services provider, in its recent survey has highlighted the state of Big Data and analytics programs in the federal government, reports Jessica Davis writing for informationweek.com.
The survey shows that 60 percent of agencies are using Big Data to reduce costs including capital and operating expenses. One such use case is that the agencies can find the inadvertent incorrect invoices or payments. Root- cause analysis can be done to determine the cause of the problem and to correct the flaws in the system.
According to the survey, 55 percent of agencies are using big data to improve their IT security. Analytics can be used to identify threats by automating the identification of inconsistencies in machine data. This provides the agencies with the accurate information regarding the attack.
The problem, however, seems to be extracting the most out the big data analytics as some have yet not realized the true potential of it and the rest of them feel that making the most of big data would be an expensive affair, as pointed out by the survey.
Lack of staff resources is one of the fundamental problems faced by the agencies. One in three government officials surveyed said that it was difficult to locate experts with the necessary experience, but 68 percent saidthat their agencies were hiring more data analysts. Moreover, 73 percent of the respondents said that they were concerned about the strain the projects put on their existing IT storage, computer and networking infrastructures. To cope with these challenges the federal agencies have planned to either maintain or increase their use of outside consultants to work on these projects, informs Davis.