The National Capitol Region Medical Directorate (NCR MD), a part of the Defense Health Agency, is the first organization outside of the Army to fully integrate into the General Fund Enterprise Business System (GFEBS)
GFEBS is one of the largest enterprise resource planning (ERP) systems in the world, processing 1 million transactions a day for some 79,000 end users at more than 200 sites worldwide.
Accenture fully deployed GFEBS for the Army in 2012 and recently delivered the solution to NCR MD.
The web-enabled financial, asset and accounting management system standardizes, streamlines and shares accurate, up-to-date financial and accounting data across the Active Army, the Army National Guard and the Army Reserve. It also streamlines business processes by creating a single source for financial, real property, cost management and performance data, as well as a core system of record for the Army General Fund.
The NCR MD exercises control over the largest military health market: seven military and joint service military treatment facilities and related facilities that include family health clinics and related facilities in the region. Its focus is to integrate services to provide convenient and accessible healthcare for the area’s military community.
With this integration, NCR MD officials will be able to access and analyze financial data in real time, giving more reliable and accessible data for improved decision making. GFEBS also will make it easier for NCR MD to work more closely with business partners, including the Army and the Defense Finance and Accounting Service.
“GFEBS is a solid foundation to help NCR MD meet the medical needs of our servicemen and women, enhancing support capabilities for our wounded warriors while addressing Congressional mandates for greater financial accountability of tax dollars,” said Joe Chenelle, who leads Accenture’s defense and intel business.
Posted on Nov 12, 2014 at 11:29 AM0 comments
Verizon Enterprise Solutions’ cloud-computing platform has received Authority to Operate from the Department of Health and Human Services under the Federal Risk and Authorization Management Program (FedRAMP).
Verizon’s Enterprise Cloud Federal Edition (ECFE) is an Infrastructure as a Service (IaaS) solution, supported by an enterprise-class computing architecture that features virtualization technology from VMware, and compute and network infrastructure from Cisco and NetApp.
ECFE is delivered from cloud-enabled data centers in Culpepper, Va., and Miami, where core infrastructure components are shared by government customers. The service is a available in multitenant and dedicated configurations, and addresses the stringent security, reliability and flexibility requirements of federal agencies and their mission-critical workloads, Verizon said.
ECFE gives government customers the ability to provision virtual servers, storage, virtual load balancers and virtual firewalls in deploying their specific applications.
“Verizon operates one of the most mature and secure enterprise-class cloud-computing platforms used today by U.S. federal government agencies,” said Michael Maiorana, senior vice president of public sector markets, Verizon Enterprise Solutions.
“We are seeing accelerating interest in cloud computing across our public sector business,” Maiorana added, “and achieving FedRAMP authorization underscores our commitment to providing reliable, flexible and high-performance on-demand computing solutions that enable the business of government.”
Verizon’s ECFE is the ninth ATO-approved cloud service provider.
Posted on Nov 10, 2014 at 10:32 AM0 comments
Editor's note: This post was changed to correct the likely location of the NSA's quantum cryptology research.
The federal government is concentrating more of its scientific assets in an effort to build a quantum computer, the next stage in computing that promises to deliver breakthroughs in medical and scientific research, including code-breaking and encryption.
The Commerce Department, the National Institute of Standards and Technology and the University of Maryland just announced the creation of the Joint Center for Quantum Information and Computer Science (QuICS).
QuICS is being launched with the “support and participation” of the National Security Agency/Central Security Service,” according to the announcement. It will also complement quantum research performed at the Joint Quantum Institute (JQI), established in 2006 by UMD, NIST and the NSA.
The center will act a “venue for groundbreaking basic research to build our capacity for quantum research,” NIST Acting Director Willie May said in announcing the center. Scientists at the center will conduct basic research to understand how quantum systems can be best used to store, transport and process information.
It will also likely further the NSA's interests in pursuing quantum technology in the race to create a computer capable of breaking existing public key encryption and many forms of web security.
According to documents provided by former NSA contractor Edward Snowden, the effort to build “a cryptologically useful quantum computer” is part of research program called “Penetrating Hard Targets” that is likely being conducted at the Laboratory for Physical Sciences at UMD.
According to reports on the Snowden leaks, NSA believes it is running even with the European Union and Switzerland in achieving potential breakthrough in developing quantum computing capabilities.
The QuICS will bring even more academic and government resources to bear on NSA’s goal. To get there, topics QuICS researchers will initially examine include:
- Understanding how quantum mechanics informs computation and communication theories.
- Determining what insights computer science can shed on quantum computing.
- Investigating the consequences of quantum information theory for fundamental physics.
- Developing practical applications for theoretical advances in quantum computation and communication.
Creation of the center will enable some of the most experienced researchers in government and academia to pursue these challenges, according to its organizers.
Dianne O'Leary, a computer science professor at UMD and Jacob Taylor, a NIST physicist, will serve as co-directors of the new center.
“The capabilities of today's embedded and high-performance computer architectures have limited advances in critical areas, such as modeling the physical world, improving sensors and securing communications,” they said in an announcement.
“Quantum computing could enable us to break through some of these barriers.”
UMD and NIST have a history of collaboration, noted UMD President Wallace Loh, who said new quantum program, “will team some of the best minds in physics, computer science and engineering to overcome the limitations of current computing systems."
Posted on Nov 05, 2014 at 5:55 AM0 comments
The House of Representatives is looking for co-located data center space to support its data center operations and the needs of other legislative agencies, including the Library of Congress, Architect of the Capitol, U.S. Capitol Police, Congressional Budget Office, Government Accountability Office, Government Printing Office.
The request for proposals for the six-year contract requires the data center to be maintainable per UpTime Institute’s Tier III or Tier IV specs, and the contractor must provide 24/7 environmental monitoring of the data center and critical infrastructure.
The contractor must provide for each agency co-location cage assemblies that are managed with auditable biometric access control. The ability to achieve and sustain NIST-800 standards throughout the life of the contract and achieve annual House audit using NIST-800 standards, is also a requrirement.
Rather than a local facility, the RFP requires that the facility be between 300 and 350 miles from Capitol Hill, more than 100 miles from the coastline and less than 100 miles from a military base. The facility must be able to support long term staff due to disaster recovery or events through government-provided on-premise office trailers.
Proposals are due Nov. 25, 2014.
Posted on Nov 04, 2014 at 1:45 PM0 comments
The National Oceanic and Atmospheric Administration has developed a new online visualization and mapping tool designed to help communities along the Great Lakes plan for changes in water levels associated with climate change.
The Lake Level Viewer uses high-resolution elevation data that lets users accurately visualize water levels ranging from zero to six feet above and below average lake level. Users can view elevation models, determine water depths at specific locations, examine data confidence and view economic impacts.
The tool was developed by the NOAA Office for Coastal Management as part of its Digital Coast initiative.
“The Lake Level Viewer provides planners and decision makers with visual lake level scenarios for rise and drop information before it happens,” said Jim Schwab, a certified planner and the manager of the Hazards Planning Center for the American Planning Association.
“Lake level scenarios can be incorporated into land use decisions, along with economic, social and environmental considerations, to make wise investments in public infrastructure and develop livable, resilient communities,” he added.
More than 4,900 miles of U.S. shoreline ring the Great Lakes, of which 3,800 miles are currently mapped on the Lake Level Viewer according to NOAA. The tool covers areas in Illinois, Indiana, Michigan, Minnesota, New York, Ohio, Pennsylvania and Wisconsin.
Great Lakes water levels are continuously monitored by U.S. and Canadian agencies in the region through a binational partnership. The annual rise and fall cycle of the Lakes’ water levels can be seen online for particular time periods beginning in 1917 via the Great Lakes Water Level Dashboard.
“In light of rapidly changing water levels, it is even more important to have a tool like the viewer to help communities visualize and plan for scenarios,” NOAA said in announcing the tool.
Posted on Nov 03, 2014 at 12:56 PM0 comments