Pulse


Pulse

By GCN Staff


State seeks info on asset discovery tools

The State Department is looking for vendors who can provide asset discovery tools to track IT equipment and installed software at the agency’s domestic and overseas locations, according to a presolicitation notice on FedBizOpps.

The asset discovery tools will help the department better understand its computer environment and will be used to reduce unnecessary license fees and maintenance costs. Among the desired capabilities are:

  • A summary of all computer hardware and software found on the network.
  • A summary of the readiness of computers on the network for migration and which computers already meet the hardware requirements.
  • Automatic identification of distributed software activity to help manage increasingly complex license compliance.
  • A summary of infrequently used software to help reduce unnecessary license fees and maintenance costs.
  • A summary of hardware currently in use and what software is installed on it.
  • Software usage metering.

The State Department said it intends the RFI be a living, on-going process until the information obtained meets its needs. At that point, a formal request for quotation will be initiated.  

Posted on Jul 01, 2014 at 7:48 AM0 comments


Puppet Labs partners to automate enterprise networking

Puppet Labs, a provider of IT automation software, has set up a cooperative program with several leading IT vendors to develop joint solutions to extend the automation of IT systems across the enterprise.

The new program aims to speed provisioning of networking and storage services, components of the enterprise that are usually set up manually and thus often create data center bottlenecks.

In setting the program, Puppet has allied with network and storage players Arista Networks, Brocade, Cisco, Cumulus Networks, Dell, EMC, F5, Huawei and NetApp.

“This collaboration will allow organizations to deploy software faster and with fewer errors, iterate more quickly and rapidly adapt to fast-changing business needs,” Puppet said.

The company cited estimates by Gartner that claim the system can cut provisioning times for new applications by more than 80 percent, “reducing typical turnaround times from weeks to days.”

Automated provisioning of the services also leads to higher network reliability because “manual provisioning mistakes are significantly reduced,” Puppet said in its announcement the project.

"Leveraging Puppet Labs’ IT automation solution, Nexus switches automate network provisioning, patch management and configuration tasks,” said Cisco marketing vice president Colin Kincaid. “Automating these manual and error-prone tasks gives DevOps teams the ability to accelerate application delivery.”

Ultimately, “our goal is to extend the reach of automation to every device in the data center, making it easier than ever for organizations to deliver great software quickly and reliably. That's a big competitive advantage in today's technology-driven business environment,” said Luke Kanies, founder and CEO of Puppet Labs.

Posted on Jun 30, 2014 at 11:40 AM0 comments


NASA launches challenges using OpenNEX data

NASA is launching two challenges to give the public an opportunity to create innovative ways to use data from the agency’s Earth science satellites.

The open data challenges will use the Open NASA Earth Exchange (OpenNEX), an Amazon Web Services data and supercomputing platform where users can share knowledge and expertise.

A component of the NASA Earth Exchange, OpenNEX also features a large collection of climate and Earth science satellite data sets, including global land surface images, vegetation conditions, climate observations and climate projections.

“OpenNEX provides the general public with easy access to an integrated Earth science computational and data platform,” said Rama Nemani, principal scientist for the NEX project at NASA's Ames Research Center in Moffett Field, Calif.

“These challenges allow citizen scientists to realize the value of NASA data assets and offers NASA new ideas on how to share and use that data.”

To educate citizen scientists on how the data on OpenNEX can be used, NASA is releasing a series of online video lectures and hands-on lab modules.

The first stage of the challenge offers as much as $10,000 in awards for ideas on novel uses of the data sets. The second stage, beginning in August, will offer between $30,000 and $50,000  for the development of an application or algorithm that promotes climate resilience using the OpenNEX data, and based on ideas from the first stage of the challenge. NASA will announce the overall challenge winners in December.

OpenNEX is hosted on the Amazon Web Services cloud and available to the public through a Space Act Agreement.

Posted on Jun 25, 2014 at 12:18 PM0 comments


Commerce secretary pledges full embrace of open data

“America’s data agency.” That’s how Bruce Andrews, the Commerce Department’s acting deputy secretary, described the business affairs agency before a meeting of the Open Data Roundtable this week.

 

In a guest blog post on Commerce.gov, Andrews outlined the agency’s mission and vowed to redouble its efforts to make put more data as well as tools to manage it into the hands of business and industry.

 

“Our goal is to unleash even more government data to help business leaders make the best possible decisions, while creating fertile ground for more startups,” Andrews said.  

 

“The best way to do that is to listen to suggestions from those already using our data – and to get the private sector’s guidance on where the federal government can unlock the greatest value in our data sets.”

 

The Open Data Roundtable was organized by the GovLab at New York University, the Commerce Department and the White House Office of Technology Policy. It is the first of several future events planned with federal agencies, including the Agriculture, Labor, Transportation and Treasury departments.

Andrews told the 21 companies convened that the department is working with other federal agencies to improve data interoperability and dissemination.

 

“We understand the necessity of ensuring that data is easy to find, understand, and access,” Andrews wrote.

 

“We recognize the urgent need to get this right, and we know that only by listening to the business community, partnering with industry, and collaborating with fellow government agencies, can we best serve our customers and unleash the full power and potential of open data.”

Posted on Jun 24, 2014 at 8:17 AM1 comments


NIST forms new cloud working groups

The National Institute of Standards and Technology announced three public working groups to address cloud services, federated community cloud and interoperability and portability. The working groups are being formed to address requirements laid out in the Cloud Computing Standards and Technology Roadmap.

The Cloud Services working group will perform a study of cloud services and methodologies to determine their properties to clearly and consistently categorize cloud services. 

Besides the cloud computing definition from NIST that categorizes cloud services as software as a service (SaaS), platform as a service (PaaS) and infrastructure as a service (IaaS), dozens of new types of cloud services and acronyms for them have popped up in the marketplace.

The new public working group will use the NIST Cloud Computing Reference Architecture to provide consistent categories of cloud services so buyers know what they are committing to before signing potentially costly, long-term contracts.

The Federated Community Cloud working group will define the term "federated cloud" and develop a path to its implementation. It is charged to develop a framework to support disparate community cloud environments – including those that access internal and external cloud resources from multiple providers.

The Interoperability & Portability for Cloud Computing working group will identify the types of interoperability and portability needed for cloud computing systems, the relationships and interactions between interoperability and portability; and circumstances where interoperability and portability are relevant in cloud computing.

Posted on Jun 23, 2014 at 11:51 AM0 comments