Data centers have become a hot topic in government lately. At a conference I attended last week, it was standing room-only for a talk about how to get the most efficiency out of government data centers. And while people do seem to be thinking outside the box, the physical data center itself seems most unlikely to change. Or is it?
We’ve reported on attempts to make efficient data centers that cool their servers using the ice built up on their sidewalks, underground bunkers that remain naturally cool (as well as bomb-proof) and a new data center campus that doesn't even have a roof. But the one thing they all have in common is that they are permanent structures.
But that might change. We’re seeing a new trend toward modular data centers, prefabricated units that can fit onto a flatbed truck, allowing an organization to expand or shrink its capacity as needed.
It’s not like calling for a pizza — systems can still take months to set up — but the concept is pretty reasonable. A trailer is loaded up with data center computers, all configured and likely having all the cooling and power needs that systems inside need. When a facility needs to add capacity, the new trailer is driven out to the data center and plugged into the matrix using standard components, bringing the extra capacity online. Removing extra capacity is just as easy. Just decouple it and go.
And an agency or organization saves money by avoiding the construction and engineering costs of building a physical data center.
The military has been using, and refining, this concept for some time, because it needs to rapidly deploy mobile data centers. But it’s still a new concept for most civilian agencies used to having their data centers firmly planted on concrete.
One company, that specializes in this new type of mobile, scalable data center, IO, even says that its data centers are more protected than the average data center facility, Network World reports. The units are self-contained and software-defined, so they don't rely on a large industrial infrastructure that could be vulnerable to physical or cyber-based attacks, or simply mechanical breakdowns, Bob Butler, the company’s chief security officer, said in a company video. In March, the Securities and Exchange Commission outsourced the data center services powering its EDGAR database to IO to save the agency $18 million in capital expenses.
Mobile data centers also simplify operations, which could improve efficiency. Many data center efficiency tips at that recent conference I attended involve the physical infrastructure and making sure that the IT staff and the facility staff worked together. With the IO solution, you really almost just have an IT staff.
Another possible advantage is that, if needed, an entire data center could simply be moved to a new location, perhaps to get out of the way of a hurricane. Though this is not as easy as simply jumping in a car and driving away, it's at least possible with the modular data centers, whereas a traditional facility is nearly impossible to relocate without months, if not years, of planning.
One of the factors holding back greater use of mobile data centers is simply that the technology is not very mature, and companies that offer it have greatly varying skill levels, FCW reported. But in an era of tight budgets and shifting IT requirements, it seems likely to catch on. Who would have thought that data centers, which remained largely unchanged for the past 10 or 15 years, would suddenly become a hotbed of innovation?
Posted by John Breeden II on Jun 11, 2013 at 9:39 AM0 comments
Earlier this week I visited the Xperience Efficiency convention put on by Schneider Electric. Bob Massie gave an excellent presentation about how to make federal data centers more efficient and from the number of people in the packed house taking notes, it’s a hot topic.
I say hot because much of the electric power in a data center is used for cooling. Massie made an excellent point that no matter how much you spend on a new facility, the operating costs are always going to be more. And oddly enough, becoming efficient is more than just dimming the lights and turning back the AC. Doing that might actually make you less efficient in the long run.
It was a lot to think about, looking at efficacies in new ways. For example, one of the biggest causes of downtime, according to Massie, was human error. Downtime generally means that you are still powering your entire facility, but not getting any work done. It’s zero efficiency.
And that happens sometimes because employees aren’t trained properly, but other times because they are trying to things that they have not been trained to tackle at all.
One way to prevent this is to add biometric access controls to equipment or to the facility, instead of just to computers. Schneider’s XB5S Biometric Switch, for example, not only attaches to equipment, but it can be programmed for security control. So an operator might be able to access a machine’s basic functions, whatever they need to get their job done, but may not be able to shut it down or install new tasks.
An engineer or administrator could log into the same machine using his fingerprint and have total access, or at least more access as needed.
Uptime also means having good equipment. A data center can sometimes be thought of as a factory floor, though hopefully a bit cleaner. But there are diesel backup generators and control panels too. It might surprise people to see just how industrial many data centers are.
We might think of everything being controlled by mouse clicks on a computer, but data centers have buttons that perform important functions, such as cutting the power in an emergency or activating backup power, even sounding an alarm if needed. Employees can tend to be a little rough with them, so to maintain uptime, and therefore efficiency, there is a MIL-STD compliant line of rugged buttons, called the Harmony line.
Finally, it won’t increase your data center’s power usage effectiveness (PUE), but extending that out of the box thinking to vehicles can improve your bottom line, as well as help the environment.
Interestingly enough, living on the East Coast, I had never seen an electric charging station for vehicles before, at least not in person. I hear they are all the rage in California, but I doubt there are very many in D.C. I was surprised how easy they were to operate. The plug is shaped like a gas handle, though I suspect that’s mostly for aesthetics.
When you get it close to the “tank” of an electric car, a powerful magnet locks it in place. I’d say it’s probably idiot proof.
Also dealing with vehicles, one of the coolest things I saw at the event was a hybrid motorcycle that looked like a pretty powerful chopper -- except that when moving around at low speeds, it was just about completely silent. That might be effective for tooling around a large government campus, but you don’t get that cool engine revving power sound, which I suspect some people would miss. I would still love to ride one, especially now that I know how to fill up both the gas and the electric tank.
Posted by John Breeden II on Jun 07, 2013 at 9:39 AM0 comments
Building websites to work on every computer at every resolution isn't an easy task anymore. It used to be that you could set your Web page to display optimally at 800 by 600 resolution, and 95 percent of the time it would look fine. Today, if your Web page was at that resolution, it would look like something out of the Flintstones.
Sean Herron, a technology strategist for NASA, blogged about running into this problem because he needed the NASA logo to display in the corner of a webpage, yet he didn't know what resolution to set it at. Too small and it would shrink to almost nothing on a screen with a high resolution. Too large and it would take up most of a page for someone viewing it at a lower resolution. Not only that, but the logo needed to appear on pages of the website, so its unknown resolution was messing with the overall site design.
The solution Herron found was using Scalable Vector Graphics. SVGs are the emerging format for Web page design, because using just a few lines of code lets you create a logo or graphic that scales and displays to whatever size a user needs for their monitor. Most SVGs use open-source code with a PNG fallback so that an older browser that does not support SVG images will have something to display in its place.
SVG, an open standard developed by the World Wide Web Consortium is XML-based so images can be searched and indexed, and they can be created even using a text editor. W3C has a working group for anyone who needs to learn how to make the best use of the new technology.
Herron said he came across FC Webicons, “resolution independent” social icons from Fairhead Creative using SVG graphics. He adapted the open-source code to solve his logo problem for the NASA site he was creating.
But he didn't stop there. He created 41 government logos and flags from every sovereign country on the planet using the new format, calling them Gov Webicons, which can be downloaded and used freely by anyone who needs them. All of the government logos and flags created by Herron are open-source and hosted on GitHub, the open-source development site being used by government agencies like the Health and Human Services Department to quickly make websites whose data can be shared freely.
The actual SVGs are elegant in their simplicity, as they only take up two lines of code, not much more than simply inserting a normal graphic onto a page. There is a tutorial that explains how it all works for designers looking to implement them or who want to use any of the 41 government logos Herron created.
SVGs are certainly going to be part of the larger picture of how webpages will be designed in the future. It's great to see a government techie like Herron not only already using them, but also helping out other government designers who may be faced with the same problems.
Posted by John Breeden II on Jun 06, 2013 at 9:39 AM2 comments
June is upon us, which means hurricanes may not be far behind. And according to the National Oceanic and Atmospheric Administration, this could be a bad year.
NOAA’s Atlantic Hurricane Season Outlook says there is a 70 percent likelihood of between 13 and 20 named storms, of which seven to 11 could become hurricanes with winds of 74 mph or higher. We are looking at the real possibility of between three and six major hurricanes of Category 3 and stronger, with winds of 111 miles per hour or more.
Of course, NOAA doesn't know if any of these possible storms will make landfall even if they do form. Many hurricanes spin around out in the ocean and don't cause any trouble other than forcing ships and aircraft to route around them. But it only takes one good hit to affect thousands of lives and do millions of dollars in property damage. One only needs to look as far back as Hurricane Katrina or Superstorm Sandy to what can happen. And the die is cast for a bad year.
“This year, oceanic and atmospheric conditions in the Atlantic basin are expected to produce more and stronger hurricanes,” wrote Gerry Bell, lead seasonal hurricane forecaster with NOAA’s Climate Prediction Center in a briefing about the pending storms. “These conditions include weaker wind shear, warmer Atlantic waters and conducive winds patterns coming from Africa."
NOAA does a good job these days using technology to predict dangerous storms, but this year it may have its work cut out for it with so much bad weather predicted, which is bad news for those of us living along the East Coast, and especially for those in the South, the typical target zone. Thankfully, NOAA will have some new prediction weapons at its disposal, if it can get them online and working before the storms hit.
Right now, the predictions are pretty good, as was evident by the extremely accurate modeling of Superstorm Sandy. NOAA predicted the storm's intensity and impact, and projected the correct time of its landfall within six hours.
To improve predictions even more, NOAA is working to bring a new supercomputer online in July, before hurricane season reaches its peak. The new system will run an upgraded hurricane weather research and forecasting model that provides significantly enhanced depiction of storm structure and improved storm intensity forecast guidance.
The National Hurricane Center uses several dozen modeling programs combining factors such as historical data and atmospheric information to predict a storm’s path.
And adding Doppler radar to its fleet of sensor-laden, storm-hunting aircraft, could mean a 10 to 15 percent increase in prediction accuracy and speed, according to NOAA.
At the moment, the uncertain element in NOAA’s big picture has to do with satellites. The GOES 13 satellite that NOAA uses to watch the East Coast. It is having some problems, and this is not the first time. There are plans to reposition GOES 14 to watch over the hurricane target zone if 13 can't get fully operational before the hurricane season begins in earnest. With a violent season predicted, NOAA needs to make sure that all of its resources are working and in place, if not by June 1, then certainly by August when the season traditionally heats up. We can't fight hurricanes, but with enough warning people can prepare for them, or, if nothing else, get the heck out of the way.
To learn about what you can do to prepare for a hurricane, visit the government's http://www.ready.gov/hurricanes storm preparation website and take their sage advice.
Posted by John Breeden II on May 31, 2013 at 9:39 AM3 comments
NASA is investing $125,000 to bring 3D printing somewhere it's never been before: into the kitchen.
We've all seen the replicator from “Star Trek.” You just walk up to it and say something like (in your best Jean-Luc Picard voice) “Tea, Earl Grey, hot.” And presto, it makes a perfect cup for you.
NASA would love to have something like that, but will have to start with a more basic model. That's where Anjan Contractor and his company, Systems and Materials Research Corporation, come into play. They showed NASA a demo of a food-based 3D printer making a chocolate pastry, and that got them a six-month, $125,000 grant to build a complete prototype system.
NASA is interested in this because the substrate used by the printer could be stored as a dry powder. So there might be one tube for the printer filled with sugar, another with protein powder, another with specific dried foods and several with flavoring ingredients. Stored in the right conditions, the raw materials, the stuff the printer needs to make food, might be good for 30 years.
In its proposal summary, NASA says the idea is to “test a complete nutritional system for long duration missions beyond low Earth orbit,” which, theoretically at least, could include an eventual manned mission to Mars. NASA also notes that a successful 3D food printer also could be used by the military, providing optimal nutrition to warfighters while cutting down on logistical challenges and waste.
Printing out a food item on the printer, anything from a hot dog to a chocolate cake, would simply be a two-fold process. First, the powdered fuel would need to be reconstituted with the right amount of water and in the right ratios. Apparently the food can be baked as its being reconstituted.
Then it would need to be sprayed out of the nozzle in a pattern to shape the food item into whatever it's supposed to look like, or at least into something edible. That second part is almost no different than how other 3D printers work when building a model from a CAD file out of ABS plastic or harder substrates. In this case, you just eat the finished product.
NASA struggles with how to keep astronauts alive in space, but also how to keep them happy. Eating the type of food one might pack for a long campout would get very old after a few weeks. Eating the same thing for years on end might be maddening. So a machine that can prepare different kinds of food and still maintain a nutritional balance has a lot of appeal. In fact, pizza is on NASA’s early menu, because it’s made in layers, which would be conducive to printing. It would depend on the quality of the ingredients, but it would probably be no worse than eating at most cafeterias.
On a large scale, NASA and Contractor envision food printers helping to end hunger around the world. NASA’s summary notes that the world’s population is expected to reach 12 billion by the end of the century, and that effective 3D printing of food, “may avoid food shortage, inflation, starvation, famine and even food wars.”
Contractor told Quartz that if a cheap food base, the tubes of fuel and flavoring, could be loaded up and printed out in homes, it not only would it eliminate waste, but go a long way toward stopping hunger too. Though it might take some getting used to, the current unchecked population boom might create a real need for something like a food printer.
“I think, and many economists think, that current food systems can’t supply 12 billion people sufficiently,” Contractor told Quartz. “So we eventually have to change our perception of what we see as food.”
Contractor says he will keep the software that drives his food printer open-source — his design uses the RepRap open-source printer — so that people can look at the code, see how the machine works and create their own food recipes.
I don't think Contractor's vision of our food future is all that bad. I'd really like to be able to print out a nice dinner without having to fire up the stove or head out to the store, other than to occasionally refill my food printer's bins. As long as I can create potato chips along with all the healthy fare, I think I'll be happy. Oh, and a cup of hot Earl Grey tea would be lovely too, if that's not too much to ask.
Posted by John Breeden II on May 23, 2013 at 9:39 AM5 comments
Leap Motion has released a brief video showing how the Leap controller will work with Windows 7 and, especially, Windows 8. The company expects to launch the controller on July 22 for $80.
The video is fairly short, but it shows how a desktop computer monitor can be turned into something like a touch screen, except of course that the user doesn’t actually touch the screen. Swipes, pinches and other gestures made in front of the screen get the job done, keeping monitors free of smudges from users’ taps and swipes.
Back when I made my 2013 predictions, I said that the market for touch screens at the desktop level would rapidly grow because Windows 8 as works best with a touch screen. For people using Windows 8 with traditional monitors, the user experience is a little clunky because the interface and conventions invite them to touch the screen as if they were working on a tablet, even if they don't have that capability.
Now I may have to adjust that prediction: desktop users may just skip touch screens in favor of gesture control. Leap Motion says that its controller is in many ways better than a touch screen, which combined with a low price and the proliferation of Windows 8 in government, could make the Leap standard equipment.
"Everything you can do with a touch-based system, like Windows 8, can now be accomplished with Leap Motion technology," said David Holz, co-founder and CTO of Leap Motion. "But this is only the beginning. The potential for our 3D interaction technology is really unleashed by applications built specifically for Leap Motion, helping drive the future of computing." NASA, for example, is considering using Leap Motion to control robots on Mars.
Back when we did our Windows 8 review, we expected there would be a bit of a learning curve for new users. Although a device like the Leap Motion controller won't eliminate that curve altogether, it certainly will flatten it out. Users of Windows 8 on a tablet will have an easier time with the new OS because of their touch screen, but the Leap could put desktop users on equal, if not better, footing.
The Leap controller also works with Mac OS. As with a Windows PC, the controller just plugs into a free USB port. And we are starting to see some computer manufacturers integrate Leap directly into their products.
GCN plans to officially review the Leap Motion controller once it's available, but for now, everything seems to point toward it being a transformative experience when used in conjunction with most modern operating systems.
Posted by John Breeden II on May 22, 2013 at 9:39 AM0 comments