5 technologies that will change the market

What you need to know to survive the disruptions ahead


If you measured the state of the economy by the number of new and potentially disruptive technologies on the market, you would think we’d be in an economic boom.

Despite the worst economic downturn since the Great Depression, the flow of disruptive technologies in the market has steadied if not increased during the past couple of years.

In large part, that trend is fueled by many of the side effects of the terrible economy, such as an increase in competition, decrease in supply and stronger need to do more with less.

The new trend is focused on creating tools and services that challenge, rethink and dramatically change the way that we use existing technologies. The goal is to provide convenience, security and improved universal service to customers. And make no mistake, some of the products coming out of these new technologies are disruptive and truly innovative, while others are fads.

Now more than ever it requires a symphony of technologies marching in rhythm to make a technology certifiably disruptive. The timing, economy, infrastructure, price and need all must be there at the same time.

Another growing trend with disruptive technologies is the need for those technologies to be familiar to consumers. The factors that led Apple Computer Inc. to change the way we listen to music were not a new idea. Apple was far from the first company to create a portable digital music player, nor was it the first to create the software component that aggregates your music into a library.

What Apple did was disrupt the industry by focusing on the software to make it easy, free and downloadable and then providing an easy, clean and idiot-proof hardware component to match. The result was a paradigm shift in the way we use all forms of media.

The success of a disruptive technology product often depends on the same characteristic that is vital to successful systems integration. The products that can break down the silos that separate disruptive technologies and produce a fully integrated experience will survive to become a sustaining technology and mainstream product.

We’ve examined the following five disruptive technologies to explore upcoming products and trends so you can better prepare for the changes that are coming in the next couple of years. To no surprise, the products to look out for take advantage of multiple technologies to offer users solutions rather than tools.

NEXT: Data, data everywhere


1. Mobile

The mobile market is a personal favorite technology for me to study because the evolution from carrying a beeper, a roll of quarters for pay phones, your address book, a good paperback book, wallet and watch to having a single device that lets you store contacts, read and purchase new books, make and receive calls, and conduct point-of-sales transactions while listening to music is true disruption.

In the coming years, you can expect the consolidation trend in mobile computing to increase in speed. But instead of removing items we carry, the focus is and will continue to be on centralizing services. Apple and Google started this trend with apps for their mobile platforms, and retail, financial, insurance, medical and governmental entities are all moving toward those types of mobile platform services.

The second large trend in mobile will be the way we communicate with those services. The advent of two-way, high-resolution video, mixed with more sophisticated data compression, intelligent video/call routing and robust wireless bandwidth, is creating a new dynamic in the way we communicate with one another and our service providers.

It’s a marriage of disruptive collaboration technology such as video and videoconferencing and our day-to-day activities that provide the ability to physically see a person instead of just hearing them. In the next couple of years, you’ll be able to show an injury to a triage nurse at a local clinic for advice before making an appointment.

A chief beneficiary in the government market will be teleworkers. Telework will become more popular as it becomes as easy and inexpensive to communicate with someone across the country as it does with someone down the hall.

According to a 2006 CDW Government Inc. study on telework, 41 percent of federal employees indicate they telecommute, which was up from 19 percent at the same time in 2005. Information technology professionals have significantly expanded technical support for telework programs, and 32 percent of IT professionals indicated that their agency has started or expanded such a program since 2006.

In recent years, the operational savings coupled with mobile computing and collaboration technology advancements have increased the telework trend, which I expect will become the most common work environment in the next five years.

The two main problems of disruptive mobile technology are security and limits in data mining. It’s nice that I can have a virtual face-to-face conversation with my boss and forward him an e-mail message that he requested while we meet. However, it becomes a moot point if I can’t find that e-mail message because I have gigabytes of small, medium and large pieces of data to sift through before I find it. Even now, I know colleagues who often prefer to re-create data instead of taking the time to look for the original because the original is buried under mountains of old documents, multiple versions of files and gigabytes of e-mail messages.

Part of the security challenge is coming from the increasing ability to snoop on transmitted data packets and decrypt information. The rate of theft and data integrity will go up in the coming years. The problem with that threat is that the onus isn’t just on the mobile device manufacturers to include proper security software. The service providers and users also need to take the proper precautions to guarantee data integrity.

I see biometrics making a strong comeback in coming years as people, who are sick of remembering passwords, swipe their finger or iris to authenticate their identity.

NEXT:  The coming search revolution


 

2. and 3. Search and the Semantic Web

The relationship between search technology and the Semantic Web is a perfect illustration of how a small sustaining technology, such as a basic search feature on an operating system, will eventually be eaten up by a larger disruptive technology, such as the Semantic Web. The Semantic Web has the potential of acting like a red giant star by expanding at exponential rates, swallowing whole planets of existing technology in the process.

The technology started as a simple group of secure, trusted, linked data stores. Now Semantic Web technologies enable people to create data stores on the Web and then build vocabularies or write rules for handling the data. Because all the data by definition is trusted, security is often less of a problem.

The task of turning the World Wide Web into a giant dynamic database is causing a shift among traditional search engines because products such as Apture, by Apture Inc. of San Francisco, Calif., let content publishers include pop-up definitions, images or data whenever a user scrolls over a word on a Web site. The ability to categorize content in this manner could have significant implications not only for Web searches but also for corporate intranets and your desktop PC.

These types of products will continue to expand, initially in the publishing industry and then to most industries on the Web in the next two to three years.

For example, human resources sites could use them to pop up a picture and a résumé blip when a recruiter drags a mouse over an applicant's name. Medical and financial sites such as the National Institutes of Health could use it to break down jargon and help with site exploration.

In the next three to five years, I would expect desktop operating system search engines to operate in a way that is similar to the Semantic Web and even have their search engines connect to the Web for an expanded search.

Government sites around the world, such as Zaragoza, Spain, and medical facilities, such as the Cleveland Medical Clinic, are using the vocabulary features of the Semantic Web to create search engines that reach across complex jargon and tech silos to offer a high degree of automation, full integration with external systems and various terminologies, in addition to the ability to accurately answer users' queries.

Despite my excitement about the possibilities of turning a large body of data into an integrated and dynamic database, there are some downsides. During the past three years, the economy has slowed the development of Semantic Web products and services. And this disruptive technology is like any open-source initiative: It requires the public’s help to develop it into something useful, functional and truly disruptive.

There is some danger in using this technology to commoditize data and push content to users. The Semantic Web would essentially allow a large retailer to track your interests, purchasing patterns and behavior to increase sales. That could be a slippery slope to privacy invasion.

The enormous amount of data on the Internet has made it increasingly difficult to find, access and maintain the information that most users require. The Semantic Web can solve that problem. That fact is enough to justify the Semantic Web as a necessary disruptive technology that will spawn both useful and wasteful products.

NEXT: What happens when cloud computing takes off?

 

4. and 5. Virtualization and cloud computing

Virtualization and cloud computing are tied together in similar ways as search functions and the Semantic Web.

The probability of search functions becoming part of the Semantic Web is as high as virtualization acting as a steppingstone to cloud computing.

Whether it's at the operating system level, virtual machine model or paravirtual machine model, virtualization is the process of turning one machine into a domain of several independent, virtual machines.

Cloud computing, which is essentially the outsourcing of IT activities to an Internet provider, would allow IT managers to erase maintenance, licensing and training from their list of costs while improving the speed with which they can get their products or services.

Thomas Bittman, vice president and distinguished analyst at Gartner, said it can take an average of four to six weeks to deploy a server. A virtual server cuts that time by a factor of 30, and a cloud server network could be even faster.

However, despite those benefits, I don’t see all major enterprises moving toward a cloud, especially during the next five years.

Most government agencies, financial institutions and some areas of medical services might never buy into true cloud computing because, at the end of the day, they need to know that all of their data in Richmond, Va., or Toledo, Ohio, is resting comfortably in a secure location that they can access at any time.

Despite that concern, many of those industries will eventually migrate to some sort of private cloud in which they still get to have control of the logistics and mechanics behind their data.

If working with outsourcing has taught me anything, latency and outages will always be problems. Assume your enterprise loses network connectivity or the cloud provider loses power. You’re in deep trouble and will be left with little to no control over the fix. That leaves a need for redundancy, which eliminates the cost benefits of a cloud.

Virtualization isn’t immune to drawbacks either. Just like cloud computing, virtualization carries a single point of failure. If that one machine crashes, you could lose hundreds of services and thousands of clients. Additionally, virtualization is power hungry and sometimes leads to slower performance.

However, companies such as Red Hat are investing heavily in operating system virtualization and have made huge strides in mitigating some of virtualization's risks. For example, Red Hat Version 6 will be more efficient and autonomic and offer a more secure and independent service.

NEXT: What have we learned? 


Lessons Learned

In investments, a diversified portfolio is usually the least risky solution. Technology is no different. To weather the disruption of game-changing technologies, the best approach is to take small, diversified steps.

With mobile technology, the movement toward collaborative integration, such as videoconferencing, is inevitable but should be counterbalanced with the knowledge that the technology will be plagued by latency for the next two to three years.

Search functions online and on your PC will continue to strengthen, but so should your patience, because the amount of available data to sift through is growing exponentially. 

Enterprises should ignore the hype of cloud computing and virtualization and treat those new technologies like any other new advancement: Use them in small, strategic steps to make sure the technologies work for the enterprises' needs and purposes — and have a backup plan and proper procedures in place to deal with potential drawbacks.

Reader Comments

Tue, Aug 17, 2010 Herr Swiss

The future has been created a while back in 2002. "The mobile unit may be used alone or as a plug-in to another device such as a cell phone; An autonomous and portable smartcard reader device incorporates a high level of embedded security countermeasures. Data transfers are encrypted with two specific input devices, namely a light sensor and PIN or other keyboard entry, and at the output through the use of a dual-tone encoder-decoder. The unit may be used alone or as a plug-in to another device such as a PDA, cell phone, or remote control. The reader may further be coupled to various biometric or plug-in devices to achieve at least five levels of authentication, namely, (1) the smartcard itself; (2) the smartcard reader; (2) the PIN; (3) private-key cryptography (PKI); and (5) the (optional) biometric device. These five levels account for an extremely strong authentication applicable to public networking on public/private computers, and even on TV (satellite, cable, DVD, CD AUDIO, software applications. Transactions including payments may be carried out without any risk of communication tampering, authentication misconduct or identity theft. In essence, the device is a closed box with only two communication ports. The emulation of the device is therefore extremely complex due to the fact that it involves PKI, hardware serialization for communication and software implementation, in conjunction with a specific hardware embodiment and service usage infrastructure component that returns a response necessary for each unique transaction."

Mon, Aug 9, 2010 Tom Folkes DC

You are so clueless with respect to the semantic web it boggles the mind. I have been working on this problem for about 35 yeas. When I was at the University of Maryland we refereed to it as context. The current design of the semantic web may just be the thing which crashes our civilization. Guys like you have a) not analyzed the human overhead b) have not been paying attention to how Tim B-L keeps altering the target. I found this article using a semantic search tool which I built. www.alexlib.info

Tue, Aug 3, 2010 Dan Washington, DC

RE: "Virtualization isn’t immune to drawbacks either. Just like cloud computing, virtualization carries a single point of failure. If that one machine crashes, you could lose hundreds of services and thousands of clients." Carlos: One of the strengths of virtualization is that if you lose a server hosting multiple services/apps, you can simply provision from Gold disks and fail over to another server, bringing servies back up as fast as you can boot up a new server...in minutes. You can also set up a "hot-hot" configuration with (HA)/ high availability to minimize the posibility of failure. Additionally,multiple physical sites can share the load for added DR assurance, including a mix of private cloud and public cloud. Also, a failed server doesn't have to mean that "thousands" of clients are out of service either. Properly architected, a system will also have a backup for user profiles of virtualized desktops/clients that only then need to connect to a newly provisioned server. In a cloud-based configuration, this is for all practical purposes, infinately scaleable and rapidly deployable in the case of either a failure of servers or DR.

Tue, Aug 3, 2010 Srideep Mitra Colorado

Dear Carlos, Great insights into the technologies that is changing the world. However, I should add that one of the underlying technology that is common across this 5 technologies is "testing" of this technologies. In the commercial world, it is estimated that between 30% and 50% of the total cost of the introduction of a software application is linked to its testing. In the defense, aerospace and civil aviation industries the significance or consequence of failure is much higher and consequently testing is necessarily more rigorous and can account for up to 75% of costs. I hope we will see more interesting stuff from you and team Washington Technology . Thanks sri

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above