Ray Kane


The way we were: Cloud's roots in the '60s

From time sharing to cloud services: evolution or revolution?

Let's peer into the past for a minute. Time sharing as we knew it began in the early 1960s. In Phoenix, Ariz., General Electric had a Central Processing Unit (CPU) named the GE 225, and a control or switching unit named the Datanet 30. Each unit had 16K core memory (yes, 16). When strapped together, up to 40 simultaneous users could use the system.

This resource did not become commercially available until Thomas Kurtz and John Kemeny of Dartmouth University's Tuck School developed the Basic language. As Visicalc and Excel did for the PC, Basic enabled users to use the system productively. The project was a huge success and in 1963 GE opened information processing centers across the country.

In time, Fortran and ALGOL were offered along with Basic. Hundreds of library programs, sets of market specific sub-routines, were developed. The network, simultaneous usage, library programs, languages and pricing "by the drink" seem like the precursor to the cloud.

Related stories:

Agencies, choose your clouds – here are the 3 basic options

Skip the baby steps, governments plunge into the cloud

Soon, competitors joined the party, but until 1970, when GE sold the computer business to Honeywell, GE dominated market share domestically and internationally with 14 international businesses facing little or no competition.

Just as in early time-sharing, cloud security and pricing clearly still need to be resolved to achieve customer acceptance. Today, cloud security is the most significant problem, with equitable pricing a close second.

The security issues in the 1960's were mostly a concern of corporations not wanting their information to be shared in the company. Discussions centered on the ability to keep corporate information on the 225 segmented to inhibit leaks across the internal organization. The Proctor & Gamble, Duz and Tide folks wanted assurance that neither had access to the others' data. One solution – spreading customer groups across geographically dispersed centers -- partially solved the security problem. Today, of course, cloud customers need more sophisticated security solutions and must be assured that their information will be secure internally and externally.

Pricing shared resources

In the pricing arena, GE began with an hourly connect charge plus a $100 minimum monthly charge per terminal. This scheme did not differentiate between the beginning or casual customer, the intermediate customer or the more knowledgeable customers who could gobble up system resources.

In 1968 GE introduced the CRU (Computer Resource Unit), which attempted  to level the playing field. In adding CRU charges for the use of scarce resources, Fortran, storage, input/output, and library programs were now priced at a premium. This allowed adjusting of the dials to more fairly regulate charges associated with system usage. Competitors joined in with CUU's (Computer Utilization Units) and other pricing schemes to ensure users paid their fair share.

New pricing, along with a push toward market segmentation for market segments ranging from manufacturing, petro-chemical, banking-financial, scientific, and electric market, brought in profitable cash generation.

The time-sharing experiences raises many questions that are immediately relevant to the rapidly developing market for cloud computing services today.

Resources are plentiful, but will pricing be put in place to achieve equity for all users? I can envision many resource hogs such as Monte Carlo simulations and other computational algorithm routines, complete system modeling and capacity studies being sent "to the cloud" to keep corporate resource costs down. 

Will the cloud be the place to send large, one time requirements?  Will we see contention for bandwidth and resources? Will CRU-like pricing be used to charge for usage? Will the cloud be segmented to address specific markets?  And ultimately, will the market predominantly move to “private" clouds and ultimately be brought in-house like time sharing?

The initial reaction that cloud is an idea du jour has changed. When federal agencies embrace one cloud e-mail, a gigantic step in agency collaboration will occur. What's next-- social media? Time and attendance? Other candidates?

A backlash could come from  in-house organizations currently providing these and other candidate services. Any move to the cloud will be thoroughly vetted by the current suppliers with security being the main issue.

To date, we have had very little dialog or information on pricing. As competitive offerings begin to hit the marketplace, the ability to review pricing in a standard, consistent, way will evolve in the pursuit of best value.

Meanwhile, industry players are offering their opening gambits. A few private sector firms are positioning themselves to gain market share and the federal government itself could be a major player.

Trusting others with your information began with time sharing and now we have cloud. Is it an evolution or a revolution? I vote for the former. What a great time to be on the sidelines to observe the future of what once was just an "idea du jour."

About the Author

Ray Kane is a consultant who has worked in the federal market for more than 40 years.


  • Records management: Look beyond the NARA mandates

    Pandemic tests electronic records management

    Between the rush enable more virtual collaboration, stalled digitization of archived records and managing records that reside in datasets, records management executives are sorting through new challenges.

  • boy learning at home (Travelpixs/Shutterstock.com)

    Tucson’s community wireless bridges the digital divide

    The city built cell sites at government-owned facilities such as fire departments and libraries that were already connected to Tucson’s existing fiber backbone.

Stay Connected