Recently, I had to do some work on a remote Linux server. Usually, in such cases, I get command-line access to the box through a Secure Shell session, using the free Putty client
for Microsoft Windows.
At the time however, someone had a Macbook notebook nearby, so I decided to use that machine instead. The nice thing about the newer Macs is, that, underneath the snazzy OS X user interface, they are built on the Darwin base operating system, which is a Unix OS based on the Portable Operating System Interface (POSIX), a set of standards that specify how an implementation of UNIX should operate. I could use the built-in SSH on this Mac.
Ultimately, I was foiled by the security features of the Mac. I found that SSH attempted to log me in as the account owner of the Mac itself, rather than letting me to supply my own log-in name and associated password. In effect, I couldn't log on as anyone except the owner of the Mac account, at least by default. Because I didn't have an account on that Mac and my friend with the Mac didn't have an account on my Linux box, I couldn't log in.
Sure, this was a roadblock for me, but I appreciated how the SSH was tied directly into the OS on the Mac. This could prevent someone else from possibly using this Mac as a launching point for other malicious activities. The Windows/Putty combo offered no such checks. (Windows' own Telnet client, which is a less secure version of SSH, does not supply the local log-in name to the destination).
While a small example, it nonetheless shows one way that Macs may in be more locked down by default, security-wise, than Microsoft Windows.
Are Macs inherently more secure than Windows? We hear this claim both from Apple and from Mac enthusiasts. But is it true?
"We like to think of OS X, both the client and the server, as being, by default, a very secure OS," Apple senior worldwide product manager Eric Zelenka told us in a recent interview. "By default" seems to be the operative phrase here.
Zelenka pointed to Mac's strict control of user permissions as an example of such security, which I had learned about first-hand in my aborted SSH sessions. Macs have a fine-grained set of permissions that determine which applications a user can run and which files and directories they can see.
Macs do not, by default, have a root account. A root account is the account you would use to make whatever changes you want on a computer. In contrast, all Windows accounts are root accounts by default. Of course, an administrator can easily configure a Windows computer to limit which actions a user can execute on computer. But Macs come like that out of the box. They follow the old Unix tradition of restricting users to their own workspaces, and keeping them — and any serendipitously planted programs operating within their accounts —- away from the sensitive parts of the OS.
"The system’s default configuration is one of the most important security features provided by Mac OS X," noted a OS X 10.3 security configuration guide posted by the National Security Agency. "The root account comes disabled in Mac OS X. Second, network services are all initially disabled. Third, the initial logging setup is consistent with good security practice."
Another advantage that Zelenka pointed out was how that underlying OS, Darwin, was open source. In theory that means more developers are combing through the source code and looking for incorrectly written code, which is a major source of vulnerabilities.
"It is not a closed-source environment where only Apple knows how the inner-workings of the OS and only Apple can improve it — it is available for the entire world to see," Zelenka said. Moreover, many of the programs and the utilities included within the OS package (such as SSH) also come from the open source community. They have been battle-hardened within the many Unix, Linux and Berkeley Software Distribution deployments out there.
Apple's security guide for OS X 10.5, mentions a number of other advanced security features designed to discourage unintended malicious activity, including sandboxing of applications within controlled environments, the use of mandatory access controls and the Keychain service to manage credentials.
But mitigating factors must also be considered as well. As Laura DiDio, principal at analysis firm Information Technology Intelligence Corp., pointed out, Macs have not been used as much as Microsoft Windows. Macs have not attracted the attention of neither the malicious hackers nor the more noble-minded security researchers, both of whom wish to make a name for themselves by finding new vulnerabilities in popular software products.
In other words, the reason that we don't see as many vulnerabilities in Macs as in Microsoft Windows is that less attention is being paid to them, not because they are inherently more secure.
This may change as Macs grow more popular. In fact, we are already starting to see this in play. In the upcoming Black Hat D.C. conference, at least one researcher will take aim at Macs. Italian security expert Vincenzo Iozzo promises to show how to have a Mac program execute entirely within the memory space of another program, thereby thwarting any efforts to detect the program through process tracing.
So only as Macs inch more and more into the enterprise will their mettle be truly tested.
Posted on Jan 30, 2009 at 9:39 AM6 comments
Earlier this week, GCN covered how an Alabama county's district attorney's office is using Apple hardware in the workplace. One of the factors in the hardware selection was that the sleek Macs may be spurring workers to become more creative than they would have with plain vanilla PCs, as Apple's "Think different" ad campaign suggested. Or if the computers are not spurring innovation, then at least they are not slowing down workers' creativity.
Many aspects of information technology are easy to quantify and, hence, specify on a request for proposal, such as price of the machines or how quickly the processors run. Murkier are the psychological benefits or detractions that come with IT products. Should they be considered by IT managers?
At the Embedded Systems Conference in Boston last fall, influential software developer Joel Spolsky talked about many of the intangibles that come with IT products, and how fuzzy perceptions can end up drawing sharp divisions on how IT is used.
For instance, he looked at the iPhone and the Macbook Air, both of which look totally seamless. "From an engineering perspective, you get the feeling if you accidentally swallowed [an iPhone] it'd go right down," he said. Was it an engineering necessity to remove all the protrusions that break the lines of other laptops and phones? No. But the psychological affect does translate into greater sales, and greater perceived satisfaction among users.
"Developers often think of these things as just being lipstick. They are concerned about functionality," he said. And yet it is the lipstick that can determine the success or failure of a technology—and this is as true for enterprise applications as it is for consumer gadgets.
Such subliminal influences can not only be designed into the technology itself, but even from the language that is used to describe the technology.
Spolsky spoke about something called "culture code," a term invented by a French anthropologist to describe how the perception people have about things can be influenced by the language they hear about such things. The language creates an "emotional space," that listeners can identify with.
He looked at how culture code around two different programming languages, Ruby and Python. Both languages can be used to do the same sorts of Web application development, and each had its own distinct set of users.
"It's hard to imagine that culture code applies to programming languages but it does," he said. "I thought the Ruby users were kind of Emo, while the Python users were kind of nerds, and I didn't know why this was happening because the languages were very, very similar." (Emo is a name of a youth subculture that favors dark clothing and moody, emotional music).
Part of the reason for these two different perceptions of these two similar programming languages may lie in the way each language was been introduced to the developer community. Spolsky argued.
Python was created by creator Guido van Rossum in the late 1980s as an easy-to-use language. Its popularity grew by word-of-mouth, and van Rossum was never one for hyperbole, and always stressed Python's practical nature.
Ruby, on the other hand, seemed to benefit from a fair amount of flash. Notably, one online book was critical to its success, "Why's (Poignant) Guide to Ruby" by someone writing under the non de plume Why the Lucky Stiff. The Web book includes found clip art, cartoons and fancy typography, as well as asides about family matters and other issues not pertinent to the language. Overall, it had an artsy cutting-edge feel to it, a vibe that carried over to the language itself for some of its readers.
"The fact that he had these goofy cartoon foxes helped establish the culture," Spolsky said.
Another voice that propelled Ruby was David Heinemeier Hansson, a developer who created the Ruby on Rails Web framework, while working for 37Signals, a Chicago-based Web application hosting firm. Spolsky pointed out how the language that Hansson used to describe Ruby on Rails was heavy in emotionally-positive verbiage. He quoted one statement from Hansson, about how the Ruby on Rails story is:
"one of beauty, happiness, and motivation. Taking pride and pleasure in your work and in your tools. That story simply isn’t a fad, it’s a trend. A story that allows for words like passion and enthusiasm to be part of the sanctioned vocabulary of developers without the need to make excuses for yourself. Or feel embarrassed about really liking what you do.”
"He is hammering onto Ruby words like 'beauty,' 'happiness,' 'motivation,' 'pride,' 'pleasure,' 'enthusiasm'," Spolsky commented. And by associating these words with Ruby, Hansson was also subtly suggesting that the words with the opposite meanings (pain, ugliness, apathy) may apply to other programming languages (And for Java, such words could actually be appropriate, he quipped).
In any case, Hanson's words started showing up in other people's discussions, talking points and PowerPoint presentations of Ruby, and this, in the long run, influenced how the developers thought of Ruby.
"The idea [is] that you can create a culture around a product no matter how technical it is, and get people to fall in love with it, even though it really is syntactic sugar on top," he said. "It is a way to get to be best rather than most suitable."
For IT managers considering which among a competing set of products to purchase, identifying the culture code around each technology could be a useful way to factor out all the corporate marketing that influences user perceptions. On the other hand, if positive perceptions on the part of users actually spur greater productivity, should they be dismissed outright? And that's a question you can ask the closest Mac user.
Posted on Jan 22, 2009 at 12:00 AM0 comments
The role of technology on Inauguration Day, like many of the logistical and security issues surrounding that event, got more media attention than usual even if as a background story to the day’s primary event.
Much of that attention revolved around the volume of online streaming video traffic that reached unprecedented heights on Inauguration Day. CNN alone reported that as of 6 p.m. Jan. 20, CNN.com Live had served more than 25 million live streams globally—and CNN.com had served more than 158.5 million pages so far that day.
Moreover, CNN.com Live estimates it served more than 1.3 million concurrent live streams during its peak, which occurred immediately prior to President Obama’s inaugural address. It was forced to limit the number of viewers of its online stream during the speech, but was prepared for peak demands by using a waiting-room strategy, where viewers were queued up in order and fed live video streams as capacity became available, a CNN.com spokesperson said.
But perhaps the more impressive technology story on Inauguration Day was the interactive use of software from Microsoft, called Photosynth, that captured thousands of photos from the Inauguration Day crowd, transmitted to CNN via cell phones and stitched them together in a powerful 3-D interactive mosaic available online and featured during CNN’s broadcasts.
Photosynth, as many who follow the work of Microsoft Live Labs know, uses two remarkable technical tools in one product: a viewer for downloading and navigating these complex visual spaces and a "synther" for creating them in the first place. The combination makes it possible to position and reconstruct slices of a 3-D world from flat photographs by mapping common image elements.
The technologies stem from Seadragon, a startup Microsoft acquired in 2006 whose technology is “capable of delivering a buttery-smooth experience browsing massive quantities of visual information over the Internet. It is all the detail you want, exactly when you want it, with predictable performance regardless of the amount of data—from megapixels to gigapixels,” according to a Microsoft information page.
The other technology relies on the work of Noah Snavely, Steve Seitz, and Richard Szeliski and a prototype they developed called photo tourism. The idea was simple: Given a few dozen or few hundred photos of a place, is there enough information to reconstruct a 3-D model of that place? The advanced computer vision techniques pioneered in pursuit of this goal formed the basis of the synther.
Clearly the CNN demonstration for Inauguration Day viewers is likely to unleash new interest and applications of Photosynth for a variety of government and military work.
Posted on Jan 22, 2009 at 9:39 AM2 comments