Showing posts with label cloud computing. Show all posts
Showing posts with label cloud computing. Show all posts

Saturday, March 26, 2011

Trust and security in the cloud

The Register has published a new 16-page whitepaper on trust and security in cloud computing, with the key findings being

  • Many companies could do much better when it comes to in-house security
  • SaaS adoption is limited currently, but there is increasing interest from the business
  • The biggest impediment to SaaS adoption is a perception of security issues
  • Companies with experience of SaaS are positive about provider security
  • SaaS is likely to help with shortcomings of on-premise security capabilities

The whitepaper was written from data gathered in an online survey with over 500 participants. Amongst the many tabulated responses to the survey there is an interesting list of the ways data can exit from corporate boundaries.

image

Saturday, August 28, 2010

Searching an Encrypted Cloud

There is a post over at the Enterprise Search blog with some pointers to encrypted search, including my own overview. There is a link to a whitepaper from Seny Kamara and Kristin Lauter of the Microsoft Research Cryptography Group, proposing an architecture for a virtual private storage service which supports the following properties

  • confidentiality
  • integrity
  • non-repudiation
  • availability
  • reliability
  • efficient retrieval
  • data sharing

Tuesday, May 18, 2010

Two universities rethink Gmail migration plans

The University of California at Davis (UCD) and Yale University were considering moving their email systems onto Gmail, but both have put those plans on hold for the moment. The CIO of UCD, Peter Siegel, said that he was not prepared to risk the security or privacy of the school’s 30,000 faculty and staff.

Yale has delayed a more general migration to Google apps, including Gmail, citing security and privacy concerns over cloud-based management of their data. Michael Fischer, a computing professor, said that
Google stores every piece of data in three centers randomly chosen from the many it operates worldwide in order to guard the company’s ability to recover lost information — but that also makes the data subject to the vagaries of foreign laws and governments, Fischer said. He added that Google was not willing to provide ITS with a list of countries to which the University’s data could be sent, but only a list of about 15 countries to which the data would not be sent.
So there is a concern that the personal data of students and faculty is being stored outside US jurisdictions. However neither UCD or Yale ruled out migrating to Google cloud applications once there was adequate transparency for the protection of data.

Friday, May 7, 2010

Wednesday, May 5, 2010

The power limit of Cloud Computing

In February I posted on Lew’s law, a prediction by former SUN CTO Lew Tucker stating that IT expenses will increasingly track to the cost of electricity. Tucker gave a keynote presentation on The Ultimate Cost of Computing at the recent Cloud Connect conference, where he gives some more insights into his views on the evolving cost model for cloud computing.

Tucker begins by stating that the driving forces of cloud computing are technology and the market, symbolised by Gordon Moore and Adam Smith (the author of the invisible hand of the market). He shows that Moore’s law, the doubling of computing power every 1 – 2 years, continues to be achieved by the microprocessor industry as a whole, with computing power increasing by a factor of one million over the last 40 years.

image

In the last few years these gains have been supported by multi-core processors, issues with power consumption, chip cooling and production costs invalidate the assumption that smaller components are the most cost effective strategy to increase processing capability. The future probably then lies with more chips of a given complexity rather than with chips of increased complexity. So Moore's Law may actually be maintained but not for the reasons that Moore predicted (increased chip density).

A key question for Lew is whether cloud service providers can pass on the benefits of Moore’s law to customers. Already the cost per hour of a CPU (instance) has dropped from $1 to less than two hundredths of a cent over the last 15 years.

image

But what are the real costs of cloud computing? Are faster computers the deciding factor? Apparently not - it's administration and power consumption.

image

Cloud computing wins by leveraging automation, virtualization, dynamic provision, massive scaling and multi-tenancy, which all lead to power becoming the dominant cost (mainly for scaling and cooling). And data centre power consumption has already doubled in the last 5 years

image
So Lew’s law can now be started as
In the cloud, the cost of computing will continue to fall bounded only by cost of energy
Being an ex-SUN man, Lew must take some delight in this final slide

image

Reblog this post [with Zemanta]

Saturday, May 1, 2010

What is the LINPACK rating of Conficker?

Rodney Joffe, senior vice president and senior technologist at the infrastructure services firm Neustar, gave a keynote presentation on Cloud Computing for Criminals at the recent Cloud Connect conference. Joffe presents some figures which show that the computational size of the Conficker botnet dwarfs the current commercial offerings, based on measuring the number of systems, the number of CPUs and available bandwidth. For Conficker these values are given (estimated?) as

  • 6,400,000 systems
  • 18,000,000+ CPUs
  • 28 Terabits of bandwidth

These corresponding measures for Google are 500,000 systems, 1,500,000 CPUs and 1,500 Gbps of bandwidth, with Amazon and Rackspace providing significantly less resources. So Conficker is a massive ad hoc computational structure. But is Conficker really like a cloud service? Joffe says yes because

  • It’s available for rent
  • Choose your geographies
  • Choose your networks
  • Choose your bandwidth
  • Choose your OS Version
  • Choose your specialty (DDoS, Spam, Data Exfiltration)

and further the vendor has good qualifications

  • Much more experience (1998)
  • Larger footprint (Millions of systems)
  • Unlimited new resources (New malware)
  • No costs
  • No moral, ethical, or legal constraints

This all reminds me of a mail post by Peter Gutmann from 2007 called, World's most powerful supercomputer goes online, referring to the Storm botnet

This doesn't seem to have received much attention, but the world's most powerful supercomputer entered operation recently. Comprising between 1 and 10 million CPUs (depending on whose estimates you believe), the Storm botnet easily outperforms the currently top-ranked system, BlueGene/L, with a mere 128K CPU cores. Using the figures from Valve's online survey

http://www.steampowered.com/status/survey.html

for which the typical machine has a 2.3 - 3.3 GHz single core CPU with about 1GB of RAM, the Storm cluster has the equivalent of 1-10M (approximately) 2.8 GHz P4s with 1-10 petabytes of RAM (BlueGene/L has a paltry 32 terabytes). In fact this composite system has better hardware resources than what's listed at http://www.top500.org.

This may be the first time that a top 10 supercomputer has been controlled not by a government or megacorporation but by criminals. The question remains, now that they have the world's most powerful supercomputer system at their disposal, what are they going to do with it?

And I wonder what the LINPACK rating for Storm is?

And I wonder what the LINPACK rating is for Conficker?

Reblog this post [with Zemanta]

Sunday, March 21, 2010

In Search of Encrypted Search

Last December Moxie Marlinspike announced a $34 cloud service for auditing the strength of WPA passwords. The service tries to reconstruct the password from the user-supplied WPA session material and a hand-crafted dictionary of 135 million passwords. Marlinspike's cloud service searches through the optimised dictionary in just 20 minutes, a computation that would otherwise take 5 days or so on a standard desktop. Notice then that Moxie learns the password if it is found in his dictionary. This might be an example of where customers would prefer Moxie to perform an encrypted search on your behalf – the parameters are encrypted and Moxie just learns a simple yes or no as to whether the corresponding password was found.

This is an encrypted search problem that could be solved in principle by the “fully homomorphic” encryption system devised by IBM researcher Craig Gentry in June of last year. There was much fanfare around the breakthrough and googling “ibm homomorphic encryption” returns over 21,000 results today. There is a long and informative article in Forbes under the headline IBM's Blindfolded Calculator, and you can also see a lecture by Gentry on his result here, given at the Fields Institute in Toronto last December.

Homomorphisms in RSA

But what is homomorphic encryption? Well RSA gives us a nice example, and recall that the “textbook” RSA encryption of a message M is given as

E(M) = M^e mod N

It is not hard to verify that

E(M1) * E(M2) = E(M1 * M2)

which in words states that the product of two encryptions is equal to the encryption of their product. And this is the defining property of a homomorphism – given a function (here encryption) and two inputs (here messages) with a binary operation defined on those inputs (here multiplication), we can interchange the order of applying the function and the operation to get the same result. So we say that the RSA function is a homomorphism with respect to multiplication, or more simply, just a multiplicative homomorphism.

So for an unknown message M2, if we are given E(M2) we can multiply E(M2) by E(M1) to produce E(M1*M2) without needing to decrypt M2. In the early days of RSA this homomorphic property was deemed to be undesirable, due to posited scenarios like the following one (which in fact seems more likely today than they did 20 years ago). Let’s say Alice is going to transfer $M to the account of Bob and sends an encrypted message E(M) instructing her bank to perform the transfer. But Bob intercepts the message and multiples it by E(2), the encryption of 2. The new message becomes E(2)*E(M) = E(2*M) and Bob doubles his money without having to decrypt E(M). The PKCS 1 format has removed the threat of manipulating messages in this way by adding additional padding and fixed fields to messages which can detect such (multiplicative) manipulation.

While RSA is multiplicatively homomorphic it is not additively homomorphic, since in general

E(M1) + E(M2) != E(M1 + M2).

Getting back to the encrypted search problem, does RSA being a multiplicative homomorphism help us? Well not really. Say we are searching for a string S in some data D, where we are only given E(D). We could encrypt S to get get E(S), and then compute

E(S) * E(D)  =  E(S*D)

but E(S*D) won’t tell us that S is somewhere in D because treating S and D as numbers then multiplying them is not related to searching. So we need to find an encryption scheme that has more homomorphic properties than just multiplication if we are to perform encrypted search. And this is what Mr. Gentry did.

Fully Homomorphic Encryption

This elusive property is what Gentry has created in his “fully homomorphic” encryption system, a system which can support the secure homomorphic evaluation of any function given an inout, such as search over data, and not just simple arithmetic functions like multiplication. The underlying computational problem that furnished this property was neither RSA nor discrete logarithm systems but rather a geometric construction referred to as an ideal lattice. Lattices are special vector spaces that furnish computationally difficult problems that have been used in several other cryptosystems such as NTRU, and provide an alternative to the “standard” hard problems used from number theory.

Using the ideal lattice Gentry was able to show how to compute a “circuit” for problems such as encrypted search that can be evaluated without decrypting the data. Both the encrypted term and data are used as inputs to the circuit, and the trick is to devise a method for securely and correctly evaluating the gates of the circuit. Your computer uses built-in gates and circuits to evaluate computations, such as performing the multiplications and divisions to evaluate an RSA encryption or decryption during an SSL session for example. And as such the computations are small and fast. However for encrypted search the corresponding circuit needs to be built and represented in software, which leads to large and inefficient computations. So this is why the result is a breakthrough but not a practical one at the moment. Gentry has estimated that building a circuit to perform an encrypted Google search with encrypted keywords would multiply the current computing time by around 1 trillion.

The Outlook for Encrypted Search

It remains to be seen whether Gentry’s theoretical breakthrough can be converted into a practical service, in the cloud or otherwise. Gentry has worked with some other researchers from IBM and MIT to remove the requirement of using lattices and to just treat the fully homomorphic encryption as operating over the integers, which provides a simpler conceptual framework. Making the central ideas more accessible will increase the chances of finding more practical solutions. There is a recent review of homomorphic encryption by Craig Stuntz, giving more details and promising code in an upcoming post. Stuntz also points to an article by Gentry in the respected Communications of the ACM which begins

Suppose that you want to delegate the ability to process your data, without giving away access to it. We show that this separation is possible: we describe a "fully homomorphic" encryption scheme that keeps data private, but that allows a worker that does not have the secret decryption key to compute any (still encrypted) result of the data, even when the function of the data is very complex. In short, a third party can perform complicated processing of data without being able to see it. Among other things, this helps make cloud computing compatible with privacy.

Reading further requires a subscription but it is clear that the work of Gentry and his co-authors is being positioned as a core service for cloud platforms, and I am suspecting IBM’s platform in particular. However, just at the moment, you will have to trust Moxie with your WPA password.

Saturday, February 20, 2010

Lew's Law: IT expenses converge to the cost of electricity

Interesting:

Last week I heard Sun Microsystems Cloud CTO Lew Tucker predict that IT expenses would increasingly track to the cost of electricity. “Lew’s Law” (as described to a room of thought leaders) is a brilliant theorem that weaves a microcosm of IT trends and recent reports into a single and powerful concept.

Lew’s Law is a powerful idea whose time has come, with profound and far reaching impacts, including the automation of the network.

Full story here from Gregory Ness with some additional remarks here.

image

Monday, December 7, 2009

WPA Password Cracking in the Cloud for $34

image

Following on from the recent results of a project on the feasibility of password cracking using cloud computing, a new cloud service for cracking WPA passwords has been announced by researcher Moxie Marlinspike, best known for his work on discovering subtle flaws in the SSL protocol. The cloud service attempts to crack uploaded passwords against a 135 million word WPA-optimized dictionary on a 400 CPU cluster.

The stated purpose of the service is to assist penetration testers and network auditors who need to test the strength of WPA passwords that use pre-shared keys (PSK or personal mode). In this mode, the PSK master key is used to derive a session key from several parameters in the initial wireless connection handshake, including the MAC addresses of the client and base station. In practice, the PSK is a password and is therefore exposed to a brute force or dictionary attack.

Verifying a guess for the PSK password is relatively costly, roughly the equivalent of processing a megabyte of data, since the session key is derived from 4096 iterated hash operations. This high iteration count, or spin, limits the number of password trials to several hundred per second on a standard desktop. Marlinspike's cloud service searches through a 135 million word dictionary in just 20 minutes, a computation that would otherwise take 5 days or so on a standard desktop.

And all this for just $34, after a client has uploaded a capture of the WPA handshake on the network of interest. If the cloud service does not find a password then a client can be confident that the submitted password is resistant to dictionary attacks – no refund, you still pay for the assurance!

The Church Of Wifi has a project to create public rainbow tables for WPA based on common passwords, but the resulting tables only encode one million passwords because the tables must be recomputed for each network identifier (ESSN), which acts as salt in the password derivation function. Marlinspike opted to create his own WPA-optimized dictionary list which includes word combinations, phrases, numbers, symbols, and “elite speak”.

In Security Muggles I posted about finding ways to connect with management about security. Robert Graham, CEO of penetration testing company Errata Security, has commented that "When I show this to management and say it would cost $34 to crack your WPA password, it's something they can understand," he said. "That helps me a lot."

PC World has somewhat dramatically reported the announcement as New Cloud-based Service Steals Wi-Fi Passwords, suggesting that the cloud is reaching into wireless networks and grabbing passwords. In fact, the service is designed to trade-off a 5-day desktop computation against a 20-minute cloud computation for $34. Nonetheless, a better name would be WPA Auditor rather than WPA Cracker.


Tuesday, November 24, 2009

Navigation map of the Cloud Ecosystem

Appiro has created a nice RIA map for navigating through the cloud computing ecosystem. The map is a 3 x 3 grid with applications, platforms and infrastructure on one axis, and public private or hosted cloud solutions on the other. The complexity of the ecosystem does not bode well for decision makers, and the support to be provided by risk management. Still, it looks positively friendly as compared to the details of SOA.


See ReadWriteWeb for more commentary and background on Appiro.

Thursday, November 19, 2009

Not so sunny for Whit Diffie

image Renowned cryptographer Whitfield Diffie has apparently left his position as chief security officer at SUN, according to a recent article at the MIT Technology Review, who were interviewing Diffie on the security of cloud computing. The Register speculates over the reasons for Diffie’s departure from SUN after 18 years of service, suggesting that Oracle “is a company known for making its dollars count rather than indulging meta thinking”. Diffie is currently a visiting professor at Royal Holloway, University of London, which runs perhaps the most respected IT Security graduate program in Europe, while also maintaining an excellent group of researchers.

And what are Diffie’s thoughts on clouds computing? His first statement is quite telling

The effect of the growing dependence on cloud computing is similar to that of our dependence on public transportation, particularly air transportation, which forces us to trust organizations over which we have no control, limits what we can transport, and subjects us to rules and schedules that wouldn't apply if we were flying our own planes. On the other hand, it is so much more economical that we don't realistically have any alternative.

Cloud computing literally turns all our conventional security assumptions inside-out, but Diffie, like others, sees the economic sense, if not the economic certainty. A recent brief on cloud computing by the Economist could spare no more than a few sentences to discuss the security risks. The large economic wheels are turning inexorably toward adoption. Diffie goes on to say that

The whole point of cloud computing is economy: if someone else can compute it cheaper than you can, it's more cost effective for you to outsource the computation.

At the moment companies face an unsatisfying choice: either encrypt data for secure storage in the cloud, forgoing the benefits of cloud computations, or leave it in the clear for maximum computational utility but with a risk of loss or exposure. Diffie mentioned a third alternative, computing with encrypted data, but at present this alternative is not viable. I assume he is referring to the recent encryption breakthrough by Craig Gentry of IBM which could be used to perform searches on encrypted data, albeit 1 trillion times more slowly than Google does today.

In the short term (and maybe the longer term as well) Diffie sees the cloud as a matter of trust. He advises to pick your supplier like you pick your accountant.