The proliferation of datacentres around the world has made the Cloud not only accessible, but also affordable in the process. However, the issue of data sovereignty, the location of where the data is stored, has been an inhibitor.
In particular, data sovereignty has been a sticking point for governments and their data, but as APC by Schneider Electric Pacific vice-president, Paul Tyrer, points out, many highly regulated organisations, such as those in the financial sector, are also concerned.
“The risks and benefits of controlling your own datacentre versus outsourcing to a co-located provider have been debated by the industry in detail,” he said.
According to Zettagrid general manager, Nicki Pereira, the hesitation by government to delve head-first into the Cloud can also be attributed to the relative newness of the technology.
“The transition to the Cloud is still in its infancy and there are many unknowns regarding data sovereignty,” he said. “The lowest risk option is to specify that data should remain under Australian sovereignty prior to any adverse situations arising.”
As such, Pereira expects that governments want the ability to “step in” in the event that something should happen.
Working for the man
Because of the data sovereignty issue, it has meant that datacentres have had to be established locally to meet the demands of governments operating in the region. However, Verizon Business Asia Pacific Cloud and IT solutions practice manager, Ray McQuillan, points out that that the localisation of stored data is not the only attraction for governments to work with local providers.
“From a network perspective, latency is a major factor attracting governments to choose local datacentres,” he said. “Having performance of their systems nearby, and not having to worry about being thousands of miles away worrying about issues impacting user experience, is a decisive factor for governments.”
The other benefit that McQuillan sees in having data onshore is that it gives them the assurance that their business is being conducted domestically and within reach.
“Scalability and feet-on-the-ground also allow for deeper relationships between them and their local providers,” he said. “It also provides a face-to-face accountability and a compelling story around disaster recovery and business continuity.”
Additionally, if a local provider has a “critical mass of customers nearby,” McQuillan sees this as providing another layer of reassurance to governments, which allows for “greater confidence in the chosen local provider to deliver its services and that it will be there for a long time.”
Having strong local security certifications, such as ASIO T4, DSD HP, and PCI DSS, is something that Earthwave founder and CEO, Carlo Minassian, sees as being essential in attracting work from a domestic government. In particular, Minassian cannot understate the importance of the “human factor” and physical proximity.
“It matters that a government agency can see where their data physically resides, that they can visit datacentre sites, get to know the people delivering their services and have access to senior engineers in their time zones,” he said.
Just because a datacentre is located locally does not mean that governments have no other issues to be wary of. In fact, Victorian Privacy Commissioner, Anthony Bendall, recently discussed some of the security concerns that he saw with Cloud computing.
What stood out for Equinix Australia country manager, Tony Simonsen, were the concerns Bendall had beyond the obvious concerns of lack of control over stored data and privacy at overseas Cloud service providers.
“When government agencies consider moving information and services to the Cloud, Bendall felt that questions of data security, accountability in the case of a data breach, and differing privacy laws needed to be addressed,” he said.
But why is the government so sensitive about its data? HP South Pacific Government chief technology officer, Scott Cassin, sees it coming down to avoid compromising citizen data and critical or sensitive information.
“Security vulnerabilities may reside or infiltrate at the system, application and code levels,” he said. “Thus, Australian government agencies need to consider if datacentre services meet Australian national security standards.”
According to Cassin, these standards are quite thorough and extend to security clearance of personnel, the physical security of the site and building access controls, and information security.
“The Australian government defines these standards through the Information Security Manual, as part of its overall Protective Security Policy Framework,” he said.
Network security, performance, availability and reliability are all key factors that governments may consider when regarding offshore services, but Cassin warns that Australian government agencies should first understand the legal implications of this before pushing ahead.
“The law of the country where the data resides will take precedence regardless of Australian requirements or laws,” he said.
Securing sensitive data
In additional to physical security, datacentres are also getting attacked electronically via external hacks. F5 Networks Australia and New Zealand managing director, Kurt Hansen, sees the cyber attacks on datacentres not just being at the network layer anymore, but distributed denial-of-service (DDOS) attacks targeted at the firewalls that protect them.
“We’re seeing an increasing amount of attacks on Web services and on the application layer,” he said. “That is what has driven us to have network, DDOS and a datacentre level protection.”
What Hansen is seeing as the issue now is that the attacks are occurring at a Web application level, and that has become a “real concern” for governments.
“Enterprises have been putting their core applications online for many years, but governments only started a few years ago,” he said. “But as they put more and more of their processes online to cut operational costs, those processes and information they want to share with customers can now be the subject of attacks at an application layer.”
The increasing sophistication and the frequency and diversity of attacks has meant that traditional network firewall protection of datacentres is seen as somewhat outdated and ineffectual.
Hansen compares this situation to the olden days when a moat would be installed around a castle to keep attackers out, with a drawbridge over the moat to let people in and out.
“But now the attacks are coming in from the air, so the drawbridge and the moat don’t help you anymore,” he said. “You still need the drawbridge and the moat in the datacentre, but you also need air defences now.”
In the past, Oracle Australia government, education and health applications general manager, Ian McAdam, has seen local regulations around data sovereignty somewhat hindering public and financial sector customers from moving ahead with Cloud computing.
“Organisations worldwide have ranked IT security as one of their top priorities as increasingly sophisticated attacks, new data protection regulations, and most recently insider fraud and data breaches, threaten to disrupt and irreparably damage their businesses,” he said.
However, the evolution of the industry as a whole has meant that providers are able to overcome their concerns and meet specific demands through robust Cloud solutions hosted at local datacentres.
As such, the providers that McAdam feels are the most successful are those which “uniquely safeguard” a customer’s information throughout its entire lifecycle and help organisations achieve “security inside out”.