Aujas US

An IDG Ventures Company

Amazon EC2 Failures Are a Wakeup Call for Cloud Customers

Amazon data center crashes

Building Cloud-friendly applications can help your company manage risk and avoid losses when the host's data center crashes

Early in the morning of April 21, Amazon’s EC2 data center in Virginia crashed, bringing down many popular websites, small businesses and social networking sites.

The strange fact is that the outage still ensures that the 99.55% availability as defined in the SLA (Service Level Agreement) is not breached. Let us put aside the other aspects and focus on Cloud services and the new generation of programmers and business who use these services. Though the SLA leads to quite an interesting debate, we will leave that to the legal experts.

More often than not, when we discuss building applications in the Cloud, the basic assumption is that of 24×7 service availability. While Cloud service providers strive to live up to this expectation, the onus of designing a system resilient to failures is on the application architects.  On the other hand, SLA driven approaches are very reactive in nature. In purest sense, SLA’s are just a means of trust between the user and the service provider. The fact is that SLA’s can never repay for losses. It is up to an Architect and CIO to build systems that tolerates such risks (Cloud system failures, connectivity failures, SLA’s, etc).

With Cloud infrastructure, we end up building traditional systems that are so tightly coupled and hosted without taking advantages of the availability factor. These shortcomings maybe part and parcel of software world where functionality takes precedence over all other aspects, but such tolerance cannot be expected in the Cloud paradigm. A failure on part of the Cloud service provider will bring down the business and getting back the data becomes a nightmare when all the affected businesses are trying to do the same.

Accommodating and managing these factors are the business risks, which need to be identified. Businesses that do not envision these risks are sure to suffer large scale losses. The truth is that building such resilient systems is not very complex task. The basics of all software principles have remained same whether they are built for Cloud or enterprise-owned hardware. Mitigating as many risks as possible requires that several basic designs and business decisions be made – while considering the software provider – such as:

  • Loosely couple the application
  • Make sure the application follows “Separation of Concerns”
  • Distribute the applications
  • Backup application & user data
  • Setup DR sites with a different Cloud service provider

These decisions involve software that follows these basic designs and business decision managers who identify various service providers to mitigate such risks. Cloud service will enforce a thinking among the business managers that availability should not and cannot be taken for granted.

These failures will not stop the adoption to Cloud but will make the customers aware of the potential risks and mitigation plans. The Cloud failure will have serious impact on the CTO/ CIO and the operations head. In a non-Cloud model, a CIO’s role has been noted as very limited. The interaction of the CIO with a CTO in the everyday business is much less. These two executives need to work more closely to protect the business and reduce risk.

The best practices for the Cloud application builders are:

  • Build Cloud applications, not applications in the Cloud
  • Design fault tolerant systems, wherein nothing fails
  • Design for scalability
  • Loosely couple application stacks (IOC)
  • Design for dynamism
  • Design distributed
  • Build security into every component

The best practices are necessary for all the architects who build Cloud applications. Do not simply port a traditional application to the Cloud. They are architecturally different and will not take advantage of the underlying services – and most often – will result in failure.

Remember “Everything fails, all the time.” It is time to think and manage risks and not let the SLA stare at you when you are losing business. Be proactive; build Cloud-friendly applications.

The new world on Cloud looks more promising than ever. However, failures can make us realize that functionality without proper foundation and thought process can have serious repercussions. It is essential for every business to review their risks and redefine their new perimeter in the Cloud.

For more information on how Team Aujas is assisting clients with security risk in the Cloud, please contact Karl Kispert, our Vice President of Sales. He can be reached at karl.kispert@aujas.com or 201.633.4745.

April 27, 2011 Posted by | Cloud Security, Data Losss Prevention, Data protection, IT security | , , , , , | Comments Off on Amazon EC2 Failures Are a Wakeup Call for Cloud Customers

Cloud Computing – Security Threats and More…

Privacy and security in the CloudCompanies that struggle to maintain their IT infrastructure often look to cloud computing to provide a significant cost savings. However, you must look into the clouds and understand what risks are swirling around when it comes to storing your data.

In a recent survey by CIO Research, respondents rated their greatest concerns about cloud adoption. Security was their top concern, with loss of control over data number two:

  • Security  45%
  • Loss of control over data  26%
  • Integrations with existing systems 26%
  • Availability concerns 25%
  • Performance issues 24%
  • IT governance issues 19%
  • Regulatory/compliance concerns 19%
  • Dissatisfaction with vendor 12%
  • Ability to bring systems back in 11%
  • Lack of customization opportunities 11%
  • Measuring ROI 11%
  • Not sure 7%

Is there security in the cloud?
Security is often an afterthought for cloud service providers. It isn’t built into their applications and is often added as a plug-in. What’s more, if a cloud storage system crashes, millions and millions pieces of information can be lost, often in spite of backup procedures.  In contrast, when we are in the thick client world, the information that is lost can be more easily tracked by the number of PCs or notebooks affected or stolen.

How different should security be in the cloud world?
Business technologies may change, but security fundamentals and lessons learned are still applicable. Some areas to consider for the cloud:

Physical security is a must for any strong security program. The data centre should have a high level of physical security. If sensitive data is being stored, consider deploying biometrics, surveillance camera monitored by professionals, and very stringent policies for physical access to the system.

Authentication is crucial, whether cloud or corporate individual network authentication will remain the same. Given the processing power of the cloud, you may choose to implement two-factor authentication, one-time passwords or other authentication tools. In spite of a highly secured processing environment, a weak password has the potential to ruin other safeguards. Maintaining password standards is a must.

Access rights are critical for all the objects inside the cloud. This part of the security will not change in the user’s point of view. There are some levels of changes required to manage multiple corporate accesses inside the single cloud service provider’s organization.

Strong firewalls are another integral part of today’s security. Even in the cloud, the same rule applies: cloud clients should secure their own networks. The only advantage is they have less information to be secured within their network. The cloud service provider should secure their network with firewalls.

Data integrity is one of the key aspects in security. Today for example, it’s hard for every notebook to implement a cryptographic checksum or hash. But in cloud service this could become commonplace.

Security threats in the cloud

Security threats can come in all forms; let’s consider some of them here.  In the cloud-based service, the provider decides where your data is stored and how your data is accessed. If your provider offers virtual boxes, a mischievous user can gain control over a virtual box, attack your data and exploit it. Another security threat in cloud computing is the attack on the perimeter of the cloud. This may be a simple ping sweep to DoS. A cloud service provider must ensure the data of each company is properly isolated and partitioned, if not, data leakage can be expected.

Another important factor that has to be addressed in the cloud world is the privileges of the power user. How do we handle the administrators and data access? The administrator’s rights are not part of the customer anymore; it is part of the cloud service provider. There should be clear transparency and access records to prevent any misuse by an administrator.

Implementing security in the cloud environment is different than what we are used to in a traditional environment.  However, remembering the fundamentals of information risk management and lessons learned along with an understanding of cloud provider risks, may help you to weather the storms looming in a dark Cloud.

Why should the cloud customer implement security?

Though the cloud promises high security, it’s essential for the cloud customer to implement their own security and maintain standards. An unsecured customer network will attract hackers and is an easy entrance to the cloud.

Data transfer between the cloud service provider and customer should be on a secured connection and the customer should take necessary steps to secure his network from attacks such as the Man in the Middle (MITM).

The applications hosted on the customer network should also be secured. Customers using the cloud to deploy applications should ensure that their software is secured. Unsecured applications can be dangerous for both the cloud service provider and customer.

Cloud security can help a little if there is a vulnerable system unmaintained or not patched.

Virus attacks are not going to change in-spite of moving your data into the cloud.

How can you do business securely over the cloud?

Before you decide to buy a cloud service, go security shopping. We always bargain based on price, but that is not enough here. You need to bargain for security rights, transparency and privacy.

The legal agreement is the first level of security that you will always require, no matter where you do business. A well prepared agreement can provide all the legal benefits over your data in the cloud. Make sure to include the ownership of the following:

  • Data
  • Data backups
  • Log files

Your day-to-day business runs with the help of data. It’s essential that the cloud service provider shows transparency in his data centre location, physical security, containment measures, and time taken to recover in case of any catastrophe.

End-to-end encryption is must in cloud computing to ensure the security of data transfer. The customer should require this capability from the provider.

Authentication and proper access rights must also be secured. Given that you can access the applications in cloud from anywhere, it’s essential to block the entire user account for former employees. This has to be an integral part of the customer’s HR policies.

Patch management is also very important. Though cloud acts like a versionless world, it is essential that the service provider either informs you about the patches required to access his network or provide automatic patch management. If you use third party clients to access the customer application, you should ensure that these clients are up-to-date with security-related patches.

You should also require log analysis reports, user accounts and privileges reports, uptime/downtime reports, and penetration test/vulnerability assessment reports from the service provider on a regular basis. To ensure more transparency, require that these reports be provided by a third party security company. You should also demand real time security alerts from the service provider.

The last level of security that is often exploited is the application security. How secure is the cloud service provider’s application? There is no real way of knowing it. There are third party security companies and tools available to certify application security. This should be done on a routine rather than a one-off basis.

Social engineering is another threat that has to be addressed. It is essential for the cloud service provider and customer to be aware of such threats and educate their employees.

Phishing attack will also target the cloud consumers. Strong phishing filters should be deployed.

You will also want to involve third party security companies as partners to verify the cloud service provider’s security policies and verify his reports.

Summary

Security should be built as an integral part of the cloud. This is a must for the cloud service provider to gain trust from their customers. Gaining customer trust is the key to winning the cloud service game. Security is an ongoing measure to protect and deal with everyday threats. No matter where you do business you should secure yourself with the best practices.

February 23, 2011 Posted by | Cloud Security, Data Losss Prevention, IT security | , , , | Leave a comment

Security Breaches Continue to Grow

Identity TheftWhat do Tulane University, South Carolina State Employee Insurance Program, National Guard Headquarters – Santa Fe NM, BlueCross/BlueShield –Michigan, Seacoast Radiology, and University of Connecticut -HuskyDirect.com have in common?  They were just a few of the companies that reported security breaches in January 2011.

Information management is critically important to all of us – as employees and consumers. For that reason, the Identity Theft Resource Center has been tracking security breaches since 2005, looking for patterns, new trends and any information that may better help us protect data and assist companies in their activities.

In prior issues of Risky Business, I posted this brief article and supporting statistics about security breaches.  I was curious to see how the data changed.  You can see for yourself below in the last line.

The following data was collected from Identity Theft Resource Center® website idtheftcenter.org and refers to the number of total data breaches that were reported with an estimate of how many records were exposed:

2005 Breach List: Breaches: 157 Exposed: 66,853,201
2006 Breach List: Breaches: 321 Exposed: 19,137,844
2007 Breach List: Breaches: 446 Exposed: 127,717,024
2008 Breach List: Breaches: 656 Exposed: 35,691,255
2009 Breach List: Breaches: 498 Exposed: 222,477,043

2010 Breach List: Breaches: 662 Exposed: 16,167,542

You must understand that the majority of the reported breaches do not reveal the actual number of exposed records so therefore the number is MUCH larger than what is listed here.

Your call to action is to ensure your Information Risk Management Program is as secure as you think it is and as secure as your stakeholders, customers, Board of Director’s believe it to be.  Aujas is helping organizations manage risk and enhance information value with practical, innovative solutions!

January 31, 2011 Posted by | Data Losss Prevention, Identity Theft, IT security | , , | Leave a comment

Effective Data Protection Requires More than Technology

Data protectionMore companies are finding that despite their technology investments, effective data protection remains elusive. Data protection technology has become as commonplace as anti-malware technologies and most organizations implement it as a standard desktop endpoint and gateway security. The technology works using a combination of document ‘fingerprinting’, key words, and policies defined around what is allowed and what is not. The technology has matured to support endpoints and email data leakage risks as well as social networking risks. However, even with a mature technology and rigorous implementation, organizations often can find their data protection is ineffective.  

IT departments are able to quickly implement a data protection technology, but struggle with effectiveness. They are unable to bridge the gap between implementation and effectiveness, and end up with large numbers of data leakage ‘incidents’, which usually turn out to be false positives.  In many cases, organizations end up operating DLP tools in ‘audit only’ mode which completely defeats the tools’ purpose. 

This gap is usually due to the approach taken to data protection and not to the organization itself. Most organizations identify data protection as a risk and IT/IS department choose a vendor for implementation. The vendor usually ‘scans’ the file stores for ‘important’ files and policies are created to safeguard those files deemed important. While this approach seems simple enough, it is the root of the problem. IT organizations are basing policies on their own interpretation, rather than on what is important or appropriate for the business. 

Data, even if critical, may need to be exchanged with outsiders for valid business reasons. The challenge is to establish policies that allow the business to operate seamlessly while stemming the data leakage.  Another challenge is to build an ecosystem that supports this on an ongoing basis. The solution ideally integrates technology, process and a governance framework.  

 The first step is a data classification policy that clearly establishes how to classify data within the organization; the users should be made aware of how the classification policy applies. Next, the data flow within business processes should be understood to identify the type and nature of data, its classification and authorized data movement of ‘important’ data across organizational boundaries. Also, the important files, templates and data base structures that were identified during this exercise should be ‘fingerprinted’. The policies should then be configured and applied based on the authorized movement of data.

 Taking these two steps will help improve data protection technology effectiveness because it incorporates business rules for data. However, it still is a point-in-time exercise that does not address the fluid business data environment. To sustain the data protection, a governance process is required. One approach is to integrate with the data governance framework if one exists within the organization. If a data governance framework does not exist, a similar structure can be created. An additional benefit of this approach is close integration with data governance when such a framework is actually created. 

The governance function should be responsible at a high level for both the strategic and operational management of data protection. At a strategic level, the function should look at how data flows and is managed and its impact on data protection technology employed.  At an operational level, the function should look at how data protection incidents are managed, false positives reduced, user awareness on classification and protection improved.  Many organizations also employ active data protection with the use of data/digital/information rights management tools which require users to ‘protect’ based on allowed rights, time limits and expiry dates. Though the above approach remains the same for these technologies too, organizations have to spend more efforts on user awareness as their cooperation defines the success or failure of the technology. 

Though data protection technologies have changed the data confidentiality playing field completely, effective data protection cannot be achieved by the technology alone. It requires a focused lifecycle management approach for it to be more effective and sustainable.

January 24, 2011 Posted by | Data Leak Prevention, Data Losss Prevention, Risk management | , , | Leave a comment

Operating in the Cloud – Sunny with a Chance of RISK!

Cloud computing riskHere is a list of some of the most important risks of operating in the cloud today: 

  • Loss of governance
  • Data protection
  • Service provider lock-in
  • Compliance risks
  • e-Discovery and litigation support
  • Management interface compromise
  • Network management failure
  • Isolation Failure
  • Insecure/incomplete data deletion
  • Malicious insider

A risk-based approach is the only way to assess a cloud computing deployment decision.

Establish detective and preventive controls specific to each cloud deployment model:

  • SaaS – Browser patching, endpoint security, access reports
  • PaaS – Browser patching, hardening, endpoint security, access reports and vulnerability scanning
  • IaaS – VPN, configuration and patch management, host IDS/IPS, VirtSec appliance, access reports, vulnerability scanning, logging & event management

Identity management is a key area of preventive control focus for all service models.For more information on how Team Aujas is assisting clients with Security Risks in the Cloud please email me at karl.kispert@aujas.com

January 4, 2011 Posted by | Cloud Security, Data Losss Prevention, Enterprise Security, IT security | , , | Leave a comment