Welcome!

Cloud Security Authors: Liz McMillan, Elizabeth White, Zakia Bouachraoui, Terry Ray, Yeshim Deniz

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cognitive Computing , Agile Computing, Cloud Security

@CloudExpo: Blog Post

Cloud Security: Five Lessons from the Yahoo Password Breach

This one-off incident should not dampen enterprise enthusiasm for a road map to hybrid computing adoption

Last week one news item that attracted media attention was the hacking of some nearly 450,000 passwords from Yahoo Service called ‘Yahoo Voice'. The communications on the incident state that, SQL Injection is the primary technique adopted by hackers to get the information out of databases and publish them.

As per further communications, we find the affected company taking more precautions to ensure that security is their highest priority. These events will also generally shake the Cloud Adoption at the enterprise level, where always the Fear, Uncertainty and Doubt in the minds of CIOs may increase due to these incidents.

However the following are the best practices and guidelines that should be adopted by any enterprise when adopting hybrid cloud computing and a one-off incident should not dampen their road map to hybrid computing adoption.

In other words the incident is not really due to the adoption of Cloud or a multi-tenant model rather from not following the age-old best practices. This also indicates the significance of Cloud Brokers or intermediaries who have substantial knowledge of enterprise computing paradigm to play an increasing role in ensuring enterprises adopt cloud computing in a secure manner.

Lessons Learned In Yahoo Password Hacking

1. Not Following Security Fundamentals During Development
Whatever the changes in technology are, the fundamentals of development and coding remains the same. Most times SaaS vendors have high pressure on time-to-market, which may at times make them work around on security fundamentals, which is not correct. If you are accepting input from the user, it needs to be validated always before the input is acted upon. Lack of validation of input is the root cause behind attacks like Cross-Site Scripting or SQL Injection. In a multi tenant world the risk of SQL Injection is increased many a fold, because the hacker could always enter the system as a legitimate user by creating an valid account and then start hacking the data other customers.

I have earlier elaborated about SQL Injection In SaaS in an article in 2010 itself, whose thought process is still valid when it comes to developing SaaS and multi tenant applications.

2. Not Encrypting at the Database Level
Encryption of key data is one of the most important security aspect of any database in general and a multi tenant database in particular. However most times enterprises may take the path of encrypting at the Virtual Machine or Volume level, which means the Entire Disk even if it is physically stolen will not be useful to the hacker.

While this is a very useful feature such level of encryption still not useful, when the hacker legitimately gets access to the virtual machine in which the database is hosted. So additionally database level encryption which further provides encryption at the user level, i.e only users who have got READ permissions on the database can view the data, will provide an added level of security.

In my earlier article on Protecting Data At Rest In Public Clouds, I have compared the options between middle ware vs RDBMS in protecting the data. As evident a combination of both would make the multi-tenant database highly secured.

3. Exposing Too Much Metadata
How many times you have a database table storing credit card information is named as ‘CREDIT_CARD' or the password column in a user database is named as PIN or Password. While the best practices of database design in the past indicated the use of correct column names, and meta data like comments at the database level, they may act detrimental to the sensitive data in a today's world.

It is always advisable to avoid too much meta data from sensitive columns and keep them as part of the supporting documentation. There is no rule which states that a CREDIT CARD information is stored in a table named temp_t1 and let your application map temp_t1 is indeed a table containing credit card information.

4. Not Using Best of Features in the Databases
Today most of the software vendors have the pressure to make their product run against multiple databases . This provides them a much wider marketing capability. While this is important from the business perspective, due to this restriction we have seen the products that are using powerful RDBMS systems as a mere data store and not utilizing their best security features. This will result in reduced security at the database level because none of the security features are really used.

In my earlier article on Implementing Multi Tenancy Using Oracle Virtual Private Database, I have elaborated on how the best features like VPD in Oracle provide lot of security features so that if properly applied, some data is not visible to a user unless the application sets the appropriate context. Similarly these features can mask a certain columns to be not visible when queried directly. Also the features like fine grained auditing provide lot of auditing features against database security breaches.

Also if database level security is properly applied, we could utilize roles, grants and privileges in such a manner that the database connections get only EXECUTE privileges on the stored procedures and no direct READ access to the underlying tables, these kind of design patterns protect the data further.

This points to the fact that the product vendors should concentrate on unique security features of the databases like Oracle, Sql Server and DB2 and implement them in their design apart from the application level security. The need for porting the application to multiple databases should not reduce the security needs of the application.

5. Not Masking the Data
Masking the data is about , taking the data in the current form and convert it into a different format for storage purposes. It is slightly different from the encryption that the data is in readable form but will not make any sense if you don't know the de-masking algorithm. Traditionally masking is only done when the production data is ported to a test or development machine and also when a production data is ported to a offshore development location.

However we have not seen much instances of live production instances using masking techniques to protect the security information.

What is the golden rule that states that a social security number always to be stored as a single column of 9 characters and stored contiguously, what if the same is stored in three different columns with the numbers flipped and the application reverses them and forms into a proper social security number? This is just an example, the same principle can be applied to a credit card or a password or a pin, so that only application knows how to make meaning out of a masked sensitive data and not anyone else.

Summary
As indicated in the beginning of the articles, there will be a few instances of security breaches when so much new services are delivered over the public internet in a short time. However enterprises need not worry from them as they are not issues out of the fundamental concept of Cloud or Multi Tenancy itself, but because of the way the they are implemented. Adopting the best of design patterns like DAO access pattern which abstracts the database calls from the application and doing proper validation along with the database level protection techniques explained above, will prevent these incidents in the future.

More Stories By Srinivasan Sundara Rajan

Highly passionate about utilizing Digital Technologies to enable next generation enterprise. Believes in enterprise transformation through the Natives (Cloud Native & Mobile Native).

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...