Riding Shotgun with Your Data

Historically money and valuables have been locked away in safes to keep them secure and prevent theft. When the valuables are out of the safe, such as being used or transferred, risk of loss goes up. In the 1800’s when stagecoaches were used in the United States to transport gold, silver and cash, they had a person “riding shotgun” to help protect the valuables outside of the safe.

Encryption in today’s world is a well-established way to help keep data secure, especially when “at rest” or stored. However, to unlock the real value of data, businesses need to use it and “interact” with it in some way. This is where the operational risk can arise such as someone leaving the door open, abusing data access privileges, or not knowing where and what kind of data is being accessed. To help customers protect data and reduce this operational risk, Oracle recently introduced Oracle Data Safe,which you might say allows them to “ride shotgun” on their cloud databases.

I spoke recently with Vipin Samar who is in charge of database security for Oracle. He’s an expert on database security and has been working to protect some of the world’s most sensitive information. He recently launched Oracle Data Safe, a new cloud security service that helps customers automate security for their cloud databases.

Fred: What are some of the challenges that companies face considering that they have valuable and sensitive data that must be used in their business yet face potential liability and financial loss if any of this data is breached?

Vipin: Data is now recognized as one of the most valuable assets businesses have. But when its security is compromised, it can become a great liability, as we’ve seen in some of the recent and very public data breaches. Organizations are in a catch-22 situation as they have to use their sensitive data to operate their business, but they must reduce the risk of that data being breached or misused.

Fred: When it comes to understanding threats and risks with cloud databases, what are customers most concerned about? Is it with the cloud infrastructure, the cloud provider, the database, or something else?

Vipin: When I talk with customers about moving their databases to the cloud, I hear several concerns. First, they express concern about the underlying infrastructure with network, virtual images, operating systems, and databases. Oracle addresses these concerns with next generation cloud infrastructure along with automated security patching, and always on data encryption. The next concern is protection from the cloud provider — obviously that’s us. We address this with strong separation of duties for our cloud administrators and activity monitoring. Their last big area of concern is about how they can secure their own data, users, and configurations, something that only they can do. They are worried about privileged users with broad access to all data, not knowing where their sensitive data is, lack of clarity regarding security policies for their data, and maintaining secure configuration.

Fred: People working on cloud security are becoming familiar with the shared responsibility model which distinguishes who is responsible for what in the context of cloud security yet there remains confusion. What can a cloud provider do to help customers in those areas of security where the customer has responsibility?

Vipin: At some level, we can all empathize with what is happening. It is difficult for cloud customers to detect all security gaps and understand how to turn all the security knobs and levers with their own data and users. Note that they often turned to the cloud because they didn’t have time or expertise in the first place. The motivation for Data Safe was to provide automated and integrated security capabilities so that the customers can more easily meet their share of the security responsibilities.

Vipin talks much more about Data Safe and its capabilities in a recent blog post and you can view a demo here.

Related:

Take Control of your Cloud Strategy with a Data First Approach

The cloud can be a powerful tool for optimizing business outcomes, which is why our customers are actively mapping out their cloud strategies. When embarking on the journey to the cloud, you must carefully consider how your data is managed – as data is increasingly becoming the most valuable asset of your business today. A cloud strategy therefore requires a data first approach, where you can ensure your data is: secured, protected, available where and when you need it, delivered at the performance levels required by your applications, and within compliance of your company policies. Dell … READ MORE

Related:

How Cloud Computing and Virtualization can innovate Education in Panama

Cloud Computing has been evolving during the last decade, it had transformed business and gave them a new direction. Not only in cost reduction, but by planning, adding new resources to the current infrastructure, offering pool, and on-demand self-services. Designing a Cloud project brings concerns like performance, capacity, resource planning, availability, and security. Years ago, end-users and IT professionals did not feel the Cloud was reliable or secure enough for the organization and there were expectations about Cloud Computing. After years of improving Cloud models, hardware vendors started to develop suitable solutions for Cloud Computing. The security concern was tackled by adding firewalls, ACLs, and Honeypots into the solutions. Implementing these security items, customers can feel more confident using a Cloud solution. But deciding hardware, service levels, and resource management are now the new challenges for Cloud vendors and users. When a system architect chooses hardware, is necessary to follow best practices from each vendor because by following best practices and considering compatibility matrix for applications, operating systems, and other hardware components, it can lead to a desired performance and resource utilization. The challenge of Service Levels or SLAs is also critical for Cloud providers due organizations need the service to be 99.99% available. If a business unit is affected by a failed hard drive disk or a faulty hardware component can result in a downtime of service, this can impact customers resulting in profits loss. It is important for vendors to meet realistic goals so customers can feel they will have reliability and security so stakeholders will have zero or minimum impact when a component fails. Last but not least, the challenge of resource management…why is it a challenge? It is true that Cloud Computing has flexibility over resources, but in a large environment it becomes challenging because the Cloud will need to support elasticity and high demand depending of the complexity of the solution. Hardware components like memory, CPUs, and hard drives needed to be allocated depending on the priority of the service. Performance and capacity issues are not only limited to hardware, it can be related by lacking of resource management; how can it be prevented? It can be prevented by developing resource policies and mechanisms, a well-known example of a resource policy is load balancing, which allows to distribute workloads within compute systems to prevent performance issues and the goal is that all compute systems within the cluster can work with same load without affecting performance and services. Each Cloud vendor offers tools for resource management for IT admins to monitor, schedule, and assign resources within Cloud project. Although the Cloud is challenging, it gives benefits and those will be described later and this papers also states why Cloud Computing and Virtualization can innovate in Panama.

Cloud Computing offers simplicity of applications which allows to provide more opportunities to a business or a government entity. In addition to this, Cloud Computing can be mixed with virtualization under Software Defined Data Center allowing IT admins to abstract a data center and have better management of the resources according to their business needs. Other notable features of Cloud Computing are flexibility and mobility because it can increase collaboration within an organization. An example of both is Remote Access which allows the employee to access E-Mails, spreadsheets, documents, or any non-critical application. The access can be from their home or any place with secure Internet Connection. Cloud Computing is also flexible because IT admins can use different Application Programing Interfacessuch as Open Stack, Google Cloud, Amazon Web Services, Microsoft Azure, and many others.

Cloud Computing is also scalable. It can be very useful for education and it can transform the learning experience for schools. A notable virtualization software technology is Virtual Desktop Infrastructure or VDI. Using VDI, information can be accessed promptly and gives better management of resources. End-users can use a tablet, thin client, or a remote desktop to access data or any application through a Secure Connection or from a Public network. Applications developed using VDI are stored into a Private, Public, or Hybrid Cloud and not on end-user’s desktop, it does not risk a university or institution’s data. Some universities and research institutions use Cloud Computing with VDI to facilitate homework, study materials, and research work. An example of an application is Blackboard, students and professors can access the material from a remote or local device. This application can be deployed in a hardware like a PowerEdge server along with storage products like Unity, or any other Dell-EMC storage product.

Education in Panama can benefit from the Cloud Computing because it offers the flexibility of resources that is needed, it can adapt to end-users needs. Cloud Computing has different advantages for the Ministry of Education, it can store the data into a Cloud, it is accessible, and is friendly for students and staff. In Panama, the technology infrastructure is growing and is a great opportunity for the country to innovate in technology and education by using Cloud Computing. The purpose of this Share Knowledge is to build an effective Cloud infrastructure for the Panama’s Ministry of Education (MEDUCA) allowing public Elementary and High schools students to access updated learning material. The suggested Cloud Model for this project is Infrastructure as Service, where the IT administrators will manage the Operating Systems and develop their own applications by using the current network infrastructure. The IaaS will be deployed in a Private or Hybrid Cloud environment which public schools will access grades, homework, study material, interactive learning games, and evaluations within a Private or Hybrid Cloud. This project has the objective of connecting 580 public schools along the country, by implementing DELL-EMC products suitable for this project and applying virtualization techniques to manager resources and using Dell-EMC product’s best practices to achieve optimal performance levels and new opportunities for the country.

Related:

Native Hybrid Cloud Makes Waves With Latest Release

EMC logo


Less than a year since we announced general availability, Native Hybrid Cloud is quickly becoming the most feature rich way to run cloud-native applications on-premises. At VMworld, we announced a number of new enhancements to Native Hybrid Cloud that broaden our production capabilities, bring new use cases for high availability and disaster recovery, and help solidify our position that we are the best way to run Pivotal Cloud Foundry on-premises.

As part of our core mission to help our customers succeed with their digital transformation initiatives, we’ve been working closely with our customers, taking direct feedback on how to improve Native Hybrid Cloud. Our conversations with customers have reflected the evolution of digital maturity in the market. A year ago, our customers were struggling simply to stand up a cloud-native platform to begin experimenting and developing next generation applications. Today, they are looking for more robust, production ready platforms to bring these applications to market.

Hyper-Converged Infrastructure and Elastic Cloud Storage Delivers High Availability to the Enterprise

In order to deliver a fault tolerant and production ready environment of Pivotal Cloud Foundry, we are offering ‘out-of-the-box’ deployments for multiple levels of high availability. At the smallest configuration, customers will be able to run multiple availability zones on a single site with physical and logical separation of resources. Pivotal recommends running multiple availability zones to assure platform health and performance, and this architecture offers you the ability to choose the level of redundancy necessary for running mission critical applications with minimal risk of applications going down.

To help avoid downtime due to a site failure, multi-site deployments provide the ability to run a full Native Hybrid Cloud in each site in an active-active configuration. Multi-site architectures provide an additional level of fault tolerance and resiliency that even if site 1 goes down, the Native Hybrid Cloud foundation at Site 2 can come alive and applications will continue to run. This architecture offers built-in layers of disaster recovery to ensure zero disruption and full application availability even in the loss of an entire site.

In a geo-replicated environment with multiple sites, ECS replicates chunks from the primary site to a remote site to provide high availability.

Elastic Cloud Storage’s robust fault tolerance and innate high availability standards ensure customers can recover from virtually any disaster scenario. Today, Native Hybrid Cloud components are backed up to Elastic Cloud Storage, while the Pivotal Cloud Foundry Elastic Runtime Blobstore lives on the platform as well. This ensures no single point of failure for applications running on Pivotal Cloud Foundry, and a secure backup for your logging data.

For customers running multi-site deployments, Elastic Cloud Storage’s geo-replication capabilities enable recovery of their logging and monitoring data even in the event of complete hardware loss. When used in combination with Pivotal’s recommended backup practices for the Pivotal Stack, it allows customers to easily recover, rebuild, and resume their Pivotal Cloud Foundry instances with no data loss.

Network Isolation gives an additional level of security and compliance for customers who deal with sensitive applications and data. This allows operators to segment portions of the network and limit who has access to different areas of the platform such as data services, applications, and the management plane. Giving operators greater control and visibility over the platform network traffic, secure network isolation improves security of the applications, and the infrastructure for those customers who want this level of security and compliance.

Simplifying Enterprise Deployments With Developer and Operator Tools

With additional levels of redundancy, availability and security comes the risk of an increase in system complexity. Managing multiple foundations and multiple sites of a cloud-native platform is challenging for even the most seasoned operators and organizations, and even more so for those just starting off their digital initiatives.  We’ve thought of that, which is why to help manage these challenges and to fulfill our commitment to our customers, we developed the Workbench within Native Hybrid Cloud, a series of tools designed to help developers and operators become productive on Pivotal Cloud Foundry on Day 1.

The Workbench includes two tools, the Access Testing Tool and our newest, the Deployment Management Tool. Codenamed ‘Fractal’, the Deployment Management Tool is designed to further automate the CI/CD pipeline by managing the application deployment and configurations of multiple foundations simultaneously. With this, developers simply submit their code, and when it’s ready to go live, operators can use a single ‘fractal push’ to automatically push the application across their entire production environment in a blue-green deployment to ensure minimal disruption.  The Access Testing Tool is focusing on helping developers easily diagnose and resolve connectivity issues during the development process. Developers can simply select the data and web services their applications need access to, and see whether their applications will have permission to those services when it’s in a production environment.

Native Hybrid Cloud 1.4 is packed with production ready features solidifying it as the best way to run Pivotal Cloud Foundry on-premises.  Learn more about Native Hybrid Cloud and get hands on with our interactive demos to see just how easy it is to deploy and manage applications with our platform.

Thanks, and keep following!



ENCLOSURE:https://blog.dellemc.com/uploads/2017/09/Native-Hybrid-Cloud-1000×500.jpg

Update your feed preferences


   

   


   


   

submit to reddit
   

Related:

Increasing tech is decreasing entry barrier for cyber criminals

Surendra Singh, Country Director, Forcepoint

Surendra Singh, Country Director, Forcepoint

Over the past few years, cloud technology has established itself as secure and modern alternate of old on-premise digital infrastructure but this has also increased the vector of cyber-attack and its impact. In an interview with Mohd Ujaley, Surendra Singh, Country Director, Forcepoint shares his views on security in the age of cloud and impact of artificial intelligence and cyber warfare on the physical world. He says, “As we move forward, the number of security breaches in the cloud will vastly increase than what they are today, so cloud providers and enterprises need to be careful about their asset security.”

There is growing adoption of cloud technology, what is your views on security in the cloud?

Technology is reducing the entry barrier for cyber-criminals. The reason why we say it because every organisation is talking about the cloud and most of the CIOs and CISOs feel that once things are in the cloud then their responsibility for security will reduce as the cloud service providers will be using specialist team to secure their assets. But in reality there is flip side to it, if all organisations go to one security provider for their cloud security, though there is a team of specialist to secure the cloud data but security breach can lead to a vast amount of data being stolen in one go. That is why, the cloud providers have to be careful about this.

As we move forward, the number of security breaches in the cloud will vastly increase than what they are today. So the cloud providers should be careful and also the organisations who are depending on the cloud should know this. There are a lot of automation happening with the help of Artificial Intelligence, this will enable organisations in finding out potential breaches quickly. Therefore, it is advisable to firms, if they are creating a new business, they should make security an integral part of their core business architecture.

In recent past, we saw nation states involved in cyber warfare. What impact do you see cybersecurity having on physical security?

You are right, the convergence of cyber and physical are impacting the foreign policies these days. We have seen small intercity cyber warfare happening especially between Russia and US or between US & China. This has a risk of nations getting into full fledge warfare because off late there are attempts to attack on the internal infrastructure of the country. Recent example is Russia that tried to completely crippled power system of Ukraine. Fortunately, Ukraine was able to avert it but any attack on national infrastructure could have a catastrophic impact.

In the cyber world it is very difficult to find the source of attack because attacker can manipulate. For example, ISIS can create a war between two countries just by creating cyberattacks. By creating such attacks the countries will be under the impression that the other country has done it. We have to be careful as these things can be easily manipulated.

And, you know this challenge is compounded by the fact that majority of personal details of the individual are available online with social media sites. Attacker can use combination of demographic details and Artificial Intelligence technology to impersonate.

There are more than 500 active cybersecurity solutions provider in the world, about 42 of them are aggressively working in India, still we see cyberattack, data breach happening regularly, why?

I agree with you, the buck must stop at industry and enterprises. They should take the responsibility. As a security firm our job is to make the life of our customer simple and easier. Today, most of the enterprise are dealing with multiple security vendors but unfortunately, security solutions of all the vendors does not integrate with each other – they do integrate but not fully. The result is customer and their security tool remain unmatched leading to loopholes.

We all have to work to integrate our own products rather than creating plans to secure different organs of the enterprises. Whole idea is more and more companies should merge and integrate their products and therefore give a unified solution to the customer. Companies should merger and integrate their products. Also at the company level, not only the CSO but other key heads should come together and use their mind to ensure robust security of their business, like most of them do for sales and taxation etc.

What kind of impact AI platform will have on cybersecurity?

With AI, more and more information is given in a digital form. And as it gets digitized it is vulnerable to external attack. What we will see is as artificial intelligence becomes mainstream, the security challenges will grow multi-fold and right in the beginning we have to track it and mitigate the issue. What is more important is that we should make security an easy thing to implement, so that customer and security team have more time to understanding the future threats that are coming. They will rather spend more time on analyzing the threats that is coming rather than resolving the one that has already happened.

Related:

Want More from Cloud? New Exadata Subscription-Based On-Premise Services

By: Edgar Haren

Principal Product Marketing Director

Today we have a guest blog from our peers on the Oracle Cloud Machine team. Today Anne Plese Director, Product Marketing, Cloud Infrastructure Database Solutions discusses the value of Oracle Exadata Cloud Machine and the use cases behind this solution.

Cloud-integrated engineered systems for Oracle Databases address many of the key concerns that inhibit enterprises from making a swifter transition away from legacy on-premises infrastructures. It’s fair to say that the vast majority of companies are well aware of the benefits provided by cloud-hosted applications. But for businesses that have real or perceived concerns about implementing cloud-hosted solutions for their more complex database, analytics, or OLTP needs, many of the benefits this technology provides are just out of reach.

Cloud Concerns

The primary worries about the public cloud include data residency, security, and vendor lock-in. For private cloud implementations, lock-in is the biggest concern, followed by a lack of appropriate skills, and then security.

Data security does appear to be the primary inhibitor, however. According to a Cloud Industry Forum report, privacy and security worries remain as big a barrier to adoption as they were five years ago. In this video, for example, Juan Loaiza, SVP Oracle, reviewed the most common objections companies communicate to us, including:

  • Regulatory or corporate policies that require data to be local to the company or territory
  • Limited resources or IT skills needed to manage database infrastructure
  • Lack of proximity to public cloud data centers, or latency issues that compel many organizations to demand the performance of a LAN infrastructure
  • Legacy database and infrastructure complexity
  • Risk associated with data security and worry about the cost and quality of services cloud providers offer

The Confident Transition

Oracle appreciates the lingering concerns that some companies have about hosting mission-critical business processes in the cloud. That’s why we have more choices of database infrastructure services to help companies transition to the cloud at their own pace. Central to that is the Oracle Exadata Cloud Machine. This fully-integrated engineered system is delivered to your data center to address latency and data sovereignty requirements—with all of the benefits of a modern cloud service. Here’s how it works: The services are physically delivered from inside your firewall, next to systems and data where compliance and operation policies have already been defined. Oracle conforms to regulatory, privacy, and legal frameworks, and meters only the services you consume.

A Good Fit if You are New to Exadata

Exadata Cloud Machine is identical to Oracle’s public cloud service in that it is managed by Oracle Cloud Experts for you. It’s cloud-based service that is still under your control, but all of the resource-consuming activities associated with managing infrastructure components—servers, storage, storage software, networking, and firmware—are supported by the Oracle Cloud Operation. Similar to our Exadata Cloud Services public cloud offering, companies avoid the costs of hiring, training, and retaining staff with specialized skills. Exadata Cloud Machine enables easy migration of existing databases across the LAN and simplifies database backup to existing data center infrastructure. Plus, it provides maximum-availability architectural configuration with built-in best practices. For companies that want more from the cloud, the Exadata Cloud Machine is a proven path away from cumbersome and costly traditional on-premises infrastructures.

Learn more about how our engineered systems can address any concerns you have with migrating to the cloud.

Resources:

Follow Us On Social Media:

Related:

Deploying DLP on private cloud

I need a solution

Hi.

I have a client that wants to install dlp in the private cloud of google (google cloud platform), I would like to know if it is possible to make a two or three tier deployment on google cloud platform, just like the tow-tier deployment on amazon (AWS). And what requirements we need, and also if it sopported by symantec

Thanks

0

Related: