Top three security tutorials (May 2016)

We know that you care first and foremost about protecting your assets.
developerWorks Security continues to bring you the content you need to ensure
that all of your precious data isn’t at risk. Here are the top articles
from the Security zone for Q1 2016. You don’t want to miss out!

Related:

Predictions for Big Data Security in 2016

Leading into 2016, Oracle made ten big data predictions, and one in particular around security. We are nearly four months into the year and we’ve seen these predictions coming to light.

Increase in regulatory protections of personal information

Early February saw the creation of the Federal Privacy Council, “which will bring together the privacy officials from across the Government to help ensure the implementation of more strategic and comprehensive Federal privacy guidelines. Like cyber security, privacy must be effectively and continuously addressed as our nation embraces new technologies, promotes innovation, reaps the benefits of big data and defends against evolving threats.”

The European Union General Data Protection Regulation is a reform of EU’s 1995 data protection rules (Directive 95/46/EC). Their Big Data fact sheet was put forth to help promote the new regulations. “A plethora of market surveys and studies show that the success of providers to develop new services and products using big data is linked to their capacity to build and maintain consumer trust.” As a timeline, the EU expects adoption in Spring 2016 and enforcement will begin two years later in Spring 2018.

Earlier this month, the Federal Communications Commission announced a proposal to restrict Internet providers’ ability to share the information they collect about what their customers do online with advertisers and other third parties.

Increase use of classification systems that categorize data into groups with pre-defined policies for access, redaction and masking.

Infosecurity Magazine article highlights the challenge of data growth and the requirement for classification: “As storage costs dropped, the attention previously shown towards deleting old or unnecessary data has faded. However, unstructured data now makes up 80% of non-tangible assets, and data growth is exploding. IT security teams are now tasked with protecting everything forever, but there is simply too much to protect effectively – especially when some of it is not worth protecting at all.”

The three benefits of classification highlighted include the ability to raise security awareness, prevent data loss, and address records management regulations. All of these are legitimate benefits of data classification that organizations should consider. Case in point, Oracle customer Union Investment increased agility and security by automatically processing investment fund data within their proprietary application, including complex asset classification with up to 500 data fields, which were previously distributed to IT staff using spreadsheets.

Continuous cyber-threats will prompt companies to both tighten security, as well as audit access and use of data.

This is sort of a no-brainer. We know more breaches are coming, such as here, here and here. And we know companies increase security spending after they experience a data breach or witness one close to home. Most organizations now know that completely eliminating the possibility of a data breach is impossible, and therefore, appropriate detective capabilities are more important than ever. We must act as if the bad guys are on our network and then detect their presence and respond accordingly.

See the rest of the Enterprise Big Data Predictions, 2016.

Image Source: http://www.informationisbeautiful.net/visualizations/worlds-biggest-data-breaches-hacks/

Related:

Symantec Backup 2012 Disaster Recovery to different hardware, or other testing routes

I need a solution

Okay, I know this post is going to sound like a broken record, but I have searched and read through other threads but it seemed to either not address or solve the issue directly or give an unclear explanation that pertained to that specific persons problem. Here it goes and please accept my apologies if this post seems so redundant. 

Okay so I have a HP Proliant DL380 G4 server running Symantec Backup Exec 2012 (I know, I know, it’s an old server lol) and I’m used to using Puredisk but it seems quite easy administering backups with SBE 2012 UNTIL! it’s time to do recovery testing. I noticed I didn’t have an adequate testing environment for my recovery tests, so I grabbed the best computer I could find which is a Dell Tower 5810 w/ a 1TB backup drive and a 250GB boot drive which isn’t setup for RAID. So I tried doing a recovery of my HP Proliant server to my Dell Tower and I was met with a number of obstacles that indicated my recovery might go so smoothly, I will list those messages at the end. A little FYI – I do full SDR backups on our server here, the server is setup in a RAID 0 with a C: and D: drive and the Dell Tower is booting AHCI. Basically, I am not able to recover my backups to the Dell Tower. It’s obvious that either this scenario won’t work at all or I will need to do quite a bit of tweaking to make my test recoveries successful. I would really like to be able to find a way to recovery to the Dell Tower because if that HP server did crash then that may very well be all I have to use. So I have two questions if anyone can please answer:

1. How can I get this scenario to work; how can I get my recovery my backups to different hardware?

2. In the future, what is the best test environment to setup for conducting test recoveries? 

Thanks, thanks, thanks, to anyone who can help me with this. It has been a headache going on three days trying to figure this out on my own. Below are the messages I ran into while trying to test-recover my backups to a different computer:

– Your computer may not start successfully, if your system or boot volume resides on one of the hard disks attached to an inactive controller

– Drivers for one or more controllers are not installed. The computer may not start because the required system…

– If the recovered computer contains multiple hard disks, ensure that the computer’s BIOS is configured to start the computer from the hard disk that contains the Windows Operating System

– invalid partition table (I received this message after rebooting from a restore)

0

1469792268

Related:

Tape backups not showing in BAR

I need a solution

I noticed in the BAR when Im browsing the backups that my duplicated jobs to tape are no longer showing as a source to restore from.  I spoted this because my tape jobs have a 7 year retention and my standard backups have 1 year.  Starting in febuary none of the 7 year jobs are showing.  This is around the same time i bumped my primary backups from 3 months to 1 year in the SLP.

I ran a report on the past tape #s and images are on the tapes with the 7 year retention.

Are they not showing because vault is not working properly or am i just not understanding how this works and because my primary backup has a 1year retention the actual tape job wont show until the 1yr retention expires.

0

Related:

Question about FP2 upgrade

I need a solution

We have 1 CAS + 40 MMS + 60 Windows agent + 10 DB backup agent + 5 linux backup agent running BE15

1. To upgrade BE15 to FP2 , if the CAS is upgraded to FP2 first but MMS still running BE15. Can backup jobs running on CAS work normally even MMS is not upgraded?

2. To upgrade BE15 to FP2 , need to reinstall the existing windows backup agent and linux backup agent?

3. To upgrade BE15 to FP2, no server restart is required?

4. Do I have to backup the DB of CAS for jobs before upgrading to FP2?

0

1449560077

Related:

Trend Micro Unveils Unprecedented Report Dissecting Data Breaches

Dateline City:
DALLAS

With organizations across all industries experiencing cyberattacks, Trend Micro Incorporated(TYO: 4704; TSE: 4704), a global leader in security software and solutions, today released a ground-breaking report, “Follow the Data: Dissecting Data Breaches and Debunking Myths.” The report analyzes each element within a data breach including attack methods, motivations and how stolen data is used, providing key insights for businesses to understand the nature and likelihood of breaches in their industry.

Language:
English

read more

Related:

Trend Micro Launches Attack Scanner App for Splunk to Protect Against Next-Gen Threats

Dateline City:
DALLAS

As businesses strive to expand their customer insight through big data
analytics, Trend
Micro Incorporated
 (TYO: 4704; TSE: 4704), a global leader in
security software, today announced the launch of the Trend Micro™ Attack
Scanner app for Splunk,
(NASDAQ: SPLK), a provider of the leading software platform for
real-time operational intelligence.

Language:
English

Contact:

For further information:
Lindsey Lockhart,
for Trend Micro,
lindsey.lockhart@hck2.com,
972-499-6614
or
PRESS CONTACTS
Scott Perry
Director of Marketing
+1 (613) 599-4505 x2274
scott_perry@trendmicro.com

read more

Related:

How to rebuild MS Agent Jobs from MSDB table?

This morning our DB server died.

They restored backups to another server but these didn’t include the many MS Agent jobs.

Anyway I have managed to get a backup of the old MSDB table where all the tables used to create MS Agent Jobs are held added to our new server.

Therefore I need a script to re-create them on the new server.

There must be a job somewhere in MS SQL to script them out as you can do it from the management console. Therefore does anyone know of a script to do this or where to find the MS one please let me know.

As no manual backups were created a lot of jobs will be missing and people won’t know what to do to re-create them manually (which is why I think being able to add them to the nightly backup process would be good – I’m a webdev so it’s not my job – I just happen to be the only person around to do this lovely task).

Any help would be much appreciated.

Related:

How do I responsibly dispose of backup tapes?

Our old tape drives have failed and we not using tapes for backup anymore. We still have a stack of DLT tapes with backups which may contain sensitive information like credit card numbers, social security numbers, etc.

How do I responsibly dispose of these backup tapes?

If I had a working drive I would be tempted to dd from /dev/urandom to the tape device, but the drives have failed. Would this be a good method if the drive was still working? What do you recommend I do with these tapes given that I have no working drive for them?

Related: