Aug 27
Classify All Files in a Folder (Local Path)

Introduction

On occasion it may not be possible to simply classify based solely on the actual content of which exists in a file. The scenario arises when all files in a folder may need to be classified as HR, Finance, etc. so that data can be tracked (tagged) back to the source location from which it originated. To persistent classify all these files in a folder the following procedures overview these steps.

Requirements

This process require the following prerequisites in which to make this process possible. These include:

  • Current versions of the SDM product(s).
  • All Files "SDD".
  • "Workflow" rule.
  • Search "Policy".

Important Notes

Note that proceeding with the following "could" override the true content based classifications for any file in these respective locations.

Process

STEP 1 (SDD)

  1. Import or create a RegEx to capture all files in a location using the expression.
  2. From within the Spirion SDM Console Admin > Senstive Data Types > Add > Select Data Type = Regular Expression > Name = Classify All Files > Expression = (\S|\s|\w|\d)*

STEP 2 (Workflow)

  1. Import or create a new workflow from within the Spirion SDM Console Admin > Classification > Add > Name = XXXX.
  2. Select optionally a color, icon, or weight.
    1. Click OK.
  3. Highlight the new Classification, from within the Spirion ribbon Select Rule > Add.
    1. Workflow Rule
      1. Provide a workflow rule name and other options
    2. Definition
      1. Select from the dropdown options the following, see the image below.
        1. "Location", "Contains", Path to folder which all files from within should be classified as XXXX.

    1. Endpoints
      1. Select the appropriate endpoint from where the local files are being searched.
    2. Actions
      1. The Classification tag will be automatically selected, if not from the drop down for "Classify results as:" select the XXXX classification name from step 3. a. i.
      2. Select from the "Execute classification rules:" "Directly on the endpoint".
    3. Click Finish
      1. Bottom right area of the page to save the new rule.

STEP 3 (Search Policies)

  1. Import or create a new workflow from within the Spirion SDM Console Admin > Policies.
  2. Create a new Policy by clicking Policy > Create.
    1. On the Policy Tab provide a name and optionally a description.
      1. For Policy Type select Scheduled Task.
      2. On the Endpoints Tab select the same endpoint chosen for the workflow created previously.
      3. On the Data Types Tab deselect all.
      4. On the Location Tab deselect all.
    2. Click Finish
  3. Select the policy name just created > expand the tree view > expand Search Locations and Select Custom Folders.
    1. Click Add from the the Ribbon
    2. In the new Folder Location field place the same Share name as done in workflow, for example (C:\Location\PoC_Test_Data).
    3. Select "Include in Search" to the right of the folder patch for the Scope.
    4. Click the green check mark to the left of the folder path to save the changes.
  4. Select the policy name just created > expand the tree view > Select Sensitive data Types.
    1. From within the resulting list select "Classify All Files" as created in this document "Step 1, 2.".
  5. Select the policy name just created > expand the tree view > Select "Scheduled Tasks" or "Search > Initiate Search" to search and classify all files from within the target folder location.

Outcomes

As result of the following detailed procedural steps all files from within a folder will be classified as XXXX so that data can be tracked (tagged) back to the source location from which it originated.

Aug 14
Credit Card Validation

In this blog I explain the numerous ways to identify and validate a credit card number (CCN). The main point in this posting is to articulate the complex nature of identifying sensitive data. The complexities of identifying these types of sensitive data manually are not practical thus the need to automate using tools such as "Spirion.com".

Validation of CCN

Below are techniques that can be used to perform cursory checks on CCN's and an explanation of each of the most common validation techniques.

Luhn Algorithm Check

The Luhn Algorithm is a simple checksum formula used to validate a variety of identification numbers, such as credit card numbers and numerous others such as:

  • IMEI numbers
  • US National Provider Identifier (NPI) numbers
  • Canadian Social Insurance Numbers
  • Israeli ID Numbers
  • South African ID Numbers
  • Greek Social Security Numbers (ΑΜΚΑ)
  • McDonald's survey codes
  • Taco Bell receipts
  • Tractor Supply Co. receipts


In addition most credit cards and government identification numbers use this algorithm as a simple method of distinguishing valid numbers from mistyped or otherwise incorrect numbers.

Major Industry Identifier

The first digit of a credit card number represents the category of entity which issued the card.

Issuer identification number

The first six digits of a card number identify the institution that issued the card to the card holder.

Personal Account Number

Digits 7 to final number minus 1 (the last is the checksum) indicate the individual account identifier.

How many digits in a Credit Card Number?

  • Visa and Visa Electron: 13 or 16
  • MasterCard: 16
  • Discover: 16
  • American Express: 15
  • Diner's Club: 14 (including enRoute, International, Blanche)
  • Maestro: 12 to 19 (multi-national Debit Card)
  • Laser: 16 to 19 (Ireland Debit Card)
  • Switch: 16, 18 or 19 (United Kingdom Debit Card)
  • Solo: 16, 18 or 19 (United Kingdom Debit Card)
  • JCB: 15 or 16 (Japan Credit Bureau)
  • China UnionPay: 16 (People's Republic of China)
Jul 13
Migrating a MSSQL DB to AWS RDS MSSQL

To import a MSSQL .bak file from another location into an Amazon AWS RDS MSSQL instance you must follow these instructions, there is currently no other option for RDS MSSQL.

Create an S3 and verify the RDS MSSQL instance can access, this could be accomplished by modifying the VPC appropriately or granting public access to the S3. After this verify you can connect to the instance on port 1433 using SSMS or using telnet to the instance name such as the way mine looked:

  • nameofdb.b6rfjiaj2jhj.us-east-1.rds.amazonaws.com

When assessable run the following query within SSMS to import the .bak from S3 into RDS MSSQL.

exec msdb.dbo.rds_restore_database
@restore_db_name='DBNAME.bak',
@s3_arn_to_restore_from='arn:aws:s3:::s3bucketname/DBNAME.bak',
@with_norecovery=0,
@type='FULL';

 

You can track the following import process status using the following Native Tracking of Process guide.

https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/SQLServer.Procedural.Importing.html#SQLServer.Procedural.Importing.Native.Tracking

If you receive this error during import you must create an RDS "Option Group".

Msg 50000, Level 16, State 0, Procedure msdb.dbo.rds_restore_database, Line 80 [Batch Start Line 0]
Database backup/restore option is not enabled yet or is in the process of being enabled. Please try again later.
USAGE:
EXECUTE msdb.dbo.rds_restore_database @restore_db_name, @s3_arn_to_restore_from, [@kms_master_key_arn], [@type], [@with_norecovery]
@restore_db_name
: Name of the database being restored
@s3_arn_to_restore_from
: S3 ARN of the backup file used to restore database from.
@kms_master_key_arn
: KMS customer master key ARN to decrypt the backup file with.
@type
: The type of restore. Valid types are FULL or DIFFERENTIAL. Defaults to FULL.
@with_norecovery
: The recovery clause to use for the restore operation. Set this to 0, to restore with RECOVERY (database will be online after the restore).
Set this to 1, to restore with NORECOVERY (database will be left in the RESTORING state allowing for subsequent differential or log restores).
For FULL restore, defaults to 0.
For DIFFERENTIAL restores, you must specify 0 or 1.

Navigate to the Amazon RDS portal.

 

Click Options Group > Create Group

  • Provide a non-space, no caps name (For example, mssqlse)
  • Provide description (For example, MSSQL Standard Edition)
  • Choose "sqlserver-se" for Standard Edition MSSQL
  • Choose Engine Version (14.00 in this case)

 

 

  • Click Create

The new Options Group is now displayed in the available Options Groups for your Amazon RDS portal page.

  • Select the newly created Options Group and Add Option.
  • Choose SQLSERVER_BACKUP_RESTORE for the Option Details name.
  • Choose "Create Custom" from the IAM dropdown option.
  • Choose immediately for the Scheduling option.
  • Select Add option.

Back on the Amazon RDS DB Portal Page

  • Select your DB instance and select Modify.
  • Scroll down the Modify DB Instance page to Database Options
    • Change the Option group to your newly created group.
  • Click Continue
  • Under the Schduling of modifications section, select the Apply Immediately.
  • Click Modify DB Instance.


This will associate a group that will permit the import of a database into the AWS RDS MSSQL instance.

I have done this for a Vended application and SharePoint 2019 successfully thus far, Happy importing!

Mar 01
Coronavirus COVID-19 Report

​Here is map of the Coronavirus COVID-19 global cases by the Center for Systems Science and Engineering (CSSE) at Johns Hopkins University.  It is a very helpful interactive graphic to better ubderstand the virus trends globally.

https://gisanddata.maps.arcgis.com/apps/opsdashboard/index.html#/bda7594740fd40299423467b48e9ecf6

https://covid-19.splunkforgood.com/coronavirus__covid_19_

Feb 17
Disconnect Bing and Cortana Online Services from Windows 10 Search

Disconnect Bing and Cortana online services from Windows 10 search.

  • Run Regedit.exe
  • Navigate to HKEY_CURRENT_USER > SOFTWARE > Microsoft > Windows > CurrentVersion > Search
  • Look for 'BingSearchEnabled', if you don't see it you will need to create it (right-click in a blank area, pick 'New DWORD' 32 bit. Type in 'BingSearchEnabled'
  • Open BingSearchEnabled, set it to 0, press OK.
  • Look for 'CortanaConsent', again create it if you don't have it using the method above. Also set it to 0.
  • Reboot.

 

Dec 30
Data discovery management to meet governance, regulation and compliance.

Sensitive data has a considerable economic value to businesses and they must manage it appropriately. Protecting that data is of utmost concern because organizations risk substantial fines from regulators or damaging its brand reputation (Yahoo, Marriott, Equifax, etc.). As result, organizations are primarily driven to protect their network because of the sensitive data they store.

At the core of all successful data security management programs are an automated data discovery process which can accurately locate sensitive data so that it can be most effectively managed and protected. Generally most of that sensitive data are stored on workstations, file servers, database's and cloud repositories and the breadth of where data can be stored has grown significantly from just a few years ago. The exceedingly large number of individuals involved in the handling of sensitive data makes appropriate data management very challenging. It's obvious you can't protect these assets unless you know where this sensitive data resides.

In my 20+ years of operational data management and security, the most successful data management and governance programs have included a unified set of information security tools and a governance and compliance process. This process will include many components for its success however the most effective components in this process generally involve a good user awareness of the sensitive data used, the data classification or sensitivity of the data, a data governance process which includes policy, best practices, and standards, and IT systems used to augment data security protection strategies. With these key processes in place an organization can be most successful and meet internal IT policy, governance, regulation, and compliance needs.

IT Data Management

The most effective data management protection programs consist of the ability to report on the location of and types of sensitive data. There isn't a single best process or tool to protect any network however there are numerous technologies including DLP, Firewall, Encryption, Patch Management, IDS, Antivirus and most importantly a Data Discovery mechanism. The most effective approach to good data hygiene and the protection of sensitive data is to know what it is your protecting and to what extent to leverage those IT security processes. With the effective use of a data discovery tool you can complement and extend your data protection stance, the core of all these processes being a data discovery toolset such as the Spirion Sensitive Data Management. The following demonstrates how having a sensitive data discovery process can complement your existing data security posture.​

DLP – These technologies can protect sensitive data in motion, however are inundated with false positives and require considerable resources to manage. Complement and extend the DLP's capabilities by augmenting it with a at rest data discovery tool like Spirion so that the data can be more accurately discovered and leverage the persistent meta tags imbedded in that sensitive data to more accurately accomplish the data in motion protection process.

Firewall – This is the most common type of technology used. However knowing exactly which file server(s) contains the most sensitive (PCI, CCPA, GDPR, PII, etc) data allows the most effective use of FTE resources to more granularly define those firewalls and comply with regulators compensating control requirements. It's imperative to know what it is you're protecting to be successful at protecting access from those users or application that should not have it.

Encryption – Identifying what really needs protected through encryption will reduce the complications of managing various data repositories.

Patch Management – Which storage devices do we need to be more aggressive with patching when a zero day occurs?

Intrusion Detection (IDS) – What sensitive data was on the device when the intrusion occurred!?

Antivirus – What sensitive data was on the device when the malware infected your device with the Trojan six weeks ago?

Sensitive data management is not solely an internal IT security or policy driven process. Effective data management involves people, process, and technology. Good governance of this data is achieved when you can report on where and what types of sensitive data are being stored. Being able to do so will help with avoiding those costly mistakes and damage to a company's reputation.

Compliance and Regulation

This is a People, Process, and Technology driven challenge and the value of sensitive data discovery extends beyond the IT management of data. Data discovery tools are most useful in the compliance and regulation landscape. A data discovery tool will reveal what sensitive data was on a compromised, lost, or stolen device months ago should the situation arise which allows an organization to react more favorably to legal, compliance, or IT security process. Should a PCI audit occur you can work with a QSA to scope those devices under the PCI umbrella. If your organization must comply with GDPR, CCPA, or any number of US state laws and you receive a DSAR request you can act on that. The scope of value gained by having a process at the center of all sensitive data discovery is most valuable.

Data has grown exponentially in size and complexity and having an automated data discovery and reporting process is essential to be successful. New privacy regulations are being implemented around the globe at an increased pace and not all are prescribe identical requirements however all share a common requirement; know where and what sensitive data is being stored. Plan ahead and have a data discovery tool in place!

CCPA will take in effect January 1st of 2020, however you still need to know what and where that data was 1 year previous. GDPR has been in effect since May 25th, 2018 and you have already seen huge fines for noncompliance. The NY SHIELD Act is very similar in scope to CCPA, along with numerous other state data protection laws that have emerged in just the first 6 months of 2019 (See US State Data Protection Laws data sheet).

With so much more to consider in the data management governance, regulation and compliance landscape it can be overwhelming to decide where to start. More to come on that!

I welcome your feedback and look forward to your different perspectives?

Dec 30
Spirion Single Sign-On

We all use SSO in many ways such as when a user logs into their desktop, then opens Microsoft Outlook, Microsoft Lync, and other applications without providing additional credentials, this is an SSO process. SSO process is used for web applications such as Google, LinkedIn, Twitter and Facebook, all of which offer popular SSO services that allows you to log into one application and also the others using their social media authentication credentials.

Using SSO for an Enterprise process such as Spirion provides a great value to organizations.

  • A user only has to remember one password at all times.
  • Only required to occasionally enter credentials for Spirion and other systems, there's significantly less effort needed.
  • The back end SSO provider can capture logging such as user activity as well as monitoring user accounts. A desirable outcome of applications such as Spirion using SSO.
  • Reduces Risk by Minimizing Bad Password Habits.
  • The combination of a user ID and password is no longer a strong enough protection strategy to access an organizations most vulnerable information, SSO provides an additional layer to strengthen this process.
  • Many modern organizations such as government (DOD, NASA, etc.) and enterprises require SSO to protect access to web application access.
  • Extra security can be added to the initial single sign-on, for example requiring biometric authentication, or access via an RSA token or similar encryption device, independent of Spirion, but allows our product integration into these processes.

 

Because Spirion supports SAML Single Sign-On, many organizations can use the Microsoft MFA to push to devices for sign in. This industry standard protocol empowers customers to use their own Azure identity management system for authenticating users of the CenturyLink Control Portal.

Now, with the addition of the Require SAML for Login option provided by Control Portal, customers can force users to authenticate through their identity providers to enable additional identity management features like multi-factor authentication (MFA) and user provisioning. This way, the CenturyLink Cloud platform can provide flexible, standards-based capabilities while allowing an organization to keep the nuts-and-bolts of their IdM configurations in their pre-existing systems.

For more details and how SAML works in general and how to specifically setup an ADFS IdP for use with Control Portal, refer to Using SAML for Single-Sign-On.

Dec 30
What is a Data Subject Access Request (DSAR)

What is a Data Subject Access Request (DSAR)

A Data Subject Access Request (DSAR) is a process in which a subject (Individual) requests through electronic process (e-mail, phone call, or web contact form) or physical visits to obtain a copy of their personal data.

Most organizations will collect information about individuals through its normal business practice. For example financial institutions or airlines will collect the name, DOB, Age, Address and various other required information about an individual, all of which are legitimate business practices. However over the life of those accounts additional information may be collected during customer service interaction or other touch points adding additional information to what has already been collected. In some scenarios this information is also shared with third party business that may service those accounts or through solicitation practices such as airline discounts and preference for new credit cards. Organizations will expectantly lose track of what has been collected due the complex process and distributed data collection practices.

With the new regulations such as GDPR, CCPA, and various other US state laws which protect individuals rights, it's now required that organizations be able to provide what data has been collected to individuals requesting this through a data subject access request.

DSAR Process

The DSAR process starts when a request is received by an individual which is then forwarded to the data protection officer or as appropriate for the organization that collects that data. The office handling this requests will respond for proof of that customer's identity and begin the process of collecting pertinent data to fulfil the request.

In addition to requesting proof of identity the customer will also need to provide a valid reason why the customer is asking for this information. Common reasons may such as debt consolidation practices, financing, audits, credit disputes, identity theft, etc., and those companies must be able to find that data otherwise are subject to large regulation fines.

Once this process has completed the DSAR must be fulfilled "without undue delay", typically within 30 days of receipt. The DSAR requests can be extended in rare cases when the scope of data to be searched will take longer than the physical limits of the technologies to meet these deadlines, however this process must already be in place. The overall DSAR process is typically handled by that organizations legal and data protection authorities.

To facilitate the process most effectively Spirion is used to search for locations to look across various data repositories and identify what data is personally identifiable and to whom it belongs. Spirion can be leveraged to search for these types of custom data which frequently include the name of the individual, Address, DOB, phone number, or perhaps data more specific to the organization collecting the data such as client ID, etc.

Deciding where to search and the relative size of the search has to be reasonable in order to fulfill the request within the timeframe given. It requires organizations the ability to scan at scale; all searches take time, so plan ahead and have a process in place. There are numerous factors which will impact the DSAR scan such as the various data composition (images, text, etc.) and the performance of the systems being scanned. Because every company's data composition is different it's important to be in front of the process and have a solution like Spirion in place.

This includes tools used to integrate with Spirion such as OneTrust which is partner tool used to further expand on the technologies of what Spirion delivers. Spirion provides expansive abilities to accurately discover sensitive data, classify the data, and provide reports for the locations and type so sensitive data. A tool such as OneTrust then expands upon those capabilities by providing further governance, regulatory, and compliance processes to meet the stringent requirements of the regulators. Spirion provides data with the appropriate bulk report formatting to digest and extend the capabilities and satisfy DSAR requests and data mapping.

Feb 14
ISACA North East Presentation

cprethercircle.pngCory Retherford (www.coryretherford.com)

Solutions Engineer, Spirion

Specializing in security architecture and data management.
Twenty years as an IT professional with focus in data security and operational data security risk reduction.
Real world solutions implementation experience in large and complex environments.

Abstract

Will discuss the critical steps and fundamentals in protecting sensitive data against data leaks.  Narrowing the project scope and creating data awareness is critical for a security programs success.  Will discuss an approach to the implementation of a data steward project and implementing technical automation to help drive information worker security awareness and concentrating resources on protecting critical systems with personally identifiable information (PII).

Download the Presentation

TBD_ISACA_Presentation.pdf

Feb 10
Windows Admin Center Required Ports

When accessing a server through Windows Admin Center you receive the following Connection Error.

Open the inbound port TCP 5985.

 

1 - 10Next
Copyright © | CoryRetherford, LLC | Contact MeNetwork Storage and Security Solutions, LLC, Rights Reserved.®
TLS 1.2, AES with 256 bit encryption

 ‭(Hidden)‬ Blog Tools