Sep 29
Data discovery management to meet governance, regulation and compliance.

Sensitive data has a considerable economic value to businesses and they must manage it appropriately. Protecting that data is of utmost concern because organizations risk substantial fines from regulators or damaging its brand reputation (Yahoo, Marriott, Equifax, etc.). As result, organizations are primarily driven to protect their network because of the sensitive data they store.

At the core of all successful data security management programs are an automated data discovery process which can accurately locate sensitive data so that it can be most effectively managed and protected. Generally most of that sensitive data are stored on workstations, file servers, database's and cloud repositories and the breadth of where data can be stored has grown significantly from just a few years ago. The exceedingly large number of individuals involved in the handling of sensitive data makes appropriate data management very challenging. It's obvious you can't protect these assets unless you know where this sensitive data resides.

In my 20+ years of operational data management and security, the most successful data management and governance programs have included a unified set of information security tools and a governance and compliance process. This process will include many components for its success however the most effective components in this process generally involve a good user awareness of the sensitive data used, the data classification or sensitivity of the data, a data governance process which includes policy, best practices, and standards, and IT systems used to augment data security protection strategies. With these key processes in place an organization can be most successful and meet internal IT policy, governance, regulation, and compliance needs.

IT Data Management

The most effective data management protection programs consist of the ability to report on the location of and types of sensitive data. There isn't a single best process or tool to protect any network however there are numerous technologies including DLP, Firewall, Encryption, Patch Management, IDS, Antivirus and most importantly a Data Discovery mechanism. The most effective approach to good data hygiene and the protection of sensitive data is to know what it is your protecting and to what extent to leverage those IT security processes. With the effective use of a data discovery tool you can complement and extend your data protection stance, the core of all these processes being a data discovery toolset such as the Spirion Sensitive Data Management. The following demonstrates how having a sensitive data discovery process can complement your existing data security posture.

DLP – These technologies can protect sensitive data in motion, however are inundated with false positives and require considerable resources to manage. Complement and extend the DLP's capabilities by augmenting it with a at rest data discovery tool like Spirion so that the data can be more accurately discovered and leverage the persistent meta tags imbedded in that sensitive data to more accurately accomplish the data in motion protection process.

Firewall – This is the most common type of technology used. However knowing exactly which file server(s) contains the most sensitive (PCI, CCPA, GDPR, PII, etc) data allows the most effective use of FTE resources to more granularly define those firewalls and comply with regulators compensating control requirements. It's imperative to know what it is you're protecting to be successful at protecting access from those users or application that should not have it.

Encryption – Identifying what really needs protected through encryption will reduce the complications of managing various data repositories.

Patch Management – Which storage devices do we need to be more aggressive with patching when a zero day occurs?

Intrusion Detection (IDS) – What sensitive data was on the device when the intrusion occurred!?

Antivirus – What sensitive data was on the device when the malware infected your device with the Trojan six weeks ago?

Sensitive data management is not solely an internal IT security or policy driven process. Effective data management involves people, process, and technology. Good governance of this data is achieved when you can report on where and what types of sensitive data are being stored. Being able to do so will help with avoiding those costly mistakes and damage to a company's reputation.

Compliance and Regulation

This is a People, Process, and Technology driven challenge and the value of sensitive data discovery extends beyond the IT management of data. Data discovery tools are typically most useful in the compliance and regulation landscape. For example a data discovery tool will reveal what sensitive data was on a compromised, lost, or stolen device months ago should the situation arise which allows an organization to react more favorably to legal, compliance and IT security process. Should a PCI audit occur you can work with a QSA to scope those devices under the PCI umbrella. If your organization must comply with GDPR, CCPA, or any number of US state laws and you receive a DSAR request you can act on that. The scope of value gained by having a process at the center of all sensitive data discovery is most valuable.

Data has grown exponentially in size and complexity and having an automated data discovery and reporting process is essential to be successful. New privacy regulations are being implemented around the globe at an increased pace and not all are prescribe identical requirements however all share a common requirement; know where and what sensitive data is being stored. Plan ahead and have a data discovery tool in place!

CCPA will take in effect January 1st of 2020, however you still need to know what and where that data was the year previous. GDPR has been in effect since May 25th, 2018 and you have already seen huge fines for noncompliance. The NY SHIELD Act is very similar in scope to CCPA, along with numerous other state data protection laws that have emerged in just the first 6 months of 2019 (See US State Data Protection Laws data sheet).

With so much more to consider in the data management governance, regulation and compliance landscape it can be overwhelming to decide where to start. More to come on that!

I welcome your feedback and look forward to your different perspectives?

Sep 22
What is a Data Subject Access Request (DSAR)

The Data Subject Access Request

A Data Subject Access Request (DSAR) is a process in which a subject (Individual) requests through electronic process (e-mail, phone call, or web contact form) or physical visits to obtain a copy of their personal data.

Most organizations will collect information about individuals through its normal business practice. For example financial institutions or airlines will collect the name, DOB, Age, Address and various other required information about an individual, all of which are legitimate business practices. However over the life of those accounts additional information may be collected during customer service interaction or other touch points adding additional information to what has already been collected. In some scenarios this information is also shared with third party business that may service those accounts or through solicitation practices such as airline discounts and preference for new credit cards. Organizations will expectantly lose track of what has been collected due the complex process and distributed data collection practices.

With the new regulations such as GDPR, CCPA, and various other US state laws which protect individual's rights, it's now required that organizations be able to provide what data has been collected to individuals requesting this through a data subject access request.

DSAR Process

The DSAR process starts when a request is received by an individual which is then forwarded to the data protection officer or as appropriate for the organization that collects that data. The office handling this requests will respond for proof of that customer's identity and begin the process of collecting pertinent data to fulfil the request.

In addition to requesting proof of identity the customer will also need to provide a valid reason why the customer is asking for this information. Common reasons may such as debt consolidation practices, financing, audits, credit disputes, identity theft, etc., and those companies must be able to find that data otherwise are subject to large regulation fines.

Once this process has completed the DSAR must be fulfilled "without undue delay", typically within 30 days of receipt. The DSAR requests can be extended in rare cases when the scope of data to be searched will take longer than the physical limits of the technologies to meet these deadlines, however this process must already be in place. The overall DSAR process is typically handled by that organizations legal and data protection authorities.

To facilitate the process most effectively Spirion is used to search for locations to look across various data repositories and identify what data is personally identifiable and to whom it belongs. Spirion can be leveraged to search for these types of custom data which frequently include the name of the individual, Address, DOB, phone number, or perhaps data more specific to the organization collecting the data such as client ID, etc.

Deciding where to search and the relative size of the search has to be reasonable in order to fulfill the request within the timeframe given. It requires organizations the ability to scan at scale; all searches take time, so plan ahead and have a process in place. There are numerous factors which will impact the DSAR scan such as the various data composition (images, text, etc.) and the performance of the systems being scanned. Because every company's data composition is different it's important to be in front of the process and have a solution like Spirion in place.

This includes tools used to integrate with Spirion such as OneTrust which is partner tool used to further expand on the technologies of what Spirion delivers. Spirion provides expansive abilities to accurately discover sensitive data, classify the data, and provide reports for the locations and type so sensitive data. A tool such as OneTrust then expands upon those capabilities by providing further governance, regulatory, and compliance processes to meet the stringent requirements of the regulators. Spirion provides data with the appropriate bulk report formatting to digest and extend the capabilities and satisfy DSAR requests and data mapping.

ADDITIONAL RESOURCES

http://coryretherford.com/Lists/Posts/Post.aspx?ID=380

Jul 16
SAML (SSO) using Spirion

Spirion now provides Single Sign-On (SSO). A SSO process permits a user to carry out a single (master sign-on) to authenticate to all of an organizations included applications and gain access to those systems. When a user browses to the Spirion web console, the user is redirected to the organizations portal to sign in with a single username and password.

The company's SSO website verifies the user's identity with an identity provider, such as Microsoft's Active Directory Federation services or Azure Active Directory which then redirects them to the Spirion web console automatically. With SSO, a user logs in once, and gains access to all systems without being prompted to log in again at each of them. Spirion can now be part of that security process.

We all use SSO in many ways such as when a user logs into their desktop, then opens Microsoft Outlook, Microsoft Lync, and other applications without providing additional credentials, this is an SSO process. SSO process is used for web applications such as Google, LinkedIn, Twitter and Facebook, all of which offer popular SSO services that allows you to log into one application and also the others using their social media authentication credentials.

Using SSO for an Enterprise process such as Spirion provides a great value to organizations.

  • A user only has to remember one password at all times.
  • Only required to occasionally enter credentials for Spirion and other systems, there's significantly less effort needed.
  • The back end SSO provider can capture logging such as user activity as well as monitoring user accounts. A desirable outcome of applications such as Spirion using SSO.
  • Reduces Risk by Minimizing Bad Password Habits.
  • The combination of a user ID and password is no longer a strong enough protection strategy to access an organizations most vulnerable information, SSO provides an additional layer to strengthen this process.
  • Many modern organizations such as government (DOD, NASA, etc.) and enterprises require SSO to protect access to web application access.
  • Extra security can be added to the initial single sign-on, for example requiring biometric authentication, or access via an RSA token or similar encryption device, independent of Spirion, but allows our product integration into these processes.
RESOURCE
May 29
Presidio Carb Day Event Presentation

cprethercircle.pngCory Retherford (www.coryretherford.com)

Senior Solutions Engineer, Spirion

Specializing in security architecture and data management.
Twenty years as an IT professional with focus in data security and operational data security risk reduction.
Real world solutions implementation experience in large and complex environments.

Abstract

Discussed the critical steps protecting sensitive data and the importance of data classification.  Discussed implementing technical automation to help drive the information policy, compliance, and regulation.  Concentrating resources on protecting critical systems with personally identifiable sensitive information.

Download the Presentation

Presidio Carb Day Presentation.pdf


Feb 14
ISACA North East Presentation

cprethercircle.pngCory Retherford (www.coryretherford.com)

Solutions Engineer, Spirion

Specializing in security architecture and data management.
Twenty years as an IT professional with focus in data security and operational data security risk reduction.
Real world solutions implementation experience in large and complex environments.

Abstract

Will discuss the critical steps and fundamentals in protecting sensitive data against data leaks.  Narrowing the project scope and creating data awareness is critical for a security programs success.  Will discuss an approach to the implementation of a data steward project and implementing technical automation to help drive information worker security awareness and concentrating resources on protecting critical systems with personally identifiable information (PII).

Download the Presentation

TBD_ISACA_Presentation.pdf

Feb 10
Windows Admin Center Required Ports

When accessing a server through Windows Admin Center you receive the following Connection Error.

Open the inbound port TCP 5985.

 

Feb 08
Update Windows Admin Center Certificate

Update Windows Admin Center Certificate

Windows Admin Center provides a self-signed certificate that is valid for 60 days, after that your browser accessing the console ill generate access errors when authenticating. To avoid this you can create a new certificate to enable year-long certificates and also used to install on the desktops accessing the console.

  1. Install IIS on the Windows Admin Server, this will be used to generate a self-signed certificate.
  2. Open IIS and select the Server Name > double click "Server Certificates".
  3. This will open Actions for creating the "Self-Signed Certificate".
  4. Name the certificate "Windows Admin Center" and select "Personal". Click OK and it will create a certificate within the certificate store on the Windows Admin Center server.
  5. From within IIS in the Server Certificates you will see the newly created TLS cert.
  6. Right Click the Name of the certificate and choose "Export" form the menu options.
    1. Export this to your desktop and provide a passphrase, don't lose this passphrase, you will need it to install on the machines accessing the Admin Web Portal.
  7. Double click the same certificate from within the Server Certificates view.
    1. This open the certificate properties, choose "Details".
      1. Scroll down to the field named "Thumprint".
        1. Copy this Value; will use this to update the thumbprint being used by Windows Admin Center.
  8. Open "Apps & features" on the server.
  9. Select "Windows Admin Center" and choose Modify > Next > Change.
    1. Change the Thumbprint appropriately as shown below with the value copied previously.
    2. Click "Change".
      1. This will update the certificate with the newly created certificate.

 

Windows Admin Center Certificate Installation on the Desktop

This process will enable you to browse to the Windows Admin Center with a valid TLS RSA AES-256 bit encryption certificate "HTTPS" connection. This will reduce the number of password prompts and secure your connection.

 

  1. From the desktop right click the "Exported" certificate copied form IIS in step 6 and select "Install PFX".
    1. Choose Local Machine > Next > Next > Supply the password you created in step 6 > Choose "Place all certificates in the following store" > Click Browse and select "Trusted Root Certification Authorities> OK > Next > Finish.
  2. When you browse to the Windows Admin Center you will have a valid TLS certificate "HTTPS" connection. This will reduce the number of password prompts and secure your connection.
Jan 02
Windows Server HTTP/2

In Windows Server 2019, a new set of features is available from within IIS among some I mention here:

  • Improved coalescing of connections to deliver an uninterrupted and properly encrypted browsing experience.
  • Upgraded HTTP/2's server-side cipher suite negotiation for automatic mitigation of connection failures and ease of deployment.
  • Changed our default TCP congestion provider to Cubic to give you more throughput!

HTTP/2

HTTP/2 provides for faster and safer Web browsing as result of new IIS hosting features. Originally, Serer 2016 added support for HTTP/2 in the native HTTP server.  Windows Server 2019 delivers performance and security benefits to your web site deployments with HTTP/2. HTTP/2 is a rework of how HTTP semantics flow over TCP connections for Windows Server 2016. This is a major upgrade after nearly two decades of HTTP/1.1 use and reduces the impact of latency and connection load on web servers. The major advance of HTTP/1.1 was the use of persistent connections to service multiple requests in a row. In HTTP/2, a persistent connection can be used to service multiple simultaneous requests. In the process, HTTP/2 introduces several additional features that improve the efficiency of HTTP over the network.

Using HTTP/1.1 each request required a dedicated TCP connection potentially imposing several round trips to establish that connection. Using HTTP/2 further improves this process by allowing the ability to share a single TCP connection across many requests to the same web site called multiplexing.

From within IIS there is an option to Disable HTTP/2, do not do it!

HTTP exchanges typically employ many HTTP headers which many times represent much more data than the actual payload. HTTP/2 uses HPACK, a header compression scheme built explicitly for HTTP Header compression. This drastically reduces the amount of data that needs to be exchanged between client and server which may also save on round-trip times.

Dec 23
NHS (UK) Number Identity

Everyone registered with the National Health Service (NHS) in England, Wales and the Isle of Man has a unique patient identifier called an NHS Number. The modern style of NHS number was generally introduced in 1996 and one allocated to every newborn since July 1995 and becoming mandatory on 1 April 1997.

The NHS Number helps healthcare staff and service providers identify you correctly and match your details to your health records. The number will appear on most official documents.

Each NHS Number consists of up to 10 digits shown in a 3-3-4 format. Those having NHS numbers following in the format XXXX 999 are no longer valid.

The Validation

In the NHS example above "9434765919" is used for the following below validation.

  • The first digit is 9. This is multiplied by 10.
  • The second digit is 4. This is multiplied by 9.
  • And so on until the ninth digit (1) is multiplied by 2.
  • The result of this calculation is summed. In this example:
    • 9*10+4*9+3*8+4*7+7*6+6*5+5*4+9*3+1*2 = 299.
  • The remainder when dividing this number by 11 is calculated, yielding a number in the range 0–10, which would be 2 in this case.
  • Finally, this number is subtracted from 11 to give the checksum in the range 1–11, in this case 9, which becomes the last digit of the NHS number.
  • A checksum of 11 is represented by 0 in the final NHS number. If the checksum is 10 then the number is not valid.
Dec 23
Passport Numbers Identities

In this blog, I explain the numerous ways to identify sensitive data. The main point in this posting is to articulate the complex nature of identifying sensitive data to comply with regulation, compliance, data governance, and data hygiene practices. In scenarios such as these, the advantages of using automated tools such as "Spirion.com" to augment manual approaches is obvious.

Passports and passport cards have numerous technologies built into the process of validating a subject such as myself "Cory Retherford". Passports use numerous codes, which will discuss in the following paragraphs, watermarks, steganography, RFID technologies similar to that of certificate authorities when validating website TLS certificates "HTTPS" and other approaches ill address.

This information is not at all intended to help you create fake identities but is intended to explain the nature of how identities are secured and to inform you as a Cyber Security Expert "White Hat". For those others use TOR where the DOJ can track your bad habits.

Contexual Validation

The first two numbers indicate which passport office issued your passport or where you applied for the passport.

Pre-Fix

Passport Office

40

New Orleans

1

Washington

15, 20, 21

New Hampshire

60

Military

90

Diplomatic

Z or 70

Temporary

 

The format of the first row

Positions

Length

Characters

Meaning

1

1

alpha

P indicates a passport, C indicates a Passcard

2

1

alpha+<

Type (for countries that distinguish between different types of passports)

3–5

3

alpha+<

Issuing country or organization.

6–44

39

alpha+<

Surname, followed by two random characters, followed by given names.

 

In the name field, spaces, hyphens and other punctuation are represented by <, except apostrophes, which are skipped. If the names are too long, names are abbreviated to their most significant parts. In that case, the last position must contain an alphabetic character to indicate possible truncation, and if there is a given name, the two fillers and at least one character of it must be included.

The format of the second row is:

Positions

Length

Characters

Meaning

1–9

9

alpha+num+<

Passport number

10

1

numeric

Check digit over digits 1–9

11–13

3

alpha+<

Nationality (ISO 3166-1 alpha-3 code with modifications)

14–19

6

numeric

Date of birth (YYMMDD)

20

1

numeric

Check digit over digits 14–19

21

1

alpha+<

Sex (M, F or < for male, female or unspecified)

22–27

6

numeric

Expiration date of passport (YYMMDD)

28

1

numeric

Check digit over digits 22–27

29–42

14

alpha+num+<

Personal number (may be used by the issuing country as it desires)

43

1

numeric+<

Check digit over digits 29–42 (may be < if all characters are <)

44

1

numeric

Check digit over digits 1–10, 14–20, and 22–43

 

U.S. Passport numbers must be between six and nine alphanumeric characters (letters and numbers).

The "C" that precedes a U.S. Passport Card number is no longer case sensitive.

RFID Verification process.

If you have ever been to the airport or through customs, TSA first visually and/or scans the MRZ of the passport. This printed info contains the basic access control keys needed to "unlock" the embedded chip.

  • The scanning device then sends this info to the chip via RFID.
  • The chip responds with all pertinent data verification which includes a cryptographic signature.
  • The verification process verifies the public keys belonging to the US State Department maintained by ICAO.
  • This process also includes checking the revocation list, also maintained by ICAO.
  • The passport is then verified as it would be when verifying any secure website (HTTPS) using a TLS certificate by a CA.

Other nations such as India and the Maldives for example first digit is alphabetic and the remaining seven digits are numbers.

Many organizations can verify Passports using services such as - https://protect.hooyu.com/document/verify/passport

1 - 10Next
Copyright © | CoryRetherford, LLC | Contact MeNetwork Storage and Security Solutions, LLC, Rights Reserved.®
TLS 1.2, AES with 256 bit encryption

 ‭(Hidden)‬ Blog Tools