Skip to main content

Data Pipeline Tools: Mapping a Path Through Digital Convergence

Copado DevSecOps - Blog Series

Originally published by New Context.

Data is a top-of-mind issue for many organizations today because of the need for a balance between access and security. They collect data for a reason: to improve their business. However, a data breach could do the exact opposite. By embracing automation and governance as data pipeline tools, organizations can create an end-to-end solution that allows them to comply with the most stringent standards while ensuring access as needed.

Balancing Access and Protection Under the GDPR

Many organizations are reconsidering their enterprise data protection policies because of the European Union’s General Data Protection Regulation. This rule, implemented in 2018, dictates requirements for collecting, storing, and transmitting data in the European Union and European economic areas. As so much of modern business is global, many major companies will need to contend with it in some fashion.

One thing it expands on when it comes to data regulation is the “why” of collection. Most rules state the agency must disclose the reason behind the collection. The GDPR goes a step further by requiring the reason to fit into one of six areas, also called the six lawful bases.

 

CONSENT

CONTRACT

PUBLIC TASK

The individuals give explicit, unambiguous approval of the collection of specific data. Withdrawal is possible at any time, for any reason. The data collected is necessary to complete a contract. On expiration or completion of the agreement, the data is purged. Data is for a task completed in an agent’s official capacity while acting in the public’s interest.

VITAL INTEREST

LEGITIMATE INTEREST

LEGAL REQUIREMENT

This narrow provision only applies to emergencies, like for hospital care when someone can’t consent. This flexible provision permits collection for the benefit of the subject. Demonstrable evidence of the benefit is required. EU or UK law mandates the collection. The agency’s overall purpose is to comply with the law.

 

These classifications are essential because the first key to data management is ensuring appropriately scoped collection. Once its legal obtainment is established, the GDPR sets out standards for:

  • Control: The controller has appropriate measures to protect information and limit access to relevant parties. Data may require encryption, anonymization, or pseudonymization.
     
  • Records: Records of activities are required for most organizations. They must list details on how data is accessed, the purpose of processing, transfers to third parties, and other pertinent actions. This provision necessitates the assignment of a data protection officer.
     
  • Remedies: In the event of a violation of key GDPR provisions, agencies may receive written warnings or fines and penalties. Remedies may also include oversight periods with regular audits to ensure corrective action.
     
  • Access: Individuals can see the data and how it’s used. In some instances, they also have the right to request its disposal. Individuals can object if data is collected for marketing, sales, or other purposes not related to performing the agreed-to services.

All organizations must get familiar with the requirements of the GDPR and incorporate them into their data governance policies, as they impact more than just countries in the EU. These provisions apply to any agency that processes information for individuals within the economic region. Of course, not all data management is dependent on compliance levels. To determine the best way to approach data security, organizations should complete a thorough audit.Balancing Access and Protection Under the GDPR - Copado

Completing a Data Audit to Establish Necessary Tools

Before an organization adopts data pipeline tools, it needs a complete understanding of what’s under its control. This process starts by asking six critical questions as part of a data security risk assessment.

 

WHAT IS THE DATA?

Some information doesn’t require maximum protection. The higher the safety, the higher the cost and lower the productivity. Organizations should tag their data based on sensitivity and focus attention on the most critical information, like social security numbers, personally identifiable information, credit card numbers, and other highly confidential details.

ARE THERE APPLICABLE REGULATIONS?

Most organizations will balance multiple regulations concerning data. Records should indicate if the information is covered under specific rules like the GDPA, HIPAA, GLBA, or other specific industry or regional regulations.

WHERE IS THE DATA?

The most sophisticated infrastructures focus on where the data is. They think of it in terms of the four-stage value chain: collection, publication, uptake, and impact. This becomes a tool to manage data production and use while also enhancing monitoring capabilities.

HOW IS IT ACCESSED?

Individuals will have to access the data, so it’s vital to understand their paths to obtain it. Some areas are riskier than others and require more management. As an example, data in storage may be rarely accessed, while data within an API may see constant use. The API will require additional measures.

WHO CAN ACCESS IT?

In our experience, 60% of cyber attacks come from inside the organization, typically due to employees having access levels they shouldn’t. A zero-trust approach is best. Individuals should always have the least-privileged credentials they need to complete their tasks.

HOW IS IT TRANSMITTED?

Data is very vulnerable when it’s transmitted from one point to another. Companies should use encryption, anonymization, and pseudonymization to ensure protection as it travels.

 

A complete data audit will tell organizations where they need to focus efforts as it exposes their potential blind spots. It also prepares them to take advantage of two critical tools in the data pipeline: automation and governance.

Automation and Governance as Data Pipeline Tools

Automation and governance are a necessity to balance access with security. These are both broad concepts that establish tools and frameworks for enhancing data protection at a fundamental level.

 

GOVERNANCE

AUTOMATION

Data governance sets the rules, responsibilities, and processes for managing secure assets in an organization. Data governance software may:
  • Catalog data
  • Manage metadata
  • Standardize data sets
  • Set compliance rules
  • Establish responsibilities
  • Assign data owners
  • Verify data integrity

In short, data governance is the knowledge base for how the information within an organization is treated and processed.

Automation takes the data governance policies and implements them without human intervention. It provides:
  • Data lineage analysis
  • Enforced compliance
  • End-to-end visibility
  • Data use access and tracking
  • Data level tagging
  • Impact analysis
  • Necessary code generation

Automation executes the requirements established under governance. It ensures consistent, well-monitored, and maintained security.

 

Automation and governance are essential data pipeline tools because they’re end-to-end, often turnkey solutions. It’s possible to manage a wide range of data with varying requirements while providing the access needed to keep productivity high. Any organization ready to embrace automation and governance should consider working with a highly experienced consultant to build a customized program.