Swedish Data Protection Authority Receives 3,500 Privacy Breach Reports Since Start Of GDPR*:

 

  • The Swedish Data Protection Authority said that since GDPR came into force on 25 May 2018, it has received 3,000 complaints and 3,500 reports of privacy breaches.
  • Most of the complaints concerned video surveillance and direct marketing.
  • It said six out of ten of the reported privacy breaches were caused by human agency.
  • The authority said in its first national data integrity report that three out of four Swedes are worried about how their personal information is used.
  • It added that only half of Swedish businesses and authorities keep working systematically on data protection.
  • Eight out of ten Swedes are familiar with GDPR and know that it provides them with more rights.
  • One in six people has taken advantage of these rights, such as by asking a company to delete their information.
  • Three out of four private and public organizations consider that implementing GDPR has gone well.

*Source: TelecomPaper, May 20, 2019

 

6 Best Practice Principles To Client And Counterparty Data Management*:

 

  • The role of client and counterparty risk management has assumed greater strategic importance over the last few years in light of heightened regulatory scrutiny and the introduction of new rules around AML/KYC, ultimate beneficial ownership, global data privacy and investor protection rules.
  • With the growth of AI/RPA-enabled compliance processes, the quest for accurate and high-quality data has become the holy grail for corporate and investment banks.
  • And yet, according to Fenergo’s recent Industry Trends report, 74% of banks surveyed believe that data management is overlooked strategically despite it being among the top three most critical business concerns, with only 15% of respondents stating that they had fully automated the collection of client data.
  • The prevalence of siloed, fragmented solutions has created pools of disparate, unconnected data, resulting in issues around data quality, transparency and visibility of ultimate beneficial ownership structures.
  • One reason for this is disjointed, siloed, semi-structured and unstructured customer data.
  • While effective client and counterparty data management has become a vital aspect of regulatory compliance, the industry lacks the ability to execute policies, procedures and solutions for a well-designed client and data management system.
  • The importance of well-executed data management
  • Many financial institutions are busily trying to remedy the current situation by integrating internal data repositories and external data providers to improve the flow of clean, golden-source data throughout their organization.
  • According to Chartis research, the biggest area of risk tech spend for Tier 1 banks is on risk, governance and integration technology, with $25bn spent in this area alone in 2018.
  • Each organization will have different transformation needs and, although there are technologies that assist in these transformations, experience has proven that that the data management process should be underpinned by robust, transparent best practice principles that are centered on empowering the business user in the overall process.
  • Data Profiling
  • Start with production data as soon as possible! 
  • A key component of the data management process is profiling existing data across a number of dimensions.
  • These include completeness and accuracy.
  • Data profiling is predicated on the availability of production data.
  • Basing the data management effort on a snapshot of production data leads to an early understanding of the content and semantics of the source data.
  • In addition, outlier data can be detected early.
  • The transformation process is iterative in nature; therefore, production or production-like data is important.
  • If necessary, the data can be masked to ensure security compliance, however, masking key data elements such as names may make certain processes such as de-duplication more difficult.
  • Full data snapshots are initially not necessary and representative subsets may be used.
  • Divide the Data Management Effort
  • A data management exercise is a complex effort touching on numerous systems and business units within an organization.
  • To ease this complexity, it is often necessary to delineate the project into discrete functional areas or domain areas and approach the effort in a granular manner.
  • This approach simplifies the process and allows isolated testing to commence early in the data management process.
  • Robust Transaction Process and Logging
  • Strong transaction processing ensures that the process of transforming, moving and creating data in new systems should not be halted because of a failure of a few records.
  • Logging failed transactions without failing an entire run drives reliability into the process.
  • Aligned to transaction processing is an actionable, accurate and timely logging mechanism.
  • Failed transactions should be logged and made available to departments and staff members that are empowered to take action.
  • Establish Quality Metrics and Unit Tests
  • Establishing metrics in conjunction with the business drives transparency into the processes.
  • These metrics often become the cornerstone of decision-making in the data management effort.
  • Problem areas are revealed earlier, and decisions can be made between mutually exclusive objectives such as accuracy and completeness.
  • Defining the expected quality metrics assists in defining the expected outcome and understanding the point where the process has reached an acceptable quality level.
  • Unit tests play an important role in ensuring quality as they may reveal unintended changes in the semantics of the transformed data.
  • Establish a Reporting Cadence
  • Business users that are involved in decisions related to data quality are drawn from functional areas from all across the institution.
  • These users are tasked with ensuring their day-to-day activities are fulfilled in addition to making available time for the data management effort.
  • The reporting cadence and triage procedure is, therefore, important.
  • Reporting must occur at a tempo that allows the data management effort to move forward in a way that does not overwhelm the decision maker.
  • Volume Testing
  • Volume tests need to be conducted relatively early in the process.
  • Metrics and extrapolations derived from the volume tests are used to predict the timing of live production runs.
  • These metrics are especially important if the principle of division is followed.
  • Data management is a vital first step towards achieving compliance.
  • The ability to link all legal entity data and documentation together to achieve a single, comprehensive view that is securely accessible from one location holds many advantages for the compliance and risk management teams, as well as other parts of the institution (e.g. business development /customer onboarding)
  • From a compliance perspective, it cements the relationship between risk management and data management.
  • Higher quality data management gives financial institutions the ability to accurately measure risk exposure and comply with an ever-growing list of regulations, all of which aim to mitigate or prevent risk in some way.
  • By making information and data easily accessible, in the correct standardized format which is available on-demand, financial institutions can ensure they are meeting their obligations under various regulatory obligations.

*Source: Bobs Guide, May 20, 2019

 

Data Security: Think Beyond Endpoint*:

  • Endpoint security is a common concern among organizations, but security teams should be thinking more broadly about protecting data wherever it resides.
  • “If you’re just focusing on device protection and not data protection, you’re missing a lot,” said Shawn Anderson, executive security advisor for Microsoft’s Cybersecurity Solutions Group, at the Interop conference held this week in Las Vegas.
  • Rather than add multiple endpoint security products to corporate machines, he urged his audience of IT and security pros to think about protecting their data.
  • An estimated 60% percent of data is leaked electronically, Anderson said, and 40% is leaked physically.
  • When an organization is breached, the incident costs an average of $240 per record.
  • The average cost of a data breach was $4 million in 2017, a year when hackers stole more than 6 billion records.
  • As more devices jump online, the risk to businesses and their information continues to grow.
  • An estimated 9 billion devices equipped with microcontrollers are deployed in appliance, equipment, and toys each year. 
  • Fewer than one percent are now connected. But that number will grow, and “highly secured” IoT devices require properties many devices don’t have certificate-based authentication, automatic security updates, hardware root of trust, a computing base protected from bugs in other code.
  • All computers within an organization – laptops, smartphones, tablets, a rapidly growing pool of IoT devices – are collecting larger amounts of data.
  • Some of it is kept on the machine but more of it is moving to the cloud, which is powering the number of alerts companies handle.
  • Microsoft analyses 6.5 trillion threat signals daily, Anderson pointed out, up from 1.2 trillion a few years ago.
  • The cloud is accelerating how companies can collect, process, store, and use information.
  • As companies transition to hybrid infrastructure, and their data moves across cloud-based and on-prem systems, they should evaluate their endpoint security strategies to make sure data is protected where it resides.
  • In his talk, Anderson discussed what he called the four pillars of infrastructure security: identity and access management, threat protection, information protection, and security management.
  • Companies should have a strategy in place to secure hybrid infrastructure and protect data from internal and external threats.
  • Identity protection is a critical component to threat protection
  • Businesses should strengthen users’ credentials by enabling MFA, block legacy authentication to reduce the attack surface, increase visibility into why identities are blocked, monitor and act on security alerts, and automate threat remediation with solutions like risk-based conditional access.
  • Data must be protected in use, in transit, and at rest.
  • Businesses should discover and classify sensitive data as it enters the environment, apply protection based on policy, monitor and remediate threats, and remain compliant as data travels throughout the organization before it’s retired and deleted.

*Source: Dark Reading, May 21, 2019

 

 

Five Questions Database Admins Should Ask About GDPR*:

  • The GDPR (General Data Protection Regulations), introduced a year ago is a regulation in EU law on data protection and privacy for all individuals’ citizens of the European Union and the European Economic Area.

 

  • It also addresses the export of personal data outside the EU and EEA areas., and carries potentially huge fines for non-compliance (4 per cent of annual worldwide turnover or nearly £20 million — whichever is higher).
  • It’s sparking a fundamental rethink of security and compliance, and, as a database administrator (DBA), this can be a challenging change to understand and adapt to.
  • DBAs are finding themselves on the front lines of protecting data, which is impacting their other daily roles and responsibilities.
  • Unless DBAs and their teams, working in partnership with a Data Protection Officer (DPO) take the time to understand sensitive data across their systems, their company can run the risk of being non-compliant, leaving them exposed to internal and external threat vectors and vulnerable to fines.
  • Here are five questions for DBAs to think about:
    1. Do you know where your personal and sensitive data lives?

      • “Sensitive personal data” has a specific meaning under GDPR, but you may have other “sensitive” data outside the scope of GDPR that also needs to be identified and protected such as financial and accounting records, purchasing contracts, etc.
      • If your business has customers and end users in the EU, make sure you know where your personal and sensitive data lives and that you’re defining it properly to be prepared for compliance.
    2. Does your company have a dedicated Data Protection Officer?

      • While the DPO role is only mandatory for certain organizations (unless you can demonstrate that you don’t require one), it may be helpful to at least identify who would be responsible for that task if and when it’s needed
        .
      • The DBA will need to work with the appointed DPO as they are tasked with identifying personal data in your systems and implementing the appropriate data protection measures.
    3. Are your systems built for automation and with data security in mind?

      • GDPR mandates that data controllers perform data protection impact assessments when certain types of processing of personal data are likely to present a “high risk” to the data subject.
      • Each assessment must include a systematic and extensive evaluation of the organization’s processes and profiles, including how they safeguard the personal data.
      • When you’re adjusting for GDPR and other data regulation requirements, it’s helpful to automate the process of discovering sensitive data in all your databases and having a tool or process or system running reports for you.
      • Your goal is to monitor that data in real time, and notify database developers of potential breaches before they deploy schema and code changes into production systems.
    4. Are you equipped to prevent personal data breaches?

      • There are two main ways to protect personal data: pseudonymization and anonymization.
      • Pseudonymization enhances privacy by replacing most identifying fields within a data record by one or more artificial identifiers, or pseudonyms.
      • Methods for pseudonymization include encryption and masking.
      • Encryption is typically used to protect data as it is moved and can be decrypted afterward with the right key.
      • Masking (and redaction) are often used for data at rest, for example in a non-production database, where data still needs to be usable for testing, etc.
      • Anonymization obscures personal data by masking it, for example.
      • Once anonymized so that the individual is no longer identifiable, the data is safe.
      • The method of anonymization needs to be irreversible for it to be truly anonymized; typically using masking or redaction.

 

  1. Do you know how to monitor data for potential breaches — when data is constantly moving?

    • Traditionally, data has been stored in one place — the database — with backup copies on physical media.
    • But in this era of data protection strategies that includes high availability (HA) and disaster recovery (DR) systems, data is continuously replicated to other locations and to the cloud (DBaaS or IaaS).
    • That continuous movement makes it difficult to identify and protect personal data — especially as DevOps and cloud initiatives essentially function with continuous movement.
    • With database activity monitoring and auditing tools, you can monitor and track important aspects of user behavior
  • Even if you think your business isn’t immediately impacted by GDPR, it’s informative to know it impacts the vast majority of companies if they handle any personal data of EU citizens.
  • Ultimately, the quicker you act on asking yourself the right questions, the quicker and more prepared you’ll be for maintaining business operations — while your competition scrambles to adapt.

 

Source: IT Pro Portal, 27 May 2019

 

 

885 Million Records Exposed Online: Bank Transactions, Social Security Numbers, and more*:

  • Several million records said to include bank account details, Social Security digits, wire transactions, and other mortgage paperwork, were found publicly accessible on the server of a major U.S. financial service company.
  • More than 885 million records in total were reportedly exposed, according to Krebs on Security.
  • The data was taken offline on Friday.
  • Ben Shoval, a real-estate developer, reportedly discovered the files online and notified security reporter Brian Krebs.
  • Krebs said that he contacted the server’s owner, First American Corporation, prior to reporting the incident.
  • A leading title insurance and settlement services provider, First American is a large company headquartered in California with more than 18,000 employees. 
  • Its total assets in 2017 were reported at over $9.5 billion.
  • A company spokesperson told Gizmodo it learned about the issue on Friday and that the unauthorized access was caused by a “design defect” in one its production applications.
  • It immediately blocked external access to the documents, they said, and began evaluating, with the help of an outside forensics firm, what effect, if any, the exposure had on the security of its customers’ information.
  • According to Krebs, Shoval said that the millions of documents, which appeared to date back as far as 2003, included “all kinds of documents from both the buyer and seller, including Social Security numbers, drivers’ licenses, account statements, and even internal corporate documents if you’re a small business.”
  • Krebs reported that the files were accessible without any kind of authentication.

 

*Source: Gizmodo, May 24, 2019

 

 

HCL Exposes Customer, Personnel Info In Wide-Ranging Data Leak*:

  • IT services provider HCL Technologies has inadvertently exposed passwords, sensitive project reports and other private data of thousands of customers and internal employees on various public HCL subdomains.
  • HCL, an $8 billion conglomerate with more than 100,000 employees, specializes in engineering, software outsourcing and IT outsourcing.
  • On May 1, researchers discovered several publicly accessible pages on varying HCL domains, leaving an array of private data out in the open for anyone to look at.
  • That includes personal information and plaintext passwords for new hires, reports on installations of customer infrastructure, and web applications for managing personnel from thousands of HCL customers and employees within the company.
  • The data was secured on May 8.
  • It’s unclear whether malicious actors accessed the data, but researchers stressed that credentials and internal IDs could be used to log into other HCL systems, while other customer and employee data could be used for other nefarious purposes, such as phishing attacks, they said.
  • Several subdomains were included in the set of resources with personnel-specific information from HCL, researchers said – including the information for hundreds of new hires and thousands of employees.
  • One such subdomain, containing pages for various HR administrative tasks, allowed anonymous access to a dashboard for new hires.
  • This included records for 364 personnel, dating from 2013 to 2019 (In fact, 54 of the new hire records were as recent as May 6).
  • Most critically, the data exposed cleartext passwords for new hires, which could be used to access other HCL systems to which these employees would be given access.
  • Also exposed were candidate IDs, names, mobile numbers, joining dates, recruiter SAP codes, recruiter names and a link to the candidate form.
  • Another personnel management page listed the names and SAP codes for over 2,800 employees.
  • In addition to HCL employees, the company was also accidentally exposing thousands records for customers.
  • That’s because a reporting interface for HCL’s “SmartManage” reporting system – which facilitates project management for customers – exposed information regarding project statuses, sites, incidents and more to anyone, unprotected by authentication measures.
  • That included internal analysis reports, detailed incident reports, network uptime reports, order shipment reports and more for over 2,000 customers.
  • The data was first discovered on May 1 when an Upguard researcher, who was monitoring for the exposure of sensitive information for customers, discovered a publicly accessible file on Upguard’s domain.
  • On May 6, after reaching a reasonably complete level of analysis of the public pages and data, the researcher notified HCL.
  • On May 8, the data was fully secured.

 

*Source: ThreatPost, May 21, 2019