You can see the rules and regulations in other jurisdictions.
Fintech businesses handle vast amounts of personal data, including KYC data, and are thus subject to the regulations defined by the General Data Protection Regulation (GDPR). This applies not only to companies based in the European Union but also those established outside of it that provide services to customers in the EU and process their personal data even if a payment is not involved. The European Data Protection Board's Guidelines 3/2018 on the territorial scope of GDPR, adopted on 16 November 2018, seeks to clarify this concept by stressing that a clear intention to target customers in the EU is essential for deciding if non-EU entities fall under GDPR obligations.1
In some cases, the processing of personal data demands consent from customers. Opt-in or opt-out boxes that are pre-ticked won't be permissible anymore, because it must be made clear through an expression or a positive action. GDPR imposes rigorous accountability expectations on controllers of information and this is a large transformation in the data protection system. This also includes impact assessments on higher risk tasks (such as ones involving data which could be employed to commit financial crime) plus instituting data protection by design and by default.1
Besides these general data protection rules, fintech companies will have to comply with banking secrecy and anti-money laundering regulations when providing services to their clients.1
A client's personal data protected by banking secrecy (including cross-border transfers) may be disclosed only with their consent or if the disclosure is necessary to achieve one of the following:
According to the Portuguese Data Protection Authority (CNPD), all personal data processed by a bank is subject to banking secrecy.1
Regarding the use of clients' data for Anti-Money Laundering (AML) reporting, there is no requirement to secure the data subject's consent as the disclosure of specific information is based upon a legal obligation. Nonetheless, financial organisations may choose to request client authorisation related to information covered by banking secrecy as part of their general client terms and conditions, which diverge from the European Union's General Data Protection Regulation requirements in this regard.1
A key part of fintech business data processing is the definition of clients' profiles, business segmentation, and automated decision making based on these profiles. It is not permissible to make automated decisions based solely on the automated processing of data intended to evaluate certain personal aspects about the data subject that produce effects or significantly affect him or her.1
New provisions have been added to the GDPR to address profiling and automated decision-making risks. This type of decision-making may only be carried out under the GDPR if it is necessary to enter into or perform a contract, or if it is authorized by the EU or Member State laws applicable to the controller, or if the individual has given their explicit consent. Where one of these grounds applies, additional safeguards and disclosure of specific information about automated individual decision-making must be provided to people affected by the process. This was highlighted in the response of the European Data Protection Board (EDPB) to a January 2020 letter from Member of the European Parliament Sophie in 't Veld. The EDPB noted that controllers must look into any potential risks posed to individuals’ rights and freedoms negatively impacted by the use or creation of an algorithm, and take action where necessary.1
For any processing of personal data, there are also additional restrictions on the use of special categories of data (such as health-related data or biometric data), which can ultimately impact how fintech companies implement strong customer authentication mechanisms under the PSD II Regulatory Technical Standards, since the Regulatory Technical Standards recommend using the biometric data of users of payment services. The CNPD has stated that financial information qualifies as sensitive data and thus is safeguarded by the Portuguese Constitution, because it can provide insight into an individual's personal life. This judgement of EDPB also causes data controllers and processors to strengthen technical and organisational measures they implement to protect the information, as well as necessitating a data protection impact assessment (DPIA). Consequently, the processing of financial data may require a DPIA in accordance with CNPD's Regulation 1/2018, which lists the processing activities that require a mandatory DPIA, since four of the nine cases in the Regulation involve the processing of highly personal data.1
Portuguese legislation, Law No. 58/2019, amended GDPR regulations, particularly in terms of handling deceased individuals' personal details, time periods given for storing data and obtaining minors' consent for data use. Moreover, it allows controllers and processors to keep personal records until any legal or contractual requirements have reached the expiration date set forth by statute.1
Cross-border payments in Portugal
Comprehensive legal services for businesses on corporate, tax law, cryptocurrency legislation, investment activities
We work for international SMEs, startups and Telco's
Participation as a lawyer at investment venture funds, leading venture M&A deals in IT, supporting iGaming and business assets