Over the past few years, consumers have gained more control of their financial lives. This is largely due to the rise of FinTech companies. These companies provide financial services through the use of technology. They offer a wide array of services, such as alternative credit scoring worthiness (the use of different data sources for determining credit), payments (the use of alternatives to cash), and lending (the use of technology to grant loans directly to consumers).
In order to provide financial services, FinTech companies utilize the data of their customers. They collect their client’s names, addresses, date of birth, gender, nationality, password, PINs, bank account details, social security details, etc. Recently, FinTech companies have begun to use alternative data. Alternative data draws from non-traditional sources. These include web search history , social network behavior (e.g. the kind of interaction they have, how they respond to certain issues online and what they post), and psychological profiles.
The use of alternative data yields additional insights for marketing, sales, and decision-making. To illustrate, lenders assess the risk level and credit-worthiness of loan applicants. However, this is troublesome when the applicant has insufficient credit information. But by mining a would-be borrower’s web history, lenders can now determine how creditworthy a person is.
Web search history contains information about a person that no traditional approach can capture. It can help lenders determine the employment status of a person. If an adult user is searching for video games in the middle of the day, he most probably doesn’t have a job. Moreover, if his previous history did not contain searches inherent to students, companies may deduce that the person is probably not a student. This unemployed adult, who is not a student, would not be considered creditworthy as there is no indication of his/her ability to repay.
This aggressive and extensive data collection and use introduces a number privacy and security issues. It raises questions as to whether customers are aware that their online behavioral data is being harvested. Moreover, FinTech companies’ use of automated decision-making processes may discriminate against vulnerable groups. Their use of personal and sensitive information may affect an individual’s eligibility for housing, employment, and credit. Finally, the presence of valuable personal information makes FinTech companies attractive targets for cyber-criminals. This is where the Data Privacy Act (DPA) comes into play. The DPA was passed in the Philippines to protect the fundamental right of privacy of individuals, while ensuring the free flow of information.
The Data Privacy Act
The DPA applies to anyone who collects and uses personal information. Since FinTech companies handle the personal information of their customers, their activities are covered by the provisions of the DPA. Generally, the law mandates FinTech companies, as controllers or processors of personal information, to observe the data privacy rights of their clients and to adhere to the general principles of data privacy. However, there are 5 key areas in which FinTech companies are particularly affected by the DPA.
Consent: The DPA requires FinTech companies to obtain customer’s express consent in the processing of personal information, unless circumstances permit otherwise. FinTech companies should never assume that their customers consented to the processing of their personal information. Because, under the law, implied consent is not valid. Moreover, companies must ensure that their clients are aware and agree to all the purposes of processing. Clients are also given the right to withdraw consent where there are no overriding legal grounds.
Right to be Forgotten: The DPA grants individuals the right to order the removal or destruction of their data. Companies must respect this right upon discovery or substantial proof of any of the following grounds:
1. That the personal information is incomplete, outdated, false, or unlawfully obtained;
2. That the personal information is being used for unauthorized purposes;
3. That the personal information is no longer necessary for the purposes for which they were collected;
4. That the individual withdraws his or her consent or objects to its processing;
5. That the personal information is prejudicial to the individual;
6. That the processing is unlawful; or
7. That the personal information controller or processor violated the rights of the individual concerned.
Notification and Registration of Automated Decision-Making Processes: For FinTech companies utilizing automated decision-making processes, the Implementing Rules of the DPA require notification to National Privacy Commission when the processing is the sole basis for the decision. In addition, FinTech companies are required, in certain instances, to register their processing systems. While the DPA does not provide a penalty for the lack of notification and registration, the NPC may take into account such factors in the determination of the company’s liability in case of a data breach.
Appointment of a DPO: FinTech companies using personal data are required to appoint a data protection officer (DPO). The DPO’s main responsibility is to ensure and monitor compliance with the DPA, it’s IRR, and the issuances by the NPC.
Data Breach Notification: FinTech companies that control sensitive personal information (e.g. race, ethnicity, marital status, age, religious, or political affiliations) are required to notify the NPC within 72 hours upon knowledge of or when there is reasonable belief that a breach has occurred. Notification is also required when the breach poses a real risk of serious harm to the client.
Indeed, complying with the DPA is a challenge. However, it also provides a competitive advantage to those who do. It builds trust by showing consumers that their data is protected, which enables companies to win and retain new clients. It could also result in increased sales and profitability. Hence, FinTech companies should not only view compliance as a responsibility, but also as a selling point.