We previously wrote about Unlocking Compliance with the Japanese Data Privacy Act (APPI) using Private AI and are now following up with an assessment of the Interim Report on Considerations for the Triennial Review of the Act on Protection of Personal Information (Interim Report). This Interim Report results from a review of APPI, which the act mandates to take place every three years.
We focus on the contemplated introduction of administrative fines to increase the effectiveness of enforcement, strengthened third-party disclosure opt-out rights for individuals, and lightened breach reporting burdens. We add what this could mean for compliance efforts and how Private AI can help.
Administrative Fines
The Interim Report proposes establishing an administrative fine system, a significant shift from the current enforcement framework. Unlike jurisdictions with data protection laws that actively enforce administrative fines, like Europe’s General Data Protection Regulation (GDPR), Japan currently lacks such measures under the APPI.
Enforcement under the APPI presently includes the Personal Information Protection Commission (PPC) conducting investigations and issuing guidance or recommendations. If a business fails to comply with these recommendations without valid reasons and poses a significant threat to individuals’ rights, the PPC may issue cease and desist orders. However, despite the existence of criminal penalties for non-compliance with these orders, enforcement is infrequent.
Criminal penalties were thus far reserved for the most egregious cases. An example from the Interim Report references a case where information about a bankrupt individual was published on the Internet in conjunction with a map indicating the address of the individual.
The proposed administrative fines aim to bolster the PPC’s ability to monitor and ensure compliance with the APPI more effectively. Yet, the implementation of this system is cautioned to be approached carefully, considering the substantial opposition from stakeholders at public hearings on this matter.
This development would likely mean that enforcement efforts will be expanded to APPI violations that are not at the level of criminal misconduct but that are nevertheless endangering the privacy of individuals. One case the Interim Report highlights is that of a call center business engaged by 38 local governments which hired a person to maintain and operate the system, and this person repeatedly extracted personal information of residents and customers. The only enforcement action was a recommendation to correct the deficiencies in the organizational security control measures and guidance on how to do so. Such cases may, in the future, be subject to fines as well.
Private AI’s Privacy Layer for Call & Contact Centers can help identify and remove over 50 entities of personal information located in customer relationship management systems, including unstructured data like transcripts and chat logs. Its advanced machine learning algorithms operate with state-of-the-art accuracy, in 53 languages, including Japanese, and across all common file formats.
Having fewer personal data stored in an organization’s environment not only decreases the misuse potential by internal or external actors but it also reduces the volume of data subject to regulatory scrutiny and potential data breaches. By minimizing the exposure of personal information, Private AI’s solution ensures that organizations can comply more effectively with data protection laws such as the APPI, and mitigate risks associated with non-compliance, including the proposed administrative fines.
Opt-Out of Third-Party Disclosures
A second focus of the Interim Report is the strengthening of opt-out rights that have existed in the APPI since 2015. The PPC observed that businesses are not properly scrutinizing data recipients or their intended use, which has in the past led to the sale of personal data to criminal groups. In addition to increased due diligence obligations regarding the circumstances of acquisition, the PPC also contemplates how to make individuals more aware of their opt-out rights and how to facilitate exercising them.
These measures may lead to an increase in opt-outs and consequently to additional resources required to be dedicated to fileting out personal information from data sets that are supposed to be disclosed to third parties.
Private AI’s personal data detection and redaction software can easily automate this process and ensure that customers’ opt-out decisions are honored by not disclosing their personal information.
Shift to Risk-Based Incident Reporting
Since 2022, organizations have been required to make both a preliminary and a final incident report to the PPC for every instant of a breach of personal information, whether this requires only one individual or regardless of the personal data involved and its sensitivity level. Breach in this context means accidental disclosure, like sending medication or credit cards to the wrong individual, or unauthorized access, like a successful cyberattack.
The PPC seems inclined to allow for the substitution of the preliminary report with an externally approved reporting system and procedure that provides assurance that the affected individuals are being properly informed of the incident and measures they can take to protect themselves from any harm.
However, the PPC is hesitant to shift to a risk-based reporting regime, requiring a report at a higher risk level than the mere possibility of harm. Requiring a significant risk is prevalent in many other jurisdictions. Currently, the PPC deems it necessary to assess actual cases and conduct further studies in cooperation with businesses to clarify the necessary requirements.
If businesses are successful with their argument that the current reporting regime is overburdening them, and if evidence suggests that a report to the PPC may suffice in instances of elevated risks, the question will become how businesses can easily determine the risk. Commonly, the number of affected individuals and the type of personal information breached will be informative.
Private AI can make this determination across all databases in an organization’s environment. In case of a breach of an information technology system, pointing the software at the affected system renders a report of the amount and type of personal information contained in the system, greatly facilitating the determination of the risk level assessed based on sensitivity and number of affected individuals.
Conclusion
The impending revisions to Japan’s APPI suggest a significant enforcement shift with the introduction of administrative fines, signalling a stricter compliance landscape similar to global standards like the GDPR. This evolution underscores the necessity for organizations to leverage advanced solutions like Private AI to manage and minimize personal data effectively. By employing such technologies, companies can reduce risks of non-compliance and enhance data security, aligning with heightened regulatory expectations and mitigating potential financial penalties.
To see the tech in action, try our web demo, or get an API key to try it yourself on your own data.