Singapore to Greenlight Personal Data Use for AI Development
Exceptions to Help AI Developers Avoid Getting Customer Consent to Use DataSingapore's data protection watchdog on Tuesday released for public consultation its proposed guidelines for organizations' collection and use of personal data to develop and deploy artificial intelligence-enabled systems.
See Also: An Executive's Guide to Operationalizing Generative AI
The Personal Data Protection Commission said organizations that need to process the personal data of citizens to develop and deploy artificial intelligence-enabled systems must obtain clear and precise consent from data subjects for the use and processing of their personal data, barring some exceptions.
Under the Business Improvement Exception, the commission said, organizations need not obtain customers' consent if they use their personal information to develop or enhance existing products with artificial intelligence capabilities or use an AI system to improve operational efficiency or offer new personalized products or services to customers.
Businesses operating in Singapore can use AI systems to process customer data with the aim of understanding an individual's behavior or preferences and customizing good or services for such individuals. They may also not require customer consent when sharing their personal data with related companies within a group of companies or between departments within the same company, as long as personal data is processed to refine or develop existing products or services.
The data protection watchdog said organizations have the option to use anonymized data to develop AI-enabled products or services, but considering the limitations of anonymized data - such as model accuracy, repeatability or reproducibility of results - they must process only so much live personal data as is necessary for specific processes.
"Organizations should carefully weigh the pros and cons of using both types of data and clearly document internally the reasons for choosing to use personal data instead of anonymized data. Organizations should employ appropriate corporate governance methods to make such decisions, including consulting relevant stakeholders and having such decisions made at an appropriately senior management level," PDPC said.
The commission added that if businesses choose to rely on the exception to avoid obtaining prior consent, they must establish that a research project cannot be accomplished without the use of personal data, that there is a clear public benefit to using the personal data for the research purpose, that the personal data will not be used to make decisions that may affect the individual, and that the research results, when published, must not identify the individual.
Avoiding a Heavy-Handed Approach to AI
Josh Lee Kok Thong, APAC managing director at privacy policy research group Future of Privacy Forum, said the exceptions reflect the Singapore government's views about artificial intelligence as "a key strategic enabler in developing its economy and improving the quality of life of its citizens. This explains why Singapore is not taking a heavy-handed approach in regulating AI lest it stifles innovation and investment," he said.
PDPC also proposed a Research Exception, giving organizations the option of not obtaining customer consent when conducting "broader research and development that may not have any immediate application on their products, services, business operations or market." Organizations that are not related but participate in joint projects to develop new AI systems may not need customer consent prior to sharing and processing their data unless it is practical to do so.
PDPC also advised organizations that conduct AI system development to provide authorities with information on data quality and governance measures without compromising the commercial confidentiality or security of their projects. Such information may include steps taken to ensure the quality of personal data, safeguards adopted to limit access to testers, whether data minimization had been practiced, and whether it had been necessary to process personal data to perform bias assessment.
The data privacy group said that organizations will not automatically qualify for business improvement or research exceptions. For the most part, they will be required to get a customer's prior consent before collecting, using or disclosing their personal information and must simplify the literature to ensure a customer can provide "meaningful consent."
The proposed guidelines will be available for public feedback until Aug. 31.