On 18 July 2023, Singapore’s Personal Data Protection Commission (PDPC) issued a proposed set of advisory guidelines (Guidelines), to offer clarification on how Singapore’s comprehensive data protection law, the Personal Data Protection Act (PDPA), would apply to the processing of personal data in the design, development and deployment of AI systems.

The Guidelines are intended to be advisory in nature, i.e. not legally binding, and will instead govern how the PDPC will interpret, apply and enforce the PDPA in the contexts where personal data is used in or for AI systems that embed machine learning models to make decisions, recommendations or predictions.

For instance, the Guidelines cover how organisations, in collecting or using personal data in the development or deployment of AI models or systems: 

  • Can benefit from existing exceptions in the PDPA
  • Should comply with the consent, notification and accountability obligations in the PDPA

The Guidelines are open for public consultation to solicit views and comments on whether there are additional issues or scenarios that should be addressed. The consultation will close at midnight on 31 August 2023 (Singapore time).


The Guidelines are structured in three “stages” of AI system implementation, namely:

  • Development, testing and monitoring
  • Deployment
  • Procurement

Development, Testing and Monitoring

In using personal data to train an AI model, organisations can either obtain consent, or seek to rely on one of the following two exceptions to consent afforded under the PDPA:

Business Improvement

  • When the organisation is enhancing an existing product, developing a product, improving operational efficiency, offering more or new personalised products and/or services including through offering recommendations to users
  • Limited only to data sharing within a group (i.e. cannot be relied upon for disclosure to an external third party)
  • Purpose cannot be reasonably achieved without using individually identifiable data
  • Reasonable person would consider such use appropriate in the circumstances
  • Some use cases include:
  • Recommendation engines in social media services offering users content more aligned to their browsing history
    • Job assignment systems that automatically assign jobs to platform workers
    • Internal HR systems used to recommend potential job candidates by providing a first cut in matching candidates to job vacancies
    • Use of AI systems or machine learning models to provide new product features and functionalities to prove competitiveness of products and services
  • The Guidelines list some relevant considerations for organisations seeking to rely on the business improvement exception for developing, testing and monitoring AI systems, and for bias assessment.


  • When an organisation is conducting commercial research to advance science and engineering without a product development roadmap, including where jointly done with a separate legal entity


Organisations are encouraged to carry out a data protection impact assessment, and adopt a “privacy-by-design” approach. This entails taking steps to mitigate risks within the AI system. They are reminded to adopt appropriate technical, process and legal controls for protecting personal data.

In determining what kind of controls should be implemented, organisations should consider:

  • The types of disclosure or risks of theft that the personal data could be subject to
  • The sensitivity and volume of personal data used

Where possible, data should be pseudonymised or deidentified. In any case, data minimisation should be practised, such that organisations must only use such volume of data as is necessary to train the AI system or machine-learning model, taking into account the relevant time periods and other key filters, such as market/customer segmentation, attributes etc.

From a corporate governance standpoint, decisions should be made at an appropriately senior management level, and assessments should be documented with the reasons for the organisation choosing to use personal instead of anonymised data.


Organisations are reminded that the consent and notification obligations in the PDPA are complementary. In crafting their notifications for AI, organisations are encouraged to provide the following information:

  • The function of their product that requires collection and processing of personal data (e.g. recommendation of movies)
  • A general description of types of personal data that will be collected and processed (e.g. movie viewing history)
  • An explanation of how the processing of personal data collected is relevant to the product feature (e.g. analysis of users’ viewing history to make movie recommendations)
  • Identification of specific features of personal data more likely to influence the product feature (e.g. whether movie was viewed completely, multiple times, etc.)

Providing such notifications can be by way of pop-ups, or via more detailed policies that are made available or accessible to end users. Notices can also be “layered”. The PDPC further noted that the industry has been developing disclosure best practices such as model cards and system cards, which could be useful in delivering these notifications as well.

With regards to accountability, organisations that process personal data in Singapore are already required to develop and maintain policies to comply with the PDPA. For AI, these policies should make transparent the organisation’s relevant practices and safeguards for achieving fairness and reasonableness, and the level of detail should also be proportionate to the risks for each use case (e.g. potential harm to the individual, and level of autonomy of the AI system).


The Guidelines also specify guidance for organisations that engage service providers, such as system integrators, for the development and deployment of bespoke or customisable AI systems. Such service providers that are data intermediaries/processors must adopt the following practices:

  • At the pre-processing stage, use techniques such as data mapping and labelling to keep track of data used to form the training datasets
  • Maintain a provenance record to document the lineage of the training data, identifying the source of training data and tracking how it has been transformed during data preparation

Other best practices include understanding the information that customers would require based on their needs and impact on end users, and designing systems to ensure that customers can obtain such relevant information easily. These will enable all parties concerned to attain compliance with their respective PDPA obligations (including the consent and notification, and accountability obligations) in the procurement of AI systems.


In issuing these proposed Guidelines and engaging with industry via a public consultation, the PDPC has signalled its commitment to adopt an inclusive, multistakeholder approach in shaping its regulatory stance over AI.

It also does not appear that Singapore will be laying down any new standalone legislation like in other jurisdictions such as the EU. Instead, current regulations will be applied to AI systems, and the PDPC makes numerous cross references to its preexisting guidance, including the PDPC’s “Advisory Guidelines on Key Concepts in the PDPC”, the “Advisory Guidelines for Selected Topics”, the “Guide to Data Protection Practices for ICT Systems”, the “Guide to Basic Anonymisation”, the “Guide on Responsible Use of Biometric Data in Security Applications”, “Model AI Governance Framework”, and “Implementation and Self-Assessment Guide for Organisations”.

At the same time, however, it appears Singapore is keeping a very close, watchful eye on developments globally, and will no doubt continue to review and refine its position with regards to regulating AI on an ongoing basis.

As AI continues to develop at tremendous speed across the globe, join us in the fourth session of our AI webinar series, “Focus on APAC”, where we will discuss AI legal and regulatory developments in Asia.

Disclaimer: While every effort has been made to ensure that the information contained in this article is accurate, neither its authors nor Squire Patton Boggs accepts responsibility for any errors or omissions. The content of this article is for general information only, and is not intended to constitute or be relied upon as legal advice.