acm-header
Sign In

Communications of the ACM

BLOG@CACM

Auditing AI and Autonomous Systems; Building an Infrastructure of Trust


View as: Print Mobile App Share:
Ryan Carrier

Few would disagree with the statement "We need to be able to trust our artificial intelligence and autonomous systems." But achieving that trust, rather than merely desiring it, commands an understanding of what trust is: how is it achieved, what factors need to be in place in order for it to be assured. What processes, standards, behaviors — and accountability — need to be guaranteed in order to confidently say: "I trust this"?

This paper explores those questions by outlining key elements to building and preserving trust, then goes on to explore  how these elements have and do apply to a range of industries and historic events.  It also suggests some solutions for building trust in AI and autonomous systems.  Solutions we have enough confidence in that we are beginning to operationalise them now.

On its face, Enron, the world's most famous audit/accounting disaster, might highlight the failure of auditing and point to the struggles facing any attempt to audit AI and autonomous systems.  But that would be to look and not really see.

The vast majority of audits do not uncover disasters, large scale frauds or malfeasance. Indeed, thousands of public entities are audited without fanfare every year. The Enron case is an outlier of the most extreme kind, which, if studied correctly and reconciled with the 99.99% of regular audits, can inform what correct and robust structures are needed for an #infrastructureoftrust. By designing and creating Independent Audit of AI Systems (IAAIS), ForHumanity will help build a system of trust in our AIs and autonomous systems.

Trust has been studied by many for centuries, and there is no universal and perfect process that guarantees it. So, instead, we look for instances of trust secured and see which elements they have in common.  Here we submit some of the elements that consciously and subconsciously combine to allow us to trust:

  1. Predictability (a belief or perception about the likelihood of future outcomes from an interaction)

  2. Transparency (access to information allowing for decision-making, especially changes from current trajectories)

  3. Understanding (awareness of a sufficient level of information and knowledge to feel comfortable with a decision or choice)

  4. Control (power equating to a sense of being in-charge and the belief that a decision has multiple viable and beneficial choices)

  5. Security (protection from harm: physical, mental, spiritual, financial or emotional)

  6. Fairness (the belief the interaction has a balanced set of pros and cons for both/all parties)

  7. Equity (the belief an individual's interaction is comparable to others with similar circumstances)

  8. Morality (the belief the trusted party/entity cares about their impact on us or at least they are not intending to harm people)

Each of us 'trusts' in different ways. Depending on the interaction in question, we ascribe more or less value to these listed attributes. Monetary interactions might score lower on the Moral variable and higher on Fairness while trust in manual driving scores high on Predictability (if not, it would be nearly impossible to share a road). If we are to build trust an #infrastructureoftrust that serves all people, then all of these elements must be accounted for in a robust way.

From this foundation, we tackled AIs and Autonomous Systems. We didn't have to start from scratch: in our purview was an existing #infrastructureoftrust with a nearly 50-year track record — the system of Independent Audit already well established in the world of Financial Reporting and Financial Accounting.

We took the 'Trust Variables' above and considered them in relation to the key three parties (auditor, compliant entity, society) engaged in an existing financial audit.

  1. Predictability

    1. A third-party auditor will examine numbers and financial accounts, each year, in a nearly identical manner. Notwithstanding changes to the rules necessary to properly maintain the system.

    2. The compliant entity can plan on compliance and even build compliance-by-design. The Committee of Sponsoring Organizations of the Treadway Commission (COSO) system of Internal Risk and Controls was built on this very premise.

    3. Society knows what compliance means. It can anticipate how usual and unusual events will be treated, which makes the results relatable over extended time periods for valuable evaluations.

  2. Transparency

    1. These rules are publicly available to the auditor, and the auditor's work is also reviewed. The auditor can participate in but not control changes to those rules which may facilitate their ability to assure compliance.

    2. The compliant entity knows the rules and has the ability to participate in but not control changes to those rules which may facilitate their ability to comply and accurately represent their work.

    3. Society knows the rules (even if a specialized profession largely deploys the results on their behalf) and has the ability to review the result.  Society has an ability to participate in but not control changes to those rules which may facilitate their ability to understand the entity and the outputs.

  3. Understanding

    1. Understanding different professions requires expertise about these numbers and what they mean. Those explanations are widely available and consistently delivered to all who seek knowledge about how to audit.

    2. The compliant entity has equal knowledge on how to comply with the audit.

    3. Society understands how financial audits are conducted, what compliance means, and understands the output.

  4. Control

    1. A third-party auditor has complete and unconflicted responsibility for assuring compliance with the rules and standards; assuring compliance is in their sole discretion based on pre-defined and well-understood proofs supplied by the compliant entity.

    2. The compliant entity reduces its own risk and dictates the manner in which it provides compliance satisfaction. Compliance satisfaction is well known and understood to avoid failed compliance. Satisfactory compliance will equate to assurance by the auditor.

    3. Society has more limited control here, except that a vital feedback loop is created as the rules are established by trusted industry specialists. If society does not use these outputs (ultimate choice) then the whole system fails and folds back upon itself. As long as the system functions positively, then society continues to use the outputs, and the system repeats.

  5. Security

    1. The Auditor has the contractual structure and 'power' to demand the information required to assure compliance. They have the imprimatur to do so because they are not authors of the rules. They have full transparency through their contract with the compliant entity.

    2. The compliant entity protects its intellectual property from the whole world, including its competitors. It is responsible for all aspects of compliance and all information, except disclosures, maximizing the protection of that information vis a vis omnibus transparency. The compliant entity maximizes security by being compliant.

    3. Society protects itself by establishing the rules which raise transparency, increase governance, oversight, and protection/safety mechanisms for humans. When laws are codified into audit rules we achieve proactive compliance rather than the traditional reactive compliance courtesy of laws and punishment.***

  6. Fairness

    1. Auditors are independent (legal term) and only receive commensurate compensation for conducting audits. The Auditor has liability for certifying compliance where it is not earned and proved by the compliant entity. This creates a natural "tension" between auditor and compliant entity where the auditor can effectively imply "prove it" in regards to the audit criteria.

    2. The compliant entity knows if it is compliant and can reasonably prove compliance, then assurance will occur.  The fees associated with the audit do not outweigh the economic benefit of compliance.

    3. Society benefits from an unconflicted process and the leverage provided by an audit firm acting as an unbiased proxy for verification and assurance of an entity's financial reports. It acts as 'shorthand' for all members of society doing their own work.

  7. Equity

    1. Auditors provide the same assurance for the same compliance based upon a set of accepted third-party rules (avoiding the inherent conflict of interest of auditing your own rules) and are simply applying them in return for a fair wage.

    2. All compliant entities receive the same assurance for the same compliance based upon a set of rules they did not create and receive public notoriety for complying with the rules demanded by society on a level playing field.

    3. Society establishes a set of rules for all.  A level playing field of compliance and assurance on a set of rules that treat the individual in a fair way vis a vis the compliant entity during the exchange of goods and services.

  8. Morality

    1. The Auditor understands they act as a proxy for society and have a duty of care to assure compliance where it exists and withhold assurance when compliance is not achieved.

    2. The compliant entity willfully complies with societal rules in exchange for the right to earn a profit or the right to continue to deliver its goods and services.

    3. Society makes rules and processes that satisfy their needs for trust, balanced against the economic needs of the compliant entity's ability to profitably or successfully deliver their goods and services

Using the trust framework derived from financial accounting, it is instructive to examine a monstrous failure of the system to understand the ways in which it was manipulated. This gigantic fraud was facilitated by fundamental failures of the infrastructure (now rectified).

The Enron Story

Before 2001, Enron employed approximately 29,000 staff and was a major electricity, natural gas, communications, and pulp and paper company, with claimed revenues of nearly $101 billion during 2000.1 Fortune named Enron "America's Most Innovative Company" for six consecutive years.  Its stock market capitalization peaked at or around $11 billion. Enron's auditor was Arthur Anderson. After the scandal was uncovered, all of that equity was destroyed.

In the next section we return to elements of the framework of trust and map the Enron scandal against each. Here, we will skip the role of one of the three key parties Society because it is not unreasonable to conclude that society was universally and completely neglected and subsequently harmed by this fraud and malfeasance. Later, we address the role of fraud and malfeasance in this system.

  1. Predictability

    1. A third-party auditor will examine numbers and financial accounts, each year, in a nearly identical manner. Notwithstanding changes to the rules necessary to properly maintain the system.

    2. The compliant entity can plan on compliance and even build compliance-by-design. The Committee of Sponsoring Organizations of the Treadway Commission (COSO) system of Internal Risk and Controls was built on this very premise.

Enron and Arthur Anderson conspired to use this predictability against society. They manipulated the rules, maximized loopholes and tailored contracts and business functions to fit into those loopholes. Collusion and communication were required to maximize these manipulations of Predictability.

  1. Transparency

    1. The rules are publicly available to the auditor, and the auditor's work is also reviewed. The auditor can participate in but not control changes to those rules which may facilitate their ability to assure compliance.

    2. The compliant entity knows the rules and has the ability to participate in but not control changes to those rules which may facilitate their ability to comply and accurately represent their work

As a result of collusion between the two entities, application of the rules was manipulated.  The  clarity of the rules was used against the system to exploit the loopholes. This was only possible because auditor independence was compromised.

  1. Understanding

    1. Understanding different professions requires expertise about these numbers and what they mean.  Those explanations are widely available and consistently delivered to all who seek knowledge about how to audit.

    2. The compliant entity has equal knowledge on how to comply with the audit

In the Enron story "specialized" and industry expertise was used against society. Insufficient scrutiny led to these manipulations. However, this specialized and industry expertise, in the shape of sophisticated investors and analysts studying disclosure and transparency requirements, eventually called out the fraud and investigations were launched. Understanding remains key and the more general the understanding is made, the greater defense against fraud and malfeasance.

  1. Control

    1. A third-party auditor has complete and unconflicted responsibility for assuring compliance with the rules and standards; assuring compliance is in their sole discretion based on pre-defined and well-understood proofs supplied by the compliant entity

    2. The compliant entity reduces its own risk and dictates the manner in which it  provides compliance satisfaction.  Compliance satisfaction is well known and understood to avoid failed compliance. Satisfactory compliance will equate to assurance by the auditor.

Arthur Anderson was utterly compromised in its role: paid as a consultant, advisor, and auditor meant its independence to act in the best interests of the audit process was impossible.  This failure was rectified by the Sarbanes-Oxley Act (2002), passed in direct response to this incident.  Later, we highlight how this law remedied key shortcomings in the infrastructure resulting in this fraud. Enron exerted control by manipulating its accounting to magnify earnings and other related assets, making the company appear more valuable than it would have appeared under standard accounting practices.

  1. Security

    1. The Auditor has all of the information they require to assure compliance and need to assure compliance until they are satisfied. They have the authority to request all relevant information and the imprimatur to do so because they are not the authors of the rules. They are availed of full transparency through their contract with the compliant entity

    2. The compliant entity protects its intellectual property from complete transparency to the whole world, including its competitors. It is responsible for all aspects of compliance and all information, except disclosures, maximizing the protection of that information vis a vis omnibus transparency.  The compliant entity maximizes security by being compliant.

By violating the principle of Independence, the colluding auditor and compliant entity conspired to manipulate the accounting practices. They actively worked together to manufacture numbers and accounts into beneficial expressions in public reports. Security was never at risk for either firm because they worked together to manage the risk.  However, the harm to Society was monumental as many employees and investors lost their entire investments.

  1. Fairness

    1. Auditors are independent (legal term) and only receive commensurate compensation for conducting audits.  The Auditor has liability for certifying compliance where it is not earned and proved by the compliant entity.This creates a natural "tension" between auditor and compliant entity where the auditor can effectively imply "prove it" in regards to the audit criteria.

    2. The compliant entity knows if it is compliant and can reasonably prove compliance, then assurance will occur. The fees associated with the audit do not outweigh the economic benefit of compliance

During 2000, Andersen earned $25 million in audit fees and $27 million in consulting fees from Enron alone.  Under such an arrangement, it is difficult to see how an auditor could be expected to deny compliance. Further, that compliance is dictated by itself and its own actions. Matching its audit to the advice given for audit compliance. For Enron, there was no natural tension between itself and its auditor there was nothing to "prove" the proof was often constructed by the auditor.

  1. Equity

    1. Auditors provide the same assurance for the same compliance based upon a set of accepted third-party rules (avoiding the inherent conflict of interest of auditing your own rules) and are simply applying them in return for a fair wage.

    2. All compliant entities receive the same assurance for the same compliance based upon a set of rules they did not create and receive public notoriety for complying with the rules demanded by society on a level playing field.

Arthur Anderson conspired with senior executives at Enron, providing Special Purpose Vehicles (SPVs) and unusual accounting treatments compared with the rest of the marketplace.  Special treatment was provided by the auditor. Non-standard and inherently conflicted solutions were developed by Enron (such as creating SPVs) and executing hedges with itself. The opposite of Equity.

  1. Morality

    1. The Auditor understands they act as a proxy for society and have a duty of care to assure compliance where it exists and withhold assurance when compliance is not achieved.

    2. The compliant entity willfully complies with societal rules in exchange for the right to earn a profit or the right to continue to deliver its goods and services.

Arthur Anderson did not behave as a proxy for Society. It failed to consider the consequences and risk to people, notably the employees many of whom lost everything from pensions to 401k's and jobs. Enron intentionally subverted rules, regulations, and transparency in order to increase profits. The opposite of Moral behavior.

It is important to state here that fraud and malfeasance perpetrated by bad actors will ALWAYS defeat a transparent system. It is impossible to build a transparent set of rules without inadvertently explaining to bad actors how to beat them.  Rather, build the system for willful compliance, for entities that want to perpetuate compliance. Bad actors will eventually be caught and terminated, when the law, punishment, and markets finish with them. Disclosure, transparency, checks-and-balances, the prohibition of conflicts-of-interest and robust incentive schemes will dissuade most bad actors, but no system is foolproof and this one is not either.

To paraphrase Sarbanes-Oxley  - passed to legally define "independence":

If you audit, then you cannot receive any other upside benefits from a relationship with the auditee. If you provide service, consult, or otherwise help a client prepare or comply with an audit, then you cannot be the auditor. Finally, if you audit or serve a client, you must use a third party to establish audit rules

This law, combined with the system we propose has a good chance of establishing an #infrastructureoftrust for our AIs and autonomous systems.

To end, we take one last walk through our trust device, this time mapping Independent Audit of AI Systems against it with a summary paragraph after each section and strikethroughs indicating a change from financial auditing:

  1. Predictability

    1. A third-party auditor will examine AI audit criteria, each year, in a nearly identical manner. Notwithstanding changes to the rules necessary to properly maintain the system.

    2. The compliant entity can plan on compliance and even build compliance-by-design. The Committee of Sponsoring Organizations of the Treadway Commission (COSO) system of Internal Risk and Controls will need to be adapted to these audits.

    3. Society knows what compliance means. It can anticipate how usual and unusual events will be treated, which makes the results relatable over extended time periods for valuable evaluations.

ForHumanity has developed a process and is expanding audit criteria for all AIs and Autonomous systems that impact humans (over 3,400 lines of audit already). These rules will be publicly available as they are approved by regulators around the world. Companies and auditors will know exactly what the compliance criteria will be encouraged to participate in the development process. This marketplace is more dynamic than that of tax law and financial accounting, and so the processes need to be regularly iterated. A reasonable time period will be allowed for compliance against newly adopted rules.

  1. Transparency

    1. These rules are publicly available to the auditor, and the auditor's work is also reviewed. The auditor can participate in but not control changes to those rules which may facilitate their ability to assure compliance.

    2. The compliant entity knows the rules and has the ability to participate in but not control changes to those rules which may facilitate their ability to comply and accurately represent their work

    3. Society knows the rules (even if a specialized profession largely deploys the results on their behalf) and has the ability to review the result. Society has an ability to participate in but not control changes to those rules which may facilitate their ability to understand the entity and the outputs.

ForHumanity operates a fully-inclusive process for developing the rules. It is crowdsourced, iterated, and transparent. All may participate in drafting audit criteria in the areas of Ethics, Bias, Privacy, Trust, and Cybersecurity. All qualified firms will be licensed to audit or provide compliance services. The only requirements for participation in the ForHumanity audit drafting process are a contact mechanism (such as email) and a willingness to sign our code of conduct asking humans to act respectfully to other humans during the process.

  1. Understanding

    1. Understanding different professions requires expertise about these criteria and what they mean.  Those explanations are widely available and consistently delivered to all who seek knowledge about how to audit.

    2. The compliant entity has equal knowledge on how to comply with the audit

    3. Society understands how Financial AI audits are conducted, what compliance means, and understands the output.

While auditing AIs and Autonomous systems remains in its infancy, ForHumanity has launched training and learning objectives available for all who want to learn how to audit using our approved certification schemes. People can learn how to execute these audits and pass an exam to audit or build systems designed to facilitate audit compliance.

  1. Control

    1. A third-party auditor has complete and unconflicted responsibility for assuring compliance with the rules and standards; assuring compliance is in their sole discretion based on pre-defined and well-understood proofs supplied by the compliant entity.

    2. The compliant entity reduces its own risk and dictates the manner in which it provides compliance satisfaction. Compliance satisfaction is well known and understood to avoid failed compliance. Satisfactory compliance will equate to assurance by the auditor.

    3. Society has more control here, unaffiliated individuals, whose mission is dedicated solely towards mitigating risk to Society, are drafting the rules. If society does not use these outputs (ultimate choice) then the whole system fails and folds back upon itself. As long as the system functions positively, then society continues to use the outputs, and the system repeats.

Compliance assurance remains solely in the domain of the auditor identical to financial audit. Courtesy of our crowdsourced expertise, people can be involved in the drafting process as much or as little as they choose. All are welcome inside ForHumanity; we welcome your perspective, wisdom, culture, and your passion for building trustworthy AIs and autonomous systems. You will make a difference. One word, one definition, one improved audit line will make a difference. The ubiquity of AI systems requires all perspectives to make it work for everyone, everywhere.

  1. Security

    1. The Auditor has the contractual structure and 'power' to demand the information required to assure compliance. They have the imprimatur to do so because they are not authors of the rules. They have full transparency through their contract with the compliant entity.

    2. The compliant entity protects its intellectual property from the whole world, including its competitors. It is responsible for all aspects of compliance and all information, except disclosures, maximizing the protection of that information vis a vis omnibus transparency. The compliant entity maximizes security by being compliant.

    3. Society protects itself by establishing the rules which raise transparency, increase governance, oversight, and protection/safety mechanisms for humans. When laws are codified into audit rules we achieve proactive compliance rather than the traditional reactive compliance courtesy of laws and punishment.***

Independent Audit of AI Systems will not provide ultimate security: all transparent systems are at the mercy of bad actors. However, combined with the laws of independence, the checks and balances, oversight, governance, disclosures, transparency, and robust audit process, it introduces a substantially heightened level of security for our AIs and Autonomous systems, certainly above our current, unregulated free for all.

  1. Fairness
    1. Auditors are independent (legal term) and only receive commensurate compensation for conducting audits. The Auditor has liability for certifying compliance where it is not earned and proved by the compliant entity.

    2. The compliant entity knows that if it is compliant and can reasonably prove compliance then assurance will occur.  The fees associated with the audit do not outweigh the economic benefit of compliance.

    3. Society benefits from an unconflicted process and the leverage provided by an audit firm acting as an unbiased proxy for verification and assurance of an entity's financial reports. It acts as 'shorthand' for all members of society doing their own work.

This section is unchanged and expresses exactly how Independent Audit of AI Systems maintains fairness. ForHumanity has one advantage over financial accounting: we have included the Sarbanes-Oxley legal structure of independence into our license agreement to ensure fairness and independence in the audit system.

  1. Equity

    1. Auditors provide the same assurance for the same compliance based upon a set of accepted third-party rules (avoiding the inherent conflict of interest of auditing your own rules) and are simply applying them in return for a fair wage.

    2. All compliant entities receive the same assurance for the same compliance based upon a set of rules they did not create and receive public notoriety for complying with the rules demanded by society on a level playing field.

    3. Society establishes a set of rules for all. A level playing field of compliance and assurance on a set of rules that treat the individual in a fair way vis a vis the compliant entity during the exchange of goods and services.

ForHumanity is a non-profit organization whose mission is dedicated to mitigating risk to humans from AIs and autonomous systems, so equity for people and not auditors and compliant entities is our focus. However, a system that does not consider the economic impact of audit compliance to compliant entities and their ability to deliver goods and services or the practical impact on auditors and their ability to achieve satisfactory assurance that an entity is compliant would render the system useless. Equity exists in that balance, but in this balance and this structure a system exist to bring an #infrastructureoftrust to people.

  1. Morality

    1. The Auditor understands they act as a proxy for society and have a duty of care to assure compliance where it exists and withhold assurance when compliance is not achieved.

    2. The compliant entity willfully complies with societal rules in exchange for the right to earn a profit or the right to continue to deliver its  goods and services.

    3. Society makes rules and processes that satisfy their needs for trust, balanced against the economic needs of the compliant entity's ability to profitably or successfully deliver their goods and services.

AIs and Autonomous Systems facilitate a heightened level of human and personal inclusion in the actual processing.  In financial transactions only numbers flow through. Here, data, which is often representative of who we are, flows through the process. Our privacy is at stake, many of our biases are encoded into the systems - some of our ethical decisions are abdicated to these systems. The work of Independent Audit of AI Systems endeavors to mitigate bias risk and uncover the embedded ethics in the systems. These mitigations are accomplished via the certification criteria. The criteria requires disclosure, transparency, and explainability so that we may continue to increase our awareness of embedded bias, embedded ethical decisions, risks to privacy, potential trust issues and areas of insufficient security for these systems. This way  we increase our ability to rectify these shortcomings and continue to mitigate risk to humans.

No system is perfect, there are no illusions here amongst ForHumanity's Contributors and Fellows. We, collectively, believe that Independent Audit of AI Systems provides our best opportunity to build a dynamic process, with robust guardrails, transparent, human-centric rules designed to grow, change and adapt to an extremely dynamic marketplace. We feel that this is the best plan to establish an #infrastructureoftrust for all of our AIs and autonomous systems. If you agree, come join us and help. If you disagree, come join us and make us better because in the end, we are ForHumanity and we are sure you are too.

About ForHumanity

ForHumanity is a 501(c)(3) tax-exempt public charity formed to examine and analyze the downside risks associated with the ubiquitous advance of AI and automation. To this end, we engage in risk control and mitigation and deploy the lens and filter of Ethics, Bias, Privacy, Trust, and Cybersecurity to ensure the optimal outcome…ForHumanity.

ForHumanity is an interdisciplinary group of dedicated expert volunteers, with over 200 contributors and 30 Fellows, Its collective expertise spans the AI field, ranging from ethics to algorithmic risk and to security. Our team is drawn from the academic, legal, policy, corporate, and public sectors of over 35 countries around the world. Our mission is to help create an 'infrastructure of trust' for all autonomous systems that directly impact humans.

ForHumanity drafts comprehensive, pragmatic and implementable audit rules and standards for autonomous systems in every corner of the economy. Our experts collaborate with industry practitioners to ensure these audits achieve our mission of mitigating AI risk to humans. This system of audit rules and standards  - adapted to local jurisdictional laws and regulations - is called Independent Audit of AI Systems (IAAIS).

 

*** Laws are reactive. They are designed to deter would-be wrongdoers and punish criminals; however, they do not prevent people from being hurt.  When ForHumanity talks about the word "audit," we do not think of simply a detailed/deep examination of financial records, accounts, or mechanisms. While this form of the word is the common understanding of audit, there is a bigger "audit" and that is the one we refer to here. Audit, when it codifies the law (such as GAAP and IFRS codifying tax law or ForHumanity codifying GDPR on behalf of the Internet Commissioner's Office), audit becomes a proactive application of the law. When audits are mandated by law (such is the case for most publicly traded companies), then compliance with the law happens before people are hurt, before laws are broken.  Knowing an independent third party will be examining your compliance has a way of increasing adherence to the law. When audit rules are drafted to codify existing law, it is a more proactive application of the law and more protective of people.

 

References

1) https://archive.fortune.com/magazines/fortune/fortune500_archive/snapshots/2001/478.html

 

Ryan Carrier is executive director of ForHumanity, a non-profit organization created to examine and mitigate the downside risks associated with Artificial Intelligence and Automation. Independent Audit of AI Systems is one such risk mitigation tool.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account