✍️ Article by Tess Buckley and Luke Patterson
Tess Buckley’s primary research interests include studying the intersection of AI and disability rights (thesis), analyzing AI governance and corporate digital responsibility (EthicsGrade), amplifying marginalized voices in data through AI literacy training (HumansforAI), and the ethical and philosophical implications of AI music systems (Computational Creativity Collective). In particular, she seeks to use philosophical principles to make emerging technologies explainable, and ethical.
Luke Patterson is a Senior Insights Analyst at EthicsGrade, a SaaS platform for supplier and portfolio ESG risks of digitalisation such as AI governance, data privacy, and cyber-security
Overview: This is an explanatory piece on ‘Corporate Digital Responsibility (CDR).’ We will answer the following questions: What is CDR? Why does it matter? and Why is it a priority?
What is CDR?
Simply put, Corporate Digital Responsibility (CDR) offers a framework through which we can standardize how we measure which corporations are using new technologies for good and those that aren’t.
There are seven principles contained within CDR:
- Purpose and trust
- Fair and Equitable access for all
- Promote societal wellbeing
- Consider the economic and societal impact
- Accelerate progress with impact economy
- Creating a sustainable planet to live
- Reduce tech impact on climate and environment.
As an example, one of the areas which CDR principles seek to address is the ethical use of data. Emerging through companies such as Google and Facebook, data forms an integral revenue stream for many businesses and can be purposed in many ways to gain value. However, events such as the Cambridge Analytica scandal in 2018 showed how personal data could be used in ways that are politically and socially destabilizing and infringe upon the human rights of users. This is a perfect example of one area of the digital world that CDR redresses. By committing to following the principles of CDR, a company is making a commitment to using personal data ethically.
However, the principles of CDR extend beyond a set of reactive policies to protect society from the harms of digital technology. Companies following the principles of CDR are also committing to proactively harnessing the significant potential of digital technologies for social good, for example, by using machine learning (ML) technologies to optimize environmental impact. Often, digital ethics discourse is too heavily centered on mitigating and protecting against risk (e.g., the misuse of personal information, loss of jobs due to automation, and the perpetuation of harmful social biases). Still, digital ethics is just as much about realizing the extremely positive potential of new technologies (e.g., democratizing knowledge and the provision of education, making new, otherwise impossible human connections, and streamlining the workflows of businesses and public services). The principles of CDR are designed to help streamline the actualization of this potential.
It is important we understand CDR as a set of principles that both inform and are used in conjunction with digital regulation. CDR is not a placeholder for digital regulation. However, one of CDR’s benefits is that it places the onus on companies to engage with, design, and deploy new technologies with considerations of social and environmental risk and benefit integrated from the beginning. The development and enforcement of regulation cannot keep up with the pace of technological innovation. Therefore, it is essential that we have a set of principles and standards which hold corporations accountable before the release of long-term regulatory frameworks. Not only does this ensure companies act socially and environmentally responsibly from the beginning, but it also helps inform how future regulation should be sculpted for maximum impact.
Why does CDR matter?
CDR is both a moral imperative and a business strategy
CDR is a moral imperative because digital technologies are integral to our daily lives, embedded in and defining our culture, from how we communicate and access information to the products and services we provide and use. The public is becoming increasingly aware of the underbelly of AI and other technological risks, such as cyberattacks, perpetuating social biases, and privacy breaches.
By considering the tangible impact of technologies on your daily life, one can prioritize and support the development of digital technologies in ways that benefit society rather than a few powerful stakeholders. Involving the public in the discussion of CDR helps promote greater social justice and equality. CDR matters for the ‘Average Joe’ because it impacts their daily lives in significant ways, and so becoming involved in CDR communities helps promote greater accountability and democratic participation in the production of technology.
How can the ‘Average Joe’ get involved in CDR?
The thing that’s challenging about getting involved in corporate digital responsibility is spelled out in its name, “corporate” … It is sometimes hard to see how one may fit in unless they are on a board or executive member who can directly influence a corporation’s digital responsibility. This is not to discourage everyone’s involvement but to acknowledge that some positions allow for more tangible influence on the issue. The ‘Average Joe’ has a part to play in the ecosystem, here is a short list of a few things to consider when engaging with digital technologies responsibility…
· AI literacy training: An ongoing problem in AI ethics is the information gap between the makers and users of technology. It is becoming ever more important to be an informed consumer.
· Exercising consumer habits: Once you have researched the digital practices of the products or services you are engaging with, you can be empowered to make financial decisions that align with your values. In the same way, that we can not financially support a company due to child labor, we may also choose not to support a company because they are not prioritizing digital responsibility or protecting user privacy.
· Supporting CDR advocacy: In another sense, you can choose to donate your time, energy, or finances to organizations that are promoting CDR. Spaces promoting corporate accountability can be a great space to attend events or simply spread the word about their work.
· Protecting your online presence: Now that you have learned a lot about the “101 of CDR,” you can consider using strong passwords and enabling two-factor authentication to protect your digital security. Make sure to consider enabling two-factor authentication on all your online accounts.
· Educate others: Now that you know more about CDR, share the news! Educate your friends, family, or coworkers about digital responsibility and encourage them to act similarly. It is important to be open to discussing how the digital practices of business affect everyone’s digital security and privacy.
The business case for CDR can be seen in the trust earned from customers after demonstrating a commitment to ethical, accountable, and explainable digital practices. As the public continues to become aware of and concerned about the risks of emerging technologies, corporations that fail to consider their concerns will continuously risk reputational damage and loss of customers. Companies prioritizing CDR will likely avoid costly legal and regulatory action by anticipating the unintentional consequences of deployed technologies with defensive design. Taking a responsible approach to digital technology and demonstrating a commitment to digital responsibility through products, services, and business practices is critical for protecting consumers’ privacy and security, ensuring equity, mitigating unintended consequences of systems, and fostering sustainable trust.
CDR allows for…
- Protecting consumers’ privacy and security through the responsible safeguarding of personal data such as email addresses and online activity tracking. Safeguarding can take the form of considering how the data is used, stored, and shared. We now expect and deserve to have more control over our personal data. Often companies fail to respect this further government has yet to (excluding the GDPR) place hard laws to make this non-negotiable.
- Digital accountability can ensure equity. As we increase the use of emerging technologies, we must ensure that they are free of unintended and harmful biases which could be amplified at scale. Corporations that fail to account for fairness in practice risk perpetuating or exacerbating societal inequalities.
- Defensive design and further protection against unintended consequences of systems. Companies that take a responsible approach to digital technology are more likely to anticipate and mitigate these unintended consequences, such as misinformation and tech scandals.
- Trust and long-term sustainability between makers and users of technology. When corporates consider their digital responsibility to stakeholders, they build strong relationships, which can lead to increased loyalty and, eventually, revenue. Additionally, CDR allows for sustainability over time, as companies that practice CDR are less likely to be subject to public backlash, regulatory action, or other risks associated with irresponsible digital practices.
Why is CDR a priority?
CDR is now a necessity rather than a luxury
The pace at which technological change occurs calls for a proactive approach to digital responsibility to avoid negative impacts on society and the environment. CDR is a practice that, when prioritized, could account for urgent issues facing companies, such as protecting personal data, exacerbating bias, and political polarization.
Urgent issues facing companies today that are addressed with CDR best practices…
- The need to protect personal data. With the increasing prevalence of cyber-attacks and data breaches, the public is faced with a growing concern about the security of their information. Companies that fail to implement strong security measures, as seen in CDR best practice, risk losing their customer’s trust and positive reputation.
- The perpetuation or exacerbation of societal inequalities. If these technologies are not designed to avoid biases based on disabilities, sex, race, or other factors, they could perpetuate human bias at scale. Ingraining our injustices in machines and making them scientifically credible.
- The unintended consequences of digital technology impact ESG. This can be seen in social media platforms facilitating echo chambers which can be seen to exacerbate the spread of disinformation and political polarization. Companies that fail to anticipate and mitigate these unintended consequences with initiatives in pre-production (defensive design) risk contributing to social and environmental harm and acquiring repetitional damage.
CDR is a priority that must be seen on board agendas. Companies must take a proactive approach to digital responsibility to avoid negative impacts on the public, society, and the environment, as well as themselves, through legal and reputation risks. By prioritizing CDR, companies can build trust with their customers, create long-term value in responsible AI systems, and help ensure a sustainable future for their business and society.