✍️ Column by Natalie Klym, who has been leading digital technology innovation programs in academic and private institutions for 25 years including at MIT, the Vector Institute, and University of Toronto. Her insights have guided the strategic decisions of business leaders and policy makers around the world. She strives for innovation that is open, creative, and responsible.
This is part 4 of Natalie’s Permission to Be Uncertain series.
The interviews in this series explore how today’s AI practitioners, entrepreneurs, policy makers, and industry leaders are thinking about the ethical implications of their work, as individuals and as professionals. My goal is to reveal the paradoxes, contradictions, ironies, and uncertainties in the ethics and responsibility debates in the growing field of AI.
I believe that validating the lack of clarity and coherence may, at this stage, be more valuable than prescribing solutions rife with contradictions and blind spots. This initiative instead grants permission to be uncertain if not confused, and provides a forum for open and honest discussion that can help inform tech policy, research agendas, academic curricula, business strategy, and citizen action.
Interview with Alex Shee, Corporate Development & Strategy, Sama
The bulk of machine learning activities involves the tedious work of data preparation, including annotation and labeling. Somewhat ironically, much of this work is performed by humans. It is typically outsourced to companies – data suppliers – operating in low-wage countries with little to no standards or regulations.
More fundamentally, the occupation itself has been obscured and undervalued. As Iva Gumnishka, Founder and CEO of Humans in the Loop, writes in her insightful piece, Who is labeling your data? (which I highly recommend), “data is the critical infrastructure necessary to build AI systems… And yet, novel model development is the most glamorized and celebrated work in AI, while data labeling is widely considered grunt work.”
These circumstances are similar to the early days of digital tech when unacknowledged and underpaid women engaged in the mental labor of computation before computer machines emerged. (See Margot Lee Shetterly’s Hidden Figures and Claire Evans’s Broad Band for accounts of this history.)
As awareness of these exploitative conditions in today’s AI industry grows, organizations are pushing for responsible sourcing practices, including the data suppliers and the AI companies whose businesses they compete for.
Beyond providing living wages, employee benefits, and opportunities for career advancement, the efforts of these organizations serve to increasingly expose, validate and professionalize the occupation, making it less vulnerable to unfair practices.
In this interview, I explore how one such company, Sama (formerly Samsource), is leading these initiatives.
NK: Alex, tell me more about your mission to “help lift tens of thousands out of poverty.” Where did this idea come from/what was the motivating factor?
Sama was founded on the premise that talent is equally distributed, but opportunity is not. We aim to level the playing field for women and youth from historically marginalized backgrounds by providing training and employment opportunities in the digital economy through a social business model. It has helped over 60,000 people break the cycle and thrive.
The idea for Sama came around 2008 when our late founder Leila Janah was working as a manager at a call center in Mumbai. During a conversation with one of her colleagues, she learned that he traveled to work from his home in Dharavi, one of India’s largest slums, to the office daily. While Dharavi had an active informal economy, access to steady, formal employment was limited. Leila asked herself a simple question: what if we could locate the call center in a community where jobs were most needed? From there, Samasource — later to become Sama — emerged.
From those early beginnings, Sama developed the first version of our platform using social enterprise models, which were cutting-edge at the time. And now, 14 years after those initial thoughts, the platform has prospered, and we have operations across three continents.
The heart of our business model is simple. We want to do our part to level the playing field by providing people from historically marginalized communities with the opportunity to build skills and earn a living wage in the digital economy. We envision Sama as a bridge, providing an entry point into formal employment that supports a sustained lift out of poverty over time.
We work continuously to impact communities and our team members positively. In 2020, we were proud to be the first AI firm to become a Certified B Corp, joining a growing community of businesses that meet the highest standards of social and environmental performance, public transparency, and legal accountability. In 2021, we were recognized as one of B Corp’s “Best for the World” for our commitment to our workforce. This was an enormous honor.
We intentionally provide training and employment opportunities to individuals that, for reasons beyond their control, have been left out of the formal economy. Many members of our workforce lived below the poverty line making $2 a day, were unemployed, or only did casual work before joining Sama. Opportunities to work in the digital economy are rare.
We are incredibly proud to say that we have impacted the lives of 60,000 people — that is, employees and their dependents — since we started. When they join us, new staff members typically see a 360% increase in their earnings. We pay a living wage but also give healthcare and other benefits to help them.
I have seen firsthand the incredibly positive impact on team members in Kenya and Uganda. Their stories are the guiding light for the work that we do. They want opportunities to make their lives better. Sama wants to help them transition into the digital economy so that they can flourish and provide themselves and their families with educational, health, and work opportunities.
NK: What has the impact been on your employees? On the local communities more broadly?
The effect of this increase in earnings has a ripple effect that goes beyond our staff. Since many of them support their siblings, their parents, and their children, new possibilities open up. School fees can now be paid, as can rent and other living expenses.
Since we started in 2008, we have been tracking the impact of our efforts. Beginning in 2017, we partnered with MIT and Innovations for Poverty Action to understand that impact in the most objective way possible: through a randomized controlled trial. The study found that people trained in Sama’s hiring pool outearned others from similar backgrounds and with similar education levels. And the women employed by Sama earned an average of 60% more than other women in the study group.
NK: How do you reconcile critiques of cheap data labeling work with your vision of economic development? (See, for example, MIT Technology Review’s article from April 2022, “How the AI Industry benefits from catastrophe.”)
Minimum wages and the cost of living are not two figures which necessarily tally. Let’s look at Kenya. In Nairobi, the minimum monthly wage last year was $119 USD, but the estimated cost of living for a family of five was $394.
To bridge that gap and help shift people out of poverty, we want to pay all members of our workforce a living wage standard for the region where they live. To do this, we conduct annual living wage benchmarking exercises using widely accepted methods from the field and international standards.
But better wages alone are not enough to make an impact. Team members in East Africa are full-time employees and can access health insurance, pension plans, subsidized meals, paid leave for new parents, and more. As part of the Sama team, employees have access to health and wellness programs such as access to psychological support and wellness breaks. In the office, we offer healthy meals and snacks, meditation rooms, new mother rooms, and game rooms.
This is in stark contrast to our competitors, who offer contractual work with no job security, benefits, or safety net. We want to be a counter to our competitors’ model and ensure that the people that work for Sama have the stability they need to find a path toward long-term work opportunities.
NK: Have you experienced pushback from your employees or labor activists? If so, how have you responded?
Any new idea, approach, or dream is bound to encounter opposition. And Sama’s mission, especially given its ambition, is no different. We welcome criticisms and actively work to address them and improve.
We are always looking for the opportunity to have more conversations about this topic. Our feedback loop is intentionally designed to ensure transparency and safety for all Sama team members. This includes open-floor meetings for collaborative conversation and more discreet processes to protect anonymity whenever desired.
We work off a set of guidelines around who we will work with and under what conditions — key to ensuring that our core values are always maintained. Our Series B lead investor, CDPQ’s Equity25^3, holds us accountable to targets. At the same time, we work with industry leaders such as the Haas Center for Equity, Gender, and Leadership and Partnership on AI to advocate for more ethical supply chains, responsible sourcing, and dignified employment for humans-in-the-loop.
Despite the measurable positive impact we have on our workers, there has been one well-mediatized complaint by a former employee of ours who worked on our content moderation workforce. We take feedback from our employees about their experience very seriously. Still, it is frustrating — most of the claims are entirely untrue, and the coverage has shown a distorted view of our operations.
We can’t say much about it as it’s an open case, but it relates to our content moderation work, which is a small fraction of our operation (less than 2%). Content moderation work is a necessary job foundational to social media, but it is also challenging for the workers carrying it out. Though we’ll likely move away from it soon as it’s not our core business, we want to be part of the solution to improve it.
NK: One scholarly article, The Limits of Global Inclusion in AI Development (which mentions Samasource), proposed solutions for how to transform a system of exploitation into one of genuine economic development, e.g., providing opportunities for upward mobility (from data labeling to model development and from model development to managerial roles). Does Sama provide opportunities for training and development for its employees?
Absolutely. There is an extensive system of informal employment in Africa it is estimated that 8 out of every 10 workers are without the protection or the stability of the formal sector. Coming to work for Sama is an entry point to formal employment for many of our workforce.
Before they join, we ask staff about their backgrounds to see how working for us could positively impact their lives.
We also have a professional development platform, Sama U, where the team can get training in everything from basic digital literacy and the future of AI to how to use tools like Slack and Google Suite.
And research shows this is working. The MIT and Innovations for Poverty Action randomized controlled trial found that after three years, individuals who were trained and included in Sama’s hiring pool had lower unemployment rates and higher average monthly earnings than those who received training only.
There is also upward mobility within Sama. We see people coming in at entry-level positions to later lead large teams and projects for Fortune 100 companies. This is due to Sama’s active work ensuring that we create an environment where there are growth opportunities. We want our staff members to have the opportunity to grow and acquire the skills they need to thrive.
NK: Any first-hand accounts of the market for data labeling would be fascinating.
We are very proud of the work we do with customers and the impact that it has on their industries. Look, for example, at the Dutch company Orbisk. It gives the people who run professional kitchens a clear picture of how much food is wasted so that they can make decisions about their supply chains.
And Project Guideline — this is an early-stage research project by Google that explores how on-device machine learning can help people with reduced vision walk and run for exercise independently.
Meanwhile, PolyPerception provides a waste management platform to plastics and material recovery facilities, analyzing their waste streams to operate more efficiently and responsibly.
Finally, Vulcan commits to better protecting wild plant and animal species and their habitat by using AI for wildlife conservation.
NK: Anything else you want to share before you go?
We want to see a future where all companies, large and small, embrace a triple bottom line. We want AI companies to embrace responsible sourcing across their supply chains. Regulation is coming, and suppliers who are not actively thinking about the ethical treatment of data professionals will be left behind.