Workplace AI Wants to Help You Belong – Stanford Social Innovation Review

September 19, 2022
6
Views

Can the new suite of digital surveillance tools help to create more just and equitable workplaces?
By Genevieve Smith & Ishita Rustagi Sep. 14, 2022
(Illustration by iStock/Who_I_am)
Picture it: At 11 a.m. on a Thursday, you get a personalized Slack notification prompting you to connect with a colleague you haven’t seen in a while. Then, at a midday team meeting on Zoom, you are alerted about who is speaking up less, so you can invite them to contribute. Later that day, while you are writing, an AI-powered plugin prompts you to use “chairperson” instead of “chairman.” The next day, in preparation for a quarterly check-in with a supervisee, you look at a dashboard that shares how people in your team are doing (data from pulse surveys and “listening tools” like text analysis, video, and always-on surveys suggest that your team is feeling highly connected to you and other teammates through one-on-ones, but that they may be feeling burnt out.)
Welcome to a new era of workplace digital surveillance and AI. Are you ready to belong?
When so many in-person offices went remote after the pandemic—with meetings and communications abruptly facilitated digitally—it became newly possible to collect, analyze, and leverage incredible amounts of workplace data. And with this employee data has come a major uptick in new digital tools to inform employee engagement and performance management. At the same time, organizations have been responding to new and louder calls for diversity, equity, inclusion, and belonging (DEIB) at work: Persistent disparities related to who is represented within organizations, and particularly leadership roles, continued to reinforce and illustrate longstanding systemic inequities in society and organizations along lines of race, gender, sexual orientation, socio-economic status, and more. Unsurprisingly, then, tech companies have begun exploring the role that technology and the newly available data troves could play in measuring and/or enhancing organizational DEIB efforts, surveilling employees in order to enhance belonging.
Belonging goes further than inclusion: It is about feeling meaningfully connected to and part of the organization. And the importance of belonging cannot be denied. In the past, survival literally depended on building connections with others to overcome threats and stresses, and humans thus have an evolutionary need to belong. In the last few years, isolation and lack of belonging have fueled a growing mental health crisis, while the lack of belonging has been identified as a key driver behind the “great resignation.”
Is the answer AI-powered tools for workplace surveillance? What are these tools and what are the opportunities they provide? What, if any, are the unintended consequences that their use might bring? To what extent are these tools “for good” also legitimizing employee over-surveillance? Can we ensure that such DEIB tools actually advance equitable and just outcomes?
Workplace digital surveillance to monitor employee productivity have already been ubiquitous for warehouse and logistics workers or UPS drivers, but employee engagement and productivity tools are now expanding rapidly among knowledge workers. The New York Times found, for example, that eight of the 10 largest private US employers track productivity of individual workers. Some of these tools are building in aspects related to advancing internal DEIB, while new tools are focusing explicitly on DEIB goals.
Our analysis of workplace technological tools—especially those using AI—focused on those with stated goals around “belonging” (given its centrality to advancing equity in the workplace). The 34 tools we mapped vary in size and scope, but all have stated goals linking to belonging and are currently reaching employees and workplaces across the globe, with customers spanning a variety of industries in companies ranging from startups with fewer than 1000 employees, (such as Axios) to companies that have 5-10K employees globally (such as Spotify, Twilio, and Virgin Atlantic), as well as large corporations like Microsoft, Unilever, and Vodafone which have over 100K employees.
Three types of tools emerge:
Data analytics tools that measure or assess belonging collect real-time information for organizations to understand who employees are connected to and communicating with, their levels of inclusion, how engaged they are, and how they are feeling. They do this through a range of tools, which may provide surveys and assess responses, capture regular pulse checks, and/or track meeting data. More technically complex services include tracking and analyzing communication metadata (ranging from internal emails and messages to external reviews on sites like Glassdoor), using sentiment analysis to assess emotions in qualitative survey data, and mapping employee networks to assess who is connected to whom. While only some of these tools currently use AI, many continue to explore ways to integrate AI into their solutions.
Tools that seek to enhance belonging in organizations encourage behavior change, often by using digital “nudges.” “Nudge theory” is a behavioral economics concept by which positive reinforcements and indirect suggestions can influence people's actions and thinking. These nudges—sent over email, text message, Slack, and more—can be customized and context-based. A majority of digital nudging tools leverage machine learning to personalize nudges based on individual communications, meetings information, and other internal data. These nudges can deliver tips on different topics related to DEIB and wellbeing, prompt inclusive interpersonal workplace behavior or learnings around DEIB topics, and prompt inclusive language and work practices specific to certain roles or functions. Besides nudges, some tools also provide a platform for employees and managers to share recognition, praise, and other forms of positive reinforcements for their work.
These technologies have the potential to better understand and advance DEIB efforts within organizations, while also making DEIB efforts more efficient, cost-effective, and scalable. However, there are also critical concerns of tools that leverage personal data to draw insights and drive personalized behavior change.
At a higher level, we are concerned about over-surveillance conducted in the name of DEIB. While these tools are collecting data with the positive aim of advancing DEIB, they are still acting as surveillance tools in personal spaces. Even when developed for purposes of “good,” surveillance can be an invasion of privacy and ultimately fuel workplace control. Also, surveillance has long disproportionately targeted marginalized communities, particularly Black and Brown communities in the case of the United States, perversely enabling more precise discrimination.
To be clear, not every tool we mapped falls prey to these concerns. Everyday Inclusion, for example, provides employees with un-customized, science-based “inclusion nudges” while Donut simply randomizes employee connections, and thus, tools like these do not raise the concerns we outline here. It is when tools start to leverage personal data to draw insights and drive personalized behavior change that we urge leaders to consider the potential pitfalls in addition to their potential.
Social change leaders must be attentive to the types of technologies they are using, supporting, investing in or funding under the name of DEIB. If tools to advance belonging can be helpful, they must be developed and managed with extreme consideration and caution if they are to result in more just and equitable outcomes. Social change leaders must ask:
It’s easy to believe that technology can solve intractable issues like lack of belonging and inequality at work across different identities. However, we must be careful regarding the promises of technology and AI. Tools like these can indeed be helpful, but as social change leaders we must demand more and ask critical questions to better understand what the potential implications of such tools can be, and how power is replicated within and through such technologies. We can also support innovations and teams that center justice as a core value and priority from design through management.
Ultimately, increasing surveillance and AI in the name of DEIB is a dangerous game. Thoughtful, curious, and intentional social change leadership and investment is required to help advance and push for tools that can truly create more just and equitable workplace environments. But in some cases, the main question is: Should this tool be developed at all?
Read more stories by Genevieve Smith & Ishita Rustagi.
Genevieve SmithGenevieve Smith is the associate director at the Center for Equity, Gender & Leadership (EGAL) at the UC Berkeley Haas School of Business. For more than a decade, she has conducted research at the intersection of gender equity, justice, and technology. She is the lead author of EGAL’s playbook on mitigating bias in AI and playbook on advancing belonging in the workplace
Ishita RustagiIshita Rustagi is a senior analyst at the Center for Equity, Gender & Leadership (EGAL) at the UC Berkeley Haas School of Business, where she supports the development of resources, tools, and thought leadership to advance diversity, equity, and inclusion. She is the co-author of EGAL’s playbook on mitigating bias in AI and playbook on advancing belonging in the workplace.
“@kobalt’s success offers a case study in the ethical use of #technology to achieve social responsibility in an ind… twitter.com/i/web/status/1…
“‘The Fight for Privacy’ is ultimately a call to action for activists, the public, and, most important, lawmakers.… twitter.com/i/web/status/1…
Read about the turns and pivots leaders make as they found and nurture organizations focused on social change. ssir.org/pivotal_moment…
RT @KateLapham: This article provides a useful framework for thinking about why we need networks and how we can support them as donors, pol…
“The digital self is constitutive of the human person, @daniellecitron argues, so it’s time for the law to catch up… twitter.com/i/web/status/1…
By Genevieve Smith & Ishita Rustagi 1
By Aida Mariam Davis 2
By Nigel Travis
Copyright © 2022 Stanford University. Designed by Arsenal, developed by Hop Studios
SSIR.org and/or its third-party tools use cookies, which are necessary to its functioning and to our better understanding of user needs. By closing this banner, scrolling this page, clicking a link or continuing to otherwise browse this site, you agree to the use of cookies.

source

Article Tags:
·
Article Categories:
Office · Technology

Leave a Reply

Your email address will not be published.

The maximum upload file size: 512 MB. You can upload: image, audio, video, document, spreadsheet, interactive, text, archive, code, other. Links to YouTube, Facebook, Twitter and other services inserted in the comment text will be automatically embedded. Drop file here