Elicitation and Empathy with AI-enhanced Adaptive Assistive Technologies (AATs)
pdf

How to Cite

McDonald, N., Massey, A., & Hamidi, F. (2023). Elicitation and Empathy with AI-enhanced Adaptive Assistive Technologies (AATs): Towards Sustainable Inclusive Design Method Education . Journal of Problem Based Learning in Higher Education, 11(2), 78–99. https://doi.org/10.54337/ojs.jpblhe.v11i2.7667

Abstract

Efforts to include people with disabilities in design education are difficult to scale, and dynamics of participation need to be carefully planned to avoid putting unnecessary burdens on users. However, given the scale of emerging AI-enhanced technologies and their potential for creating new vulnerabilities for marginalized populations, new methods for generating empathy and self-reflection in technology design students (as the future creators of such technologies) are needed. We report on a study with Information Systems graduate students where they used a participatory elicitation toolkit to reflect on two cases of end-user privacy perspectives towards AI-enhanced tools in the age of surveillance capitalism: their own when using tools to support learning, and those of older adults using AI-enhanced adaptive assistive technologies (AATs) that help with pointing and typing difficulties. In drawing on the experiences of students with intersectional identities, our exploratory study aimed to incorporate intersectional thinking in privacy elicitation and further understand its role in enabling sustainable, inclusive design practice and education. While aware of the risks to their own privacy and the role of identity and power in shaping experiences of bias, students who used the toolkit were more sanguine about risks faced by AAT users—assuming more data equates to better technology. Our tool proved valuable for eliciting reflection but not empathy.

https://doi.org/10.54337/ojs.jpblhe.v11i2.7667
pdf

References

A Review of the Data Broker Industry: Collection, Use, and Sale of Consumer Data for Marketing Purposes. (2013). [Staff Report for Chairman Rockefeller]. United States Senate Committee on Commerce, Science, and Transportation.

Adzima, C. (2017, June 15). Using Amazon Rekognition to Identify Persons of Interest for Law Enforcement. Amazon Web Services. https://aws.amazon.com/blogs/machine-learning/using-amazon-rekognition-to-identify-persons-of-interest-for-law-enforcement/

Algorithmic Accountability Policy Toolkit. (2018). AI Now Institute.

Angulo, J., & Ortlieb, M. (2015). “WTH..!?!” Experiences, Reactions, and Expectations Related to Online Privacy Panic Situations. 19–38. https://www.usenix.org/conference/soups2015/proceedings/presentation/angulo

Angwin, J. (2015, September 10). First Library to Support Tor Anonymous Internet Browsing Effort Stops After DHS Email. ProPublica. https://www.propublica.org/article/library-support-anonymous-internet-browsing-effort-stops-after-dhs-email

Axure Share. (n.d.). Retrieved May 3, 2020, from https://share.axure.com/

Bennett, C. L., & Rosner, D. K. (2019). The Promise of Empathy: Design, Disability, and Knowing the “Other.” Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 1–13. https://doi.org/10.1145/3290605.3300528

Buolamwini, J. (2019, April 24). Response: Racial and Gender bias in Amazon Rekognition —Commercial AI System for Analyzing Faces. Medium. https://medium.com/@Joy.Buolamwini/response-racial-and-gender-bias-in-amazon-rekognition-commercial-ai-system-for-analyzing-faces-a289222eeced

Chatterjee, R., Doerfler, P., Orgad, H., Havron, S., Palmer, J., Freed, D., Levy, K., Dell, N., McCoy, D., & Ristenpart, T. (2018). The Spyware Used in Intimate Partner Violence. 2018 IEEE Symposium on Security and Privacy (SP), 441–458. https://doi.org/10.1109/SP.2018.00061

Collins, P. H. (1990). Black Feminist Thought: Knowledge, Consciousness, and the Politics of Empowerment (1st edition). Routledge.

Collins, P. H. (2019). Intersectionality as Critical Social Theory. Duke University Press Books. Collins, P. H., & Bilge, S. (2016). Intersectionality (1 edition). Polity.

Costanza-Chock, S. (2020). Design Justice: Community-Led Practices to Build the Worlds We Need. The MIT Press.

Couldry, N., & Mejias, U. A. (2019). Data Colonialism: Rethinking Big Data’s Relation to the Contemporary Subject. Television & New Media, 20(4), 336–349. https://doi.org/10.1177/1527476418796632

Crenshaw, K. (1989). Demarginalizing the Intersection of Race and Sex: A Black Feminist Critique of Antidiscrimination Doctrine. University of Chicago Legal Forum, 1989(1).

Crenshaw, K. (1991). Mapping the Margins: Intersectionality, Identity Politics, and Violence against Women of Color. Stanford Law Review, 43(6), 1241–1299.

Draper, N. A., & Turow, J. (2019). The corporate cultivation of digital resignation. New Media & Society, 21(8), 1824–1839. https://doi.org/10.1177/1461444819833331

Edwards, E. J., Monet, C. M., & Branham, S. M. (2020). Three Tensions Between Personas and Complex Disability Identities. Conference on Human Factors in Computing Systems Extended Abstracts. CHI’20, Honolulu, USA.

Eubanks, V. (2006). Technologies of Citizenship: Surveillance and Political Learning in the Welfare System. In T. Monahan (Ed.), Surveillance and Security. Routledge. https://doi.org/10.4324/9780203957257-12

Eubanks, V. (2018). Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. St. Martin’s Press.

Ferguson, A. G. (2017). The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement. NYU Press.

Fiesler, C., Garrett, N., & Beard, N. (2020). What Do We Teach When We Teach Tech Ethics?: A Syllabi Analysis. 289–295. https://doi.org/10.1145/3328778.3366825

Ford, D., Zimmermann, T., Bird, C., & Nagappan, N. (2017). Characterizing software engineering work with personas based on knowledge worker actions. Proceedings of the 11th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, 394–403. https://doi.org/10.1109/ESEM.2017.54

Gorm, N., & Shklovski, I. (2016). Sharing Steps in the Workplace: Changing Privacy Concerns Over Time. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 4315–4319. https://doi.org/10.1145/2858036.2858352

Grammarly. (n.d.). Retrieved May 3, 2020, from https://www.grammarly.com/

Grossman, T., & Balakrishnan, R. (2005). The bubble cursor: Enhancing target acquisition by dynamic resizing of the cursor’s activation area. 281–290. https://doi.org/10.1145/1054972.1055012

Hagendorff, T. (2020). The Ethics of AI Ethics: An Evaluation of Guidelines. Minds and Machines. https://doi.org/10.1007/s11023-020-09517-8

Hamidi, F., Poneres, K., Massey, A., & Hurst, A. (2018). Who Should Have Access to My Pointing Data?: Privacy Tradeoffs of Adaptive Assistive Technologies. Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility, 203–216. https://doi.org/10.1145/3234695.3239331

Hamidi, F., Poneres, K., Massey, A., & Hurst, A. (2020, April). Using a Participatory Activities Toolkit to Elicit Privacy Expectations of Adaptive Assistive Technologies. W4A’20.

Hancock, J. T., Naaman, M., & Levy, K. (2020). AI-Mediated Communication: Definition, Research Agenda, and Ethical Considerations. Journal of Computer-Mediated Communication, 25(1), 89–100. https://doi.org/10.1093/jcmc/zmz022

Hao, K. (2018). Amazon is the invisible backbone behind ICE’s immigration crackdown. MIT Technology Review. https://www.technologyreview.com/2018/10/22/139639/amazon-is-the-invisible-backbone-behind-ices-immigration-crackdown/

Kang, R., Dabbish, L., Fruchter, N., & Kiesler, S. (2015). “My Data Just Goes Everywhere:” User Mental Models of the Internet and Implications for Privacy and Security. Eleventh Symposium On Usable Privacy and Security SOUPS’15, 35–52. https://www.usenix.org/conference/soups2015/proceedings/presentation/kang

Karan, E., & Brown, L. (2022). Enhancing Students’ Problem-solving Skills through Project-based Learning. Journal of Problem Based Learning in Higher Education, 10(1), 74-87. https://doi.org/10.54337/ojs.jpblhe.v10i1.6887

Ladner, R. E. (2015). Design for user empowerment. Interactions, 22(2), 24–29. https://doi.org/10.1145/2723869

Mak, A. (2018, April 19). How a JPMorgan Security Team Reportedly Used Palantir’s Tools to Spy on Its Employees. Slate Magazine. https://slate.com/technology/2018/04/jpmorgan- used-palantir-tools-monitor-employee-activity-bloomberg-report.html

McDonald, N., Akinsiku, A., Hunter-Cevera, J., Sanchez, M., Kephart, K., Berczynski, M., & Mentis, H. M. (2022). Responsible Computing: A Longitudinal Study of a Peer-led Ethics Learning Framework. ACM Transactions on Computing Education, 22(4), 47:1-47:21. https://doi.org/10.1145/3469130

McDonald, N., Massey, A., & Hamidi, F. (2021). AI-Enhanced Adaptive Assistive Technologies: Methods for AI Design Justice. Bulletin of the IEEE Computer Society Technical Committee on Data Engineering, 4(Special Issue on Responsible AI and Human-AI interaction).

Miller, R. L., & Williams, L. L. (2006). Personas: Moving beyond role-based requirements engineering.

Newell, A. F., Gregor, P., Morgan, M., Pullin, G., & Macaulay, C. (2011). User-Sensitive Inclusive Design. Universal Access in the Information Society, 10(3), 235–243. https://doi.org/10.1007/s10209-010-0203-y

Park, J. S., Bragg, D., Kamar, E., & Morris, M. R. (2021). Designing an Online Infrastructure for Collecting AI Data From People With Disabilities. Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, 52–63. https://doi.org/10.1145/3442188.3445870

Richardson, J. (2014). Spinoza, Feminism and Privacy: Exploring an Immanent Ethics of Privacy.

Feminist Legal Studies; Dordrecht, 22(3), 225–241. http://dx.doi.org.ezproxy2.library.drexel.edu/10.1007/s10691-014-9271-3

Saltz, J., Skirpan, M., Fiesler, C., Gorelick, M., Yeh, T., Heckman, R., Dewar, N., & Beard, N. (2019). Integrating Ethics Within Machine Learning Courses. ACM Trans. Comput. Educ., 19(4), 32:1-32:26. https://doi.org/10.1145/3341164

Scholkmann, A., Stegeager, N., & Miller, R. K. (2023). Integrating the Integration: The Role of Problem-Based Learning in Bringing Together Social Science and Humanities (SHH) and Science, Technology, Engineering and Mathematics (STEM). Journal of Problem Based Learning in Higher Education, 11(1), 98-123. https://doi.org/10.54337/ojs.jpblhe.v11i1.7371

Seberger, J. S., Llavore, M., Wyant, N. N., Shklovski, I., & Patil, S. (2021). Empowering Resignation: There’s an App for That. Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, 1–18. New York, NY, USA: Association for Computing Machinery. https://doi.org/10.1145/3411764.3445293

Seberger, J. S., Shklovski, I., Swiatek, E., & Patil, S. (2022). Still Creepy After All These Years: The Normalization of Affective Discomfort in App Use. CHI Conference on Human Factors in Computing Systems, 1–19. https://doi.org/10.1145/3491102.3502112

Shinohara, K., Bennett, C. L., Pratt, W., & Wobbrock, J. O. (2018). Tenets for Social Accessibility: Towards Humanizing Disabled People in Design. ACM Transactions on Accessible Computing, 11(1), 6:1-6:31. https://doi.org/10.1145/3178855

Shinohara, K., & Wobbrock, J. O. (2011). In the shadow of misperception: Assistive technology use and social interactions. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 705–714. https://doi.org/10.1145/1978942.1979044

Shklovski, I., Mainwaring, S. D., Skúladóttir, H. H., & Borgthorsson, H. (2014). Leakiness and Creepiness in App Space: Perceptions of Privacy and Mobile App Use. Proceedings of the 32Nd Annual ACM Conference on Human Factors in Computing Systems, 2347–2356. https://doi.org/10.1145/2556288.2557421

Ur, B., Leon, P. G., Cranor, L. F., Shay, R., & Wang, Y. (2012). Smart, useful, scary, creepy: Perceptions of online behavioral advertising. Proceedings of the Eighth Symposium on Usable Privacy and Security, 1–15. https://doi.org/10.1145/2335356.2335362

Webb, A. (2019). The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity. PublicAffairs.

West, S. M. (2019). Data Capitalism: Redefining the Logics of Surveillance and Privacy. Business & Society, 58(1), 20–41. https://doi.org/10.1177/0007650317718185

Whittaker, M., Alper, M., Bennett, C. L., Hendren, S., Kaziunas, L., Mills, M., Meredith, R. M., Rankin, J., Rogers, E., Sala, M., & Sarah, M. W. (2019). Disability, Bias, and AI. AI Now Institute.

Whittaker, M., Crawford, K., Dobbe, R., Fried, G., Kaziunas, E., Mathur, V., West, S. M., Richarson, R., Schultz, J., & Schwartz, O. (2018). AI Now 2018 Report. AI Now Institute.

Windsong, E. A. (2018). Incorporating intersectionality into research design: An example using qualitative interviews. International Journal of Social Research Methodology, 21(2), 135– 147. https://doi.org/10.1080/13645579.2016.1268361

Wobbrock, J. O., Kane, S. K., Gajos, K. Z., Harada, S., & Froehlich, J. (2011). Ability-Based Design: Concept, Principles and Examples. ACM Transactions on Accessible Computing, 3(3), 9:1- 9:27. https://doi.org/10.1145/1952383.1952384

Wong, P.-H. (2020). Democratizing Algorithmic Fairness. Philosophy & Technology, 33(2), 225– 244. https://doi.org/10.1007/s13347-019-00355-w

Zhou, W., & Piramuthu, S. (2014). Security/privacy of wearable fitness tracking IoT devices. 2014 9th Iberian Conference on Information Systems and Technologies (CISTI), 1–5. https://doi.org/10.1109/CISTI.2014.6877073

Zuboff, S. (2015). Big other: Surveillance capitalism and the prospects of an information civilization. Journal of Information Technology, 30(1), 75–89.

Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (1 edition). PublicAffairs.

Articles published in Journal of Problem Based Learning in Higher Education are following the license Creative Commons Attribution 4.0 (CC-BY)
Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution 4.0 International License (CC-BY). Further information about Creative Commons