Digital Technology in Human Services
Professor Michael Lithgow
MAIS 623: Digital Humanities
Digital Technology in Human Services
In the last fifty years, digital technology modified entrenched methods of providing human services. Using examples from my own experience, I demonstrate that technological applications can be beneficial but can also marginalize already fragile populations. In the future, digital humanities could interfere in this type of degradation.
Technology as a material tool, and meaning as symbolic construction, through relationships of production/consumption, experience, and power, are the fundamental ingredients of human action – an action that ultimately produces and modifies social structure (Castells, 2000, 9).
I was born in the late 1950’s; colour TV came out in the mid 1960’s. In the late 1970’s, cable TV came to our remote Yukon community. Fifty years later, society is virtually networked; the world has “entered a new technological paradigm, centred around microelectronics-based, information/communication technologies…” (Castells, 2000, 9).
In the 1980’s, the municipal office bought two Apple II computers for word processing and bookkeeping. The Apple II computers used Peachtree Software, the first bookkeeping software developed for small accounting applications in 1978 (https://careertrend.com/about-6328213-history-computerized-accounting.html). Backups were performed on Friday afternoons by transferring computer information to floppy disks, in between numerous bouts of Hitchhiker’s Guide to the Galaxy, a text adventure or interactive fiction game released in 1984.
Social changes were made: users were shaped by learning this technology: operating computers, simplifying bookkeeping procedures, allowing relatively untrained workers to take on preparing financial documents and later, on other machines, even work in DOS. Ramifications spread outside the office: the municipal accountant and senior government staff were circumvented as they became overseers rather than preparer of documents that were now completed in-house. In 1982, an astute marketing push placed Apple computers in schools, when Apple had a strong (albeit eventually seemingly only commercial interest) in ensuring that updates were based on the needs of users as more and more people came to use the machines in different circumstances.
Human Services Work
In the 1990’s, my work was at storefront school that catered to marginalized youth and adults who had not completed high school. Although textbooks supported the curriculum, the primary education tools were digital technology in the form of linked and online computers. With an English major, a background in bookkeeping and municipal government issues, I found myself furiously flipping from task to task: English 9, Physics 12, Accounting, Communication 11, Literature 12. Traditional school time frames were circumvented; I worked days, evenings and sometimes summers. High school education crossed traditional demographics; students ranged from fifteen-year-old single moms to one fellow who graduated in his eighties, intellectually capable people to individuals who lived in group homes for individuals with mental disabilities, people who came to school at night or in the morning, home schooled children and children from religious and cultural communities who did not attend public high schools. I became a peer with the students as we simultaneously explored the myriad mysteries of algebraic calculators. Exams were largely online; some essay work was required but students were largely responsible for their own education. Strict job categories and teaching protocols were circumvented; I did not have to be the know-all, the information was on the machines. It was not always clear who was helping who as I frequently had to ask students for help or look online in other school’s curriculums for help with my scholastic arch-nemesis – any math or physics over the Grade Nine level. In other cases, I co-taught, two teachers hunkered down over an inexplicable Physics 12 question, while working with one student. Students loved it; teachers loved it. In this case, “…social classes, as constituted, and enacted in the Industrial Age, cease to exist in the network society” (Castells, 2000, 9).
As Lanier (2010) states “it is impossible to work with information technology without also engaging in social engineering” (4). The boundaries between areas previously separated were being easily crossed by commonality of cause and use of tools – educators took on the role of facilitators, all the students studied in the same room on any topic, students helped each other and could work largely at their own pace or take a break to visit their infant in the day care attached to the facility or work in small, self-regulated groups of learners. Teachers did not sit in loco parentis over a class of thirty largely unengaged individuals.
Teachers were tasked constantly with changing written and technological curriculum materials to better support the students during the five or so years I worked there; websites and login applications changed, mail order courses were phased out, courses amended to better meet student’s individual aspirations and more and more courses available online.
Then, in the time of education cutbacks in BC, this avant-garde school was shut down and I became unemployed and, ironically, became an employment counsellor. The instructional format was initially much the same: online and text book curriculum supported by human facilitators. However, over the next twenty years, unchecked and inconsiderate use of digital technology, coercion in the guise of shaping, had negative ramifications in the world of human services work.
Understanding involves a flow of influence from the domain of interest to a scientist, altering how the scientist views the domain. Shaping is a creative activity that goes in the reverse direction, with influence flowing from the scientist to the domain, resulting in alterations to the domain itself (Rosenbloom, 2016, 223).
It wasn’t the scientists themselves perhaps that turned our clients into digits in a database. It was the application of the principles of Taylorization without consideration of the humans in human service work. Digital technology does “not preclude exploitation, social differentiation and social resistance” (Castells, 2000, 9). Or as Galloway (2010) asserts, there is a price to be paid for inclusion into the network (293).
Part of the job was to assist older, unemployed workers to re-learn “reading, ‘riting and ‘rithmetic” on some type of digital technology or lose their employment prospects. These individuals grew up in the one-job-for-life environment and were uncomfortable with all this flexibility and technology. They balked when thirty years of skills needed to morph into an online, parsed down resume tailored to hide one’s age. They felt reduced; they had no real choice; their skills were no longer relevant and government assistance would not be forthcoming unless they changed their habits and skills. Although Lanier (2010) states that “Technologists don’t use persuasion to influence you” (5), people were coerced by the application of the technology inside the various employment or assistance programs to reduce themselves into statistics in a database or be threatened by diminishing their immediate financial and future employment prospects. Although Lanier (2010) is here referring to Facebook, the same principle applies: “Am I accusing all those …users of ….networking sites of reducing themselves in order to be able to use the services? Well, yes I am.” (53) But what choice did these older workers have?
In 2014, one of my last positions in human services, I worked for an agency that specialized in working with people with disabilities who were interested in finding employment. As a worker, I taught classes, lecture style, five hours a day, for groups of eight to ten people. One class had a blind student, a few students on probation and individuals with intellectual disabilities; for the two weeks I lasted in this position, I felt entirely inadequate to meet the needs of the individuals in the room with any compassion or integrity. Record keeping at the end of the day consisted of individuals receiving an x for attendance along with some reductionist anecdotal information – twenty words or less – which I put into the computer. Contractors were paid by the x’s.
I also watched potential BC income assistance clients walk away from the five-page online digital document they were expected to fill out as the first step to applying for income assistance, a form which initially befuddled me also. Was this difficult digital form providing cost savings for government in the form of providing discrimination against the people who might be eligible for benefits? It reminds me now of the problems with Florida’s voting system in the last USA federal election, a system that purportedly was set up to disenfranchise the less educated voters.
Although Lanier (2010) is talking about social media, I feel the principles he espouses are relevant here. In “The Abstract Person Obscures the Real Person” he states
…there is a brittleness to the types of connections people make online. This is a side effect of the illusion that digital representations can capture much about actual human relationships…. set up a rigid representation of human relationships on a digital network …and that reduction of life is what gets broadcasted…. (70-71)
There was an illusion of ethics (Lanier, 2010, 65); the government was pouring resources in new, updated digital technology supposedly to be more efficient at helping marginalized people find work or access benefits. The actual result was that individual’s lives were turned into a database. Lanier (2010) goes on to say that this degradation is a philosophical mistake, a belief that computers can represent human thoughts or relationships (69).
David Berry and Fagerjord Anders (2017) echo this when they say:
These are not merely theoretical concerns but are of utmost importance for digital humanists to be aware of, if they are not to allow these new forms of Taylorist practices to colonize the humanities. Digital humanists cannot pretend that contemporary contestation over automation, proletarianization and precarity are not relevant to their practices and discipline. Rather, as the vanguard of many of these new technologies and techniques, it is paramount that they are able to balance these powerful and new methods with the continuities required for ensuring that the ways of knowing developed in the humanities are preserved even as they are extended. (14)
I’ve been out of the game for several years, but my ex-coworkers tell me it hasn’t changed. Human services work, commonly called ‘the industry’ but let’s not do that anymore, has come to the place where “computational education … focuses perhaps too heavily on the instrumental dimension of computation…” and to change that, there must be a change in focus, “supplement and broaden them (existing technologies) so that they include humanistic modes of thought and practice” (Berry & Fagerjord, 2017, 2)
The Next Five Years
To effect positive change, “…actors will have to challenge the network from the outside and in fact destroy it by building an alternative network around alternative values” (Castells, 2000, 19). Part of the issue is the ‘one size’ fits all mentality. For some people, digital technology is wonderful; for some, it’s a demon. Perhaps society will see practioners successfully provide costs savings by averting face-to-face specialist referrals while still providing end-user satisfaction (Lai et al, 2018) via construction of alternative networks.
Digital technology in human services expedited the service delivery at the cost of the delivery of the actual service – engaging with humans. Human service work needs to be rife with opportunities to put the ‘human’ back in. Not only does technology change the social world of humans, humans must be engaged with technology to ensure that the humans are not entirely dehumanized by these computational approaches.
Castells (2000) refers to the information networks of “capital, production, trade, science, communication, human rights, and crime” (18) that have endangered the sovereignty of the nation-state by bypassing it. Digital humanities can be a catalyst for change in human services. “Digital humanities remains focused on the research questions that are drawn from the humanities, even whilst working in and through computational approaches” (Berry & Fagerjord, 2017, 8). I look forward those changes.
Berry, D. & Fagerjord, A. (2017). “On the way to computational thinking,” in Digital Humanities: Knowledge and Critique in a Digital Age. Malden, MA: Polity.
Castells, M. (2000). Materials for an exploratory theory of the network society. British Journal of Sociology, 51(1), 5-24. doi:10.1080/000713100358408
Galloway, A. R. (2010). Networks. In Mitchell, W.J.T. & Hansen, M. B. N. (Eds.) Critical Terms for Media Studies. (pp. 280-291). Chicago: University of Chicago Press.
Lai, L., Carsen, S., Kontio, K., Fournier, A., O’Connor, M., Gandy, H., & … Kurzawa, J. (2018). The impact of electronic consultation on a Canadian tertiary care pediatric specialty referral system: A prospective single-center observational study. Plos ONE, 13(1), 1-13. doi:10.1371/journal.pone.0190247
Lanier, J. (2010). You Are Not a Gadget: A Manifesto, New York: Alfred A. Knopf.
Rosenbloom, P. (2016). Toward a conceptual framework for the digital humanities. In M. Terrass, J. Nyhan & E. Vanhoutte (Eds.) Defining Digital Humanities: A Reader (pp. 219-236). Burlington, VT: Ashgate.