Current Issue

Volume 8 Issue 1/2 (Fall 2016) 


Issue Preview by Special Issues Editors

Lee-Ann Kastman Breuch, University of Minnesota

Victoria Sadler, Metropolitan State University

Programmatic Research in Technical Communication: An Interpretive Framework for Writing Program Assessment

Nancy W. Coppola, New Jersey Institute of Technology

Norbert Elliot, New Jersey Institute of Technology

Faye Newsham, Scope Infotech, Inc.

Postscript by Andrew Klobucar, New Jersey Institute of Technology

Abstract. Important advances have been demonstrated in the assessment of writing programs. In this paper, we identify an application of programmatic research based on an accountability framework for writing program assessment. A form of relational modeling that allows a postsecondary institution to identify and fashion the variables that impact the writing program, the application is termed Design for Assessment (DFA). To demonstrate its benefits, we review contemporary views of program assessment, explicate the interpretative features of the framework, and describe a case study application. We close with heuristic questions for program assessment attentive to stakeholder contributions. A postscript from the current program director at our institution provides a reflective statement on the need for evidence-based responsiveness in writing program design.

Keywords. Design for Assessment (DFA); empirical methods; program assessment; technical, scientific, and professional communication; variable modeling


Connecting Programmatic Research with Social Media: Using Data from Twitter to Inform Programmatic Decisions

Chris Lam, University of North Texas

Mark Hannah, University of Arizona

Erin Friess, University of North Texas

Abstract. Traditional data sources provide technical communication programs with a variety of useful decision-making metrics. However, many of these data sources are constrained by a variety of factors such as misalignment of institutional and programmatic goals and the summative nature of programmatic data and its subsequent application. Therefore, we argue for the use of data from Twitter to inform decisions about curriculum, assessment, and long-term programmatic vision. In this article, we outline relevant research questions related to programmatic decision making and then describe how to collect, analyze, and apply Twitter data to answer those questions.

Keywords. social media, Twitter, curriculum, assessment, data analysis

Disempowered Minority Students: The Struggle for Power and Position in a Graduate Professional Writing Program

Susan Popham, Indiana University Southeast

Abstract. To meet the complex realities of our contemporary society, academic programs in technical writing should examine diligently the current lack of African-American participation and explore possible ways in which programs may be marketed, revised, and shaped to meet the expectations and needs of potential African-American students. This study used individual interviews with five African-American women while they were pursuing graduate degrees (MA and PhD) in a professional writing program. With a theoretical framework of positionality and agency theory, this article describes with detail how these women faced complex challenges: choosing such a degree program, learning needed academic and career skills, negotiating invisible racial difficulties, and creating support systems for themselves. The article concludes with suggestions for creating programs that may be more amenable to the challenges faced by minority graduate students, like the women in this study.

Keywords. African-American women, minority graduate students, position, agency, power, powerlessness

Towards a Participatory Action Research Model for Extending Programmatic Assessment with Industry Advisory Boards

John M. Spartz, University of Wisconsin-Stout

Julie Watts, University of Wisconsin-Stout

Abstract. As a commentary on how professional, technical, and scientific communication programs might extend traditional approaches to programmatic assessment, this article details a conceptual model for participatory action research (PAR) that draws on a combination of data sources: an industry advisory board and reflective portfolios. We also offer “proof of concept” reflections on that framework and our own intentional advisory board engagement by describing both the process and our results from PAR at the University of Wisconsin-Stout. We further describe ways that the iterative process of PAR can and has proved instrumental in informing program development and revision that leads to student industry employment.

Keywords. advisory boards, program assessment, participatory action research, external stakeholders, industry, curriculum


Student-Centered Assessment Design in a Professional Writing Minor

Denise Tillery, University of Nevada, Las Vegas

Ed Nagelhout, University of Nevada, Las Vegas

Abstract. This article describes an approach to student-centered assessment design for a new minor in Professional Writing at the University of Nevada, Las Vegas. This approach creates seamless connections among courses, breaking down course barriers to promote broader community engagement in learning networks and correlating competencies, student activities, digital assets, and program assessments. To enhance both short-term and long-term course and program assessment strategies, we take a more ethnographic approach by identifying what types of texts to analyze, what features to consider within those texts, and how those features typically evolve as students acquire, use, and eventually master the variety of skills involved. Moreover, the student-centered assessment encourages a more open and processoriented approach among students and teachers that increases learner control, learner choice, and learner independence.

Keywords. program assessment, personal learning, project development, strategies

Book reviews

Exploding Technical Communication: Workplace Literacy Hierarchies and Their Implications for Literacy Sponsorship. Author: Dirk Remley Baywood Publishing Company, Inc 2014, 194 pp.

Reviewed by Geoffrey Clegg, The Pennsylvania State University


Science and the Internet: Communicating Knowledge in a Digital Age, Editors: Alan G. Gross and Jonathan Buehl

Afterword by Charles Bazerman

Baywood Publishing Company Amityville, New York 2016. 323 pp.

Reviewed by Ryan Eichberger, University of Minnesota