Appendix D: Courses in social sciences

Back to the Report: Generative Artificial Intelligence for Education and Pedagogy

In the social sciences, new forms of generative AI (GAI) are both a topic and a tool for teaching and inquiry. Tools such as ChatGPT may provide new case studies in our classes and assignments. But they also touch upon the very notion of what it means to learn about society, given that these systems tend to reproduce and mimic many of the analytic strategies and methods at the heart of social science disciplines, including literature reviews, textual analysis, exploratory writing, as well as the collection, analysis, and interpretation of data.

Opportunities

There are several ways in which GAI offers opportunities for education in the social sciences. Given that some students have already begun to incorporate GAI tools into their study habits, finding ways to incorporate GAI into courses might be a fruitful way forward for many instructors (while keeping the concerns about them in mind). It would also add a potentially useful “real-world” skill. Possible ways in which GAI may be included into a course or sanctioned as a way of studying (see Appendix B for detailed case studies):

  • to summarize specific publications/books,
  • to develop paper outlines,
  • to brainstorm hypotheses,
  • to garner editing feedback,
  • to serve as an interlocutor for developing specific arguments

Instructors can also go beyond merely updating assignments to incorporate GAIs by using them pedagogically to improve meta-cognitive strategies, such as critical thinking and careful evaluation of source materials. This might involve asking students to critically evaluate GAI content, which asks them to reflect on the quality of ideas, feedback, outlines, etc.

Concerns

The widespread use of GAI raises a number of concerns specific (though not necessarily exclusive) to the social sciences. Most importantly, these systems have been shown to be unreliable as sources of analysis and information, often suggesting non-existent facts, misattributing ideas, or inventing sources that may appear deceptively authentic. Therefore, using these tools effectively for learning requires a deep understanding of their strengths and limitations, not unlike the situation students and instructors faced in the early days of Google Search or Wikipedia.

The most pressing concerns arguably pertain to out-of-class activities and assignments. Many of the traditional social science methods and techniques focus on the analysis, interpretation, and production of text—tasks that tools like ChatGPT are claimed to do well, but in practice, often fail to conduct. In addition to the already mentioned challenges of accuracy and reliability, concerns include:

  • the loss of writing as a method and a way of thinking in its own right, including the devaluation of seemingly “tedious” work of reading, writing, and rewriting;
  • the reification of existing problems in academic scholarship, such as the uncritical adoption of biased citation practices, clunky prose, and formulaic methods;
  • the possibility of students taking “shortcuts,” especially under time pressure, and the limited ability of instructors to verify original work;
  • the blurring of authorship in exercises and assignments, especially when assistive devices are heavily used in the production of essays, short answers, and data analysis;
  • or, the loss of critical capacities when confronted with elegantly written, but methodologically flawed AI-generated reviews of literature and data.

If students rely on GAI passively to merely generate literature reviews or other written content for them, they risk losing the opportunity to build crucial research and analytical skills.

Recommendations

Individual instructors will need to work out if and how they might be able to incorporate GAI use into their courses, ideally emphasizing critical use of GAIs given their limitations at that point in time. If a prohibitive approach is adopted, assignments and prelims allowing computer-use might still be possible if the instructor takes advantage of GAI limitations. For example, asking students to provide their own reflections on class materials or integrating insights from different lectures is likely to make GAI less useful. A potential caveat, though, is that such limitations are moving targets as GAI improves across time. Instructors will need to keep abreast with such changes to prevent potential academic integrity violations.

Integrating GAI into our pedagogy via classroom exercises and discussions might address some of the concerns above, helping us to emphasize thinking and writing as a method and process, and not merely an output. Given the speed of GAI development and the proliferation of more specialized applications, courses that incorporate GAI use will likely need to be routinely updated. Clear and explicit instructions about attribution of GAI use will need to be in place.

Students’ knowledge of the kind of data used in specific social science areas and how such data is interpreted in light of theories is often the focus of in-class evaluations or homework. Assignments can be designed in this context to encourage students to engage critically with GAIs. Given the fast pace of GAI development, it might be prudent to have students always note the date of their responses on such assignments.

Back to the Report: Generative Artificial Intelligence for Education and Pedagogy