This post lists the leading Generative AI tools that produce outputs based on natural language text prompts. This page is inspired by @aaronsiim’s Generative AI list, but focuses on the text-based tools and provides accessible links to them.
Text-based tools
Text-to-audio
These tools turn text into audio using AI to produce more human-like speech, including tone and inflexion.
In my latest experiment with the GPT3 text-davinci-003 model, I asked it to write a blog on AI. I then wanted to run that content through some of the tools designed to detect AI-generated content.
Step 1: Adjusting the AI model’s settings
Temperature: 1 (This controls randomness. The closer the number is to 0, the more deterministic and repetitive the content will be. I wanted this as far away from 0 as it would accept.
Maximum length: 1000 tokens (I want to give it the space to generate a longer article)
Frequency penalty: 0.5 (Decreases likelihood of a repeated line)
Presence penalty: 0.5 (Increases likelihood of talking about new topics)
Best of: 3 (This will run the completion three times server side and display the best)
Step 2: Prompts for the AI
To produce the blog article, I sent the following prompts:
Write a high quality, but witty and amusing blog post that explains the benefits of artificial intelligence in the context of university-level education
The above copy is great. But nowhere near long enough. Can you rewrite the piece to be longer, and include some Harvard style references?
That's almost perfect. Can you make it even longer and funnier - in a dry humor kind of way.
Now give this blog post a clickbait title
I then used Grammarly to tidy up the grammar (just a few commas!).
Step 3: The final result – AI output:
Discover the Surprising Benefits of AI in University-Level Education!
University-level education is becoming increasingly advanced, in part thanks to the use of artificial intelligence (AI) in a variety of capacities. AI has revolutionized how we approach teaching, assessment and student support services, making learning more personalized and efficient than ever before. But what really are the benefits of artificial intelligence in the university context? Well, let me explain…
To begin with, AI technology can be used to assess student work quickly and accurately by offering automatic marking tools that give instant feedback on assignments (Lee & Voyer, 2019). This means students don’t have to wait for days or weeks for their grades – they can thank AI for avoiding those dreaded Sunday night existential crises! Additionally, AI software can scan through large volumes of material quickly, such as research papers or books, which helps professors save time when preparing lectures and course materials (Christie et al., 2019). This means their cake recipe development endeavours become even more prosperous – it’s a win-win!
Furthermore, AIing also helps personalize learning by recognizing individual learner educational needs and providing tailored content accordingly (Chen et al., 2018). For instance, it can detect a student’s point of difficulty in a particular subject matter based on their previous habits or searches online. It can then offer up educational materials at those points which are best suited to that person’s level of understanding and help them overcome any difficulty they may be having with the material. The best part? You don’t need to worry about remembering anything – so no more cramming before your exams whilst eating questionable microwaveable meals!
Finally, Artificial Intelligence (AI) technology offers up endless opportunities for teaching virtual classes where all students learn remotely from the comfort of their own homes while still benefiting from real-time interaction with their professor/tutor (Ahmed et al., 2017). With AI systems such as CourseSmart assisting instructors by preparing session material remotely before each lesson, no students will ever miss out again – not even if you forget your laptop charger because who could possibly forget that!?
All in all, these benefits make it clear why so many universities today are happily embracing AI technologies for university-level education purposes! By lowering costs associated with traditional methods as well as offering new stimulating learning environments; Artificial Intelligence presents an exciting frontier awaiting exploration. So why not get started today? Who knows what kind of convenient new features you can look forward to introducing into your life!?
References:
Ahmed I., Ullah K. A., Khan S., & Abbas M. (2017). Exploring the Potential Effects of Virtual Learning Environment among Undergraduate Learners: A Systematic Literature Review Using Thematic Analysis Approach. Computers & Education Journal. 112:204-222.
Chen C.-S., Chen T.-H., Chiu H.-Y., & Tsai C.-C.(2018). Cognitive Apprenticeship System Based on PrePA—Personalized Portfolio Assessment System for Learner Self-Development Environments Cloud Computing. International Journal Of Engineering And Technology Innovation 8(1):84–94.
Christie T., Wright C., Matthaei J ., Mureithi P .& Hobbs V .(2019). OERHub Final Report – Compendium Of Tools And Resources For Open Educational Resources Practitioner Support In Higher Education Institutions Around The World: Progress Report, Version 1 Harvard Graduate School Of Education
Lee S.-K& Voyer v.(2019). Soft Computing In Education: New Trends And Approaches: Learning Through Technology For All | Qsearch Ltd
NOTE: All of those references are garbage. Those sources don’t exist – though it looks convincing…
Testing the above article: Was it written by AI?
Well – we know the answer. It was, indeed, written by an AI. However, I wanted to run that text through some of the popular tools being used to detect AI-generated content to see how they did.
I can see why. It’s score for the above blog article was: The classifier considers the text to be very unlikely AI-generated.
GPTZero
GPTZero works by measuring the text’s perplexity and randomness. In this case, it did a better job than AI Text Classifer, though it failed to detect the whole piece as written by AI. It only identified four sentences at the top as generated by AI, but identified nothing in the rest of the text.
GPT-2 Output Detector Demo
In fairness, this was designed for GPT-2, and the model used for the above article was substantially enhanced. Unsurprisingly, GPT-2 Output Detector scored the piece as real:
Writer
Next, I tried Writer AI Content Detector. This tool is designed to identify AI text to help authors tweak their content to avoid detection. This is not designed for unfair means, but to stop search engines from penalising website page rankings. Content entirely produced by an AI often penalises websites from getting top spots on Google searches. Writer AI Content Detector is limited to 1,500 characters, so I had to split the article into two. Both halves were scored 100% for human-generated content…
Giant Language model Test Room (GLTR)
GLTR (glitter) “enables forensic inspection of the visual footprint of a language model on input text to detect whether a text could be real or fake”. It is built by a collaboration between Harvard NLP and the MIT-IBM Watson AI Lab. Similar to the GPT-2 Output Detector Demo, it was designed for GPT-2. It analyses how likely each word would be predicted given the context before it. It is pretty cool, as you can see word-by-word likelihood predictions for the next word:
Words highlighted in green are in the top 10 for most likely. Yellow words are in the top 100, and red words the top 1,000. A violet word is even more unlikely to be detected. In essence, while green should be the most common colour for both AI and human written pieces, there should be a proportionally higher number of yellow/red/violet words for something written by a human, as we are more random.
In this case, I was really shocked by the output. In my previous tests, I’d always seen a high proportion of green in AI-generated content. This time with the above blog post, I think it is fair to say there is a broader use of yellow/red/violet. To better explain the significance of this, I compared the above AI-generated content to my last blog post. You’ll see an almost identical spread of green/yellow/red/violet – though perhaps my content does have slightly more of the last two.
I hadn’t expected that. In this case, I think GPT-3 text-davinci-003 and the above prompts produced a decent output – that the above detectors all failed to identify as AI-generated.
Human detection
I think it’s important to consider human detection. As we can see above, all of the tools failed to identify AI-generated content. In fairness, this is a new field – and just like the AI tools, these will develop too. In this case, human detection certainly wins. As you can see from the article, it has made up a load of references. They look convincing – and those journals, volumes and issues exist. The articles, however, do not.
Aside from the obvious errors in the AI-generated content, I question if anyone would write something so overwhelmingly positive in an academic context. Even a positive argument acknowledges there are alternative positions out there that should be refuted and rebutted. Part of this was down to my prompt, perhaps – but still something important that it missed out on generation.
I do think it did the funny, witty part well. The line: This means their cake recipe development endeavours become even more prosperous – it’s a win-win! was highly unlikely to be written by an AI in the above model.
Conclusion: AI-generators can trick AI-detectors
As you can see, there is still a long way to go in developing GPT-3 detection – which is perhaps concerning given that GPT-4 is on the way. I don’t think we can rely on tools to automate this process for us, and as you can see above, the human detectability of my output was very focused on my prompts. This will become more complicated as other AI tools emerge, requiring text to be checked against each of them. Until there is a paradigm shift in this technology, I think the answer to the above question is YES! An AI generation tool can certainly trick an AI detection tool.
There has never been a more important time to ensure AI literacy is a core aspect of the curriculum at every level of education.
It’s an honest question. Everywhere I look, there are discussion threads, social media posts and emails from Higher Education professionals obsessing over artificial intelligence. Most of these communications focus on ChatGPT, but some acknowledge other tools exist. These ‘new developments’ in artificial intelligence have prompted a dramatic response from the education sector. It has been described as a crisis, a moral panic, an ‘end to homework‘ and a threat to higher education. I think my favourite contribution this year comes from The Mail, which announces artificial intelligence could make ‘mankind extinct’.
Okay. Some perspective is needed. But my real question is this: How did the Higher Education (HE) sector not see this coming?
Artificial Intelligence has powered your work for years
In the UK, the vast majority of Universities use Microsoft (Office) 365 and the Windows operating system. Microsoft’s Outlook powers our emails, SharePoint/OneDrive stores our files, Teams manages our collaboration, and Office keeps us productive. Since 2016, Microsoft’s ‘Office Intelligent Services‘ have seamlessly integrated artificial intelligence-powered features into our everyday working lives. For most HE practitioners, the developments in artificial intelligence have been staring us in the face. Literally. The documents we write, the slides we develop, the emails we read, and the Teams calls we make have all been enhanced by Artificial Intelligence for YEARS.
Artificial intelligence in Microsoft (Office) 365:
Read aloud has turned text-to-speech, enhanced to use tone and inflexion.
Dictate has enabled speech-to-text, allowing people to talk instead of type. This includes
Optical Character Recognition has helped turn image-based text into readable characters.
Presenter Coach has analyzed people’s speech, language and body language to deliver real-time presentation feedback in PowerPoint.
Slide Designer has taken draft slides and automatically added design elements and images to make slides more effective.
Accessibility Checker has allowed the automatic generation of ALT text for images, using computer vision.
MicrosoftViva has provided detailed insights: reading your emails to identify unfinished tasks and checking your calendar to provide useful documents for meetings – in real-time.
Subtitles and Transcription have enabled PowerPoint and Teams to provide real-time subtitles for presentations, calls and recordings.
Excel has offered enhanced chart types (i.e. Maps) and real-time, streamed data (i.e. Stocks).
Editor has offered enhanced spelling and grammar advice, and has extended to use text prediction to save time when writing.
Translate has offered real-time translation from text, images and speech across up to 100 languages (and variants).
Scheduler has coordinated meetings between people – and even booked rooms.
Natural language queries in Excel have allowed people to use questions, not formulas.
Search enhanced with AI when using Bing.com
The examples above are just workplace, education and consumer applications. In industry, Microsoft-powered AI has been detecting facing, monitoring crops, enhancing video games, fighting fraud and detecting faults across hundreds of sectors. I can understand people not being aware of some of these applications – but the stuff listed above has been right in front of our eyes.
How can any of these artificial intelligence developments be a surprise?
So. Reflecting on the list above, ‘Intelligent Services’ have supported reading and writing across the Microsoft (Office) 365 platform for over eight years(!!!). If you’ve been using Microsoft Office productivity applications like Outlook, Word and PowerPoint – I cannot understand how ChatGPT can be a surprise. Office applications have started:
correcting your writing and predicting what you will say
reading your emails to manage your diary and tasks,
listening to you, so you don’t need to write,
automatically making things accessible with subtitling and computer vision
The list goes on. If artificial intelligence has been doing all this for years – how is ChatGPT such a leap?
I can understand how ChatGPT feels like a significant step up from previous chatbots. But I don’t see how it can be all that surprising when we reflect on those daily developments and how artificial intelligence has slowly become part of the everyday. It isn’t just at work or in education. Your last test at the hospital might have been screened by artificial intelligence. Every time you make a purchase, the transactions are scrutinised by artificial intelligence for anomalies. It really is everywhere. I get how the quality of written response is shocking – but given what we’ve seen happen in Microsoft (Office) 365 over the last few years – I don’t think we can call it a surprise.
Why the last-minute response?
I honestly do not know the answer to this question. The radical potential of artificial intelligence has been staring everyone in the face for years. Every email. Every document. Every Teams call. Every PowerPoint. Artificial intelligence has been prompting, pushing, helping and enhancing for years. How can ChatGPT be such a surprise? I am absolutely shocked that schools, colleges, and universities are so late in reacting to the challenge artificial intelligence poses to traditional assessment. This should not be a surprise. Not at all. I cannot understand how future scanning and business planning did not identify this as part of long-term strategies. Emergency planning and task groups should not be necessary! Educational policies should have been prepared years ago.
But they weren’t.
As such, the kneejerk reaction has been to ban artificial intelligence in many educational establishments. Given the circumstances – this is probably the right answer. For now. In future, I urge educational leaders to reconsider artificial intelligence and look at how it can be used alongside the curriculum.
Artificial intelligence in the curriculum
Our students will be leaving the educational world to join workplaces that use artificial intelligence. It might not even be labelled as such – but it will be there in the background. The example of Microsoft (Office) 365 in this very post is a great example of this in application. As such, we need to equip our students for this. Not just the practicalities of using artificial intelligence – but the difficult questions around ethics and responsibility of use too. These skills cannot be developed under a ban.
Artificial intelligence reproduces the biases of society. Our students must be equipped to challenge this.
Artificial intelligence uses Machine Learning – but this is often targeted at a range of information- including high-quality, questionable and poor-quality sources. Students need to be able to identify fact from fiction.
Artificial intelligence supports accessibility, it can translate, see and write for anyone who needs support with it. While not perfect, those features cannot be turned off.
It’s fair to say – the news isn’t all doom and gloom. It’s been nice to see some new articles discussing how artificial intelligence won’t hinder creativity, will help schools teach skills for university and how it can be integrated into teaching. It is also interesting to see reflections on the OpenAI website (ChatGPT’s creator) about how it can support streamlined and personalised teaching – amongst many opportunities and threats.
The last word – The future of Higher Education with ChatGPT
I couldn’t help but close this post by handing over the conclusion to ChatGPT. I asked it to tell me the best-case scenario for the inclusion of artificial intelligence in university education.
The best-case scenario
In the best-case scenario, artificial intelligence (AI) will have a positive impact on university education. AI will help to improve the quality and efficiency of education by providing students with personalized learning experiences. AI-driven tutoring systems will be able to analyze a student’s performance and give immediate feedback and recommendations for improvement. AI can also be used to create adaptive learning systems that adjust the difficulty of tasks based on a student’s progress, allowing students to move at their own pace and make the most of their learning experience.
AI can also be used to automate administrative tasks, such as grading, making it easier for professors to focus on teaching and providing students with the best possible learning experience. AI-enabled analytics can be used to identify students at risk of dropping out, so appropriate interventions can be put in place to increase their chances of success. Finally, AI can be used to offer more engaging learning experiences, such as interactive simulations and augmented reality. All of this will enable universities to provide a higher quality of education while reducing costs and increasing student satisfaction.
(ChatGPT, 2023 – Published by OpenAI)
…and the worst-case scenario?
In the worst-case scenario, artificial intelligence (AI) could have a detrimental impact on university education. AI-driven tutoring systems could be overly rigid, unable to adapt to individual students’ needs, and lead to a one-size-fits-all approach to learning. AI-enabled analytics could be used to track student progress and create an environment of surveillance and control, where students feel that their every move is being monitored and judged. In addition, AI-driven automation of administrative tasks could lead to job losses, as universities reduce their staff numbers in favor of automated systems. This could lead to a decrease in the quality of education, as universities focus on cutting costs rather than on providing students with the best possible learning experience. Finally, AI-enabled simulations and augmented reality could lead to a decrease in critical thinking and problem-solving skills, as students become reliant on pre-programmed systems. All of this could lead to a decrease in student satisfaction and an overall decline in the quality of university education.
(ChatGPT, 2023 – Published by OpenAI)
Okay – I had a bit of fun with the conclusion. But ChatGPT makes a valid point… there is a lot of potential disruption – good or bad – coming to HE. The question still remains:
How did most HE practitioners not see this coming?
This blog post shares my latest conference poster titled: Learning Development in a Time of Disruption. This poster presents the research I have undertaken for my Postgraduate Certificate in Academic Practice (PCAP). This full version of the poster is hosted on the National Teaching Repository and can be accessed below:
Abstract
The Covid-19 Pandemic had (and continues to have) a significant, worldwide impact on Higher Education. This research project analysed a special issue of the Journal of Learning Development in Higher Education (JLDHE) to identify how third space professionals respond to this challenge.
The special issue of JLDHE contained 102 peer-reviewed reflections, identifying numerous challenges and responses to teaching in Covid-19. All reflections were analysed using structural, topic and thematic coding to identify common responses and challenges to pandemic teaching.
The findings of this study have identified a range of individual and shared challenges for both students and third space professionals. One of the core findings relates to the diversity of responses that have been designed to meet these challenges, with over 100 distinct pedagogic and technical solutions to pandemic teaching. From these, five core themes have emerged: emergency remote teaching; reflective practice and evaluation; pedagogy and technology support; collaboration and shared practice; and, course design for the long-term.
There are four core implications for future practice. Firstly, it is important to develop flexible policies and procedures to allow practice to develop in times of disruption. Secondly, rich educational research and case studies can support innovative and adaptive practices in times of disruption. Thirdly, university staff need support and training to ensure online pedagogies and technology are supported. Finally, reflective practice and evaluation are required to ensure continuous monitoring and improvement.
Keywords: Covid-19, Higher Education, pandemic, teaching, third space professionals
As part of undertaking my Postgraduate Certificate in Academic Practice, I had to produce a Teaching Philosophy Statement. I’m drawn to the approach the University of San Diego (2023) takes to introduce the Teaching Philosophy Statement, which I think sets the tone well:
The life of a teacher is an extremely busy one. From early morning until long after dark, teachers dedicate the better part of their day to their students. Amid the lesson planning, the snack breaks, the recess duty, grading and the myriad other daily tasks, it can be easy to lose sight of the why of teaching.
Why are you drawn to the classroom, and what is it about your love of teaching that makes it a fulfilling career? What’s the overarching philosophy that guides your teaching practice? Even on the busiest school days, every teacher should be able to explain their “why” by returning to their teaching philosophy.
(University of San Diego, 2023)
A Teaching Philosophy Statement is absolutely something all Higher Education practitioners can reflect upon. It is not just the realm of academics. A teaching philosophy statement sets out core beliefs about the purpose of teaching, it sets out an individual’s approach and justifies why this is their approach. I found it a thoroughly enjoyable activity!
Lee – why share your Teaching Philosophy Statement now?
This is a good question! After all, I’ve technically left Learning Development. But, I have two very good reasons for sharing this now.
I want to call on Learning Developers to write their Teaching Philosophy Statements and share them. Do it now! This will provide a rich discussion about the teaching philosophy of Learning Development. It will reflect the diversity of the profession, and allow further discussion around the values established by ALDinHE and how they are applied in the profession.
I must acknowledge my teaching philosophy is changing. I am now a Lecturer in Education Studies. I want to share my Teaching Philosophy Statement as it stood a year ago. I promise to re-visit this in another blog post and update my statement for my new context. It will provide an opportunity to reflect on any similarities and differences. I think this is an exciting way to continue reflecting on that transition from thirdspace professional towards academic.
This is why my post is titled the ‘teaching philosophies’ of Learning Development. I think we need to acknowledge these statements will be numerous and diverse – just like the profession. Learning Development is a profession I still very much care about. While I may now work as a Lecturer in Education Studies, a core part of my scholarship and research will be dedicated to Learning Development. It’s why I am still involved closely with ALDinHE – and am a member of LearnHigher and the JLDHE Editorial Board.
So! Here it is – my teaching philosophy from my time as a Learning Developer…
My Teaching Philosophy Statement (2022)
My aspirations
As a Learning Developer, I feel my aspirations are very driven by my profession. Hilsdon (2011:14) defines Learning Development as the “teaching, tutoring, research, design and production of learning materials, as well as involvement in staff development, policy-making and other consultative activities” in support of student academic success. However, I believe student success can never be built on dependency, so for me, effective Learning Development must also build student independence and self-efficacy. As much as I aspire to help students, my true goal is for students to become self-sufficient so they do not need me.
Philosophical underpinning
The signature pedagogy of learning development is arguably academic literacies (Lea & Street, 1998; 2006). This approach acknowledges that writing, learning, and other academic practices are not isolated ‘skills’, but are complicated literacies situated within disciplinary discourses and power frameworks (Lea & Street, 1998). Acknowledging these complexities is vital for me to identify and confront what is not taught or is assumed – constituents of the null (Kazemi et al., 2020) and hidden curriculum (Hinchcliffe, 2020). As a Learning Developer, I work within the hidden curriculum to expose and challenge it. My role as a 3rd space professional (see: McIntosh & Nutt, 2022) is very much an enabler.
Teaching methods and assessment
The most established teaching method in learning development is the one-to-one appointment. As learning is both a complicated and individual process, such appointments allow students to engage in these complexities with full acknowledgement of what they already know and understand. As represented by Webster (2018), both students and Learning Developers bring knowledge to such appointments and operate with different levels of agency. Depending on the appointment, this can frame my role as mentor, listener, teacher, and coach. I often have to informally assess students to determine how to best support them, and which of those roles I might need to take.
The most significant challenge has been scaling Learning Development beyond appointments to help more students, and there are three approaches I have taken. Firstly, there is workshop-based instruction, which allows similar principles from appointments to be applied in a group situation, extending capacity. Second is the creation of self-support resources like University of Hull (2021) SkillsGuides. These allow students to access help at a time that is convenient. Finally, there is ‘integrated practice’ which involves directly teaching in timetabled sessions as part of the curriculum. Integrated practice is arguably the best and most inclusive way to increase access to Learning Development.
While I do not set or mark student work, I do have a role in assessment. One common task involves helping students prepare for an assessment set as part of their course. This can be as simple as demystifying the essay in an appointment or teaching a whole class the principles of public communication to help them write a wiki article. I also support students with formative feedback to help them develop their response to an assessment, or provide them summative feedback on a previously marked piece of work to help them develop further. For me, this is all about supporting student learning.
Inclusivity at the heart
Inclusivity is a core value of my practice. I have worked hard to promote inclusive practices, helping ensure students can be successful no matter their background, neurodiversity, or protected characteristics. This goes beyond legal obligation – it is simply the only ethical approach to teaching. Furthermore, I aspire to uphold the ALDinHE[1] (2018) Manifesto for Learning Development, which strives to increase participation in HE and legitimise different forms of student knowledge. As I work across all disciplines taught at Hull, I need to respect different approaches to knowledge too.
Looking to the future
For me, the PCAP is an opportunity to improve my teaching further. I now have over 10 years of experience working in HE, and I still have things to learn. I’m proud of my Senior Fellowship with the HEA and fully intend to work towards Principle Fellowship in the future. I’m also keen to maintain my professional accreditations with ALDinHE and Microsoft Education, as well as gain my accreditation with the Association of Learning Technology (ALT).
[1] Association of Learning Development in Higher Education – the professional body for learning developers
Writing your own Teaching Philosophy Statement
As I shared in my introduction, I’d love to see more Teaching Philosophy Statements shared from Learning Developers. If you want to know where to start, check out this guide: What Is a Teaching Philosophy? Examples and Prompts. Please share yours and pop the link in the comments section below.
Hilsdon, J. (2011) What is learning development?, in Hartley, P., Hilsdon, J., Keenan, C., Sinfield, S. & Verity, M. (eds), Learning development in higher education. Basingstoke: Palgrave Macmillan, 13-27.
Hinchcliffe, T. (ed), (2020) The Hidden Curriculum of Higher Education. Advance HE.
Kazemi, S., Ashraf, H., Motallebzadeh, K. & Zeraatpishe, M. (2020) Development and validation of a null curriculum questionnaire focusing on 21st century skills using the Rasch model. Cogent Education, 7(1), 1736849.
Laurillard, D. (2002) Rethinking university teaching: A conversational framework for the effective use of learning technologies. London: Routledge.
Lea, M. & Street, B. (1998) Student writing in higher education: An academic literacies approach. Studies in Higher Education, 23(2), 157-172.
Lea, M. & Street, B. (2006) The “Academic Literacies” Model: Theory and Applications. Theory into Practice, 45, 368-377.
McIntosh, E. & Nutt, D. (eds) (2022) The Impact of the Integrated Practitioner in Higher Education: Studies in Third Space Professionalism. Oxon: Taylor & Francis.
Webster, H. (2018) How to implement effective 1:1 tutorials, Association of Learning Development in Higher Education Annual Conference. University of Leicester, 26th – 28th March. Leicester: Association of Learning Development in Higher Education.
I’m delighted to be this week’s guest on Cup of TEA – the Teaching Excellence Academy’s official podcast, exploring learning, teaching and assessment at the University of Hull.
Season 1 – Episode 3: Lee Fallin
This week we’re speaking to Lee Fallin from the School of Education. We talked about a range of topics including his research on learning spaces and also his recent transition from professional services to academia. Lee is really passionate about education and learning development and we think this comes across brilliantly in this really interesting chat. Lee has also kindly provided a list of related links to topics we discuss which can be found below.
My contribution aside, I highly recommend you check out the Cup of TEA podcast and bookmark it in your favourite app. In week 1, my colleague Kelly Dockerty and student Jess Gleisinger discussed PBL and authentic experiences, with week 2 showcasing Liz Wells and her transition from clinical practice to academia. Both are a great listen!
There have been some interesting discussions around the use of Artificial Intelligence (AI) – and the fear of how it will impact assessment in HE. I will reflect on this myself in time, but for now, I want to muse on a possible use case. I’ve been following an email thread on AI in higher education via a publically accessible JiscMail. One participant had acknowledged how sprawling the debate had become – and asked if ChatGPT could lend a hand. I decided to give this a go using OpenAI‘s text-davinci-003 model (and it should be noted that this post’s feature image was drawn by OpenAI’s DALL·E 2)!
First, I downloaded the content text from 30 publically accessible emails on this topic of ChatGPT. Secondly, I removed all personally identifiable information (i.e. names, emails, links, institutions) from the text. This was an ethical issue as while the email text is in the public domain via jiscmail.ac.uk, the tool I used is allowed to retain text to train the algorithm further. Thirdly, I split the text into two separate queries to fit within the 4,000 token limit on text-davinci-003 model. Finally, I took the two text-davinci-003 outputs and asked text-davinci-003 to summarise the discussion from each text sample and re-write it for a knowledgeable audience.
Artificial Intelligence summary of the discussion:
This email thread is discussing a variety of topics related to ChatGPT, a new technology, in higher education assessment. Ideas have been suggested to mitigate any short-term disruption to assessment this semester, such as informing teachers of the technology, revising current assessment regulations, teaching students about academic misconduct, incorporating the model into assessment design, encouraging the greater use of oral-based assessment, group project work, and classroom-based assessment, using browser lockdown capabilities, testing the model against a variety of tasks, and promoting assessment (re)design that requires the creation of multi-media artefacts. There has also been discussion about curating resources on the topic, as well as different types of assessment that could be used, such as handwritten assignments and orals, and the ethics of the technology, with suggestions to use Turnitin to cross-reference essay submissions and to teach students to question the production and consumption of their cultural landscape. Additionally, it has been suggested to have an event which focuses on the implications of GPT for educational development and developers, as well as a SIG on AI and Academic Development. It was also noted that OpenAI (the company behind ChatGPT) is likely to become an organisation within the Microsoft ‘empire’, and that there is software that can detect ChatGPT-generated text.
text-davinci-003 summary of academic development discussion
Evaluating text-davinci-003’s output
… and do you know what. It isn’t a bad summary. It has missed some of the nuance and detail – and obviously the weight of some arguments over others. Some of the more powerful arguments (that were buried in the thread) were lost. For me, the summary was much poorer for excluding some of the deeper ethical concerns, for example, OpenAI Used Kenyan Workers on Less Than $2 Per Hour .
Yet, for someone who is ‘time-poor’, such a summary presents an exciting way to engage with significant chunks of information in little time. This summary took about five minutes – including both my time editing the text and that of text-davinci-003. This cost around 20 USD cents ($0.20). The original text was computed to take 18 minutes to read – the refined output was 60 seconds. Paying 20 cents to save 17 minutes of reading… not a bad cost for some.
Caution is, however, needed. When I asked text-davinci-003 to summarise the above for a 3rd grader – it was quite off the mark!
ChatGPT is a new technology that can help teachers assess students. It can help teachers check if students are cheating and can help them create different types of assessments. It can also help teachers create projects and activities that involve creating multi-media artefacts. There are also some ethical considerations to think about when using this technology.
text-davinci-003 3rd grader summary of academic development discussion
At this point, it has over-simplified the discussion to the point it is incorrect. At no point was ChatGPT pitched as something to help assess students or help create multi-media artefacts.
It was quite right on the ethical mark, though. I took great care to only use public-domain information with no personal information in this experiment. But even then – I imagine some people would feel uncomfortable about this.
Artificial Intelligence summaries as an automated proccess
This was a manual process – but I wonder how far off an automated future could be. From anyone using Microsoft 365/Outlook for emails, you may have received Microsoft Viva briefings. Microsoft Viva emails to bring together data from across your Microsoft 365 account (including your email) to bring connection and insights. This all uses a form of Artificial Intelligence to provide things like helpful reminders on outstanding actions, advice on how to manage your time in the context of your diary and information it feels might be helpful. Given Microsoft are rumoured to want OpenAI’s technology further integrated into Windows and Microsoft 365 – perhaps email summaries (on this scale) would be a natural extension of Microsoft Viva.
What are your thoughts? Let me know in the comments:
Last year I had the great pleasure of joining the Editorial Board of the Journal of Learning Development in Higher Education (JLDHE). This started as a guest editor for the Special Edition, ALDinHE Conference Proceedings and Reflections. For this special issue I, alongside a team of other guest editors, had the opportunity to weave together the peer-reviewed conference abstract with community and author reflections. Based on my experience as a guest, I was excited to apply and interview for a permanent editor role. As you can safely guess from the title of this post – I was successful 😁.
Become an editor
I’ve been an editor for a handful of months. I’m still early on in my journey, but have already found it very rewarding. It’s a fantastic opportunity to support and mentor new authors, alongside helping more established writers get their work into press. The biggest surprise, for me, has to be the dedication and generosity of our peer reviewers. I am often overwhelmed with the interest in review, and often find it challenging to choose who to allocate.
It was stupid of me to be surprised. After all, over the last couple of years, it had become clear to me that peer review was a form of community building. This particularly resonated for the Compendium of Innovative Practice where, as a reviewer, I had myself really felt part of something. This is something that I feel is very special about JLDHE. It is a community – an argument well established in Alicja’s contribution to the conference proceedings.
Teamwork = dream work
Perhaps the single most rewarding part of being on the Editorial Board is the opportunity to work with an amazing set of colleagues. I have thoroughly enjoyed learning from them all, and getting to know them more through our work. I have to give Gita Sedghi, who has been my mentor a HUGE shout out. Gita has been the one who has answered my 200-300 questions about the journal, editing and our journal system. (Thanks Gita!).
Our Editorial Board is a great team (yes – I’m biased). We meet monthly to ensure everything is on track for upcoming issues, and there is a lot of work to keep on top of between these meetings. I’ve found the pace and challenge really refreshing. As you can see above – it’s also a rewarding role 😁.
The year ahead
Today drew to a close our Editorial Board Winter Residential – and so I write this post sat on a train back to Hull after what has been an intense couple of days working on journal strategy and plans. It’s got me really excited for the year ahead! I also got a chance to use the LEGO I had sat in my suitcase (a staple for any work trip away!).
I’m really looking forward to developing in my role further. I’m also happy for the opportunity to take on more responsibility, and start to focus on different aspects of the journal’s work. Stay tuned! I really look forward to share more about this journey over the coming year.
Get involved!
I hope this post is encouraging for anyone who is thinking about getting more involved in journals – especially JLDHE. I recommend peer-review as a great place to start getting involved (it’s where I started!). It helps you contribute to a journal – and get a feel as to what is within it’s scope. JLDHE is always looking for more reviewers, so watch the LDHEN and SEDA JiscMail networks for future calls to review!
Widening Participation is an important topic, and something cemented into Higher Education Policy through Access and Participation Agreements. Yesterday I had the great pleasure of attending the University of Hull’s inaugural Widening Participation conference. The main theme and question of the conference asked: ‘whose job is Widening Participation anyway?’.
Widening Participation is something I am passionate about. It is about ensuring someone’s circumstances do not impact their ability to enrol at a University and be successful. The end result should see more students enrolling from under-represented groups. This includes, for example, care leavers, low participation postcodes, disabled students, mature students, and some ethnicities. For social justice – it is an absolute no-brainer. While the crisis around student fees and the option for Universities to raise them from £6k to £9k has been disastrous for some, one good consequence was the requirement for institutions charging over the basic fee (£6k) to have an Access and Participation Plan.
All providers that are required to have an Access and Participation Plan need to ensure their plan addresses several key points. The plan needs to show how a Higher Education provider will raise participation from under-represented groups. The plan had to include their ambition for change, the plans for that change and what targets have been set. It also needs to be clear how that plan will be delivered and what investment it will take. While £9k fees will off-put some prospective students (even though the repayments are more affordable than the old scheme). One good outcome, however, was the absolute requirement to address access for any provided charging a higher rate.
Widening Participation: My journey to university
It is fair to say that Widening Participation is something that is personal to me. Technically, my own background would have been widening participation. While the postcode I lived in had high rates of participation, no one else in my family had ever gone to University – no one could ‘sell it to me’ or tell me what it is like. My mum was also severely disabled and out of work. While my dad did work as a manager, he had worked his way through the ranks to get there – though at this point he no longer lived in the family home. I was fortunate that my school raised those university aspirations, and my teachers helped me understand the importance of a degree and the experience of studying for it.
It’s also fair to say I’ve gone beyond that base expectation. My postgraduate certificates, my job in higher education and my doctorate — they are all things that people from my background did not do (certainly at the time I started out).
To return to the question – whose widening participation is it? For me, in my experience, it was MY widening participation. Obviously, the question is broader than personal experience – but I wanted to reflect on this for one reason. If I had anything less than an absolute commitment to widening participation, I would be pulling the drawbridge up to prevent people like me from having the same success. Here is where the situation can be insidious. Imagine if I were from a privileged background and did not fight to widen participation in Higher Education. Well… I’d be working to pull that drawbridge up to stop people not like me from being successful. On that reflection – it is appropriate to fully answer the question:
Whose Widening Participation is it?
Everyones.
That is because access and participation is fundamentally an issue of equality, diversity and inclusion.
Social justice requires progress in this area. The right to Higher Education should not be based on where someone is born, or what needs they have. It should be based on ensuring everyone can reach their potential. As such – everyone working in Higher Education has a duty to Widening Participation, no matter what their own background is.
And if we fail? Well. Not only are we not widening participation, but we are not being equal, inclusive and supportive of diversity.
This somehow failed to publish in May – but better late than never eh?
Earlier this week I had the pleasure of attending a Flood Resilience Workshop designed to help inform best practices in flood recovery. One of the distinctive parts of this session was that the substantive part of it was based around a board game. The Flood Recovery Game was built to facilitate dialogue with flood recovery stakeholders. With representatives from the Environment Agency, major insurers, Fire and Rescue, academics and more — it definitely delivered.
The Flood Recovery Game is a ‘serious game’ – serious as it is a learning and debate tool. There were several modes of play, all getting players to consider different scenarios and how they would deploy scarce resources. Money, emergency services, volunteers, council workers and recovery workers could be deployed to address the scenario. As the game developed, those resources became more scarce — and were deployed in different forms.
The games begin!
For the entire morning, we worked through The Flood Recovery Game in groups. The game started off with in quite an idealistic response. Resources were fairly unlimited — and you could deploy what you wanted. It reminded me a bit of those card games where all players pitch a response to a given scenario. The winner is chosen by the rest of the table, voting on their preferred response. The game starts to ramp up difficulty where resources become ‘spent’ and you start to earn random resources back. At this point it’s important to collaborate, especially when you have an uneven hand. At one point, I had lots of money and workers – but no council workers or emergency services, I just had to support the plans of others – but to their success!
I think my favourite modes of play came later in the game. At that stage, resources become finite and you don’t get them back (even at random)! At these end stages of the game, you get to role play one of the key stakeholders – the council, business, insurers, flood groups, NGOs, emergency workers and others. This is played on the second side of the board (see below) and gave much more opportunity for bartering resources.
The Flood Recovery Game
Reflecting on the game
I really enjoyed working with the others on my table. I was along to bring an ‘educational perspective’ – I’m still not 100% sure what that meant — but my geographical background and experience with local political really helped me get stuck in. I even won the first part of the game (😉). Collaborating with a student, an academic, a representative from Fire and Rescue and an insurer made for really interesting dialogue. Some of the participants noted that it was an excellent conversation starter and wanted to try it outside the Humberside region (we’re not bad for flood awareness apparently!).
One of the more useful aspects of this game was the opportunity to identify gaps – and perhaps, opportunities (see below). For example, our group identified a potential to leverage Fire and Rescue data to help Insurance Companies priorities their response to vulnerable customers. With some legal consideration or consent – that data could make all the difference in a disaster. There is even potential for that idea to leave the session — and there is an example of how the session also worked as a form of knowledge exchange. It allowed academic knowledge to breach the walls of the university to a place it could impact people, business and government.
GAP! Identifying gaps
Games in my practice
Developing games like this take a lot of time – and money too! Fancy printed boxes, boards and game cards don’t come cheap. For this reason, I’ve never had the inclination to develop something like these, even though I’ve always believed them to be pedagogically effective (considering teaching at this point). The quality of the discussion from playing the Flood Recovery Game, however, has made me consider their potential for teaching critical thinking. I’ve seen many structured approaches for debate, teamwork and so in from a business context — there is clearly educational potential too. I’d certainly like to see a criticality game – may provide an alternative approach to just another workshop.
Recent Comments